Custom Content Integrations for your Digital Person

Overview

To integrate custom content with our platform, we offer a Skills API.

What are Skills?

Skills are modular conversational components that can be added to a Digital Person to enhance its capabilities. Skills could do anything from providing repeatable content to acting as an interface between the Digital Person and integrate with a third-party application or APIs. Skills reduce the time and effort required to implement these capabilities. When creating a project in Digital DNA Studio customers can add one or more Skills to augment the Digital Person’s base conversation and deploy in minutes.

  • Skills can only be used with HumanOS 2.2 or higher Digital People.

  • Skills are only supported in English conversations.

  • Skills can only be used with the NLP platforms specified below.

NLP Platforms Supporting Skills

Check out Building Skills for more information on how you can build your own Skills.

Out-of-the-box Skills

Soul Machines offers the following out-of-the-box Skills for you to leverage: 

Currently, Skills are only created by Soul Machines and are vetted for potential risk to your projects. To support this, each one is documented with as much transparency as possible. 

Adding Skills to Soul Machines Studio

The video tutorial below demonstrates the process of adding Skills from Digital DNA Studio (refer to Creating a Project | 3. Connect conversation and add skills for detailed instructions).

Video- Add Skills to your Digital Person

Testing Skills

The sample conversation supports Skills allowing you to create a project without having to configure an NLP to try out Skills.

Deleting Skills

Skills are independent of each other or the base conversation so you may remove a skill after it's been added to your project. However, if you remove the base conversation all the skills connected to it will be removed.

How Skills Work?

Skills augment your conversation at runtime. It works like a switchboard—Skills do not alter your conversation, but when the time is right the Digital Person will switch away from your conversation to fill gaps. As an example, adding a Math Skill to a Digital Person would allow it to answer math-related questions.

  • A skill can be implemented in any service which is able to provide a turn-based conversation. They can be implemented on the supported NLP platforms, or custom-built using the Skills Webhook API.

  • A skill can provide sufficient capability to drive a turn-based interaction with a user by itself (i.e. as a base conversation in place of a conversation implemented using a provider such as IBM Watson, Google DialogFlow ES/CX, or Microsoft Azure Bot Service), or be designed to work alongside other skills within a single interaction. There are different skill types and their roles are covered on this page.

  • Some skills may require configuration prior to being added to a project. Configuration may involve providing credentials and setting the possible responses.

Flow Diagram

The following flow diagram illustrates how a response from a Digital Person is generated when a base conversation corpus is used alongside a range of skills.

  1. User input is received and pre-processed by a Pre-process Skill (e.g. for translation from the user’s language to a language that subsequent skills can handle)

  2. The output from the Pre-process Skill is sent to the Base Conversation

  3. If the Base Conversation is unable to handle this request:

    1. A fallback is triggered and the request is routed to one of the other skills (Math, Weather, or Other in the diagram below).

    2. If none of the other skills are able to handle the request, it is instead routed to a fallback handling skill (Elegant Failure in the diagram below).

    3. The response from one of the skills in (a) or (b) is processed by the Post-process Skill prior to being spoken by the Digital Person

  4. Otherwise, the response from the Base Conversation is processed by the Post-process Skill (e.g. for translation from the output language of prior skills to the target output language which the user can understand) prior to being spoken by the Digital Person

 

Skills Flow Diagram

Considerations

  • If the Skill (or other Skills) cannot handle the user query and there is no fallback handling skill (such as Elegant Failure), the Digital Person speaks the fallback response from the Base Conversation.

  • The next user query will be directed back to the original Base Conversation corpus after the Skill responds. 

  • The Base Conversation corpus will be reverted to remain at the same turn at which the skill-related digression occurred (for IBM Watson and Google DialogFlow ES only).

Â