Optimising your Conversation

What are speech markers?

Speech markers are triggers that can be included within the Digital Person's text response in order to trigger an action. Speech markers always begin with an `@` symbol and often require one or more parameters to be passed to them.

Digital DNA Studio supports the use of speech markers in the NLP conversation scripts (created in Dialogflow, Watson, etc.) that enable Conversation Engineers/Writers to control specific elements and features of the UI via commands or utterances from the user, during a conversation with the Digital Person.

Note: Depending on your conversation provider you may need to escape Speech Markers. For example in IBM Watson Assistant @attendObject would have to be written as \@attendObject.

In general, the behaviors of UI-related markers such as @showcards, @hidecards, @feature, @close all require the UI to implement this functionality. Our default UI includes an implementation for all these features but their behavior is up to the UI developer.

Available Speech Markers

Function

Speech Marker

Description

Example

Function

Speech Marker

Description

Example

Content Awareness

@attendObject

The Digital Person will look towards the on-screen object with the given ID.

@attendObject([object_id: str], [start_time(optional): float], [duration(optional): float])

 

@gestureObject

The Digital Person will gesture towards the on-screen object with the given ID.

@gestureObject([object_id: str], [start_time(optional): float], [duration(optional): float])

 

@GestureObjectBothSides

The Digital Person will gesture with both hands simultaneously towards two on-screen objects with the given ID.

@gestureObjectBothSides([object_id_1: str],[object_id_2: str], [start_time(optional): float], [duration(optional): float])

 

@PointObject

The Digital Person will point with either right or left hand depending on the content placement on the screen with respect to the Digital Person. 

 

@PointObjectPalmUp

The Digital Person will point with palm halfway up at the on-screen object with the given ID.

 

Content Cards

@showcards

Allows you to show content card(s) on screen and results in the Digital person gesturing at the content.

 

@showcardsnogesture

Similar to @showcards, but unlike @showcards, the Digital Person will not look at or gesture toward the content card(s). The UI will receive a @showcards marker event for this speech marker.

@hidecards

Allows you to Hides the cards present on screen

 

Speech

@pronounce

Allows the Digital Person to say one thing and display another as text. This is used to speak harder to pronounce words. (e.g. ABC as A, B, C)

 

UI Feature

@feature

Enables the user to deactivate or activate the UI feature—Microphone or the Transcript Window in response to a user’s verbal request.

Note: This command can be used only on the default UI.

 

End session

@close

Enables the user to tell the Digital Person to ends the session in response to a user’s verbal request.

Note: This command can be used only on the default UI.

 

Available Tags

Tags refer to short term changes in the Digital Person’s state.

Function

Tags

Description

Interruption using user hand gesture

  • #EnableStopTalkingGesture

  • #DisableStopTalkingGesture

Digital People are able to respond to hand gestures from the user.

Pause

  • #PauseHalf

  • #PauseOne

  • #PauseOneandHalf

  • #PauseTwo

Adds pauses to speech

Digital Person gesture markup 

  • #HeartSign

  • #ThumbsUpOneHand

  • #ThumbsUpBothHands

  • #OneHandToBrow

  • #Bow

  • #DisappointedHeadShake

  • #Listening

  • #Wave

  • #WaveWide

  • #WaveHand

  • #WaveShy

  • #WaveSauve

  • #Stop

  • #JazzHands

  • #TakenAback

  • #Confused

  • #HappySwayHighEnergy

Drive specific gesture animations to override autonomous animations.

How to guides

Related Topics

For detailed tutorials see Persona & Conversation Design and Conversation Implementation on Soul Academy.

 

Â