Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

 -create_character [char name] [config-filename]  - You can specify multiple characters one after the other

 -data_folder_path: specifies the folder path to the xslt files and the rule_input_[culture].xml file.

 

Optional:

-write_to_file: specifies whether the output bml is to be sent out as a VH message or if it should be written to a file with the name given here. Default is "false."

...

-storypoint: specifies which story-point should be loaded from the saliency-map xml and the saliency map is updated accordingly with priorities to appropriate pawns etc.

 

 

Sending input messages

 

NVBG subscribes to vrExpress messages and a few other control messages which allow for setting some options. Below is a list of messages that NVBG subscribes to:

vrExpress

This message is sent to NVBG by the NLU, NPCEditor or similar module. This message can be used to convey information about speech data, posture, status change, emotion change, gaze data etc. as shown below.

Configuring the character

You can use a config file which you can specify as a command line argument to NVBG, in order to define a character. The structure of the config file is as below:

 

****************ChrBrad.ini********************

[general]
rule_input_file=rule_input_ChrBrad.xml
posture=ChrBrad@Idle01
all_behavior=on
saliency_glance=on
saliency_idle_gaze=on
speaker_gaze=on
listener_gaze=on
nvbg_POS_rules=on
saliency_map=saliency_map_init_brad.xml


As we can see all the information required to configure a character can be specified in the config file.

 

 

 

Sending input messages

 

NVBG subscribes to vrExpress messages and a few other control messages which allow for setting some options. Below is a list of messages that NVBG subscribes to:

vrExpress

This message is sent to NVBG by the NLU, NPCEditor or similar module. This message can be used to convey information about speech data, posture, status change, emotion change, gaze data etc. as shown below.


Speech

The speech messages are characterized by the speech tag within them. They are interpreted and the corresponding output bml is generated with the speech time marks, animations, head-nods, facial-movements etc. These animations are generated based on the content of the speech tag and the fml tag in the input message.


 vrExpress "harmony" "ranger" "harmony221" "<?xml version="1.0" encoding="UTF-8" standalone="no" ?><act>
<participant id="harmony" role="actor"/>
<fml>
<intention>
<object name="A316">
<attribute name="addressee">ranger</attribute>
<attribute name="speech-act">
<object name="A317">
<attribute name="content">
<object name="V28">
<attribute name="modality">
<object name="V29">
<attribute name="conditional">should<

Speech

The speech messages are characterized by the speech tag within them. They are interpreted and the corresponding output bml is generated with the speech time marks, animations, head-nods, facial-movements etc. These animations are generated based on the content of the speech tag and the fml tag in the input message.

 vrExpress "harmony" "ranger" "harmony221" "<?xml version="1.0" encoding="UTF-8" standalone="no" ?><act>
<participant id="harmony" role="actor"/>
<fml>
<intention>
<object name="A316">
<attribute name="addressee">ranger</attribute>
<attribute name="speech-act">
<object name="A317">
<attribute name="content">
<object name="V28">
<attribute name="modality">
<object name="V29">
<attribute name="conditional">should</attribute>
</object>
</attribute>
<attribute name="polarity">negative</attribute>
<attribute name="attribute">jobAttribute</attribute>
<attribute name="value">bartender-job</attribute>
<attribute name="object-id">utah</attribute>
<attribute name="type">state</attribute>
<attribute name="time">present</attribute>
</object>
</attribute>
<attribute name="motivation">
<object name="V27">
<attribute name="reason">become-sheriff-harmony</attribute>
<attribute name="goal">address-problem</attribute>
</object>
</attribute>
<attribute name="addressee">ranger</attribute>
<attribute name="action">assert</attribute>
<attribute name="actor">harmony</attribute>
</object>
</attribute>
</object>
</intention>
</fml>
<bml>
<speech id="sp1" type="application/ssml+xml">ranger utah cant be bartender if he becomes sheriff</speech>
</bml>
</act>"vrExpress "utah" "ranger" "utah200" "<?xml version="1.0" encoding="UTF-8" standalone="no" ?><act>
<participant id="utah" role="actor"/>
<fml>
<intention>
<object name="A131">
<attribute name="addressee">ranger</attribute>
<attribute name="speech-act">
<object name="A133">
<attribute name="reason">become-sheriff</attribute>
<attribute name="content">
<object name="V6">
<attribute name="content">
<object name="V7">
<attribute name="q-slot">polarity</attribute>
<attribute name="type">question</attribute>
<attribute name="prop">
<object name="P24">
<attribute name="type">event</attribute>
<attribute name="location">town</attribute>
<attribute name="theme">sheriff-job</attribute>
<attribute name="event">providePublicServices</attribute>
<attribute name="agent">utah</attribute>
<attribute name="time">future</attribute>
</object>
</attribute>
</object>
</attribute>
<attribute name="type">csa</attribute>
<attribute name="action">info-req</attribute>
<attribute name="actor">ranger</attribute>
<attribute name="addressee">utah</attribute>
</object>
</attribute>
<attribute name="motivation">
<object name="V5">
<attribute name="reason">become-sheriff</attribute>
<attribute name="goal">avoid</attribute>
</object>
</attribute>
<attribute name="actionpolarity">avoid<>negative</attribute>
<attribute name="typeattribute">backward<>jobAttribute</attribute>
<attribute name="addresseevalue">ranger<>bartender-job</attribute>
<attribute name="actorobject-id">utah</attribute>
</object>
</attribute>
</object>
</intention>
</fml>
<bml>
<speech id="sp1" type="application/ssml+xml">im not sure about that</speech>
</bml>
</act>"

Posture change

These messages are characterized by the <body posture=""> tag which allows NVBG to know that there has been a change in posture.

<attribute name="type">state</attribute>
<attribute name="time">present</attribute>
</object>
</attribute>
<attribute name="motivation">
<object name="V27">
<attribute name="reason">become-sheriff-harmony</attribute>
<attribute name="goal">address-problem</attribute>
</object>
</attribute>
<attribute name="addressee">ranger</attribute>
<attribute name="action">assert</attribute>
<attribute name="actor">harmony</attribute>
</object>
</attribute>
</object>
</intention>
</fml>
<bml>
<speech id="sp1" type="application/ssml+xml">ranger utah cant be bartender if he becomes sheriff</speech>vrExpress "harmony" "None" "??" "<?xml version="1.0" encoding="UTF-8" standalone="no" ?><act>
<participant id="harmony" role="actor" />
<bml>
<body posture="HandsAtSide" />
</bml>
</act>"

Status / request

 

Posture change

These messages are characterized by the <body posture=""> tag which allows NVBG to know that there has been a change in postureThe idle_behavior and all_behavior attributes within the request tag allows NVBG to keep track of whether or not to generate the corresponding behavior.

vrExpress "harmony" "None" "??" "<?xml version="1.0" encoding="UTF-8" standalone="no" ?><act>
<participant id="harmony" role="actor" />
<fml><bml>
<status type<body posture="presentHandsAtSide" />
<request type="idle_behavior" value="off" />
</fml>bml>
</act>"

Gaze


Status / request

The idle_behavior and all_behavior attributes within the request tag allows NVBG to keep track of whether or not to generate the corresponding behaviorThese gaze tags, if present within the input message are transferred unaltered to the output message.

vrExpress "harmony" "rangerNone" "constant103??" "<?xml version="1.0" encoding="UTF-8" standalone="no" ?><act>
<participant id="harmony" role="actor" />
<fml>
<gaze <status type="weak-focus" target="ranger" track="1" speed="normal" > "listen_to_speaker" </gaze>present" />
<request type="idle_behavior" value="off" />
</fml>
</act>"


Gaze


These gaze tags, if present within the input message are transferred unaltered to the output message.

vrExpress "utahharmony" "ranger" "gaze199constant103" "<?xml version="1.0" encoding="UTF-8" standalone="no" ?><act>
<participant id="utahharmony" role="actor" />
<fml>
<gaze type="avertweak-focus" target="ranger" track="eyes-offset1" speed="slowernormal" > "planninglisten_speechto_hold_turnspeaker" </gaze>
</fml>
</act>"


Emotion

The affect tag contains data about the emotional state the character is currently in. This can be used to affect output behavior.

vrExpress "harmony" "None" "schererharmony17" "<?xml version="1.0" encoding="UTF-8" standalone="no" ?><act>
<participant id="harmony" role="actor" />
<fml>
<affect type="Fear" STANCE="LEAKED" intensity="110.475"></affect>
</fml>
<bml> </bml>
</act>"

 

 

nvbg_set_option

These are control messages to set options for NVBG. They are as shown below:

nvbg_set_option [char-name] all_behavior true/false - sets/unsets flag that allows all behavior generated by NVBG.

nvbg_set_option [char-name] saliency_glance true/false - sets/unsets flag that allows saliency map generated gazes. These gazes are based on content in the speech tag and the information in the saliency map.

nvbg_set_option [char-name] saliency_idle_gaze true/false - sets/unsets flag that allows idle gazes generated by the saliency map. These idle gazes are based on the priority of pawns in the saliency map and are generated when the character is idle.

nvbg_set_option [char-name] speaker_gaze true/false - sets/unsets flag that allows for the character to look at the person he's speaking to.

nvbg_set_option [char-name] speaker_gesture true/false - sets/unsets flag that allows gestures to be generated when speaking.

nvbg_set_option [char-name] listener_gaze true/false - sets/unsets flag that allows the listener to gaze at the speaker when he speaks.

nvbg_set_option [char-name] nvbg_POS_rules true/false - sets/unsets flag that allows behavior to be generated based on parts of speech returned by the parser.

nvbg_set_option [char-name] saliency_map [filename] - lets you dynamically specify the saliency map that the character will use

nvbg_set_option [char-name] rule_input_file [filename] - lets you dynamically specify the behavior map that the character will use


Understanding output messages

...

<rule keyword="statement_animation" priority="2" >
<pattern>is</pattern>
<pattern>are</pattern>
<pattern>were</pattern>
<pattern>was</pattern>
<pattern>have been</pattern>
<pattern>has been</pattern>
<pattern>at</pattern>
<pattern>stands</pattern>
<pattern>come</pattern>
<pattern>like</pattern>
<animation>
<posture name="CrossedArms"> 
<clip>CrossedArms_RArm_GestureYou02</clip>
<clip>CrossedArms_RArm_GestureYouPalmUp</clip>
</posture> 
<posture name="HandsAtSide">
<clip>HandsAtSide_RArm_GestureOffer</clip>
<clip>HandsAtSide_RArm_LowBeat</clip>
<clip>HandsAtSide_RArm_MidBeat</clip>
<clip>HandsAtSide_Arms_Beat</clip>
<clip>HandsAtSide_Arms_Chop</clip>
<clip>HandsAtSide_RArm_Chop</clip>
<clip>HandsAtSide_RArm_FistsChop</clip>
<clip>HandsAtSide_RArm_LowBeat</clip>
<clip>HandsAtSide_RArm_MidBeat</clip>
</posture>
<posture name="HandsOnHip">
<clip>HandsOnHip_RArm_MidBeat</clip>
</posture>
<posture name="LHandOnHip">
<clip>LHandOnHip_RArm_You</clip>
<clip>LHandOnHip_RArm_GestureOffer</clip>
</posture>
<posture name="Chair">
<clip>Chair_You_Small</clip>
<clip>Chair_You</clip>
</posture>
</animation>
</rule>

 

The pattern <tag> contains the word that is to be matched, and the <clip> tags contain the animations which should be played when that word is spoken. The <posture> tag is used to set animations for each character posture. Only animations that are in the current characters posture will be played by NVBG. The 'priority' attribute allows NVBG to prioritize between animations when multiple one's overlap.

...