Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

See Release Notes for details.

  • Jul 11 2012 - Released We have released a minor update to the toolkit, fixing some usability and stability issues. See Release Notes for details.
  • May 31 2012 - Released an An exciting new version of the toolkit is now available, offering the MultiSense framework, the Rapport research platform and the SBMonitor tool. MultiSense is a perception framework that enables multiple sensing and understanding modules to inter-operate interoperate simultaneously, broadcasting data through the Perception Markup Language. MultiSense currently contains GAVAM, CLM FaceTracker and and FAAST which you can use with webcam or Kinect. The Rapport agent is a “virtual human listener” providing nonverbal feedback based on human nonverbal and verbal input. It has been used in a variety of international studies related to establishing rapport between real and virtual humans. Finally, the SBMonitor is a stand-alone tool for easy debugging of SmartBody applications, including testing available (facial) animations, gazes and more complex BML commands.
  • Mar 2 2012 - Released a A minor update release of the toolkit is now available, updating the Unity version to 3.5 as well as providing incremental changes to the Unity/SmartBody debug tools in the Unity Editor (VH menu in Unity).
  • Dec 22 2011 - Happy holidays!   Released a new version of  The latest release of the toolkit which includes the ability to interrupt Brad, improved support for higher resolutions, and a fix for text-to-speech not working properly.
  • Aug 10 2011 - Released a new version of the toolkit, which offers support for the free version of Unity 3D; users may now create scenes for Brad.  Download Unity here: http://www.unity3d.com.  For instructions on how to use the Unity 3D Editor with the toolkit, see the vhtoolkitUnity section. In addition, the user interaction has been improved; Unity now launches in full-screen automatically and users get visual feedback when talking to Brad. To directly talk to Brad, first make sure you have a microphone plugged in, wait for Brad to finish his introduction, and close the tips window. Now click and hold the left mouse button while asking Brad a question; release the mouse button when you are done talking. The recognized result will be displayed above Brad in white font (toggle on/off with the O key), and Brad will answer your question. It is advised to update Java and ActiveMQ, which are provided with the 3rd party installer versions.
  • See Release Notes for details.

For feedback, see the toolkit Google Groups.News Archive

Toolkit Overview

Goal

The goal of the Virtual Human Toolkit developed by the University of Southern California Institute for Creative Technologies (ICT) is to make creating virtual humans easier and more accessible, and thus expand the realm of virtual human research and applications.

...