Digital Media Designer: Virtual Reality, Mixed Reality, Artificial Intelligence and AAA Games
Non Verbal Feedback
Non verbal communication and feedback using animation blendshapes, eye movement and traditional A.I. patterns to create a human/avatar feedback loop.

The avatar, Audrey is using Tom Knabe's Realistic Eye Movement (REM) package to control the eye's subtle saccadic movement. Overall, the character movement, touch input and animations are controlled by Unity's Mecanim and a custom finite state machine controller class I created in C# that switches between states of awareness and response.
Once the user has crossed a distance threshold or triggers a specific event the REM package hands control to a custom facial and gesture ik control class I created in C#.

For example, "State = Aware" with Audrey as seen above.
Within the Aware state, a response action drives Audrey's facial expressions with blendshapes in the character rig.
The blendshape values (weights) are driven by a custom class called "Anxiety":
The Anxiety class continuously determines 2 things
Trust Level = value that is timed to slowly restore if a threshold is respected (personal space distance).
Anxiety Level = value calculated by the users distance from Audrey as compared against Audrey's current trust level.