Charlie AR Assistant:
Using augmented reality, voice recognition, and natural language processing to create an interactive experience by communicating with a virtual engaging assistant.
This is an academic project created in a 5 people team in CMU. I'm the UX designer and artist that responsible for exploring the interaction in AR, prototyping and iterating the user interface as well as designing the conversations between the user and the virtual character.
interaction Design | Conversational Design | Flow Chart | VUI Design | Character Design
Sketch | Axure | Photoshop | Illustrator | Maya | Unity | API.AI | IBM Waston
Interaction Flow Chart
A young age character can help adjust users' expectation and also allow more surprising elements in the conversation.
User subconsciously notices all the subtle facial expressions/animations. For example, if she blinks is a bit slow they will think she's tired.
User considers she's a human-like character, they tend to treat it with a human courtesy. Also because of that, they will expect her to act like a human. For example, it is okay for her to get things wrong or get distracted, but not repeating the same sentence over and over.
DialogueFlow(API.AI) can only provide us with standard answers. In this tech stage, we still need to manually type in different states to help develop answers with her personal touch.
We are limited by the pool of content we can pre-type in. In order to create a natural conversation, using environment, objects, and animations to indirect lead the user can help us optimize the experience.
A goal-based conversation will be more suitable for this kind of interaction. Figure out a simple task can be another way to limit the content pool and also make the interaction meaningful. Users tend to feel better about the experience when achieving something.