DEVELOPING A 3D VIRTUAL PERSONAL ASSISTANT
Bringing a high-fidelity avatar to life across a multi-screen experience
As part of Conjure’s ongoing interface research and development programme, we tasked ourselves with developing a real time 3D Virtual Personal Assistant (VPA). We opted to simulate an automotive IVI on iOS with the goal to:
- Test and integrate the functionality of the Houndify voice platform and;
- Evaluate how OpenGL driven 3D graphics responds to voice input
- Designed and modelled a visual avatar to represent the VPA
- Developed a companion phone app that allows the VPA to move off the IVI onto the handset
- Integrated Houndify into an iOS framework mimicking an automotive IVI
- Created a 3D Avatar running at 60fps
Can a 3D VPA avatar be integrated with a conversational UI?
Having reviewed a series of voice control solutions such as Siri, Google Speech and Watson, we opted to use Houndify, the system powering Mercedes MBUX. Our challenge was to develop the app in such a way that the character would retain a high frame rate and respond rapidly to the Houndify API.
As an added feature we also wanted the avatar to be able to disappear from one device and appear on another as if the VPA could free itself from the IVI.
The development plan spanned three key areas: Importation of the animated avatar, Houndify integration, and Bluetooth connectivity between tablet and phone. It was quickly established that 3D importation would prove the biggest challenge, and through a series of experiments across a range of export formats we settled on a system that would give us the highest fidelity of model at a smooth frame rate.
The overall interface was designed with a clean aesthetic, mimicking an IVI with choice elements of data while giving the avatar centre stage.
The team for this programme consisted of a product owner, two developers, a Character Artist, 3D artist and UI designer.
Artistic and technical challenges require diverse skill sets, and so the product owner ensured all the team members worked in close proximity and to a planned Agile programme. We wanted the avatar to be both futuristic but expressive, and so both developers and the 3D artist focused on ensuring the subtle animation and shaders were ported as the Character Artist intended.
The finished demo spanned both an iPhone and an iPad, with the user launching the application on both devices. The two would then invisibly connect via Bluetooth, with the VPA floating in space awaiting a command on the primary iPad. When the user asks a question, for instance – asking for the state of the vehicle’s tyre pressure – our avatar jumps into life, first thinking, and then animating around a render of a vehicle to show the state of each tyre.
We created a string of unique animations for tasks such as weather and GPS requests, allowing us to show the avatar moving in real time. Finally, pick up the iPhone and walk away from the iPad and our Avatar neatly disappears from the larger interface, bouncing into life on the small screen ready to serve once again.