Bio-Motion in HMI
For the best part of a decade travellers queuing up at Gatwick airport’s passport control could not have failed to notice a small bright series of LEDs rotating smoothly around a suspended camera lens. The high contrast white LED against the black surround was chosen for maximum contrast, and the smooth, trailing tail designed specifically to create an illusion of biological movement.
Called Mflow and created by Infobric, they were running facial recognition software and in order to function they needed the public to look up into the camera. While I’ve never been a fan of the term “hack” for, well, anything, this is about as close to a brain hack as you can get. To understand why we need to know how our vision works.
Known as the Fovea, at any given moment only 1 to 2 degrees of our field of vision is in sharp focus. The rest of your vision is much blurrier and less detailed, but your brain does a remarkable job of filling in the gaps. It uses information from your peripheral vision, memory, and even predictions based on past experiences to create a seamless visual experience. If a particularly handsome individual catches your eye, and you’re surprised when looking directly at them they are not as you expected, this is your brain predicting what they will look like based on your preconceived beauty preference, not the reality of bringing them into the Fovea.
The extreme contrast of the LED acts to capture your attention from outside the Fovea range, yet the killer is the organic movement.
In 2005 neuroscientist Giorgio Vallortigara and his colleagues at the University of Trieste in Italy explored whether newly hatched chicks could distinguish biological motion patterns without prior visual experience. In the experiments, chicks were presented with computer screens displaying different animations:
A point-light animation of a hen walking.
A scrambled version of the hen's movement.
Randomly moving dots.
The results showed that the chicks exhibited a spontaneous preference for approaching the biological motion patterns, such as the walking hen, over the scrambled or random dot movements. This finding suggests that chicks are born with an innate ability to recognize and be attracted to the movement patterns characteristic of living beings. Interestingly, this predisposition extended beyond their species, as chicks also showed a preference for the biological motion of other vertebrates, including potential predators like cats. Humans are the same, and the snake-like movement of the Mflow LED circle taps into that predator instinct.
Not all uses of this technique are as nefarious.
In 2023 a team of four Chinese professors undertook an experiment, creating an exterior human machine interface (eHMI) for autonomous cars. Using the movement of a cheetah (I'm not making this up) they showed a series of videos to a set of 32 candidates asking them to guess the intent of the vehicle. To quote their paper:
“In this experiment, we utilized a biomimetic point-light model inspired by the cheetah, for which we identified key light points representing its form and created a motion skeleton model in the 3D computer graphics software. After adjusting the biomimetic motion of the cheetah skeleton model in Blender, we imported the model into the Unity game engine to link preset cheetah actions, culminating in the creation of the experimental scenario. For instance, the ‘Giving Way while Moving’ includes running, pawing the ground and bowing”
This is the only picture in the white paper, it's not a rear view mirror, but a car.
“These interactional scenarios were presented without any supplementary background cues. After each video, a two-question comprehension task was presented. The first question assesses one’s understanding of vehicle movement, offering choices such as ‘from stationary to moving’, ‘from moving to stationary’, ‘maintaining a stationary state’, and ‘continual movement’. The second question addresses the vehicle’s communicated intent with options like ‘Please cross’ and ‘Do not cross’.”
When compared to a simple text instruction, or no display at all, their results were startling:
Now I’ve not seen these videos, nor do I know if this study was peer reviewed, however it offers a tantalizing glimpse into how biological motion could underpin HMI both in and outside the vehicle in the years to come.
Finally, from a purely creative standpoint, weaving natural movement into cutting edge 2D and 3D interfaces is an exciting prospect. When form and function combine to create a unique experience, everyone wins.