Nowadays artificial intelligence (AI) and data are the buzzwords. Data is constantly being collated, analysed and used to make our lives easier, theoretically. But AI can only analyse actual words, i.e., what you say only accounts for seven per cent. The remaining 93 per cent, is how you say it and is measured in terms of gestures, vocal tone and body language. This is not analysed at the moment.
So, if AI is to reach its potential, it has to develop and evolve to encompass this 93 per cent as well as just raw data.
Sensum is an empathic technology company building emotion AI solutions to measure, understand & respond to human emotions, physiology and behaviour. But how does this technology work and will it change the future of driving?
I spoke to Ben Bland, Chief Operations Officer for Sensum to understand their technology better.
He explained that at the core of everything that Sensum does is a powerful ‘emotion translation’ engine called ‘Synsis’ that extracts emotional, physiological and behavioural signals from data derived from sensors.
Ben also went on to explain what makes Sensum unique: “In order to increase the accuracy and reliability you would expect from just one single data stream, e.g. facial coding or heart rate, Sensum uses multimodal sensor fusion. By fusing the data from a wide range of sensors that detect physiological changes and correlating that with contextual data and media, e.g., location, speed, environmental factors, we can get a more dynamic and holistic picture of the user’s state. This is what Synsis does in real time.
“Having learned the hard way how to get over the headache of manually synchronising all the data and media together, we have created automated tools for this very purpose.
“We have also created a mobile solution and consider ourselves world leaders at measuring emotions outside of a lab environment, ‘in the wild’ so to speak. We’ve taken our technology to a huge range of environments from homes, shops and offices to extreme environments such as the Arctic, inside a volcano and in a range of extreme sports.”
But how does it all work?
At this stage of development, with the industry trying to understand the best uses for empathic technology, physical sensors are placed inside the car. Sensum’s research kit includes biometric sensors, video cameras and microphones so any vehicle can be rigged up with the necessary tools required to measure the driver or passenger’s current state.
With the capability to both measure emotions from these sensors and operate the equipment from an app on any standard smartphone, data can be collected at any time in the driving experience.
Ben explained: “Car owners might find it uncomfortable to be wired-up to sensors and recorded while they drive, but we find participants in our studies get used to it immediately.
“However, several top-tier companies are looking into sensors that are built into the vehicle, this has the advantage of being less intrusive than wearing the sensor. As this technology becomes increasingly embedded into the vehicle, we expect it will soon become part of everyday driving. With these sensors turning vehicles from passive to actively responsive empathic machines.”
Sensum have recently enjoyed collaborating with Ford, where they were asked to measure the ‘buzz moments’ experienced by drivers controlling a Ford performance car. Sensum then used the insights to supply the technology for a Ford ‘Buzz Car’ that generated visualisation displays around the car when the driver gets excited.
As well as being a successful PR campaign, this demonstrated the capability of modern technology to both measure and respond to emotions.
But can, and will this technology change the future of driving?
With vehicles becoming more automated and connected, this empathic development is not just about the vehicle’s own technology. It’s about how they will become connected into a wider data infrastructure from communicating with other vehicles, people and the road infrastructure itself.
So what’s next for Sensum?
Ben said: “The mobility sector is exploding at the moment with new developments and because that sector is uniquely savvy to the opportunities in empathic human-machine interaction this is where we are concentrating our focus.
“That said, Sensum can be applied to almost any scenario and environment. Ultimately, our vision is to be a major factor of the future ‘digital self’ that all of us takes with us everywhere we go.
“Everything we do relates to a better understanding of the emotions we, as humans, have.”
With AI making technology smarter and more useful, it seems that empathic AI is just the next step in smart technology.