In the Swiss Biathlon Arena Lenzerheide Tawny Together with young Swiss biathlon professionals and the innovation department of Red Bull Media House, they conducted initial tests on how athletes' hit probability could be predicted based on biometric patterns. Three-time Olympic biathlon champion Michael Greis and four other young professionals enthusiastically participated in the project to decode the DNA of their flow.
In fact, with the help of artificial intelligence (AI), a physiological pattern was identified to estimate the probability of hitting targets at a given moment. The most striking finding of the analysis is that there appears to be a threshold beyond which the probability of hitting the target at the shooting range increases considerably. Consequently, supporting athletes who learn to achieve a sub-threshold state can help improve their shooting performance.
This insight could not only change the training routine and method of athletes, but also revolutionize the sports experience of spectators through additional information on the TV screen.
The four levels of emotional intelligence
And yet, biathlon is just one of countless application examples where AI has become indispensable. Essentially, it's about improving machines or processes through automation and self-learning systems. TAWNY establishes a completely new position within this landscape: 'Emotion AI' is a key term here and, similar to the development stages in autonomous driving, AI also involves a level model:
As the graphic shows, our goal is to make machines, products, and services empathetic. This is the next and still largely unexplored development path in the AI world: considering the human factor.
Level 0: EQ Zero
Level 0 stands for an EQ (=EI=Emotional Intelligence) of zero. Currently, this level probably applies to 99% of all machines on this planet. Industrial equipment or even a calculator have no access whatsoever to the emotional state of their users. They serve humans merely as tools and aids in completing tasks, such as complex mental arithmetic. So-called "smart chatbots" also belong to this level. Just ask Alexa, "Alexa, how am I feeling?"; the answer will disappoint you.
Level 1: simulated EI
Level 1 is found in rule-based systems and assistants that appeal to human emotions or assume that a person is in a certain state (such as stress). One example of this is the small coffee cup indicator in the car cockpit, introduced in 2009, which is intended to signal, depending on the mileage, whether the driver might be a bit tired and inattentive.
Another example is the success story of the Tamagotchi, which originated in Japan in the 1990s. An electronic chick that, depending on the level of attention or intensity of use, becomes either satisfied or dies, only to be revived. The term "Tamagotchi effect" is used in connection with the development of mutual emotional bonds between machines and robots.
Level 2: partial EI
However, real emotional intelligence can only be achieved in Level 2 The research project and startup TAWNY.ai is an example of this stage. Using wristbands, biometric data such as heart rate variability or skin electrodermal resistance are measured to subsequently classify human emotions and states of overstimulation, understimulation, and flow.
This information is then passed on to connected devices to empower them. Cars know how aggressive their drivers are and can autonomously adjust their driver assistance systems accordingly. Smart homes know which settings make residents feel most comfortable. Televisions make program recommendations based on the viewer's mood. The workplace adapts to the mental state of employees.
This level of emotionally intelligent products and services has the potential to be a game changer for entire industries, as it enables a completely new dimension in improving job satisfaction, workplace safety, and application customization. This is also supported by the exploding market for emotion recognition technology (EDR). While it was just under $7 billion in 2016, a market volume of $36 billion is expected for 2021.
In our exciting biathlon project, we were able to demonstrate that Level 2 is not just TAWNY's perception, assumption, conjecture, idea, and vision, but rather a fact, a reality, and lived reality. This isn't scratchboarding, but real life!
Level 3: high EI
Level 3 will be achieved within the next 20 years. Through multimodal input from camera-based facial expression recognition, speech analysis methods, text-based sentiment analysis, and vital data, a 24/7 emotional profile will be compiled to adapt each person's environment to their emotional and mood worlds, as well as their general mental state. Given that humans do not fulfill the rationality assumption of homo oeconomicus, the ability to measure and predict emotion-driven actions would be a milestone in consumer research.
Level 4: Humanoid
Whether the goal should be Level 4 The pursuit of emotional intelligence remains a more complex topic. Here, it is assumed that robots can develop their own emotions and not just recognize those of humans in order to better serve them.
Many countries and organizations are currently considering how AI can or should be regulated. This debate is being fueled by prominent figures such as Elon Musk, Stephen Hawking, and Bill Gates. On the one hand, there is the image of a technological apocalypse caused by the unethical and self-determined spread of AI; on the other, there is the immense economic potential offered by autonomous systems and the resulting opportunity to satisfy the growth hunger of a globalized world.
About the author:
Dr. Michael Bartl is founder and CEO of TawnyThe Munich-based startup has set itself the task of equipping machines with emotional intelligence.