How to build a responsible future for Emotional AI
JANUARY 6, 2020 RESEARCH REPORT
- Emotional AI will redefine products and services as we know them. But how can companies use the technology responsibly?
- Emotional AI is helping businesses detect peoples’ emotions in real time—by decoding facial expressions, analyzing voice patterns and more.
- This technology offers opportunities for Communications, Technology and Platform companies to reinvent customer engagement, but there are risks.
- We share key considerations companies should take into account and next steps businesses can take to address risks associated with Emotional AI.
Learning to feel
Emotional AI technology can help businesses capture peoples’ emotional reactions in real time—by decoding facial expressions, analyzing voice patterns, scanning e-mails for the tone of language, monitoring eye movements and measuring neurological immersion levels.
Emotional AI will not only offer new metrics to understand people; it will redefine products and services as we know them. But Emotional AI also brings risks. The data collected using Emotional AI technology will test companies with a whole new set of ethical challenges that require responsible actions.
The data and responsibility opportunity
Emotional AI will be a powerful tool that will force businesses to reconsider their relationships with consumers.
Companies that lead in both responsibility and intensity of emotional data usage experience on average a 63% gain in total revenue compared to the average company within the industry.
The same companies experience on average a 103% gain in EBITDA (earnings before interest, taxes, depreciation and amortization) compared to the average company within the industry.Leaders from the technology and platforms industries have a responsibility to act now to prepare for the Emotional AI era. No doubt privacy battles will erupt as our inner lives become a currency.
Coming to our senses
Emotional AI applications can lead to better experiences, better design and better service for customers. They also hold the potential to open up a completely new world of opportunities for Communications, Technology and Platform companies.
In brick-and-mortar stores across the world, cameras and sensors are gauging shoppers’ sentiments to target offerings and help make future decisions.
Enterprises are working on emotional-recognition training for digital voice assistants to aid in health monitoring and consumer sentiment analysis.
Video + audio
Emotional AI is improving auto safety. For example, cameras and microphones can pick up on passenger drowsiness—and jolt the seatbelt as a result.
Many applications have been created to not only detect emotion in text but make editorial recommendations based on reading level and other criteria.
Feeling the risks
Reading people’s emotions is a sensitive science. The data collected using Emotional AI technology will test companies with a whole new set of ethical challenges. Based on our research, we see four aspects of data collection and usage that merit close attention: Systems Design, Data Usage, Transparency and Privacy.
1. Systems design
As more and more companies incorporate Emotional AI in their operations and products, it’s going to be imperative that they’re aware of and actively work to prevent bias in their systems.
2. Data usage
Consumers may cry foul if their emotional data is being used for the company’s gain and not their own. The more value a consumer derives from sharing their data, the more they trust service providers.
Businesses often fail to explain the benefits to the user of collecting their emotional data. Companies should be transparent with the user about what is being collected, how and why.
As emotional data collections become more sophisticated, privacy and data protection will become more complex, and clarity around data ownership, usage and meaningful consent will become more urgent.
A new sense of responsibility
Given the role of the communications, technology and platform industries in the design of emotional data collection and usage across all industries, their approach towards responsibility in the use of emotional data becomes central to how responsibility is woven into Emotional AI as its use expands across all industries. This role needs to be taken seriously. To become a responsible steward of Emotional AI, companies need guiding principles for how data is captured and leveraged. In addition, there is a set of actions firms can take to drive stronger responsibility across three layers—individual, company and industry ecosystem—of operations.
Listen to your employees
As shown in employee protests, there’s a strong sense of ethics and doing the right thing. Empower these employees to drive an ethical mindset.
Lean on diversity
Diversity in the workplace can help executives ensure that AI systems are designed and trained with the least possible biases.
Bring responsibility to startup culture
A start-up mentality can take businesses a long way, but firms need to build a responsibility mindset into the “minimal viable product” culture.
Draw on outside experts for responsible design
Embedding an ecosystem of experts and collaborators into business processes ensures a better lens into the impact of collecting emotional data.
Extend the reach of risk assessment
Be honest and explicit about worst-case scenarios. Continuously revisiting risk assessment questions builds foresight and helps detect shifting trends.
Build responsibility into partnership agreements
Ensuring ethical principles can be fulfilled requires the entire partner ecosystem to operate on the same principle