Prompts on how to use Amazon's Alexa personal assistant are seen in an Amazon ‘experience centre’ in Vallejo, California,
Prompts on how to use Amazon's Alexa personal assistant are seen in an Amazon ‘experience centre’ in Vallejo, California, U.S., May 8, 2018. (Photo: To match Insight AMAZON.COM-ALEXA/ REUTERS/Elijah Nouvelage)

A new leak claims that Amazon may be working on a wearable that can detect human emotions. It comes in the form of a wristband that will be marketed as a health and wellness device, based on the documents that have surfaced. This wristband is reportedly being developed by Lab126 and Amazon's Alexa voice software team.

The information was leaked by Bloomberg, which claims it got details of the project from internal documents. The device is supposed to have a microphone that can listen to the voice of the wearer, which it uses to determine about one's emotional state.

Details of the said project are sparse, including the status of the work the teams developing it. However, the leaked documents indicate that Amazon aims for this technology to encourage people to communicate their feels with other people "more effectively." The wristband, according to the report, will also work with the user's smartphone.

The project is reportedly being codenamed 'Dylan,' but Amazon hasn't confirmed such a project actually exists. There is also beta testing being planned, but it's unknown if the company intends to launch this alleged device commercially or will be made exclusive to the healthcare industry. It's unknown if the project is just an experiment either.

Questions remain over how broadly Amazon may anticipate using this technology, assuming it is in development as the leak claims. There are potential uses beyond offering a wellness device, including enabling Alexa to understand better users who are speaking to her, as well as understanding users' emotional state when they interact with the assistant in different ways.

Amazon's emotion detection ambitions are visible in two papers it published in recent months.

Both projects trained models using a University of Southern California (USC) data set of approximately 12 hours of dialogue read by actors and actresses. The data set of 10,000 sentences was then annotated to reflect emotion.

"Multimodal and Multi-view Models for Emotion Recognition" detects what Amazon Alexa senior applied science manager Chao Wang calls the big six: anger, disgust, fear, happiness, sadness, and surprise.

"Emotion can be described directly by numerical values along three dimensions: valence, which is talking about the positivity [or negativity] of the emotion, activation, which is the energy of the emotion, and then the dominance, which is the controlling impact of the emotion," Wang said.

Bloomberg argues that a 2017 Amazon patent details ways emotional knowledge could be utilized by Alexa, including creating a tailored response to the user and provide recommendations for products or services that may suit the current emotional state of the wearer.