Technology and AI have become an integral part of daily life. Through the rise of basic communication and interaction with self-service machines in supermarkets, to AI-powered virtual assistants such as Siri and Galaxy, automated information and assistance are at our beck and call. AI is even becoming prominent in fields that were once the province of humanity alone. Modern human existence is intrinsically interlinked with our digital counterparts. What was once science fiction has become reality, from the creation of music and fashion trends to healthcare advice – Black Mirror writers take note.

So how does the concept of a device that can read human emotions and assess your current mental state grab you?

Emotional dialogue

Amazon is currently working on a wearable gadget which is, in all likelihood considering the market trend for wearables, going to evolve into a fitness-tracker or smartwatch. It is currently described as “a health and wellness product” according to information made available to Bloomberg from an anonymous source.  

‘Dylan’, the new AI, ‘emotion-recognising’ wearable is a collaboration between Lab126 and Amazon’s hardware division (developers of Fire Electronics and Echo smart speakers) and Amazon’s software division (the voice development team behind Alexa). As reported by Bloomberg, according to “documents and a person familiar with the program”, Dylan, currently in early beta testing, can discern the wearer’s emotional state from the sound of his or her voice.

The current model of this device is a voice-activated, wrist-wearable tech gadget, complete with a microphone and speakers and designed to be paired with a smartphone app. The device provides detailed information on the wearers emotional being: “joy, anger, sorrow, sadness, fear, disgust, boredom, stress, or other emotional states.” Emotion recognition is part of the patent for Alexa, the previous voice-pattern-analysis incarnation and will potentially be built upon, enabling Dylan to advise the wearer on a course of action in response to his or her mental state. It is unclear whether this wearable is fully “trained” and encoded after purchase and therefore specific to the wearer or whether emotion recognition is part of a more comprehensive, generic range of vocal pattern understanding.

Her master’s voice

Alexa, Amazon’s commercially available generation of voice-activated technology, has a patent apparently showing the technology assessing illnesses based on an individual’s “sniffling” voice. Alexa offers support – most famously a chicken soup recipe – and health products based on that assessment. Amazon is reportedly moving further into the healthcare industry; and therefore, developing technology which recognises human behaviour is no surprise. Amazon’s purchase of online pharmacy PillPack back in June 2018 links very nicely to these current developments in technology.

AI technology is becoming so commonplace today that it is easy to forget that behind the mechanics of the systems which weave in and out of daily life there must be an equal system that develops and ‘teaches’ the gadgets the parameters of specific behaviours and appropriate responses. Behind Alexa, Amazon’s human team assesses random individual’s voice samples, transcribing and then feeding the information back into Alexa to close gaps in the system’s capabilities, capacity and understanding – AI pedagogy in fact.

Whether individuals are aware that this is a part of the process of AI learning is a matter for further debate and begs the question of whether privacy laws or even ethical and moral codes are being transgressed.

You don’t necessarily think of another human listening to what you’re telling your smart speaker in the intimacy of your home … I think we’ve been conditioned to the [assumption] that these machines are just doing magic machine learning. But the fact is there is still manual processing involved.

Florian Schaub, professor at the University of Michigan and Privacy Issues researcher and speaker

The recent Amazon re: MARS conference showed Alexa programming developing more ‘natural’ complex, reactive and proactive strands of ‘conversation’. This goes beyond just the prediction of an end-goal into multiple strand scenarios. And from a commercial perspective where companies are diversifying daily to capture wider markets, it’s moving toward pushing advice – and sales.

With this new approach, Alexa will predict a customer’s latent goal from the direction of the dialogue and proactively enable the conversation flow across topics and skills. This is a big leap for conversational AI.

Rohit Prasad, Alexa VP and head scientist

Another such AI project by Amazon which has more recently come to the fore in spite of having spent several years incubating on the back burner, is Vesta. Named for the Roman goddess of domesticity, Vesta is a domiciliary AI. However, it is currently unclear exactly which roles the AI is anticipated to play. Harnessing advancements in camera hardware and ‘computer camera vision’ software enables Vesta to navigate around places where Alexa and the Echo smart speakers are not present, potentially providing a seamless AI-support network throughout the home.

Big data

Concerns are already being raised about the way Amazon will use the personal information it collects and Dylan’s data evidently has the potential to be utilised by the online sales giant to target their advertising and recommend specific products to suit the apparent needs of the device wearer – at times of heightened emotion.

Coming at a time of ever-growing privacy concerns and the recent issues within the Amazon shareholder camp regarding the sale of its facial recognition software, the development of humanised AI is gathering pace.

Concerning their facial recognition software, Michael Punke, VP of Global Public Policy at Amazon Web Services, writes on their blog:

New technology should not be banned or condemned because of its potential misuse. Instead, there should be open, honest and earnest dialogue among all parties involved to ensure that the technology is applied appropriately and is continuously enhanced.

Michael Punke, VP of Global Public Policy at Amazon Web Services

How these increasingly humanised technological advances are going to be used and whether some of them will even get past the beta phase is unknown. If they support and enrich daily life, then they may well turn out to be a positive addition to the family. Currently, however, it appears that many questions around their development and application need to be resolved.

One thing is certain, the technological capacity to recognise a human emotional response is a huge leap and this progress is likely to continue gathering pace.