Can artificial intelligence read emotions now? Read to find out.

It is early July, approximately 3°C outside, but Mihkel Jaatma is reasoning about Christmas. In a co-working space in Soho, the 39-year-old creator and CEO of Realeyes, an “emotion AI” initiative which uses eye-tracking and facial expression to resolve mood, papers through a list of 20 festive ads from 2018.

He settles on “The Boy and the Piano”, the gift from John Lewis that narrates the life story of Elton John backwards, from Megastardom to the favor of a piano from his parents as a child, guided by his ageless heartstring puller “Your Song”. The ad was well earned, but Jaatma is clearly suspicious.

He hits play, and the ad starts, but this time two lines – one grey (negative reactions), the other red (positive) – are etched across the process. These follow the second by second feedback of a 200-person specimen audience who patrolled the ad and allowed Realeyes to document them over the camera of their computer or smartphone. Realeyes then used its AI technology to resolve each individual’s shallow expression and body language.

The company did this with all of Jaatma’s ballot of 20 Christmas ads from 2018, valuating 4,000 people, before rating each commercial for attention, despair, sentiment and certainly giving it a mark out of 10.

Affectiva caters software development kits that empower developers to embody this type of heated intelligence into their utilisations. The company’s SDKs are able to notice anger, contempt, bother, fear, joy, sadness and confound. “With these seven, developers can arrest a range of emotional states from their users utilizing just the camera on their device,” said Affectiva’s Pitre.

Salisbury does admit as the technology advances there will be huge positive potential to make technology more cordial and thoughtful.

“Emotion AI’s true crest is on the horizon. It will grasp billions of people and aid them in their daily lives, authoring us feel connected in a more human way,” True Emoji’s Dugar combined.