smileML: AI That Makes Emotion Recognition An Exact Science

May 04, 2021

smile ML | Air Force Accelerator Powered By Techstars

Miscommunication impacts everyone, from causing friction in teams that leads to toxic workplaces and churn, to impacting split decisions in an operating room where lives hang in the balance. Founded by Ethan Petersen and Shannon Anderson, Boston startup smileML takes the guesswork out of emotional interpretation by using AI to help individuals and teams communicate effectively. The startup completed the Air Force Accelerator Powered By Techstars in 2019.

We’ve all been there: A video call where colleagues don’t see eye to eye, or a remote team that is ineffective in its communication and output. Poor communication and misinterpreted emotions affect us all, and it is this complex human problem that one Boston startup is looking to crack with non-human assistance.

smileML Cofounder and CEO Ethan Petersen explains that his startup combines the power of artificial intelligence with the power of human emotion to offer deep insight.

“Miscommunication impacts everyone, from sales reps and the outcome of their negotiations to Intelligence Analysts showing signs of PTSD that go unnoticed,” Ethan says. “So, smileML reduces the amount of stress involved with emotional guesswork by leveraging emotion-recognition to help people understand one another.”

The team’s primary commercial product identifies sentiment data related to sales calls and helps sales managers get immediate feedback on seller performance instead of waiting until the end of a quarter to see results.

The startup has developed an edge-based emotion recognition model capable of running on any camera. The analysis is performed on the device itself and is never sent to the cloud, making smileML safe and secure for even the most sensitive environments, Ethan says.

smileML is an alumnus of Google Cloud for Startups and has received funding from The United States Air Force.

A New Application With The Air Force

The smileML of today is not the same company that Ethan and cofounder Shannon Anderson originally envisioned. The startup began as an iPhone app that had nothing to do with emotion recognition. However, Ethan and Shannon realized that getting honest user feedback was difficult and inconsistent. So, they started putting together a solution that became the platform for smileML.

In addition to the platform’s commercial applications, there are also very real applications in the intelligence world. This became quickly apparent when smileML joined the Air Force Accelerator Powered by Techstars and adapted their technology to an Android phone, with the intention of enabling intelligence operators with a lightweight tool to analyze interviews of various informants. 

Soon after the program, smileML won a $1.5M AFWERX Phase II SBIR contract to adapt its commercial emotion recognition for the analysis of in-person interviews for United States Special Operations Command personnel.

Finding The Right Product-Market Fit With Techstars

The pivot to intelligence was driven by the 2019 class of the Air Force Accelerator Powered By Techstars. “Techstars Air Force gave us the opportunity to speak with the most incredible mentors in commerce and government who introduced us to applications where our technology could even save lives,” says COO Shannon Anderson.

“Techstars helped us find the right product-market fit and distinguish between what is impressive and what is traction. Before Techstars, we had been distracted by impressive accomplishments that weren't moving our business forward, and our attention was split across different use cases. Techstars equipped us with a framework to systematically evaluate markets and allowed us to narrow our focus down to one specific use case.”

“Through Techstars,” Ethan says, “we discovered our technology could make an even larger impact than we had ever dreamed.”

The recent funding will help smileML to market the first application for emotion analytics on video calls, Shannon says. “We want to equip anyone that relies on video calling with tools to understand what parts of a conversation other people engage with, if they react positively or negatively, and how a message is coming across,” Shannon says.