What I Learned About Emotion AI Through My Startup

Just over a year ago, at the end of 2018 we closed Netek; my emotion AI company. The reason? Just like every startup that’s funded, we ran out of money. I tried over many months to raise more money. We had the tech and the team. But when I took the reigns as CEO I knew that the technology was still in it’s nascent stage and would need to evolve further than what the industry was operating on.

I understood that the science behind it was based on 6 basic emotions; but believed that would have to evolve to be more granular to fully understand emotions through a camera and EEG. So one of the first things I did is set a small team of 3 people to investigate and figure out how we could extend the basic theory to encompass emotions on a more granular level. All of this while we built our first product based on the 6 emotions. I figured we needed to do this because we had to get ahead of the curve and try to avoid a huge obstacle in the adoption of the technology: trust.

I also understood that we would have to raise money, and this topic would come up. And to survive and keep going we would have to figure out a way to create something useful with what we had, which was a challenge. Plus, raising money would require partners who were willing to experiment with us and keep evolving the technology.

We were all aware that there would be pushback from the market. When I traveled in the U.S. I received the most pushback, specifically that the technology is based on outdated science. The main point of contention is that emotions are more complex than six basic emotions; therefore the technology can’t be trusted because emotions are expressed in a huge variety of ways, which makes it hard to reliably infer how someone feels from a simple set of facial movements.

Many months now since we closed it’s become evident the science behind the technology has come under scrutiny; it can’t be trusted and is unreliable.

This point became more evident to me towards the end of the company. In the last 3 months we engaged with a company that developed a product for companies to hire remotely through a camera. When we talked, they saw immediate utility in our technology and decided to see how we could integrate our tech with their product. This happened over many months before we actually got to developing and experimenting. The end came when we started doing testing and seeing results, and how we had to do more and more of it.

I simply asked myself, “would we use this product ourselves to hire people? And can we trust it?”

The answer was no. We had a lot of work to do to get to yes, and that would take time and money.

I have a lot of lessons learned, and many more stories, from my experience in this space, hit me up if you want to learn more or are just curious about the topic.


Bottom line: Emotions are a cluster. It’s not just microfacial expressions. It’s also your body language. It all goes together. With all our flaws, humans will always be a better at judging people until we figure out how to implement this through a computer system.