Confused about emotion AI? Our own Paty Hernandez sat down with 5 people – a child, teen, college student, grad student and expert – to explain emotional artificial intelligence.
While machine and deep learning are advancing the state of artificial intelligence, there’s still a long way to go before machines can replace us; if ever. Still, we have to recognize that today we are surrounded by computing power: devices, virtual assistants and robots; it’s everywhere.
Right now machines are really good at replicating and doing a better job than humans at repetitive tasks that require a lot of processing power and pattern recognition. But AI has trouble replicating those things that humans are really good at: understanding, motivating, and interacting with human beings.
Which is why you shouldn’t be worried machines will take your job: Machines will not replace us until they become emotionally intelligent.
This is important because as artificial intelligence continues its journey into the mainstream in 2018, and emotion AI becomes more present in discussions about AI; it’s worth asking: why does AI need emotional intelligence?
Before we can share our lives with machines, we must teach them to understand and mimic human emotion. Today machines can recognize faces, and they can also read our emotions:
With that said, beyond the “cool” factor, why does AI need emotion?
Here are three reasons why:
The future is already here. For a look into how this looks like, checkout our last post where we outlined 10 useful things emotion AI can do.
Originally posted in Vibetek blog.
This was the year most every large company took notice of the rise of artificial intelligence. But while it encapsulates many categories – machine learning, deep learning, computer vision, natural language processing, speech recognition and others – there is still one category that hasn’t been recognized, beyond academia, and that’s emotion recognition; or affective computing.
Computers are increasingly able to figure out what we’re feeling. A recent report predicts that the global affective computing market will grow from $12.2 billion in 2016 to $53.98 billion by 2021. The report by research and consultancy firm MarketsandMarkets observed that enabling technologies have already been adopted in a wide range of industries and noted a rising demand for facial feature extraction software.
Affective computing is also referred to as emotion AI or artificial emotional intelligence.
On this episode of the Big Bang Podcast I’m joined by Sergio Langarica, President of Netek – a neuro applications technology company, to talk about the future of emotionally intelligent technology.
Sergio has 20 years of experience in Information Technology ranging from Start-Ups to Global Players. He is an avid promoter of these industries in Mexico and abroad in various Board of Directors roles at the Mexican Chamber for the Electronics, IT and Telecommunications Sectors.
Netek, based in Tijuana Mexico, has recently finished developing it’s technology after two years of R&D. Their first product, still in beta, is a an affective computing platform called Vibetek.
Below are some questions and our chat:
We invite you to experiment using our API or download our app.
The Big Bang is a weekly podcast. Tune in every Tuesday for more discussions on what’s possible.
Intro audio is by Arturo Arriaga, outro audio is Candyland by Guy J.