This AI detects 11 types of emotions from a selfie
The machine learning models that can detect our face and movements are now part of our daily lives with smartphone features like face unlocking and Animoji. However, those AI models can’t predict how we feel by looking at our face. That’s where EmoNet comes in. Researchers from the University of Colorado and Duke University have developed the neural net that can accurately classify images in 11 emotional categories. To train the model, researchers used 2,187 videos that were clearly classified into 27 distinct emotion categories including anxiety, surprise, and sadness. The team extracted 137,482 frames from these videos and then excluded sets of…
This story continues at The Next Web
from LatestTechyTalks
Comments
Post a Comment