Artificial emotional intelligence or Emotion AI uses optical sensors to measure facial expressions of emotion and cognitive states the same way people do. The technology works by identifying a human face, either in real time or in an image / video and, by using computer algorithms, identifies key facial landmarks e.g. the corners of the eyebrow, the tip of the nose or the corners of the mouth. Machine learning algorithms then analyse pixels in those regions to classifying facial expressions. Combinations of these facial expressions are subsequently mapped to a range of emotions.
When used with traditional market research methods, facial analysis can deliver an additional layer of insight by uncovering the emotional reaction to stimulus, whether static or dynamic.