Electrical activity in the user's brain are monitored using the EEG headset. The brain activity data collected by the EEG headset is then interpreted by algorithms to define attention, stress, relax, eye blink and engagement.
It allows you to capture all respondents reactions via facial coding. It detects 7 basic emotions such as Delight, Sadness, Anger, Surprise, Fear, Sceptical, Disgust and Neutrality.
Using near infrared device or integrated camera, discover what consumers really see. Allows you to observe viewing patterns understand what and where people see and track their gaze path.
With computer vision and machine learning the engine predict the visual saliency of an image or video ad (bypassing the eye tracking hardware, camera and sample user). It predicts what part of an image or video will grab your attention and draw consumer in.