jump to navigation

Neuro-Tools: Emotion Detection January 16, 2017

Posted by eyetrackrob in Biometric, Captiv, neuromarketing, Tips And Tricks, Uncategorized.
Tags: , , , , , , , ,
add a comment

Much of the research that requires biofeedback involves emotions and the detection or classification of those. But as you surely noticed emotions are a very complex topic with many different aspects and challenges. Measuring when and how strong these events occur is relatively easy using electrodermal sensors, cardiovascular sensors or respiratory sensors and a software in which to process the raw data (and maybe even applies emotion detection algorithms – as TEA Captiv does). Things that I covered in my previous posts.

But immediately after knowing when and how strong an emotion occurred, the inevitable question comes up: “was it a positive or negative emotion?” (Valence) with the usual follow up question: “which emotion was it?” (Classification).

We have seen that each tool has its merits when it comes to finding emotions, but most biofeedback sensors on their own have the limitation that they can’t really give us answers regarding those rather interesting questions about valence or classification.

However countless studies exist, that cover specific emotions or sets of emotions and that use different sensors to measure the bodily reactions to those emotions. If we could just review all those studies, we could surely come up with a map showing which emotions could best be captured with which sensors and how the measurements might differ from one emotion to another so that we could identify specific emotions (i.e. fear, happiness, anger etc.) from those measurements.

emotion-detectionIdentifying emotions just by looking at the different biometric measurements. Is it possible?

Sylvia D. Kreibig probably had the same idea and reviewed 134 publications that report research of emotional effects so that you don’t have to. Her review “Autonomic nervous system activity in emotion” was published in 2010 in Biological Psychology holds some interesting findings as well as food for thought.

Before getting to the eye opening results, there are a few take-aways from her research that might be interesting.

  1. Generally most research is done on negative emotions. And in general negative emotions have been associated with stronger autonomic reactions. However in her review she did not report on magnitude of changes partly for reasons described in #3).
  2. Heart rate was the most used measurement in those 134 studies, followed by GSR and much more than HRV or Respiration.
  3. Baselining! Do it! Some of the studies she reviewed did it, others didn’t. There are a number of ways to do a baseline: neutral or benchmarking/variable. While there is no definitive way to do it (making it more complicated to compare between studies), the important thing is that you use some kind of baseline to compare your results to.
  4. A rose is a rose is a rose. But in emotions the same term describing an emotion can mean different things. Disgust can be induced due to contamination (dirty toilets, fowl food) or due to injury (mutilations, blood). Sadness can provoke crying or not and there are many other ambiguities: although anger for example is always a negative emotion it can drive people either away (withdrawal/avoidance motivation) or pull them closer in an aggressive move (approach motivation). Emotions that are semantically close such as fear and anxiety or amusement and happiness might still be based on distinct behavioural systems!
  5. Artifacts may influence the measured response. Watch out for the effects of posture and movements, ambient temperature and cognitive demands! Start with sensors that give you good signal quality to start with. If you then use TEA Captiv you can process the data, apply artifact detection algorithms and filters to smoothen the data and eliminate unwanted effects.

There are a few more things that need to be considered when comparing data from different studies but these are my personal top 5 take-aways. Apart from the results of course.

In the table below you can see already my summary of her results. In her review, she reports that HR was increased in most emotions and surprise. However in her review she comes to the conclusion that the HR decreased in emotions that involve an element of passivity such as non-crying sadness, contentment, visual anticipatory pleasure and suspense.

GSR was increased in most emotions probably reflecting motor preparation and increased action tendency and of the more commonly induced emotions in (commercial) research just non-crying sadness lowers GSR. All other emotions tend to increase the reaction.

HRV has been shown in quite a few studies to be useful as an indicator for cognitive workload. Low HRV usually correlate with means high stress levels. Since her review was mainly focused on emotions, cognitive workload was not considered and the use of HRV was not too helpful.

emotion-chart
The table shows different emotions and how they influence measurements, which can increase (+), decrease (-), depend on different factors (D) or are indecisive (I)

So, what does this table tell us? Is there a specific fingerprint of biometric activities unique to each emotion?
Maybe!
Under very controlled conditions and also taking into account other types of measurements there might be potential to discover a unique signature to some emotions.

Unfortunately for many researchers, very distinct emotions such as Anger and Happiness or Joy and Disgust have very similar bodily reactions – if we look only at HR, GSR and Respiration Rate. Different types of sadness can cause a variety of reactions, making it actually a very interesting research subject in my opinion, but this doesn’t make it easier for your everyday research and especially when it comes to commercial research you might not be able to control for every possible factor and to use too many sensors.

My personal conclusion is that while tools such as GSR, HR, Respiration or Heart Rate Variability can help us detect emotions, in most research projects they don’t allow to uncover which emotion it is and not even if it is a positive or negative emotion.
But on the positive side, we still have a few other tools in our Neuro-Toolbox that can help us along the way: Facial Expression Analysis for example, Implicit Association Tests or even EEG can help us to understand emotions, associations and motivations and thus help us to detect valence or even to classify emotions.

With this in mind, I’ll be covering Facial Expression Analysis in my next post as it is probably the easiest to use out of the three.

 

 

If you want to dig deeper into the original report, you can find it here: http://www.sciencedirect.com/science/article/pii/S0301051110000827
Sylvia D. Kreibig has since then been involved in some interesting research projects following up on the results. Take a look at her work on Research Gate.

Tobii Eyetracking & Garmin GPS/ANT+ Data for Sports and Real-World Wayfinding Research October 31, 2016

Posted by Scott Hodgins in Biometric, eye tracking, Glasses, Tips And Tricks, Tobii, Uncategorized.
add a comment

In Jon’s previous blog post he mentioned me running and training with some friends over at Forest Fit Clubs and added a video link. I wanted to reply to the inevitable questions about data overlays etc. and how did we do this with the Tobii software? The short answer is that we didn’t, here’s a “how to” to get you started.

This version is based on a running, other examples include:

  •  Wayfinding – overlay GPS data on the eyetracking video so you can immediately see where pedestrians moved in the built environment. Understanding how people use signage, do they see it? do they understand and use it?
  • Driving & Flying – use the GPS and speed/G-Metrix data to understand the technique and relationships between looking-engaging and acting on that information.
  • Data overlay’s are not just limited to the Garmin descriptions – you can hack the overlays to change the title and add maybe GSR data, or cognitive data and arousal metrics from an EEG such as the ABM X-10

Overview.
We wanted to show the power of adding basic data overlays onto the eyetracking video so we could easily associate an action from the video with a resultant change in data. We had a Garmin VIRB XE that we had used for a technical demonstration with a customer. I had noticed that the VIRB produced MP4 files, as did the Tobii Glasses 2 (TG2), so the idea of hacking the workflow, swapping the VIRB video out and overlaying biometric and location data over the TG2 video data was born. Below is a video showing an overview of the software.

The kit list:
1 x Tobii Glasses 2 (any variant)
1 x Garmin Fenix 2 Sports watch (now replaced by the Fenix 3 and Fenix 3 HR, which may be an alternative source for HRM data)
1 x Garmin HRM RUN Heart rate monitor, we wanted this as it also offered specific run data, not just HR
1 x Garmin VIRB XE & Garmin VIRB Edit Software (we didn’t use the camera, just the software)
1 x Willing participant (me, and yes I calibrated myself, in daylight, outside, it is super-easy with the TG2) with a suitable pocket or running belt to stash the recording unit in.

The steps:

  1. Assemble the TG2:
    Connect Head Unit-Cable-Recording Unit, Insert SD Card, Insert Battery & power up. This took about 5 minutes instead of the normal 2 minutes as I threaded the cable under my base layer to control the cable movement and placed the recording unit in a neoprene running belt to control movement. (1)
  2. Power up the controlling computer, we use Dell Venue 11 Pro’s (Now Dell Latitude 11 5000) running W7 Pro or W10 Pro x64.
  3. Connect to the TG2 WLAN, start Tobii Glasses Controller, select “New Recording”, add a participant name and description.
  4. Calibrate: Select the icon in the bottom right of the screen to calibrate (easy on the tablet – just touch) and look at the calibration marker – for non Tobii users a fully 3D calibration is completed typically <5s. A major advantage of this ground-breaking 3D calibration model is that we don’t have to try and “work around the data” during analysis. (2)
  5. Start the recording, then start the recording on the Garmin Fenix while looking at the screen – it’s not perfect but we should be able to sync the data to the frame where the recording started and at 25fps video we are sure that we are sync’d to about 40ms. (3) Turn the laptop off, or put it in sleep mode.
  6. Run around, enjoy the beautiful scenery at Somerley.
  7. Finish session, cool down, stretch – up to 90 minutes per battery so we have plenty of time head back to the car and stop the recording on both the Garmin Fenix and the TG2
  8. Stopping the recording, then select the recording and start to output the 1080p HD video.
  9. Sync Garmin to the cloud – in this case it was Bluetooth to Garmin Connect on my iPhone then auto sync’d to the cloud (connect.garmin.com)
  10. Login to your connect account, select the activity and download the FIT or GPX data from this session.
  11. Open VIRB Edit, create a new video and import the video you exported from the Tobii Glasses Controller, then add this to the video timeline.
  12. Import the FIT or GPX data, click on G-Metrix and then on Data and find your file.
  13. Sync the two files using one of the options at the foot of the G-Metrix>Data dialogue.
  14. Now use either the Templates and Gauges options to add data overlays on to the video, you can use appearance to change the colour of the gauges.
  15. Importing the logo & setting up a new template is more art than science – good luck with that, I think it took me about a dozen failed attempts then it magically worked, I took the time to try again while writing this, it now shows as a Beta function in the software.
  16. Export the video to your chosen resolution and quality.

The next post will look at doing something similar, using TEA Captiv as a more scientific option, with multiple video feeds and more data options.

The end result:

Notes:

  1. It is worth taking 5 minutes here to make sure you have free movement before starting, otherwise turning too sharply could lead to disconnection or discomfort. Because I used the wireless version, once I was wired up and adjusted I didn’t need to touch the system again until I was retrieving data.
  2. Other wearable/head mounted eyetrackers have issues when we start looking in different planes. Their calibration model is typically a one dimension transform that is adequate when looking in the calibrated plane, the calibration will suffer when looking in a different plane. For example if we calibrate on a vertical wall (looking at the corners of a picture) then place that picture flat on the desktop we will see trapezoidal offsets, this is also true if we calibrate in a horizontal plane (desk) and look at a vertical target (wall). The result is that if we are not cognoscente of this and take the distorted (erroneous) data at face value we risk producing worthless results.
  3. There is a sync port on the Tobii that can send/receive an LVTTL pulse to/from an external device, however the Garmin watch isn’t designed for this so we chose to keep it simple with a video based sync.
  4. Garmin data formats, I have always used the GPX download to import into VIRB Edit, the FIT data caused a few anomalies specifically with time and GPS origin. The FIT file has all of the data recorded in Garmin Connect, the GPX has less, there was still enough for this example though.

3D – The Key to Tobii’s Performance Lead October 17, 2016

Posted by Scott Hodgins in eye tracking, Glasses, Market Research, Marketing, Media, neuromarketing, Shopper Research, Technology, Tips And Tricks, Tobii, Updates, Usability & UX.
Tags: , , , , , ,
add a comment

This post is trying to answer some of the most common questions that we get asked – Why should I buy a Tobii? Why is it better? System “X” has a “better head box” and system “Y” is cheaper.

The answer from our point of view is simple, the eyetracking is more accurate than using other systems for more people over a longer timeframe.

This is a pretty grand claim, why are we so confident?

Let’s start at the beginning; Eyetracking itself is straight forward, there are several well documented methods to find and follow the pupil, Tobii uses a non-intrusive video based technique called “Pupil Centre Corneal Reflection” (PCCR). Essentially an IR illuminator is used to help differentiate between the pupil and the iris, it also creates a highlight or glint that we use as well. The Tobii systems use an improved version of this idea, the secret-sauce as it were being a combination of two things, illumination and data modelling. These two areas allow the remote and wearable trackers to monitor the respondents relative 3D position in space, adjust the calibration parameters in the 3D physiological model, and therefore afford a far greater range of movement than similar systems while keeping accuracy and precision.
(Figure below shows the native 3D data from the TG2)

3d-head-coord-tobii

Illumination: Tobii can use up to two different lighting techniques known as bright and dark pupil to optimise the illumination for the participant in that location, and crucially when they move we can adapt the illumination to keep track of them. This allows a Tobii to offer people greater freedom of movement while retaining the tracking accuracy without the need for constant drift correction from the system operator.

Data modelling: The Tobii method is different having typically used multiple cameras in their research grade eyetrackers and have done since the launch of the T and X series systems in 2007/8. The advantage of using multiple cameras is that we can physically describe the location of the eye in space. That is to say we know with a very high degree of accuracy where the centre of your eye is, and which eye it is, for every sample recorded. The slightly different images from the pair of cameras in an X2 for example allows the creation of a 3D physiological model of the eyes it is tracking during calibration. This approach allows Tobii to understand the movement of the eye or the eyetracker should one or the other move and adjust the calibration accordingly with a high degree of precision.

The net result is that the these systems can accommodate movement, even if the head leaves the area trackable by the hardware and can recover tracking when the eyes are visible again, this is one of the reasons people keep choosing Tobii for demanding applications like infant research and in-vivo commercial research. In a recent study Acuity Intelligence recruited 330 people as they were entering supermarkets and didn’t have to turn away a single participant because they could not be tracked – a first for any data collection exercise with this number of people regardless of the brand of technology they were using.

Don’t just take out word for it, please challenge us, whether it is onscreen, in the real world or in the emerging AR and VR application areas we can help.

 

Tobii Studio 3.2 Launched December 18, 2012

Posted by Jon Ward in eye tracking, Studio, Technology, Tips And Tricks, Tobii.
add a comment

We are pleased to announce the latest version of the worlds most popular eye tracking analysis software. Studio 3.2 has a number of new features including task based segmentation, a new visualisation tool and a number of performance tweaks. Go to the Tobii website and download it now, if your support and upgrade contract isn’t up to date then get in touch with us and we will give you a quote so you can continue to enjoy the constant improvements and new functionality of Tobii Studio.

Eye Tracking Ronaldo and Combining Eye Tracking With Motion Capture September 23, 2011

Posted by Jon Ward in Advertising, Dikablis, Ergoneers, eye tracking, Glasses, Technology, Tips And Tricks.
4 comments

It’s been a pretty hectic couple of weeks with a lot of interest in the Dikablis eye tracker after the Ronaldo footage was shown across the world… I finally got around to posting the footage on YouTube for those that didn’t see it go to http://www.youtube.com/user/AcuityETS#p/a/u/0/2NcUkvIX6no and check it out if you haven’t already.

But the TV show didn’t allow us to demonstrate the full capabilities of the Dikablis system, which is designed not only for eye tracking but also integration to other software platforms, research tools and custom interfaces. Not only does the Dikablis allow fully wireless transmission and live viewing of data up to 5km away (designed from inception for testing vehicles and ergonomics during real world scenarios) but we can also synchronise the data stream to any number of input devices. The SimLab driving simulator takes things a step further, with a choice of screen sizes and types you can create VR worlds and eye track drivers as they navigate, drive and interact with the vehicle. On top of this every key switch, button press and acceleration is logged, marked and tied together allowing you to accurately plot every aspect of the users behaviour.

For other types of experiment, such as gait monitoring, sports studies and environmental testing we can look at the D-Control interface. This allows us to add 4 additional video streams to the eye tracking data, so this could be different views of the participants, different elements the user will interact with or perhaps different views of a cricketers approach to bowl?

But we aren’t finished there…. we can also utilise the Dikablis system with the Vicon motion capture system, as seen tracking Ronaldo in the TV show and as seen in this demonstration video here : http://www.youtube.com/watch?v=tqVynnEXpY8 … so as well as having multiple views of the participant filmed by traditional cameras, we can also monitor every aspect of their body motion, limb movement and add accurate eye gaze as well. The possibilities for sports science and medical research are endless, and also what about for CGI and standard motion capture where eye gaze has been “guesstimated” in the past – we can now say with 100% certainty where they were gazing as they ran, walked, tackled and interacted.

The Dikablis has a very open API which can stream data live, so the options don’t end there either – you can integrate the system into your own systems, your own code and your own research. Give us a call on +44 (0) 1189 000795 or email us at sales@acuity-ets.com for more details or to request a demo.

Tobii Studio 2.3.2. Released! September 5, 2011

Posted by Jon Ward in eye tracking, Studio, Tips And Tricks, Tobii, Updates, Usability & UX.
2 comments

Just a quick note to say that Tobii Studio 2.3.2 has been released… this latest updates improves the processing speed of the web grouping tool, so will be of massive benefit to all those web usability testers out there!

To download the latest version you need to have a valid support contract and go to HELP > CHECK FOR UPDATES within Studio. If you haven’t got a valid support contract get in touch on sales@acuity-ets.com and we can discuss your options… Studio 2.4 is scheduled for release at the end of the year, so keep your eyes out for that as well!

Please note Studio 2.3 is NOT compatible with X50, 1750 and 2150 eye trackers.

Tobii Mobile Device Test Platform May 27, 2011

Posted by Jon Ward in eye tracking, Studio, Technology, Tips And Tricks, Usability & UX.
add a comment

Testing mobile devices with eye tracking has always posed problems – screen based systems allow testing in a number of ways, from emulation to hiding a unit under a table (see the white paper on testing mobile devices – available on our Scribd page) and head mounted systems allow the freedom to test but sometimes suffer from low levels of clarity from the forward facing camera. When it comes to working with the data to produce findings easy aggregation of data hasn’t been possible – meaning more qualitative analysis and longer lead times for projects.

Taking this into consideration Tobii have designed a platform to utilise the current X series eye trackers, which allows repeatable, accurate testing of mobile devices and aggregated visual and statistical outputs using the Tobii Studio software platform. By using the X series there are also no requirements for the user to wear any eye tracking equipment and the interactions can be very free and natural. There are a variety of additional mounts and ways to increase the systems capabilities to allow for testing of phones, tablets and also other devices or physical objects – such as documents or magazines.

With Tobii Studio’s ability to group and compare different types of stimuli and tests we can now accurately compare, tabulate and analyse multi-platform products, websites, interactions and user journeys with a constant range of deliverable tools, visuals and metrics – a big leap forward for eye tracking!

The stand is due to be shipped from the middle of June, and pre-orders are already stacking up – have a look at the data sheet or contact us at sales@acuity-ets.com for more – or give us a call on 01189 000795. We also hope to have the stand at the Tobii UX Meeting next month…

Tobii_Brochure_MobileSolutions_18022011_UsENG_Korr2

Merging Projects In Tobii Studio May 23, 2011

Posted by Jon Ward in eye tracking, Glasses, Media, Studio, Tips And Tricks, Tobii.
13 comments

One of the common questions we get from clients relates to merging data from different Tobii projects, and more often or not the question is “Why can’t I merge data from two different Tobii Studio Projects?” so I thought it was time to do a little blog post to help people through this and understand a workflow to avoid any potential issues.

The main thing to understand is that when a project folder is created within Studio it is given a ‘digital signature’ of sorts which is unique to that project and that copy of Studio. If you want to merge data later on YOU MUST USE THIS PROJECT AS THE TEMPLATE!

To give you an example of what I mean by this let’s assume we are collecting data in two locations – at each location I create the test using the stimuli required, put them into a timeline that is exactly the same on both machines. On paper these look identical and you would think you can merge the data – you can’t. The digital signature of each test is different and therefore they are standalone projects, regardless of how identical they may appear. In this scenario you should create the blank project at one location, filling the timeline with the relevant stimuli and so on. Once the test is ready to go, export the project using the “EXPORT PROJECT ARCHIVE” option in the “FILE” menu within Studio. On the second (and each subsequent) location we would then import the test template into the Studio licence there, thereby retaining the digital signature and allowing us to merge data sets at a later date.

Once data capture is completed we use the “EXPORT” and “IMPORT” project archive tools accordingly and when we import we get the option to either merge the data sets or have them as separate tests in the library. Please note you cannot merge the tests by doing the quick “copy and paste” style of export.

Hope this helps people and reduces confusion! As always fire over any questions to jon@acuity-ets.com or give me a call on 01189 000796 and I’ll be happy to help.

Tobii Glasses – Take 2 Cases To Do Research? Not Us, We Take 1 And Go…. April 15, 2011

Posted by Jon Ward in Advertising, eye tracking, Glasses, Market Research, Shopper Research, Studio, Tips And Tricks, Tobii.
add a comment

Ok, so it is a really bad pun and only those who have seen the “Wash ‘n’ Go” adverts will understand it but it gets the message across. Much as we love the innovative and amazing Tobii Glasses having the IR markers and glasses system in two cases is a pain – especially when flying on ‘ahem’ more cost effective airlines… so we searched for an alternative and came up with this hardened ABS case (the same stuff some of the Tobii screen based eye trackers come protected in, and the cases we use for our loan PC’s) and we love it!

The trays fit in nicely on top of each other with the markers on the bottom and the glasses layer on the top. There is room for documents, manuals and mounting accessories in the lid and it has four clip locks, as well as a combination lock and a great shoulder strap! An alternative option we are putting together is for a layer to lock in your laptop and charger so one case rules them all! (Ok, another awful reference but it’s been a long week….)

Plus it looks so much more James Bond! So avoid being nervous about your expensive equipment being damaged as the hardened case means it is much more resilient to knocks and can even be put into hold baggage or trusted with Scott or me!

And instead of being nervous about your expensive equipment being damaged the hardened case means it is more resilient to knocks and can be put into hold baggage meaning! For more details, pricing and availability contact sales@acuity-ets.com or call us on 01189 000795.

Tobii SDK Version 3.0 – Available for FREE Beta Release! December 15, 2010

Posted by Jon Ward in eye tracking, Technology, Tips And Tricks, Tobii, Updates.
add a comment

More good news for all those people wanting to use their Tobii system in some weird and wonderful way, or to integrate it into their own software as an input device! New functionality allows more access to gaze data – so 3D plotting of the eye is now possible as well as access to more detailed calibration and validation functionality – and more!

Follow this link for the product page and download : http://landningpages.tobii.com/landingpads/analysis_sdk_beta.aspx

And if you do something odd, cool, fun or amazing let me know!