jump to navigation

HTC Vive Eye Tracking Integration – See Virtual, In Reality July 5, 2017

Posted by Jon Ward in Uncategorized.
add a comment

Our most popular product at the moment is by far the new Tobii HTC Vive eye tracking integration and we are happy to announce we have several demo units available, packaged up and ready to hit the road to demonstrate to our customers.

The initial feedback both internally and externally has been amazingly positive and with the headset, combined with the Razer Blade laptop, all neatly packaged in a customised Peli case – we want to visit you and showcase the capabilities and possibilities for this amazing technology.

 

Drop us an email on sales@acuity-ets.com or call us on 01189 000795 to get a date in the diary.

Neuro-Tools: Emotion Detection January 16, 2017

Posted by eyetrackrob in Biometric, Captiv, neuromarketing, Tips And Tricks, Uncategorized.
Tags: , , , , , , , ,
1 comment so far

Much of the research that requires biofeedback involves emotions and the detection or classification of those. But as you surely noticed emotions are a very complex topic with many different aspects and challenges. Measuring when and how strong these events occur is relatively easy using electrodermal sensors, cardiovascular sensors or respiratory sensors and a software in which to process the raw data (and maybe even applies emotion detection algorithms – as TEA Captiv does). Things that I covered in my previous posts.

But immediately after knowing when and how strong an emotion occurred, the inevitable question comes up: “was it a positive or negative emotion?” (Valence) with the usual follow up question: “which emotion was it?” (Classification).

We have seen that each tool has its merits when it comes to finding emotions, but most biofeedback sensors on their own have the limitation that they can’t really give us answers regarding those rather interesting questions about valence or classification.

However countless studies exist, that cover specific emotions or sets of emotions and that use different sensors to measure the bodily reactions to those emotions. If we could just review all those studies, we could surely come up with a map showing which emotions could best be captured with which sensors and how the measurements might differ from one emotion to another so that we could identify specific emotions (i.e. fear, happiness, anger etc.) from those measurements.

emotion-detectionIdentifying emotions just by looking at the different biometric measurements. Is it possible?

Sylvia D. Kreibig probably had the same idea and reviewed 134 publications that report research of emotional effects so that you don’t have to. Her review “Autonomic nervous system activity in emotion” was published in 2010 in Biological Psychology holds some interesting findings as well as food for thought.

Before getting to the eye opening results, there are a few take-aways from her research that might be interesting.

  1. Generally most research is done on negative emotions. And in general negative emotions have been associated with stronger autonomic reactions. However in her review she did not report on magnitude of changes partly for reasons described in #3).
  2. Heart rate was the most used measurement in those 134 studies, followed by GSR and much more than HRV or Respiration.
  3. Baselining! Do it! Some of the studies she reviewed did it, others didn’t. There are a number of ways to do a baseline: neutral or benchmarking/variable. While there is no definitive way to do it (making it more complicated to compare between studies), the important thing is that you use some kind of baseline to compare your results to.
  4. A rose is a rose is a rose. But in emotions the same term describing an emotion can mean different things. Disgust can be induced due to contamination (dirty toilets, fowl food) or due to injury (mutilations, blood). Sadness can provoke crying or not and there are many other ambiguities: although anger for example is always a negative emotion it can drive people either away (withdrawal/avoidance motivation) or pull them closer in an aggressive move (approach motivation). Emotions that are semantically close such as fear and anxiety or amusement and happiness might still be based on distinct behavioural systems!
  5. Artifacts may influence the measured response. Watch out for the effects of posture and movements, ambient temperature and cognitive demands! Start with sensors that give you good signal quality to start with. If you then use TEA Captiv you can process the data, apply artifact detection algorithms and filters to smoothen the data and eliminate unwanted effects.

There are a few more things that need to be considered when comparing data from different studies but these are my personal top 5 take-aways. Apart from the results of course.

In the table below you can see already my summary of her results. In her review, she reports that HR was increased in most emotions and surprise. However in her review she comes to the conclusion that the HR decreased in emotions that involve an element of passivity such as non-crying sadness, contentment, visual anticipatory pleasure and suspense.

GSR was increased in most emotions probably reflecting motor preparation and increased action tendency and of the more commonly induced emotions in (commercial) research just non-crying sadness lowers GSR. All other emotions tend to increase the reaction.

HRV has been shown in quite a few studies to be useful as an indicator for cognitive workload. Low HRV usually correlate with means high stress levels. Since her review was mainly focused on emotions, cognitive workload was not considered and the use of HRV was not too helpful.

emotion-chart
The table shows different emotions and how they influence measurements, which can increase (+), decrease (-), depend on different factors (D) or are indecisive (I)

So, what does this table tell us? Is there a specific fingerprint of biometric activities unique to each emotion?
Maybe!
Under very controlled conditions and also taking into account other types of measurements there might be potential to discover a unique signature to some emotions.

Unfortunately for many researchers, very distinct emotions such as Anger and Happiness or Joy and Disgust have very similar bodily reactions – if we look only at HR, GSR and Respiration Rate. Different types of sadness can cause a variety of reactions, making it actually a very interesting research subject in my opinion, but this doesn’t make it easier for your everyday research and especially when it comes to commercial research you might not be able to control for every possible factor and to use too many sensors.

My personal conclusion is that while tools such as GSR, HR, Respiration or Heart Rate Variability can help us detect emotions, in most research projects they don’t allow to uncover which emotion it is and not even if it is a positive or negative emotion.
But on the positive side, we still have a few other tools in our Neuro-Toolbox that can help us along the way: Facial Expression Analysis for example, Implicit Association Tests or even EEG can help us to understand emotions, associations and motivations and thus help us to detect valence or even to classify emotions.

With this in mind, I’ll be covering Facial Expression Analysis in my next post as it is probably the easiest to use out of the three.

 

 

If you want to dig deeper into the original report, you can find it here: http://www.sciencedirect.com/science/article/pii/S0301051110000827
Sylvia D. Kreibig has since then been involved in some interesting research projects following up on the results. Take a look at her work on Research Gate.

Neuro-Tools : Heart Rate & Respiration November 21, 2016

Posted by eyetrackrob in Biometric, Captiv, eye tracking, Market Research, Marketing, neuromarketing, TEA, Technology, Uncategorized.
1 comment so far

Although not as fast as I thought, step by step, I’ll be covering the most relevant biofeedback sensors in this blog series. So far I’ve only managed to write about GSR, one of the sensors of the hour! Galvanic Skin Response has been around for a long time and in the past years it has gained lots of attention from researchers, but as you might have read in my last post, although it deserves all the attention it gets, it’s not always that simple to use.

Other measurements mentioned before that could tell you more about emotions or cognitive workload are respiration, heart rate and from this also the possibility to calculate the variability of the heart rate (HRV).

Heart Rate

Heart Rate (HR) reflects the frequency of a complete heartbeat within a specific time window. It is typically expressed as beats per minute (bpm). The HR is constantly, antagonistically influenced by the sympathetic nervous system (SNS) and parasympathetic nervous system (PsNS) and in general heart rate, similar to GSR, unfolds rather slowly. Although with peak effects observed after 4 seconds and return to baseline after about 20 seconds it is much slower than GSR. Heart Rate Variability (HRV) on the other hand expresses the quick variations of the frequency between heart beats. The time between beats is measured in milliseconds (ms) and is called an “R-R interval” or “inter-beat interval (IBI).”

ecg-signal

Image 1: shows a typical heart rhythm as recorded by an electrocardiogram (ECG). You can see heart rate (4bpm) as well as the differences in the inter-beat intervals.

Both measurements (HR and HRV) are closely related to emotional arousal, with HRV allowing for assessment of more sensitive and quicker changes, which also can be related to stress and cognitive workload (this might be a good topic for a follow up post).

While today many fitness devices exist that measure heart rate in the context of fitness and well being, those solutions might not be the ideal for your research. One of the reasons for this is the processing and averaging of data going on in the sensor.

fitness-monitor slide1

Image 2: shows the same recording as averaged data export (blue) and as it was displayed during the recording (orange). The data was recorded with a wrist worn device measuring the HR optically using light. In the averaged data the highest heart rate is at around 100 bpm. In the live stream the same time frame shows much more variability (still averaging at around 100 bpm) and it’s clearly visible that it is not the highest value of the recording.

 

As mentioned above, heart rate has a relatively low sensitivity and slow response. Many wearable fitness trackers don’t allow to export the data for further analysis or allow to access only averaged data, where quick spikes in the data have been eliminated as noise. The result of this prepossessing of data is that the effects of emotion might be lost altogether. On the other hand to compute HRV correctly, continuous and precise measurements must be guaranteed. Just 2-3 missed data points can mean inaccurate calculations of the times between beats and thus again missing relevant events.

slide21

Image 3: In the live visualization the highest heart rate reaches 145bpm. However the suspiciously round form reaching to the peak value indicates that data points are missing and data was interpolated. This becomes clear when looking at the averaged data. This data would not be suited for interpretation of HR or HRV.

Another reason why many heart rate trackers available for fitness purposes are not necessarily a suitable solution for researchers is that most of them are worn on the wrist and use light to measure blood flow and from there derive the heart rate. Compared to sensors that are placed close to the heart and measure electrical impulses (electrocardiogram/ECG), sensors on the wrist have to overcome challenges of compensating for movements, muscle-tensing, sweating and potentially light interference. ECG sensors are therefore the recommended tool for data collection for research purposes as they are more sensitive to certain signal characteristics.

ecg-beltecg-electrodes

Image 4: ECG Sensor as belt or as electrodes

Respiration

Research has associated respiration rate and depth with emotional impact and emotional valence. Interestingly olfactory information ascends directly to limbic areas and is not relayed through the thalamus as other sensory input. The Thalamus is a part of the brain which is acting as a relay and pre-processing for sensory information and is accounted to be relevant to regulate consciousness, arousal, wakefulness and alertness. As olfactory information is not relayed through this part of the brain, there is a different mechanism to make olfactory information conscious which leads to quicker physiological response and unconscious alternation of the respiratory pattern. Respiration patterns therefore allow to identify potentially unconscious liking or disliking and arousal. The deduction of a unique emotion from respiration rate and depth does not seem to be possible although more research is still needed in this area.

Respiration measurements can be obtained either from the use of dedicated clinical instruments, stretch sensitive respiration belts or can be calculated from ECG data. The latter being the least invasive for commercial research.

t-sens-respi-belt

Figure 6. Stretch Sensitive Respiration Belt

ECG data can be processed in TEA Captiv to obtain HR, HRV and even respiration rate and as with GSR all of the mentioned measurements can be synchronized with eyetracking to understand what visual information influenced a rise in HR, a change in HRV or an alteration of respiration patterns.

In my next post I’ll take a look at how all these measurements can be combined and if through a combination it is possible to not only detect emotional events but also understand whether it is a positive or negative emotion and even which specific emotion it is. So, watch this space for more!

 

Tobii Eyetracking & Garmin GPS/ANT+ Data for Sports and Real-World Wayfinding Research October 31, 2016

Posted by Scott Hodgins in Biometric, eye tracking, Glasses, Tips And Tricks, Tobii, Uncategorized.
add a comment

In Jon’s previous blog post he mentioned me running and training with some friends over at Forest Fit Clubs and added a video link. I wanted to reply to the inevitable questions about data overlays etc. and how did we do this with the Tobii software? The short answer is that we didn’t, here’s a “how to” to get you started.

This version is based on a running, other examples include:

  •  Wayfinding – overlay GPS data on the eyetracking video so you can immediately see where pedestrians moved in the built environment. Understanding how people use signage, do they see it? do they understand and use it?
  • Driving & Flying – use the GPS and speed/G-Metrix data to understand the technique and relationships between looking-engaging and acting on that information.
  • Data overlay’s are not just limited to the Garmin descriptions – you can hack the overlays to change the title and add maybe GSR data, or cognitive data and arousal metrics from an EEG such as the ABM X-10

Overview.
We wanted to show the power of adding basic data overlays onto the eyetracking video so we could easily associate an action from the video with a resultant change in data. We had a Garmin VIRB XE that we had used for a technical demonstration with a customer. I had noticed that the VIRB produced MP4 files, as did the Tobii Glasses 2 (TG2), so the idea of hacking the workflow, swapping the VIRB video out and overlaying biometric and location data over the TG2 video data was born. Below is a video showing an overview of the software.

The kit list:
1 x Tobii Glasses 2 (any variant)
1 x Garmin Fenix 2 Sports watch (now replaced by the Fenix 3 and Fenix 3 HR, which may be an alternative source for HRM data)
1 x Garmin HRM RUN Heart rate monitor, we wanted this as it also offered specific run data, not just HR
1 x Garmin VIRB XE & Garmin VIRB Edit Software (we didn’t use the camera, just the software)
1 x Willing participant (me, and yes I calibrated myself, in daylight, outside, it is super-easy with the TG2) with a suitable pocket or running belt to stash the recording unit in.

The steps:

  1. Assemble the TG2:
    Connect Head Unit-Cable-Recording Unit, Insert SD Card, Insert Battery & power up. This took about 5 minutes instead of the normal 2 minutes as I threaded the cable under my base layer to control the cable movement and placed the recording unit in a neoprene running belt to control movement. (1)
  2. Power up the controlling computer, we use Dell Venue 11 Pro’s (Now Dell Latitude 11 5000) running W7 Pro or W10 Pro x64.
  3. Connect to the TG2 WLAN, start Tobii Glasses Controller, select “New Recording”, add a participant name and description.
  4. Calibrate: Select the icon in the bottom right of the screen to calibrate (easy on the tablet – just touch) and look at the calibration marker – for non Tobii users a fully 3D calibration is completed typically <5s. A major advantage of this ground-breaking 3D calibration model is that we don’t have to try and “work around the data” during analysis. (2)
  5. Start the recording, then start the recording on the Garmin Fenix while looking at the screen – it’s not perfect but we should be able to sync the data to the frame where the recording started and at 25fps video we are sure that we are sync’d to about 40ms. (3) Turn the laptop off, or put it in sleep mode.
  6. Run around, enjoy the beautiful scenery at Somerley.
  7. Finish session, cool down, stretch – up to 90 minutes per battery so we have plenty of time head back to the car and stop the recording on both the Garmin Fenix and the TG2
  8. Stopping the recording, then select the recording and start to output the 1080p HD video.
  9. Sync Garmin to the cloud – in this case it was Bluetooth to Garmin Connect on my iPhone then auto sync’d to the cloud (connect.garmin.com)
  10. Login to your connect account, select the activity and download the FIT or GPX data from this session.
  11. Open VIRB Edit, create a new video and import the video you exported from the Tobii Glasses Controller, then add this to the video timeline.
  12. Import the FIT or GPX data, click on G-Metrix and then on Data and find your file.
  13. Sync the two files using one of the options at the foot of the G-Metrix>Data dialogue.
  14. Now use either the Templates and Gauges options to add data overlays on to the video, you can use appearance to change the colour of the gauges.
  15. Importing the logo & setting up a new template is more art than science – good luck with that, I think it took me about a dozen failed attempts then it magically worked, I took the time to try again while writing this, it now shows as a Beta function in the software.
  16. Export the video to your chosen resolution and quality.

The next post will look at doing something similar, using TEA Captiv as a more scientific option, with multiple video feeds and more data options.

The end result:

Notes:

  1. It is worth taking 5 minutes here to make sure you have free movement before starting, otherwise turning too sharply could lead to disconnection or discomfort. Because I used the wireless version, once I was wired up and adjusted I didn’t need to touch the system again until I was retrieving data.
  2. Other wearable/head mounted eyetrackers have issues when we start looking in different planes. Their calibration model is typically a one dimension transform that is adequate when looking in the calibrated plane, the calibration will suffer when looking in a different plane. For example if we calibrate on a vertical wall (looking at the corners of a picture) then place that picture flat on the desktop we will see trapezoidal offsets, this is also true if we calibrate in a horizontal plane (desk) and look at a vertical target (wall). The result is that if we are not cognoscente of this and take the distorted (erroneous) data at face value we risk producing worthless results.
  3. There is a sync port on the Tobii that can send/receive an LVTTL pulse to/from an external device, however the Garmin watch isn’t designed for this so we chose to keep it simple with a video based sync.
  4. Garmin data formats, I have always used the GPX download to import into VIRB Edit, the FIT data caused a few anomalies specifically with time and GPS origin. The FIT file has all of the data recorded in Garmin Connect, the GPX has less, there was still enough for this example though.

Neuro-Tools : GSR October 24, 2016

Posted by eyetrackrob in Biometric, Captiv, eye tracking, Glasses, Market Research, neuromarketing, TEA, Tobii, Uncategorized.
3 comments

As mentioned in my first introduction to this blog, the central nervous system is divided into different branches which monitor and control different body functions. One of the branches, the sympathetic nervous system (SNS), is responsible for quick fight or flight reactions. By constantly accessing the surroundings and scanning for situations that could potentially be dangerous an evaluation takes place which leads to preparations for an adequate fight or flight reaction. These preparations can be measured throughout the body and include changing heart rate, respiration and levels of sweat on hands and feet.

As we start to understand that these non-conscious reactions are strongly and inseparably tied to decision making processes and thus human behaviour, more and more researchers have become interested in using tools to measure these reactions.

In my first post a few weeks ago, I wrote about the general rise of Neuro-Tools and mentioned some such as eyetracking, EEG, facial expression analysis, GSR, heartrate and respiration as well as Implicit Association Tests as examples. The series aims to go through these tools one by one and review what they measure, how they work and of course also where we run into the limitations of those tools. With the general objective to give you a perspective on how these tools can be made a valuable addition for your research, I’d like to continue the series looking at GSR today. Initially I thought of talking about GSR, heartrate and respiration in this post as they could easily be summarized as “biometrics” or “biofeedback measurements”, but it turned out to be a quite long post, so I’ll split them down into individual posts.

Enough of the introductions! Let’s dig into the exciting world of biometrics starting with:

Galvanic Skin Response

GSR isn’t simply around measuring sweat, there is an awful lot more to it than that so before offering some general advice on what to look out for when considering to use GSR, I would like to explain the basics around this tool.

Electrodermal Activity (EDA), Skin Conductance (SC) or Galvanic Skin Response (GSR) refer to the ability of the skin to conduct electricity due to changes in the activity of the sweat glands and thus the secretion of sweat. Those changes are closely related to psychological processes and can be triggered by emotional stimulation. Electricity can be conducted when an external, unnoticeable current of constant voltage is applied, and with more moisture on the skin, electrical resistance decreases and skin conductance increases at a measurable level, although sweat might not necessarily be visible through visual observation.

Skin conductance can be divided into tonic and phasic activity. The level of conductivity of the tonic activity is constantly changing within each individual respondent, depending on their hydration, skin dryness and autonomic regulation in response to environmental factors such as temperature for example. Phasic response in turn are short term peaks in GSR reflecting reactions of the SNS to emotionally arousing events, mostly independent of the tonic level. For most of the time, we will be looking at these reactions which occur in the eccrine sweat glands.

GSR data is measured in microsiemens (μS) and the relevant phasic reactions can be quantified and analysed in different ways. Apart from the number of peaks occurring within a certain period after stimulus onset, peak amplitude, the time to reach peak value and the recovery time can be used for analysis. GSR can be used to determine strength of arousal but can’t be used to determine the valence (like or dislike) of a reaction.

eda-example

Image 1 is an example of data including tonic and phasic activity.

 

The density of sweat glands varies across the body being highest on the head, the palms and fingers as well as on the sole of the feet. Most tools that measure the GSR are therefore build to be used on the fingers, where this reaction is strongest. However some instruments on the market allow for measuring the change in sweat levels on the wrist which often results in poorer data quality but might be necessary for some experiments where the hands are needed to interact with objects (i.e. holding mobile devices/products or typing).

 

eccrine-sweat-glands-distribution-2

 

Image 2 shows eccrine sweat gland concentration. Red areas indicate a high concentration of eccrine sweat glands (glands.cm−2) allowing to measure sympathetic arousal of low intensity and minimal duration. Green zones indicate a low concentration of relevant sweat glands able to measure only events of high intensity (for example on the wrist). (N. Taylor; C. Machado-Moreira, 2013

 

Depending on the manufacturer and kind of system used for the measurements, sensors can be adhesive electrode pads that are already filled with conductive gel in order to reduce preparation time and to avoid electrode movement. Conductive gel is not mandatory but can improve data quality and ensure a good and stable electrical connection. Many GSR device manufacturers that provide systems for the use on fingers and toes, provide Velcro straps to place the electrodes firmly. In any case excessive respiration, movements and talking should be avoided as these can cause noise in the data or variations in the signal that can be misinterpreted.

tsensgsr           e4-front_light

Image 3 shows a classic sensor (TEA T-Sens GSR) that can be placed on the fingertips adjustable with velcro straps next to an Empatica E4 wristband. 

 

As written in the introduction, reaction times and strength are highly individual and therefore distinct for each participant and they can vary between 400 milliseconds up to 5 seconds after presenting a stimulus. In a controlled lab environment a calibration procedure can help to understand individual differences in reactions but might not always be necessary. It is not advised to use GSR in areas where many low and high impact events can occur uncontrolled at any time and can be mixed with all kinds of artifacts, as it might be complex, if not impossible, to relate an emotional arousal peak to a specific event.
If free movement is a requirement (for example in shopper research) it is highly recommended to calibrate the GSR reaction time and strength for each participant and to complement the GSR measure with a synchronized video and sound feed -ideally even with eyetracking- to understand the source of the arousing events. The synchronization of several feeds can sometimes be a challenge but there are solutions that allow either for a live synchronization or a post-recording-synchronization.

 

tea-synch

Image 4 shows a synchronized recording of different sensors such as ECG, HR, HRV, Respiration and Cogntitive Workload with eyetracking (top right) and an additional video stream (bottom right). The synchronization can be done for example using the QR code that is visible on the screen (top left) marking a synchronization point in video and sensor feed.  

 

t-log

Image 5 shows a TEA T-Log, a small and mobile device that emits a short flash of light that can be picked up by a camera or in the video of the Tobii Glasses marking a visible event in the video and a sync point in the sensor recordings.

 

How GSR raw data, filtered data and emotion detection works all synchronized with eyetracking, can be seen in the following short video, recorded from TEA Captiv. I also imported data from a wrist-worn GSR device but the data was not usable, which is why I chose to minimize those curves in the software.  As you can see in Image 2 the concentration of eccrine sweat glands on the wrist is low which very often means having a very noisy signal or the absence of a signal. To improve the signal quality it is recommended to get a minimum level of tonic sweating, for example through some physical exercises. Although I did this (as you can indirectly and briefly see at the very beginning of the video), it wasn’t enough to make the measurement from the wrist usable. For these types of study (researching and improving the emotional and visual impact of TV commercials), I would usually recommend to use a remote eyetracker such as the Tobii X2-60 as well as sensors worn on the fingers (T-Sens GSR or similar), however I also wanted to show that it can easily be done with a mobile eyetracker if needed as shown below:

 

In comparison you can also watch a video of a similar test (same commercials) using a remote eyetracker as mentioned above. You’ll notice similarities in the general gaze data but also in the arousal detection, although you might also notice that each participant has a slightly different reaction time and  the emotional threshold has an influence on how many emotional moments each person is experiencing:

 

There is still a bit more to know about GSR and we at Acuity are do offer training on methodologies, technology and best practices for your research. To give you a headstart on some of the things to consider have a think about these 4 questions and then maybe give us a call:

  1. Where will the data collection happen? Do you need to be completely mobile, or will it be a controlled environment close to a computer? If you go mobile, can you carry a small device to record the data or does the GSR device itself needs to store the data?
  2. What type of sensor do you need? Is it a viable option to use sensors on the fingers, or will you need to use the hands to hold something or type for example?
  3. Do you know how to analyse the data? GSR raw data is rarely usable. Do you know how to remove the effects of tonic activity and artifacts and do you need a software that can do it for you and find the relevant events?
  4. Do you need to synchronize the data with other devices and do you want to accumulate data over several participants?

In the next post I’ll be covering heart rate and respiration to wrap up the more commonly used biofeedback tools before taking on EEG, facial expression analysis, Implicit association tests and others. Stay tuned!

 

Tobii Glasses 2 – Great for Human Performance Studies, Indoors or Out! October 4, 2016

Posted by Jon Ward in Biometric, eye tracking, Glasses, TEA, Technology, Tobii, Uncategorized.
add a comment

One of Acuity’s directors, Scott, likes to run around in the forest early morning and do some circuit training and we thought this would be an ideal opportunity to test out the Tobii glasses alongside his fitness tracker and GPS watch which allows us to overlay his positional data, heart rate, speed and distance travelled over the eye tracking video output. This would give a researcher a fantastic insight into the participants performance during sports research, medical or clinical trials or military type studies and is really simple to integrate. What really stood out for us was how well the Tobii Glasses 2 performed across a wide range of lighting conditions, movement, physical activity and stayed resilient to the moisture in the air and the beads of sweat from Scott’s forehead!

By using it’s unique four camera eyetracking system the Tobii Glasses can compensate for slippage which occurs under normal use – and was even more extreme under these test conditions – and it remains accurate, as you can see for the video! Also with the low hanging sun coming up through the trees, the Tobii Glasses full HD scene camera worked fantastically and eyetracking data remained solid, robust and accurate.

Don’t just take my word for it – have a look for yourself and if you want to discuss the Tobii Glasses, our range of biometric options or anything else then please don’t hesitate to get in touch via sales@acuity-ets.com or on +44 1189 000795.

Tobii Glasses 2 – 50hz and MEMS Sensor Update April 27, 2015

Posted by Jon Ward in Uncategorized.
add a comment

The Tobii Glasses 2 eyetracker has been an amazing success, with its unobtrusive design, wide field of view for tracking and slippage compensation due to the unique 4 camera tracking system – but now the system gets even better.

Glasses 2

Firstly the system will be updated on the next firmware release to track at 50hz on each eye camera, and this benefit will be for all customers, new and existing, and will happen by way of a standard firmware update – no need to return the system to us! As a quick note if you are currently capturing data at 30hz and require consistency within your study, don’t update your glasses just yet as the system cannot be downgraded after the update.

The second update is the activation of the MEMS sensors – so the accelerometer and gyroscope in the device will collect data alongside your eye tracking data – this unique innovation has many potential uses in your research and we are excited to see how people integrate this in their studies.

For those that want to know more about the update there is a webinar tomorrow (sorry for the short notice!) which you can see the details of here : http://www.tobii.com/en/eye-tracking-research/global/about-tobii-pro/event-calendar/tobii-events/tobii-glasses-2-webinar-join-us-for-tobii-pro-glasses-2-50-hz-mems-better-than-ever/ or of course feel free to give us a call!

 

Glasses 2 Image

Guest Blog Post – Cyber Duck Share Their Eye Tracking Experiences… August 27, 2013

Posted by Jon Ward in Uncategorized.
add a comment

Our friends over at Cyber Duck (http://www.cyber-duck.co.uk/) have been Tobii users for quite a few years now and often speak at conferences and at their own events about the way they integrate eye tracking into their user testing. Below Matthew shares a few of their thoughts on how the get the most from eye tracking and user testing…

10 ways to improve your web eye tracking studies

Web eye tracking technology offers valuable insights into how users behave when navigating a website or application. At Cyber-Duck we have been conducting our own in-house eye tracking studies since 2008; enhancing
our usability testing processes and ultimately bettering our offering as user experience (UX) specialists. At Cyber-Duck we use a Tobii T60 eye tracker hooked up to a Dell Precision M4500 laptop with Tobii Studio installed. Whilst your eye tracking tools and approach will inevitably differ from our own the tips in this article will be equally valid to your business as they are to our own.

Best Practices

1.    Find appropriate recruits

The best kind of participants for any kind of usability studies are real users, current or future.

Whilst a client may wish to recruit participants from within their organisation for various reasons, it is important that they understand this is only appropriate if the project being tested is an internal tool for members of the organisation to use. However, if the project is a public facing website or application, testing staff members and stakeholders would not be representative of real end users and this will limit the effectiveness of the study.

It is also important to ensure diversity within test participants. If all recruits are sourced from the same organisation or profession, it is likely that they share similar user traits, which when tested could be detrimental to the findings of your studies. These similarities could result in some issues not being detected, or assumptions being made about a website based on what could be a minority of user behaviour. By testing a diverse pool which includes likely user groups (age, demographics and profession) you eliminate the risk of producing misleading data.

 

2.    Test up to five participants

Eye tracking tests do not need to be conducted on a vast amount of participants. In most cases, provided the test has been well-designed, the first five participants will identify around 80% of usability issues. The first few participants will identify most issues with the website or application. After this, the major issues have been identified and any new issues that arise will become less frequent, and usually have less of a negative impact on user experience.

It is a good idea, if your budget allows, to consider implementing eye tracking tests throughout the project lifecycle. This allows for issues rectified following the first round of eye tracking to be tested and changes validated before the project goes live to the public.

 

3.    Test-run your equipment

It is essential to do a full check of all equipment before the client test. You should set up all equipment and test the eye-tracking device, the computer you will be using, as well as any software and peripherals you will be relying on well in advance of the testing date. This will allow for maintenance to be conducted if any of the kit needs servicing.

On the day of the test it is always a good idea to arrive early. You want to be able to set up all equipment and have plenty of time to conduct a run through of the test to iron out any issues before participants arrive.

 

4.    Ensure participants are relaxed

Most participants will have never taken part in eye tracking testing before, therefore it is important to ensure they are at ease before you start the test.

You should explain to them before you start that it is not them being tested but rather the system and so by making mistakes,  they are actually helping you to find issues with the project. Also make it clear that you are there as an observer and not to aid them; it should prevent them from breaking their gaze from the screen and seeking assistance.

Keep task descriptions brief and simple, and refer to specific directions in a slightly abstracted manner to avoid inadvertent clues on how to accomplish the tasks. For example if as part of the test, the participant needs to sign in to their account and the button is labelled “Sign in” you could ask the user to “log in” to avoid giving away too much of a clue.

It is important to ensure that the testing environment is suitable. Ideally, the participant will forget their surroundings and their observer due to their concentration on the task at hand. If possible it is advisable to have a dedicated testing lab which promotes the optimum environment for testing. However if you have to test on-location here are some tips on how to set up the best field testing environment.

–       Make sure the room is quiet

–       Ensure there is an area close to the testing room for participants to wait.

–       Place notices on the door to the testing room stating where participants should wait and that they should not disturb the testing environment.

–       Have no more than two observers in the room with the participant and if possible have them seated out of the participants sight range.

–       Do not speak and be as quiet as possible during the test.

 

5.    Printed instructions

It is important that the participant has printed instructions of their task available to them. Whilst you should verbally introduce the task to the user before the test and have on-screen instructions at the start, printed instruction ensure the participant has a constant reference throughout. It also means that participants won’t have to seek advice or help from you to complete tasks.

 Cyber Duck in actrion

6.    Use real information

Encourage participants to use their real personal information when completing web forms in the test. This means that the way the user completes the form is more natural and therefore more useful when making design decisions. Dummy details corrupt the testing slightly as they aren’t a true measure of how long a form takes to complete (all this is actually accomplishing is testing the participants ability to copy information).

When participants use their own details it is far easier to identify problems with the website or application. Individuals differ in the way they enter certain data into web forms, such as telephone numbers. For example, if the dummy data presents a telephone number with no spaces, you may overlook an issue that denies users the ability to input telephone numbers with spaces. Real data can help identify these kind of web form issues. The test should consider how the system expects to receive information, how the user interprets this, and how easily the system can handle alternate formats of data.

Some users may be uncomfortable providing their genuine data. You should ensure you have consent forms ready and that these explain clearly how the participant’s data will be used and assuring them that their data will remain private and is only being used for test purposes. Ensure you have dummy data prepared in case any participants do refuse to use genuine data.

7.    Take detailed notes

Eye-tracking is an extremely valuable way of collecting data about your users. However, the eye tracker will only provide data and information, which needs to be analysed and interpreted by the test initiator. The eye-tracker is unable to provide human insights regarding the data.

This is why it is essential to take detailed and comprehensive notes whilst the tests are being conducted. This enables the tester to record their own insights from observing the tests, such as what aspect of a task caused the participant to pause or hesitate. Notes should also record the participants’ personal details, such as skill level and affinity with using computers. As an observer you will also be noticing patterns in user behaviour which you can only record manually.

It is a good idea to conduct a short survey at the start of the testing session on each participant. This can help you to gain slightly more information about their skill levels and confidence using similar systems, as well as age and English language capabilities. These are all factors which can affect usability, and it is good to have these in mind when assessing participant’s results.

 

8.    Verbal questions

It is a good idea to follow the test with some verbal questions. The participant will be able to provide valuable qualitative information regarding the product being tested whilst it is fresh in their memory.

Importantly, phrase your questions in a manner which avoids suggesting particular answers, or making assumptions about the answer. This has two benefits. Firstly it encourages the reader to describe their experience instead of giving a yes or no answer. This feedback is often more valuable. Secondly it stops you from suggesting answers to the participant. If a participant is unsure about what is expected of them they may answer with what they think you want to hear.

 

9.    Present your findings clearly

Ensure that your findings are presented in a clear and accessible away. The client most probably will not have an in depth knowledge of terminology associated with user experience and eye tracking. Ways you can combat this are to:

–       Include a glossary of any technical terms you include in your notes or recommendations. This ensures content clarity for the client.

–       Include your interpretations of visuals such as gaze plots and heatmaps. The client may not understand the importance or significance of these unless it is explained.

 

10.  Bring a designers perspective to testing

In the same way that you would bring creativity, attention to detail and empathy for participants into designing the product, you should apply these principles to your test. This will ensure a well-designed test and intelligent analysis of your results. This critical analysis will inevitably lead you to stronger solutions.

The New Tobii X2 Eye Tracker – The Smallest And Most Flexible Eye Tracker On The Market! February 11, 2013

Posted by Natasha French in Advertising, eye tracking, Market Research, Marketing, Media, Shopper Research, Technology, Tobii, Uncategorized, Updates, Usability & UX.
add a comment

Acuity are proud to present the new Tobii X2 eye tracker – a ground breaking development in delivering the smallest and most flexible eye tracker on the market!

Image

The Tobii X2-30 Eye Tracker (available in Compact Edition and Wide Edition) is a revolutionary small eye tracking system, powered by the latest generation in innovative eye technology from Tobii.

The Tobii X2 family comprises of eye tracking systems at 30 and 60 Hz. The X2 can easily be clipped on to a laptop, a PC monitor, or even a tablet for a compact and is our most portable system yet!

Research anywhereSmall footprint accommodates truly portable solutions and enables expansion of eye tracking from lab to real-life environments.

Supreme efficiency Ease of set up and operation paired with very robust participant tracking allow for cost efficient studies.

Trust your data – Unparalleled tracking accuracy within a revolutionary large head movement box ensures reliable and valid research results.

Choose between the Compact Edition and the Wide Edition – depending on your specific study context!

The Compact Edition is a smaller version of the eye tracker, measuring 184 mm (7.3’’) in length. You can use it as your portable lab or for studies that require a small eye tracker to track what participants see on:

  • Laptops and smaller PC monitors up to app. 22’’
  • Tablets and mobile phones (dedicated mobile device accessories will be available soon)
  • Small real-world interfaces

The Wide Edition is designed for studies that require larger gaze angles (up to 37°) and enables studies that involve larger stimuli, being able to track interfaces such as:

  •  PC monitors up to app. 27’’
  •  TV
  • Projections and simulators
  • Large real-world interfaces

Acuity are offering both rental and purchase options. As always for more information please contact the Acuity team at; sales@acuity-ets.com or (0)1189000795!

Why 2013 Is Going To Be Even Better For Our Clients…. February 5, 2013

Posted by Natasha French in eye tracking, Market Research, Shopper Research, Technology, Tobii, Uncategorized, Updates, Usability & UX.
add a comment

It’s been a busy couple of years with exciting developments here at Acuity. Key to this was the launch of Acuity Intelligence just over a year ago and the addition of Technical Director Dr Tim Holmes. I’ve had a few clients contact me, curious to find out the difference between the two companies: Acuity ETS and Acuity Intelligence (ETS and AI for short!). Reflecting on 2012 and what it held for the group, I thought the New Year might be a good time to clarify the difference between both and explain why 2013 is going to be even better for our clients.

Acuity ETS

Most of you reading this will already be aware of Acuity ETS. You’ll know our Directors and hopefully come to us when you have an eye tracking sale / rental requirement. Alongside this, ETS also offer training and support, giving you the tools and ‘know how’ to complete your own independent research effectively and use scientific technology to answer difficult commercial questions.

Now, if any of you know our Directors well, you’ll also be very aware that they are technology geeks (this enthusiasm is a must for anyone that joins the Acuity team!). With a (slightly obsessive) thirst for new tech, it wasn’t long before other tools that could supplement gaze data were also found by the guys. Wanting to offer these to clients too – the idea of Acuity Intelligence began!

Acuity Intelligence

Customer tracking / counting technology, facial and emotional recognition solutions, non-intrusive biometric sensors, browser based eye tracking to name but a few, the supplementary offering expanded quickly along with the AI team.

The end of 2011 welcomed a new company Director and Vision Scientist, Dr Tim Holmes. Actively involved in scientific research as a university lecturer, Tim was the catalyst for AI offering additional research and consultancy services based on real science. Services which include education about why technologies such as eye tracking and galvanic skin response are important measures of consumer attention and emotional engagement; the design of rigorous and innovative research projects that maximise the value of client investment by using the right combination of methodology and technologies to answer client questions; and innovative analysis and presentation of results that are scientifically accurate, replicable and focus on the meaning of the data, rather than just the numbers themselves. With Matlab and E-Prime consultancy offerings, AI also supports experiment development and data analysis in a number of Universities in the UK and beyond and is actively involved in cutting edge research at a number of institutions. In addition to all this, AI now also has product integrators and software developers in house who are feverishly beavering away on some truly unique off-the-shelf solutions as well as bespoke solutions tailored to the needs of specific customers!

Working together for you in 2013

Both sister companies offer something very different but both work together seamlessly on your behalf to provide you with a more complete research solution for 2013; all equipment rental, technical support and analysis or simply the technology and knowledge that will enable you to complete your own independent research well. We hope that by extending our offering to you, we can offer a more flexible way to make use of new technologies that add real scientific value to your research.