Neuro-Tools : Heart Rate & Respiration November 21, 2016Posted by eyetrackrob in Biometric, Captiv, eye tracking, Market Research, Marketing, neuromarketing, TEA, Technology, Uncategorized.
add a comment
Although not as fast as I thought, step by step, I’ll be covering the most relevant biofeedback sensors in this blog series. So far I’ve only managed to write about GSR, one of the sensors of the hour! Galvanic Skin Response has been around for a long time and in the past years it has gained lots of attention from researchers, but as you might have read in my last post, although it deserves all the attention it gets, it’s not always that simple to use.
Other measurements mentioned before that could tell you more about emotions or cognitive workload are respiration, heart rate and from this also the possibility to calculate the variability of the heart rate (HRV).
Heart Rate (HR) reflects the frequency of a complete heartbeat within a specific time window. It is typically expressed as beats per minute (bpm). The HR is constantly, antagonistically influenced by the sympathetic nervous system (SNS) and parasympathetic nervous system (PsNS) and in general heart rate, similar to GSR, unfolds rather slowly. Although with peak effects observed after 4 seconds and return to baseline after about 20 seconds it is much slower than GSR. Heart Rate Variability (HRV) on the other hand expresses the quick variations of the frequency between heart beats. The time between beats is measured in milliseconds (ms) and is called an “R-R interval” or “inter-beat interval (IBI).”
Image 1: shows a typical heart rhythm as recorded by an electrocardiogram (ECG). You can see heart rate (4bpm) as well as the differences in the inter-beat intervals.
Both measurements (HR and HRV) are closely related to emotional arousal, with HRV allowing for assessment of more sensitive and quicker changes, which also can be related to stress and cognitive workload (this might be a good topic for a follow up post).
While today many fitness devices exist that measure heart rate in the context of fitness and well being, those solutions might not be the ideal for your research. One of the reasons for this is the processing and averaging of data going on in the sensor.
Image 2: shows the same recording as averaged data export (blue) and as it was displayed during the recording (orange). The data was recorded with a wrist worn device measuring the HR optically using light. In the averaged data the highest heart rate is at around 100 bpm. In the live stream the same time frame shows much more variability (still averaging at around 100 bpm) and it’s clearly visible that it is not the highest value of the recording.
As mentioned above, heart rate has a relatively low sensitivity and slow response. Many wearable fitness trackers don’t allow to export the data for further analysis or allow to access only averaged data, where quick spikes in the data have been eliminated as noise. The result of this prepossessing of data is that the effects of emotion might be lost altogether. On the other hand to compute HRV correctly, continuous and precise measurements must be guaranteed. Just 2-3 missed data points can mean inaccurate calculations of the times between beats and thus again missing relevant events.
Image 3: In the live visualization the highest heart rate reaches 145bpm. However the suspiciously round form reaching to the peak value indicates that data points are missing and data was interpolated. This becomes clear when looking at the averaged data. This data would not be suited for interpretation of HR or HRV.
Another reason why many heart rate trackers available for fitness purposes are not necessarily a suitable solution for researchers is that most of them are worn on the wrist and use light to measure blood flow and from there derive the heart rate. Compared to sensors that are placed close to the heart and measure electrical impulses (electrocardiogram/ECG), sensors on the wrist have to overcome challenges of compensating for movements, muscle-tensing, sweating and potentially light interference. ECG sensors are therefore the recommended tool for data collection for research purposes as they are more sensitive to certain signal characteristics.
Image 4: ECG Sensor as belt or as electrodes
Research has associated respiration rate and depth with emotional impact and emotional valence. Interestingly olfactory information ascends directly to limbic areas and is not relayed through the thalamus as other sensory input. The Thalamus is a part of the brain which is acting as a relay and pre-processing for sensory information and is accounted to be relevant to regulate consciousness, arousal, wakefulness and alertness. As olfactory information is not relayed through this part of the brain, there is a different mechanism to make olfactory information conscious which leads to quicker physiological response and unconscious alternation of the respiratory pattern. Respiration patterns therefore allow to identify potentially unconscious liking or disliking and arousal. The deduction of a unique emotion from respiration rate and depth does not seem to be possible although more research is still needed in this area.
Respiration measurements can be obtained either from the use of dedicated clinical instruments, stretch sensitive respiration belts or can be calculated from ECG data. The latter being the least invasive for commercial research.
Figure 6. Stretch Sensitive Respiration Belt
ECG data can be processed in TEA Captiv to obtain HR, HRV and even respiration rate and as with GSR all of the mentioned measurements can be synchronized with eyetracking to understand what visual information influenced a rise in HR, a change in HRV or an alteration of respiration patterns.
In my next post I’ll take a look at how all these measurements can be combined and if through a combination it is possible to not only detect emotional events but also understand whether it is a positive or negative emotion and even which specific emotion it is. So, watch this space for more!
Copy Cat Brands – Who is Trying to Steal Your Attention? October 31, 2016Posted by Jon Ward in Advertising, eye tracking, Market Research, Marketing, neuromarketing, Shopper Research.
add a comment
Tim from Acuity has recently been speaking at a conference in Peru where he presented some of the exciting findings from our parasitic brands research last year. Using the world leading facilities at the GSK SSL and in partnership with the British Brands Group we tested people’s recognition of famous brands and their not-so-famous imposters under a variety of conditions. Have a watch of the video below and maybe head over to the Acuity Intelligence website and read more about the study here : http://www.acuity-intelligence.com/blog/statute-of-imitations
3D – The Key to Tobii’s Performance Lead October 17, 2016Posted by Scott Hodgins in eye tracking, Glasses, Market Research, Marketing, Media, neuromarketing, Shopper Research, Technology, Tips And Tricks, Tobii, Updates, Usability & UX.
Tags: eye tracking, eyetracking, Marketing, research, smi, Technology, Tobii
add a comment
This post is trying to answer some of the most common questions that we get asked – Why should I buy a Tobii? Why is it better? System “X” has a “better head box” and system “Y” is cheaper.
The answer from our point of view is simple, the eyetracking is more accurate than using other systems for more people over a longer timeframe.
This is a pretty grand claim, why are we so confident?
Let’s start at the beginning; Eyetracking itself is straight forward, there are several well documented methods to find and follow the pupil, Tobii uses a non-intrusive video based technique called “Pupil Centre Corneal Reflection” (PCCR). Essentially an IR illuminator is used to help differentiate between the pupil and the iris, it also creates a highlight or glint that we use as well. The Tobii systems use an improved version of this idea, the secret-sauce as it were being a combination of two things, illumination and data modelling. These two areas allow the remote and wearable trackers to monitor the respondents relative 3D position in space, adjust the calibration parameters in the 3D physiological model, and therefore afford a far greater range of movement than similar systems while keeping accuracy and precision.
(Figure below shows the native 3D data from the TG2)
Illumination: Tobii can use up to two different lighting techniques known as bright and dark pupil to optimise the illumination for the participant in that location, and crucially when they move we can adapt the illumination to keep track of them. This allows a Tobii to offer people greater freedom of movement while retaining the tracking accuracy without the need for constant drift correction from the system operator.
Data modelling: The Tobii method is different having typically used multiple cameras in their research grade eyetrackers and have done since the launch of the T and X series systems in 2007/8. The advantage of using multiple cameras is that we can physically describe the location of the eye in space. That is to say we know with a very high degree of accuracy where the centre of your eye is, and which eye it is, for every sample recorded. The slightly different images from the pair of cameras in an X2 for example allows the creation of a 3D physiological model of the eyes it is tracking during calibration. This approach allows Tobii to understand the movement of the eye or the eyetracker should one or the other move and adjust the calibration accordingly with a high degree of precision.
The net result is that the these systems can accommodate movement, even if the head leaves the area trackable by the hardware and can recover tracking when the eyes are visible again, this is one of the reasons people keep choosing Tobii for demanding applications like infant research and in-vivo commercial research. In a recent study Acuity Intelligence recruited 330 people as they were entering supermarkets and didn’t have to turn away a single participant because they could not be tracked – a first for any data collection exercise with this number of people regardless of the brand of technology they were using.
Don’t just take out word for it, please challenge us, whether it is onscreen, in the real world or in the emerging AR and VR application areas we can help.
Where Is The Value From Eyetracking August 9, 2016Posted by Jon Ward in eye tracking, Glasses, Market Research, Marketing, Shopper Research, Technology, Tobii, Usability & UX.
add a comment
Of the past decade we have seen eyetracking move out of the research labs and academic institutes and begin to hit mainstream uses in markets such as gaming and control of operating systems, but if we look holisitically over every possible use and application for this amazing technology one question crops up quite regularly – “what can’t you eyetrack?”
Potentially this could prompt a simple answer and we list some obvious things and limitations of eyetracking as a technology – but I think the bigger question is “will eyetracking add value to what I am doing?” as it isn’t always obvious where the return on investment is from the data that eyetracking gives you.
There are actually very few situations where you can’t eyetrack people (or indeed some species of animal!) – for example recently Tobii equipment was used to eyetrack a F1 driver (https://www.youtube.com/watch?v=zjkUUMZnTnU) where the latest technology readily mounted inside the very snug and close-fitting helmet of Nico Hulkenberg. Staying with a sports theme Zoe Wimshurst from Southampton Solent University used the Tobii Glasses on a gymnast who performed a number of backflips while the equipment not only stayed in place, but also remained accurate thanks to Tobii’s 4 camera binocular platform (https://twitter.com/ZoeWimshurst/status/760472938499936256). And while we are name dropping I eyetracked Cristiano Ronaldo some years ago (https://www.youtube.com/watch?v=2NcUkvIX6no) with a previous generation platform. Sports is one thing, but what about something different… how about primate research, yup – we can tick the box there as well! For example some trials we did with Edinburgh Zoo (http://www.living-links.org/2012/11/), and the typical uses of consumer shopper and online research, psychology, linguistics and infant research are all areas where eyetracking is heavily involved, and this is of course now developing into the virtual world, mounting systems into VR and AR units for the next generation of these fields… not to mention interaction within gaming – both user testing and for control applications (http://www.tobii.com/xperience/apps/the-division/).
So you might say “great PR but what does that give us apart from a YouTube video” – well lets look at some examples, starting with F1 – if by watching the eye movements and point of gaze of a F1 driver we can shave 0.1 seconds per lap from a 58 lap race, we gain 6 seconds. In the Australian GP 7 drivers (from the 16 that finished) could have gained a position with this advantage, one driver could have jumped 4 places – gaining 7 points in the drivers championship in the process. For a footballer, releasing the ball 0.25 second earlier because you have the ability to read the field more efficiently visual performance training could be the difference between beating the offside trap and scoring or dropping points in a multi-billion pound race to the title. In elite performance the smallest of margins can mean winning or losing, and in today’s environment that could mean the difference between fame and fortune, or fading into obscurity.
If we look at medical or clinical uses, being able to identify things like autism at an earlier stage (using non-verbal responses through measuring eye movements) allows parents and clinicians to adapt and plan a child’s education to minimise the impact on their development and lets the family be more prepared moving forward. Building up databases of typical and non-typical developing children from all walks of life both in and out of the lab allows milestones to be measured, new learning or rehabilitation techniques to be developed. Being able to extract information without the requirement for self reporting or verbal communication breaks down barriers that would otherwise mean that diagnosis may not be available for weeks, months or even years later otherwise. Using the latest techniques for training and both real and virtual presentation of scenarios means that we can now train healthcare professionals, surgeons and patients in situations that could be life threatening without the risk, and by understanding totally how they interact and engage gives us insights never before available.
When looking at process management, health and safety or manufacture there are always people in a workplace that are ‘naturals’ at what they do, they have either adapted to their task very comfortable and excelled, or more likely through repetition and learning have become expert. Using eyetracking we can observe how these people operate, understand if and how they anticipate next steps, how they scan and search for elements or their situational awareness. Next we bring on the novice or the person to improve, observe them and compare them to our experts, guiding their interactions with a proven benchmark. An accident at work can be costly both in financial and possibly human terms, so use a simulator, VR environment or test area and monitor people’s actions and movements – and pre-empt possible bad situations. Does that fork lift driver check either side of the load often enough? How is that member of the QA team better at spotting defects in products – is their search strategy different? What makes that soldier better at finding ground disturbance in the field and locating IED’s? How can we be sure a mechanic checks every inch of an engine during a service and a vehicle is safe to use?
Let’s think about consumer research – a mainstay of eyetracking and an ever growing market place. With the adoption of mobile devices on-screen real estate is smaller, we consume information quicker and we need to be more efficient at being noticed, getting our message across and of course helping the customer with their journey. A 1% increase in click-throughs, sign up or user experience could mean huge increases in a companies KPI’s but often selling in ideas and changes to a stakeholder can be challenging. Eyetracking provide a very visual way to demonstrate why customers aren’t (or indeed are!) doing what was expected on a website, image or menu system. Jumping into the retail space we are bombarded with products, signage, offers, POS, noise, colour and a whole lot more every time we walk through a shop entrance, or a mall, or a petrol forecourt – consumers self reporting their actions always has its limitations and this is even more evident in such a busy space as a retail outlet. Our eyes are digesting heaps of information, our brain is processing and discarding things that aren’t pertinent to the task and consumers simply can’t remember, never mind verbalise, all of this at the rate it is going. Unlock the subconscious by measuring the bodies leading input device – the visual system. Again small performance gains at the checkout in one store quickly multiply to large increases across a brand, retailer or globally. What distracts the shopper or draws their attention away from where we want them to look? What attracts them to our competitors? What elements do they use to navigate, make a decision or determine quality? Can people navigate around the virtual store before we invest in deploying the new layout?
Think about your project, objective or study – is the interaction with the stimulus, product, environment or other people of interest? Do you want to know what and when they use visual information at any stage in the trial to inform the decision-making process? Do you want to understand why someone is better at a task than someone else? Do you want a very visible way of demonstrating a participants behaviour to a stakeholder? If the answer to any of these questions (or many more similar to these) is yes, then there is value in eyetracking for you.
Speak to us about methodologies for your study, the different types of equipment on hand and how we can help you get the insights you need.
Tobii Glasses 2 Real World Mapping – Saving Time in the Real World! October 29, 2015Posted by Jon Ward in eye tracking, Glasses, Market Research, Marketing, Media, Shopper Research, Technology, Tobii, Updates, Usability & UX.
Tags: eye tracking, Tobii, upgrade
add a comment
The Tobii Glasses 2 have been a huge success, with the wide angled field of view, live wireless viewing and automatic slippage compensation thanks to the unique 4 camera binocular tracking more and more people are able to do some great fieldwork both unassisted and in a more traditional context. However as with all glasses based eye tracking platforms the analysis of the data is more time-consuming than with a screen based system as you need to code the user data onto reference images to create aggregated visual outputs and statistical metrics. This can be very time-consuming and depending on the type of interactions mapped (fixation or raw data) combined with the environment and task of the user (unboxing a product, retail purchases, driving and so on) can take anywhere from 10 to 20 times the duration of the recording to code – so 100 minutes of total recorded interactions could take upwards of 16 hours to code before you can begin the analysis.
Tobii Real World Mapping can help reduce this time considerably in many types of study by using advanced computer vision to automatically detect the gaze points based on the reference images you upload – meaning that 10x or 20x multiplier comes down to as little as 2-5x, and processing can be queued so the software can happily run along in the background freeing up valuable staff resources to focus on other tasks.Once the automatic process is completed you are presented with a chart plotting the automatically mapped points, alongside a confidence level in its accuracy and then any missed points (for example if there was a large amount of occlusion in the frame) or mapped points that need some adjustment can be manually corrected by a researcher.
This video has a brief overview of this exciting feature which will be part of next update of the Tobii Glasses Analyzer software.
Of course not every single study will be able to take advantage of the new functionality, for example very dynamic content such as sports science studies have few or no fixed reference points to work with, objects that are largely occluded constantly or are at extreme distances will not be viable but for a large number of shopper studies, product interaction, mobile and tablet applications, advertising and navigation tasks users should see a significant time savings to using the tool – and by running a pilot (see tip 28 on Tim’s Acuity Intelligence blog for a reminder of the importance of this : http://www.acuity-intelligence.com/blog/eye-tracking-tips-26-30) you can quantify this benefit and also ensure that your reference images are correct and work well – more about that another time!
The Real World Mapping is ready to demonstrate to customers now – so get in touch and we will be happy to walk you through it, and show how it can help you process your glasses based eye tracking data quicker!
The New Tobii X2 Eye Tracker – The Smallest And Most Flexible Eye Tracker On The Market! February 11, 2013Posted by Natasha French in Advertising, eye tracking, Market Research, Marketing, Media, Shopper Research, Technology, Tobii, Uncategorized, Updates, Usability & UX.
add a comment
Acuity are proud to present the new Tobii X2 eye tracker – a ground breaking development in delivering the smallest and most flexible eye tracker on the market!
The Tobii X2-30 Eye Tracker (available in Compact Edition and Wide Edition) is a revolutionary small eye tracking system, powered by the latest generation in innovative eye technology from Tobii.
The Tobii X2 family comprises of eye tracking systems at 30 and 60 Hz. The X2 can easily be clipped on to a laptop, a PC monitor, or even a tablet for a compact and is our most portable system yet!
Research anywhere – Small footprint accommodates truly portable solutions and enables expansion of eye tracking from lab to real-life environments.
Supreme efficiency – Ease of set up and operation paired with very robust participant tracking allow for cost efficient studies.
Trust your data – Unparalleled tracking accuracy within a revolutionary large head movement box ensures reliable and valid research results.
Choose between the Compact Edition and the Wide Edition – depending on your specific study context!
The Compact Edition is a smaller version of the eye tracker, measuring 184 mm (7.3’’) in length. You can use it as your portable lab or for studies that require a small eye tracker to track what participants see on:
- Laptops and smaller PC monitors up to app. 22’’
- Tablets and mobile phones (dedicated mobile device accessories will be available soon)
- Small real-world interfaces
The Wide Edition is designed for studies that require larger gaze angles (up to 37°) and enables studies that involve larger stimuli, being able to track interfaces such as:
- PC monitors up to app. 27’’
- Projections and simulators
- Large real-world interfaces
Acuity are offering both rental and purchase options. As always for more information please contact the Acuity team at; email@example.com or (0)1189000795!
UCD2012 Conference at Cavendish Square, London… See You There! November 8, 2012Posted by Natasha French in Advertising, eye tracking, Market Research, Marketing, Media, Technology, Uncategorized, Usability & UX.
add a comment
Last weekend it was ‘The Ski and Snowboarding Show’, this weekend it’s UCD2012 Conference at Cavendish Square, London…. I suppose you could say we get around a bit!
So why UCD2012 and why Acuity? The show aims to give people the opportunity to enjoy real world case studies, inspiring presentations and hands on workshops focusing on user centred design in the real world by offering opportunities for learning and sharing with like-minded people. With UCD2012 being a not-for-profit conference for the community (it’s only been possible due to the presenters donating their time!) Acuity felt it was important to show our support by attending the event and sharing our own experiences and knowledge on the day.
With that in mind, on Saturday 10th November, Acuity will be holding interactive workshops with eye tracking and other complementary technologies we offer such as GSR, EEG and wireless biometrics – please note that if you are attending workshops are on a ﬁrst come ﬁrst served basis! Getting there early is advisable!
In addition to this, our very own Jon Ward will be giving a talk on ‘Palm Reading for The 21st Century?’ (I’ll let the mystery with that continue until Saturday!) and with talks from our friends at Cyberduck, Amberlight, User Vision, NileHQ, Sapient Nitro and Foolproof – it’s set to be an interesting and informative weekend. With our Christmas night out planned for the evening, it could be eventful too.
UCD2012 starts on Friday 9th November at 9.00am and finishes on Saturday 10th November at 5.00pm. For information and event updates please go to http://www.ucd2012.org
add a comment
Earlier this week Tobii launched another groundbreaking product – the X1 Light.
The X1 Light eye tracker is a truly portable lab, that you can literally fit into your laptop bag and test in situ, wherever and whenever you want. With device stands for both desktop, screen based, laptop based and real world testing this is a hugely flexible piece of equipment for market research, usability and more. There is plenty of information on our website about the X1 Light here : http://www.acuity-ets.com/products_x1-series.htm and you can also see an introduction video here : http://www.youtube.com/watch?v=_B1r3xuqDck
The X1 Light can be used with the free of charge Morae plug-in for usability studies or you can use it with Tobii Studio, the most popular eye tracking analysis software in the world!
For more information, specifications and pricing please don’t hesitate to contact firstname.lastname@example.org .
Tobii Studio 3.0 – What’s In The Update December 13, 2011Posted by Jon Ward in Advertising, eye tracking, Market Research, Marketing, Shopper Research, Studio, Technology, Tobii, Usability & UX.
add a comment
Well it is just over a week now since Studio 3.0 went live for Tobii users and the feedback so far has been all positive – but some users on earlier versions, and other customers have asked “So what’s new?” – so I am on hand to give you a quick rundown of the key functions / changes in 3.0…
General fixes : There have been a number of stability and resource issues addressed in line with updating the database type to accommodate the new data from the biggest new feature in 3.0 – dynamic areas of interest.
New text export tool : For customers who like to play with the raw data and export data sets to packages such as SPSS, Matlab, R and so on there is a much more flexible and powerful text export tool. Not only do you have more data sets to choose from, you can also create templates of the reports you like and save them for future use – much as you can do with the fixation filter settings since the update in 2.3.
Remote viewer (Enterprise edition) : The old Studio Remote Logger has been replaced with a sleeker, more stable remote viewer. Unfortunately the logger functionality has gone – however it has been replaced with a better, solid, lightweight viewer that has much better buffering and stability across a network – sending user cameras, screen data, gaze data and mouse interactions over a network.
Dynamic areas of interest : Areas of interest now have a life of their own! You can reshape, resize and move areas of interest to compensate or track movement in videos, web pages, Glasses recordings and more – the powerful tools allow you to quickly manipulate the AOI as you wish, press a button and get your statistics for the dynamic interaction, or indeed group them together using the AOI tools to get data aggregated across all sorts and types of stimuli!
Along with all these features there are also some visual changes to the GUI and a few little bits and pieces which all add up to a great update for the worlds market leading eye tracking software! See the data sheets for more information or contact us on email@example.com to find out more!
Tobii Glasses Case Study – 3D Labels! April 1, 2011Posted by Jon Ward in Advertising, eye tracking, Glasses, Market Research, Marketing, Shopper Research, Studio, Tobii.
add a comment
We have a new case study based on work carried out on behalf of Rolling Optics – a Swedish producer of optical 3D product packaging labels. This is the first publically available case study of the Tobii Glasses being used in a read shopper environment, more details and the link to be found below…
After applying a Rolling Optics 3D label to Grazette of Sweden’s XL hair care product range, sales of the range soared almost 90%. Rolling Optics wanted to know if their 3D labels had anything to do with the sales increase – and evidence that they could use customer dialogues. Aimed at explaining the sales increase the aim of the study was to test the theory that using a Rolling Optics 3D label on packaging was more attractive to the consumer. Tobii Glasses were used in a real store environment to compare performance of different premium shampoo bottle labels. By examining shoppers’ viewing patterns, both qualitatively and quantitatively, a correlation could be established between the use of Rolling Optics 3D labels and the sales increase. The case illustrates how the Tobii Glasses are used for measuring consumer attention in a store environment, including use of the IR markers to allow automated data aggregation in Tobii Studio.
For more information, pricing or to arrange a demonstration of the Tobii Glasses, Studio or the other products in our portfolio contact us on 01189 000795 or firstname.lastname@example.org