jump to navigation

Microsoft, An Xbox 360, Project Natal And Eye Tracking – The Future Of Market Research? October 8, 2009

Posted by Jon Ward in Uncategorized.
trackback

Now there is a strange headline right? Well maybe not, well at least not in concept. For any avid gamers out there (I am the resident game addict at Acuity) Project Natal should be familiar to you, it is one of the most hyped and publicised new technologies for a long time. For those of you who aren’t involved in gaming (or as Scott would say – those with a life!) Project Natal won’t mean anything. Project Natal is a new plug in system for the Xbox 360 that allows you to use your whole body as a controller – and before you say just like a Nintendo Wii, look back at what I said… you use YOUR BODY as the controller. The Wii works using IR detectors emitted from sensors in the controller and the receiver bar. Project Natal uses a controller bar and you, your arms, legs, head, body and everything in between! It tracks in real time over 40 points on your body and maps those into the virtual world.

So where does this fit into eye tracking and market research? Well have a look at this video of Project Natal in action with a VR person called Milo…..

Impressive? Scary? A bit like the first stages of the machines taking over? Maybe a bit of all three, but what it is demonstrating is more accessible technology to give us better interaction with VR characters. Let’s skip forward a few years and imagine we are using a virtual supermarket for some market research, we see an assistant – they interact with us (in the same way as Milo does with the user in front of the TV) responding to the tone of our voice, our body language and using eye tracking we can monitor the users interaction. We can then show them an advert, shopping list or example package (just as the user does with the fish picture in the video) and the assistant takes us off around the store to where we want to go….. now that I am sure you agree would be impressive! You could reach out to a shelf and pick up an item, interact with other shoppers and freely look around at point of sale, promotion offers and even your feet if you wish! The eye tracking data could be fed live so people in the VR world react as realistically as possible, we can see how people walk down the aisles, even include interactive point of sale or displays that could respond to your gaze just like they would in the store. Pushy sales people could try to make eye contact with you and you could even try to chat up the girl on the ‘meals for one’ aisle if you wanted!

As we constantly strive to improve MR, UX and usability testing to recreate as close to life situations as possible, Natal gives us a little glimpse of what was originally military technology filtering down towards the mass market. And we think the future looks pretty good!

Comments»

1. Gareth Tuck - October 26, 2009

Jon, I am with you on your vision of the future of MR, but I don’t think your right with how long it will take to get there. FMCG Companies in the USA can already use the extremely realistic virtual supermarkets they have designed to run research studies about new packs etc . The virtual store is shown on a Touch screen attached to a shopping “kart” controller. They already combine this with eye tracking to get a deeper understanding of the pre cursors to purchasing. Real store staff are green screened in to interact with the shopper via two way live video feeds in the virtual world and “jump” them or lead them to places in the virtual store where the product they want is displayed. The shopper in the virtual store can already reach out and touch the screen and pick up and zoom in on virtual products and see every small detail of the nutritional information if they wish. Recipe cards and magazines can also be viewed in detail in the virtual store, as can live in store TV which can be tailored to the exact shopper and only start when the viewer looks at the TV or near it. What they cant do at present is interact with other shoppers in the virtual store, and if you ask most shoppers (and FMCG companies) they don’t really want to interact with anyone else other than store staff in the virtual store (with some experience to help them out) which can be achieved with the live video feed. While all this is going on, emotion and cognition can and is measured using very small EEG sensors on the shoppers head (not the full 64 or 128 points) combined with eye tracking data points at 50 times per second. The virtual store can track footfall down to the mm and provide an instant feedback of all the footfall, emotion, cognition, eye tracking, product pick-ups and purchase data to researchers in remote locations. The insights into what people see, think and feel during shopping are there already for those who use the virtual stores.

Project Natal and the associated developments have the chance to alter the need for a touch screen in this context but other developments such as emotion recognition are always going to be very basic if they use facial coding as in Natal. Using a real person, green screened into the virtual store means the shoppers tone of voice and body language will likely be picked up far better than a PC ever could.

Jon Ward - October 27, 2009

Gareth, thanks for your response and insight into the goings on in the US. The background towards this post was really to touch on what is going on and what could be integrated at a more realistic cost perhaps to what FMCG clients can afford, with the luxury of green screens and actor etc. With regards to the interaction levels with other people – let’s look away from shopper research and also look at social monitoring, offender rehabilitation, sports and sales training and much more. With much broader brush strokes the former military technology that Natal is based on can be brought into a wide range of uses and as AI and biometric / facial response coding advances I can see much more affordable (for researchers, companies and smaller brands for example) systems becoming available. Yes they may lack the full interaction capabilities of having an actor green screened, but they will be accessible and that is the key.

Eye tracking itself was classed as voodoo and witch craft not long ago and now with the Tobii IS unit all of a sudden we are thinking which devices we can add eye tracking to and increase functionality – something unthought of until recently. LG have just launched a (decent) mobile phone watch, something from science fiction until just this year – and Natal, or at least the theory behind the technology, could give us a glimpse into AFFORDABLE interactive research….


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: