Using gesture control and eye tracking to let window shoppers interact with window items and receive followup information.
Impetus: Having computing embedded into everyday objects brings a great number of natural interactions back into possibility. Despite this, so many designers I know are still limiting themselves to interactions mediated by a screen - even if they’re designing physical objects.
This project was designed almost as a protest piece to show how ‘screen’ technologies could begin to be broken out into the wild, but became so much more. It’s not quite the Internet of Things in its full realisation, but some things come in small steps.
Inspiration: It started out when I was at a conference last year and eye tracking was all the rage. One company offering it had a large rig that didn’t look too mobile - long story to medium length - after a conversation they said that while it could be used in any situation, it was meant for tracking people looking at a screen. I decided to see how eye tracking might be applied to things beyond the screen..even if you have a big stationary rig.
Future work: I’d love to really let people interact with the objects in the window. One example of which would be to have the object pointed to come closer to the glass and turn on. Taking this concept to the big department store christmas windows to let kids move the elves and build presents would be a fun use. Technically speaking, using a Kennect for better movement tracking, and a stronger infra-red bar of LED’s for better eye-tracking would help the reliability of the system.
Rough interaction flow:
- Person walks past the shop and the lights come on - humans love
colour and movement.
- Specially designed points attached to objects in the store attract people’s attention and invite them to point at the object.
- As people point at the object and the system begins to register a point at the object, a stylised thumb’s up begins to raise. It continues to raise as long as the point is still registered at the object.
- Once the thumb’s up reaches horizontal, the object in the store pops up on the iPad with details about it. Additional objects are also added to the iPad as people point to them.
- People are invited to have more information about the items they had pointed at emailed to them. This was refined to allow people to at this point select which objects they would like more information on, as a) multiple people often interacted with the shopfront at any one point in time, and b) people often pointed to more objects than they really had an interest in (for the fun of it).
- Other interesting facts, such as ‘most pointed at’ were originally added to the iPad application, as a sort of real world analytics. These were removed after they were not often used, and cluttered an otherwise very clean interface.