The writing's on the wall – and it's following you. Two new gesture-sensing innovations designed for large electronic screens in public places herald a future in which everything from street art to advertisements track your movements, are fully interactive, and nigh on impossible to ignore.
Giant flat-screen displays powered by organic LEDs (OLEDs) are plunging in price, so screens tens of metres long could soon line urban corridors. Rather than have them simply fire messages at a tuned-out public, researchers at the Technical University of Berlin (TUB) in Germany have built two applications that they hope will captivate passers-by and inspire a new wave of interactive displays.
OLED flat-screen technology is already finding its way into our private spaces: homeowners will soon be able to cover whole living-room walls with screens like wallpaper. While TV companies ponder what content is best displayed on such vast indoor vistas, the TUB team has been working out what can be achieved with vast outdoor displays.
"We believe that in the future all surfaces in urban areas could be interactive displays," says team member Robert Walter. "This presents great opportunities and challenges as it will need to be attractive and work in an intelligent way." The researchers will reveal their first two street-smart applications – StrikeAPose and Screenfinity – next month at the CHI conference in Paris, France. They believe that while advertising could provide the impetus for the adoption of the technology, non-commercial apps will also appear – courtesy of artists or poets, perhaps.
StrikeAPose, developed by Walter's team, lets a person in the street perform a unique gesture to take control of anything from a bus-shelter advert screen to a large, Times-Square-style video wall. Once you are registered as the screen controller, software fed by the depth cameras used in Microsoft's Kinect system lets you control, say, a gesture-driven game. In trials in a university cafeteria, the team settled on a registration gesture they call The Teapot: users put their hands on both hips, their arms describing the profile of two teapot handles. This was the most robust gesture, even when obfuscated by thick clothing.
Screenfinity, led by Jorg Muller, generates content for large, long screens that follows the viewer as they walk along beside it. The system monitors passers-by with 10 Kinect cameras placed along the length of a screen. As a person approaches, text or pictures pop up and slide along in sync with their walking. If someone moves further away, the text gets bigger; closer, and it gets smaller, so it is equally legible all the time. In a recent trial on the TUB campus, cafe menus were displayed in a bustling concourse. Not only were people able to read the menus at varying distances and without breaking stride, the display proved so attention-grabbing that it had users looking behind the screen to see if a person was tracking them.
Walter says these screens "should never know more about you than you are willing to share". But users could be allowed to use gestures to select from an on-screen menu to have it show them a type of information – sports scores, say – and have it follow them. It is a tantalising proposition for advertisers as it would knock spots off advertising on posters or static small screens.
"In London there is some great poetry posted up in the subways," says Walter. "Being able to read this while walking would make it even better."
Simon Parnall of News Digital Systems in Staines, UK, is developing floor-to-ceiling TV screens. StrikeAPose is user-friendly, he notes, since people only need simple gestures to interact with it. But he wonders how many people will want to "perform a potentially embarrassing gesture in a public space in order to interact". He foresees organisations like the London Underground making strong use of Screenfinity, however, as it will allow ads to move down the escalators, tethered to specific commuters.