I’m playing Fruit Ninja with my eyes, and the experience is on a different level to playing the game with your finger. I barely have to move, yet there’s still an element of skill involved as a pineapple flies up on the screen, I fix it with my gaze and watch it slice satisfyingly into two.
This is the future of how we’ll play games and interact with our mobile devices, says Sune Alstrop Johansen, the man watching over my shoulder in a small meeting room in Copenhagen, Denmark.
Johansen is the CEO of The Eye Tribe, a startup that spun out the University of Copenhagen in 2011 to make eye-tracking software, initially for the disabled and now for the mass market. What makes his company unique is how cheap the technology is.
Eye-tracking scanners typically cost around $10,000 and there are less than 100,000 of them in the world, mostly used for research and ad analytics. Johansen sells a rectangular bar that hooks onto a tablet for just $99, and in three years he’s flooded the market with “many thousands” more of them, mostly for developers who are creating apps that use eye tracking.
He got the price so low thanks to clever software and the falling cost of sensors. The components are so cheap that the device’s most expensive feature is the casing. So far, the Eye Tribe has raised $3 million in venture capital funding from wealthy Silicon Valley entrepreneurs like Richard Sanquini, in addition to a $2.4 million grant from the U.S. government.
The next step: integrate into smartphones and tablets.
New mobile devices should be launching in 2015 with The Eye Tribe’s technology in them for the first time, Johansen says. He won’t give names other than Sony Mobile, the one partnership that’s technical public. And officially, his startup will only say it’s “in discussions with dozens of [manufacturers] on the integration of eye tracking technology into mass-market consumer mobile devices.”
But the technology is coming. “We definitely expect to have some of the first integrated devices this year,” says Johansen, gesturing at the Microsoft Surface tablet in front of him. “We’re talking devices like these.”
The idea of eye-tracking technology still has to be lifted out of realm of gimmick into that of real use case. But the demo I’m going through at The Eye Tribe’s office is undeniably compelling.
After playing Fruit Ninja, I’m guided to a page showing six icons to select from. I ask if I need to look at one for an extra second or two to select it. “No,” says Johansen. “Look at the icon, and then tap anywhere on the screen.”
At first that sounds counterintuitive. Then it becomes clear that you’re saving the split second of time and brain power it takes to reach over and touch a specific icon with your finger.
Forget for a moment that this points to a future where we become the smoothie-slurping humans from Wall-E who’ve lost all need for hand-eye coordination.
Looking at an icon and then tapping anywhere to select is not only easy, it feels almost instinctive. “The fastest way to do selection on tablet is combining the eye and pressing anywhere,” says Johansen. “It’s basically instant.”
The challenge is educating consumers on an entirely new way of interacting with devices. “It’s retraining the end users not to use their fingers,” says Windsor Holden an expert in human-computer interaction with Juniper Research. There’s also the risk that end users could become disoriented by such features, and put off.
That might not matter for those of a certain generation.
Johansen’s team recently created a demo app that guided you step-by-step through the creation of a Lego model. Instead of swiping, you looked at an arrow on the screen for a moment and the next step magically appeared. The team showed it off at a technology fair and found that while most adults needed a tutorial on how to use it, the kids in attendance didn’t need any explanation at all. “They just skipped the tutorial,” he says.
During the demo at The Eye Tribe’s office, I also used the tracker to scroll down the home page of the New York Times, and then scrolled through a grid of Netflix movies. If I settled my eyes on one movie for long enough, I’d get a pop-up with the synopsis.
That creates a tantalizing prospect for the number crunchers behind Netflix’s predictive algorithms. Netflix observes what movies and TV shows you choose to get a better sense of what to recommend. With eye tracking, it could get an even more nuanced view on our preferences by noting what we look at, but don’t tap.
“Today most devices are passively waiting for our input,” says Johansen. “Eye tracking can make the device smarter to, in some way, predict what you want.”
The most interesting use case is still navigational, and the integration with the touch screen for launching apps or even unlocking the phone. Imagine taking a smartphone out of your pocket and activating it by pressing a button, while also letting its camera scan your eye – an extra layer of security for those who want it.
Johansen shows me the prototype tracker, which is an eight-inch long, black rectangular box that attaches to the bottom of his Microsoft Surface tablet. “It’s mostly air,” he says. The only thing inside is a camera in the middle and infrared lights on either end, along with a USB attachment cable.
Note that a couple of phone manufacturers already have similar but in some cases not-quite-there-yet eye-tracking tech. Amazon’s Fire Phone contains four infrared cameras on the front that track the user’s head for its tilt-based user interface, which creates the illusion of 3D images.
Samsung also marketed its Galaxy S4 smartphone as having eye tracking but Johansen calls that a stretch. “It’s a matter of definition,” he says grinning. “It doesn’t work in low light… To scroll you have to do this.” He then nods up and down.
Watch a video demo of the Eye Tribe’s tracker here:
So what makes The Eye Tribe’s tracking technology better? It’s actually very little to do with the hardware, since the tracker works with a low-resolution camera phone, and more to do with an advanced core algorithm that estimates where you’re looking based on the reflections of its infrared lights.
Mathematics PhD Javier San Agustine from Spain developed the core algorithm, which is based on detecting light shined on the eye to help the camera identify two glints and a pupil, then using that to estimate, “with high accuracy” according to Johansen, where the pupil is pointing. “It’s basic but very difficult to do.” The Eye Tribe’s closest competitor is Tobii, a Malmo, Sweden-based eye-tracking company that sell high-resolution trackers for $10,000 and is 10 years old.
“I’m not claiming that our $99 [tracker] is just as good as a $20,000 tracker, but I’m claiming that it’s pretty close,” says Johansen. “For most people that’s sufficient. We’re not targeting rocket scientists. We’re targeting consumers.”
What’s interesting about The Eye Tribe is that it’s following the same historic pattern of one or two other types technologies that we use to interact with computers: before hitting the mass market, they’re sometimes built for the disabled first.
Remember T9, that early predictive-text feature on Nokia phones that came out during the Bush administration? That originally came from disabilities research at Seattle’s Tegic Communications, now owned by Nuance.
Aaron Sheedy, the former COO of keyboard app Swype and now head of mobile at Nuance, told me in a previous interview that even the computer mouse was partly borne out of disabilities research.
Johansen himself first became interested in creating cheaper eye-tracking tech seven years ago, while conducting university research on people who suffered from ALS. For anyone unfamiliar with the disease beyond its association with the ice bucket challenge, the neurological disorder gradually leaves a person unable to move their muscles, save for, eventually, their eyes. Stephen Hawking suffers from a slow-progressing form of the disease.
Johansen was working with an ALS sufferer named Bjorn at the University of Copenhagen. “We gave him this prototype that we built of this free eye-tracking software, and he tried it and it worked,” Johansen remembers.
Bjorn asked if he could get the system himself, and the team were left floundering over what to do with their prototype. Bjorn then raised the possibilities of getting it into the hands of disabled people who were priced out of using high tech eye trackers. Johansen and his team talked about starting a company.
“When you’re exposed to it first hand and you hear the life story of some of these guys,” Johansen says, “We thought, ‘We have to do this.’”
Follow me on Twitter: @Parmy
This article was written by Parmy Olson from Forbes and was legally licensed through the NewsCred publisher network.