Spotlight on eye control

Eye control keyboard

How can Windows empower every person and every organization on the planet to achieve more? Our ongoing mission is behind a feature that Windows Insiders recently voted as one of their favorites, eye control.

While eye control was first released with the Fall Creators Update, it’s seen some great improvements with the April 2018 Update. We sat down with Microsoft engineer Jake Cohen to get the full story behind the accessibility feature that lets users control Windows with their eyes and use a compatible eye-tracking device, like Tobii or EyeTech.

“Accessibility has been super important for us for the past 20-plus years,” Jake said. “For the past few years, we’ve working hard to really aspire towards our mission statement by empowering every person of every level of ability.”

So how has eye control gotten to where it’s at now, and how are Windows Insiders helping us determine what’s next? Let’s take a look!

Looking ahead with eye control

The Windows team started on the road to developing eye control during the 2014 Microsoft company-wide hackathon when Steve Gleason, an NFL football player who had played with the New Orleans Saints, emailed Microsoft with a challenge. Living with ALS, Steve wanted to drive technology that could help him communicate more easily, play with his son, and move his wheelchair independently, which would all be major advancements for everyone living with ALS.

Since that hackathon, Microsoft has been working closely with Team Gleason, Steve’s nonprofit foundation working to develop technologies that will empower people living with ALS. Continuing this work, the Windows team has been working to build eye tracking in Windows 10.

With a compatible device, eye control makes the most of infrared lighting and cameras to tell where a user’s eyes are looking relative to the screen. Windows uses that information to let you control a mouse or keyboard with just the movement of your eyes.

“Eye control starts with a launch pad, which is UI that’s always present on the screen,” Jake said. “When you dwell your eyes on an icon, which is the act of fixating your eyes somewhere on the screen and waiting, it’ll activate a click. So it’s basically a press and hold with your eyes.”

“You have access on the launch pad to the mouse, the keyboard, text-to-speech, and now in the April 2018 release, many more options like Quick Access, Start, Task View, Device Calibration, Settings, and more,“ he said. „For browsing the web or scrolling an app, you can also fixate your eyes somewhere on the screen and then use the arrows that are provided to scroll up and down using your eyes.”

What’s next

What’s in store for the future of eye control? Jake’s team is still working with Microsoft Research and Team Gleason to collect feedback from the ALS community and make eye control even better.

“It’s really inspiring to get this feedback, because we hear people say, ‘This is amazing technology. This is really helping me,’ and also, ‘This is the next thing I need,’“ Jake said. „It’s about empowering them to do everything they can think of, not just a subset of interactions or abilities.”

Jake also mentioned that sparking more third-party tools is a priority for the team. “The next step we’re taking is releasing public developer APIs and open-source libraries to allow third-party developers to build apps and experiences that can leverage eye tracking,” he said.

„I’m excited see what developers can come up with in order to make an impact,“ he said. „Imagine all of the gaps that third-party developers can fill for customers who are living with mobility impairments. It comes down to Microsoft’s core roots. We can’t fulfill this mission statement alone to empower everyone—we have to build a platform that empowers everyone to empower other people.“