In the latest Windows Insider Preview Build of Windows 10 (build 16257), users with the right hardware can now use Eye-tracking technology to operate Windows. Pitched as an accessibility feature, there’s actually a lot of potential on offer here to increase the number of inputs to the system and increase productivity.
Moving the mouse around the screen to select interface elements is pretty efficient, however simply looking at what you want to select is faster. I’d love to be able to simply look at a application shortcut, press a key and have that app launch. Similarly, it’d be incredibly fast to look at the close button and press a key (or click) without moving your right hand, to close and application. This would be particularly enhancing when using multiple displays which take longer to navigate to. It does require a rapid detection speed from the camera and pin point accuracy to ensure you select the intended object.
Eye Control allows you to operate an on-screen mouse, keyboard, and text-to-speech experience using only their eyes. What you’ll need to get up and running is a camera that supports eye-tracking and right now the recommended one is the Tobii Eye Tracker 4C, which unlocks access to the Windows operating system to be able to do the tasks one could previously accomplish with a physical mouse and keyboard.
Right now the Tobii Eye Tracker 4C costs around $220+taxes and delivery costs, but in the future, competition will arrive as Microsoft adds support for Tobii Dynavox PCEye Mini, PCEyePlus, EyeMobile Plus, and I-series cameras.
Once on the latest build, you’ll need to ensure you have the hardware drivers installed, then enable it in Settings.
Here we see it in action.
Thankfully Microsoft offers customisation over the speed at which a pause in your eye movement indicates a selection. This is similar to selection times in Mixed Reality experiences.
Tobii have released a video on how to get started with Windows Hello (automatic login with facial recognition) and how to drive the Tobii Eye Tracker 4C. Like the introduction (or rather better support) for ink, touch and speech, eye tracking opens the door to another input mechanism for Windows and if that helps people with accessibility options, that’s awesome, just remember, this can also help able bodied people as well.