The Emotiv EPOC Headset is a spidery-looking bit of headgear that translates electrical impulses from your brain, head movements and facial expressions into digital input. And, as discussed in depth here, it has a .NET API.
For serious. They have an API for your thoughts now.
Setting aside the obvious implications for gamers and people with disabilities (and, I guess, gamers with disabilities), what could this sort of device mean for the future of business applications?
Last year, Siri delivered a serious upgrade in human-machine interaction to the masses (at least, those masses who could swing an iPhone 4S in this economy). Android is still flailing to match it. Even the latest version of Windows bothers with the mouse and keyboard only under the heading of backward compatibility.
Tomorrow’s more successful nerds will need to learn to see around the ways in which the traditional KVM interface to software has blocked in our thinking. What will be possible and probable when you can regularly expect to speak and think at your applications? What will the screen look like?
Consider that it’s the thrilling year 2012 now, and we are still making plenty of audio-only telephone calls. That’s not because the technology for video phones isn’t there. The ubiquitous Videophone never happened because the use case was fiction—a real telephone conversation involves walking, driving, or just not necessarily being presentable or stationary in any way.
So, maybe we only bother with the screen under the heading of backward compatibility.
Maybe the new interface is that would-you-like-fries-with-that sort of headset, with the occasional display available when needed. Maybe the fact that I’m picturing a headset at all is paleofuture, and rooms will just come standard with transducers for your innermost twitches and ideas. Creepy.
Anyway. Back to mind controlling your software. Here’s where you can check out the Emotiv development experience for free, complete with a software emulator for the headset.