Startup Neurable Unveils the World's First Brain-Controlled VR Game
Posted 7 Aug 2017 | 13:10 GMT
Imagine putting on a VR headset, seeing the virtual world take shape around you, and then navigating through that world without waving any controllers around—instead steering with your thoughts alone.
That’s the new gaming experience offered by the startup Neurable, which unveiled the world’s first mind-controlled VR game at the SIGGRAPH conference this week.
In the Q&A below, Neurable CEO Ramses Alcaide tells IEEE Spectrum why he believes thought-controlled interfaces will make virtual reality a ubiquitous technology.
Neurable isn’t a gaming company; the Boston-based startup works on the brain-computer interfaces (BCIs) required for mind control. The most common type of BCI uses scalp electrodes to record electrical signals in the brain, then use software to translate those signals into commands for external devices like computer cursors, robotic limbs, and even air sharks. Neurable designs that crucial BCI software.
The game on display at SIGGRAPH is a collaboration between Neurable and the Madrid-based VR graphics company estudiofuture, and it is meant merely as a demo of Neurable’s tech and its capabilities. It’s played on an HTC Vive headset by swapping out the Vive’s standard elastic strap and putting in Neurable’s upgraded strap, which is studded with seven electrodes. A HTC Vive virtual reality headset with Neurable's strap attached to allow for mind-controlled gaming.
CEO Alcaide, who has a PhD in neuroscience, tells Spectrum that his tech doesn’t use the EEG brainwave patterns associated with focused concentration or relaxation as control signals. Those signals are used by a few BCI devices already on the market, such as the Muse brain-sensing headband designed to improve meditation and to play simple real-world games. Instead Neurable’s software registers event-related potentials, more specific signals that occur when the brain responds to a stimuli, which allows for an intention-based interaction method.
IEEE Spectrum: Tell me about this game you’re showing off at SIGGRAPH.
Ramses Alcaide: We’re showing off a VR game called Awakening that you play with your mind. In the game, you actively pick up objects with your mind, you stop lasers with your mind, you turn a robot dog into a balloon animal. It’s a completely hands-free experience, you don’t use any controllers.
In the game you’re a child, you wake up inside a cell, and you’re trying to escape a government lab. There are items around you that kids would normally play with, like blocks. As you focus your attention on different items you interact with them, and you then discover clues on how to escape the room. Then you get to another room and battle robots and do other stuff. All the interactions—locomotion, grabbing objects, fighting robots—they’re all done hands-free.
Spectrum: How are people responding to the game demos?
Alcaide: We’ve had an amazing response. A lot of people come in highly skeptical, because BCI has been a disappointment so many times before. But as soon as they grab an object, there’s a smile that comes over their faces. You can see the satisfaction that it really works.
Spectrum: Will this first game, Awakening, be commercially available?
Alcaide: We’re targeting VR arcades in 2018. What we’re showing off right now is a shortened version of the arcade game. We’re not really a game company or a hardware company... But this game is the first thing we’re looking to provide to VR arcades that are using our technology. The Awakening game is a nice introduction because you’re awakening your powers as a child, and we’re also awaking people to the potential of next-generation BCIs. A seated man wearing a VR headset plays a game, the screen on the wall behind him displays his avatar.
Spectrum: You made the strap that can be swapped into the HTC Vive as just a demonstration prototype, right? Will the gear get sleeker in the future? The current strap is pretty bulky, and the seven electrodes seem big!
Alcaide: Yes, exactly. For the short term, we’ll be providing those straps to VR arcades and to developers. In the long term, we see headset manufacturers adding electrodes to their form factor and design, and licensing our software to make use of that capability. For this strap we used off-the-shelf dry electrodes from a company called Wearable Sensing. But our software can work with smaller electrodes and fewer channels. The form factor is the easiest part of the problem to solve.
Spectrum: So what’s the hardest part?
Alcaide: Accurately and responsively interpreting a user’s intention. It’s been 13 years of my life’s work, and the technology is now finally ready to be used in a consumer product.
Spectrum: Can your software make sense of info from basically any EEG headset? Or do the electrodes need to be positioned in certain places?
Alcaide: We need the electrodes to be positioned in certain areas, because we have a scientific basis for what we do. The standard thing most companies do is record lots of stuff from the brain and hope they get enough data to make their devices work. A man wearing a VR headset featuring brain-sensing electrodes uses it to play a VR game. The game screen is displayed on the wall.
Spectrum: How did you decide to use the brain signals called event-related potentials to control the action in the game? Are they particularly well suited for being translated into gaming commands?
Alcaide: There are a lot of different neural signals that can be used for BCI. As part of my first few years of grad school I investigated all of them. You can think of event-related potentials in a very simplified manner. If you get down to core, they tell you what a person cares about. When a person cares about something, they want to take action on it. The game can predict what that action is, in real-time.
Spectrum: Did you consider using brain signals related to imagined body movements, which have been used in many experimental BCIs? I’d think those would be easy to translate into game movements—you imagine moving your leg and your avatar kicks.
Alcaide: Because those signals are so physically defined, it’s not a natural interface for the user, and it takes a long time to train someone. Motor imagery is something we’re interested in integrating into future systems, but the current approaches aren’t fit for consumer technology. We wanted to create a system that would require basically no training. You just tell the player to want something in the game world, and that’s all they need to do. Their brain will naturally respond with an event-related potential.
Spectrum: How exactly does that work to control actions in the VR game?
Alcaide: In the virtual world you see objects in front of you, and the game uses small animations [to highlight the objects in turn]. As soon as you chose and object and think, “I want to take action”—some people say “grab” in their mind, others don’t even do that—you take action with that object. So you might telekinetically bring an object to yourself and throw it at the robot.
[The video clip below shows a user learning how to use the Neurable system to select objects in the game. His complete write-up of the experience is here.]
Spectrum: Are you working with other VR hardware platforms?
Alcaide: We’re hardware agnostic, our technology is all in the machine learning for the BCI software. We only have a strap system for the HTC Vive right now. But a developer could make their own for Oculus or HoloLens.
Spectrum: Why did you decide to make a BCI system for virtual reality and augmented reality?
Alcaide: Every major computing platform has reached a stage of rapid evolution when it gets the right interface. Computers were around for a while, but it wasn’t until the graphical user interface and computer mouse came about that they became ubiquitous. When Apple integrated the touchscreen into the smartphone, they became ubiquitous. What VR lacks right now is a hands-free mouse, and we’ve essentially created a brain mouse. We’re going to be the interaction method that allows for ubiquitous VR and AR.
Spectrum: Do you imagine your tech being used primarily in VR gaming, or also for other VR applications?
Alcaide: Gaming is the most exciting application. But the most valuable one is as a UX/UI platform. Right now, if you try to do anything like typing or dealing with high density data in virtual reality, it doesn’t work well. But BCI will allow VR to become that computing platform.
Spectrum: You mentioned that people have been disappointed by BCI tech before. How do you convince them that this time it’s different?
Alcaide: That’s really easy, and I only say it because we’ve already done it. We wouldn’t have been able to raise a penny if our technology didn’t work well. We raised our initial $2 million by putting an EEG headset on people’s heads, [putting them in front of a computer screen], and saying, “Drive this car around with your mind and see how easy it is.” And now we’re doing it with VR headsets. We can consistently provide quality that is leaps beyond what people expect. In time, new BCI systems are going to step out of the crowd of disappointments that currently exists.
Spectrum: What are your next steps?
Alcaide: We’re focused on getting our tech into existing headsets, targeting arcades to sell our technology to, and working with developers to make amazing content.