guides & articles

Related listings

Latest Postings

Subscribe to the hottest news, latest promotions & discounts from STClassifieds & our partners

I agree to abide by STClassifieds Terms and Conditions

Gadgets & Home Improvement

No words, no gestures, just your brain as control pad

RECENTLY, engineers sniffing around the programming code for Google Glass found hidden examples of ways that people might interact with the wearable computers without having to say a word
The Straits Times - May 13, 2013
| More
No words, no gestures, just your brain as control pad

RECENTLY, engineers sniffing around the programming code for Google Glass found hidden examples of ways that people might interact with the wearable computers without having to say a word. Among them, a user could nod to turn the glasses on or off. A single wink might tell the glasses to take a picture.

But don't expect these gestures to be necessary for long. Soon, we might interact with our smartphones and computers by using our minds. In a couple of years, we could be turning on the lights at home just by thinking about it. Farther into the future, your robot assistant will appear by your side with a glass of lemonade because it knows you're thirsty.

Researchers in Samsung's Emerging Technology Lab are testing tablets that can be controlled by your brain, using a cap that resembles a ski hat studded with monitoring electrodes, the MIT Technology Review, the science and technology journal of the Massachusetts Institute of Technology, reported last month.

The technology, often called a brain computer interface, was conceived to enable people with paralysis and other disabilities to interact with computers or control robotic arms, all by simply thinking about such actions. Before long, these technologies could well be in consumer electronics, too.

Some crude brain-reading products already exist, letting people play easy games or move a mouse around a screen. Car manufacturers are exploring technologies packed into the back of the seat that detect when people fall asleep while driving and rattle the steering wheel to awaken them.

But the products commercially available today will soon look archaic. "The current brain technologies are like trying to listen to a conversation in a football stadium from a blimp," said Mr John Donoghue, a neuroscientist and director of the Brown Institute for Brain Science. "To really be able to understand what is going on with the brain today, you need to surgically implant an array of sensors into the brain." In other words, to gain access to the brain, for now you still need a chip in your head.

Last year, a project called BrainGate pioneered by Mr Donoghue enabled two people with full paralysis to use a robotic arm with a computer responding to their brain activity. One woman, who had not used her arms in 15 years, could grasp a bottle of coffee, serve herself a drink and then return the bottle to a table. All done by imagining the robotic arm's movements.

But that chip in the head could soon vanish, as scientists say we are poised to gain a much greater understanding of the brain and, in turn, technologies that empower brain computer interfaces. An initiative by the Obama administration this year called the Brain Activity Map project, a decade-long research project, aims to build a comprehensive map of the brain.

Ms Miyoung Chun, a molecular biologist and vice-president for science programmes at the Kavli Foundation, is working on the project and although she said it would take a decade to completely map the brain, companies would be able to build new kinds of brain computer interface products within two years. "The Brain Activity Map will give hardware companies a lot of new tools that will change how we use smartphones and tablets," she said. "It will revolutionise everything from robotic implants and neural prosthetics, to remote controls, which could be history in the foreseeable future when you can change your television channel by thinking about it."

There are some fears to be addressed. On the Muse website, an FAQ is devoted to convincing customers that the device cannot siphon thoughts from people's minds.

Although we won't be flying planes with our minds any time soon, surfing the Web on our smartphones might be closer.

Mr Donoghue said one of the current techniques used to read people's brains is called P300, in which a computer can determine which letter of the alphabet someone is thinking about based on the area of the brain activated when he sees a screen full of letters.

But even when advances in brain-reading technologies speed up, there will be new challenges, as scientists will have to determine if the person wants to search the Web for something in particular, or if he is just thinking about a random topic. "Just because I'm thinking about a steak medium-rare at a restaurant doesn't mean I actually want that for dinner," Mr Donoghue said.

"Just like Google glasses, which will have to know if you're blinking because there is something in your eye or if you actually want to take a picture," brain computer interfaces will need to know if you're just thinking about that steak or really want to order it.



Samsung announces super-fast 5G data breakthrough

Monster Blade