網路城邦
回本城市首頁 時事論壇
市長:胡卜凱  副市長:
加入本城市推薦本城市加入我的最愛訂閱最新文章
udn城市政治社會政治時事【時事論壇】城市/討論區/
討論區知識和議題 字體:
看回應文章  上一個討論主題 回文章列表 下一個討論主題
由腦波控制的器具 – E. Landau
 瀏覽777|回應1推薦1

胡卜凱
等級:8
留言加入好友
文章推薦人 (1)

胡卜凱

Brain-controlled devices may help paralyzed people

 

Elizabeth Landau, CNN, 10/17/12

 

(CNN) -- Wouldn't moving objects with your mind be fun? But the implications go deeper: For the millions of Americans who live with paralysis, mentally controlling artificial limbs and mobility devices would be a big step forward toward more independent living.

 

Melody Moore Jackson, director of the BrainLab at the Georgia Institute of Technology, is trying to make that happen.

 

Jackson started this lab in 1998 to look at methods of brain control that didn't involve surgery. At that time, she estimates, there were about five labs working on the same topic of brain-computer interfaces. Now there are about 300.

 

The BrainLab was one of the first to demonstrate that a person can control a robotic arm and a wheelchair with brain signals, Jackson said.

 

"We can literally influence the wiring of the brain, rewiring the brain, so to speak, to allow them to make new neural connections, and hopefully to restore movement to a paralyzed arm," Jackson said.

 

About 6 million Americans live with paralysis, according to the Christopher & Dana Reeve Foundation.

 

A smaller subset in need of such technologies consists of patients with locked-in syndrome, a rare neurological disorder. These patients feel, think, and understand language, but cannot move or speak -- they are "prisoners in their own bodies," Jackson explained.

 

A famous example is Jean-Dominique Bauby, who became locked-in after a stroke, and wrote the memoir "The Diving Bell and the Butterfly" by blinking to indicate individual letters. Jackson wants to open up possibilities for people with locked-in syndrome to communicate and move.

 

There has been a lot of activity in brain-computer interfaces to help such people.

 

Another pioneering research group in this area is the laboratory of Miguel Nicolelis at Duke University Center for Neuroengineering. Nicolelis and colleagues have shown that a rhesus monkey in North Carolina could, using only its brain, control the walking patterns of a robot in Japan. In 2011, they got a monkey to move a virtual arm and feel sensations from it.

 

This team is leading the Walk Again Project, an international consortium of research centers dedicated to creating brain-computer interfaces to restore movement.

 

One technique that Jackson and colleagues use to harness brain signals is called functional near-infrared spectroscopy. This involves shining a light into the brain to discern how much activity is there, and examining the corresponding oxygen level.

 

Light at a specific wavelength is beamed into the brain, and the oxygen present will absorb some of that light. This allows scientists to pick up on small differences in the blood's oxygenation.

 

For example, scientists can place a sensor over the Brocca's area, a part of the brain essential for language. This area is activated when you talk to yourself inside your head or count silently, which is called subvocal speech.

 

Scientists can use the oxygen levels associated with this to create a system of allowing a person to say "yes" and "no" just by thinking; "no" corresponds to no subvocal speech or nonsense syllables.

 

The original hardware for a device that utilizes this technique was developed by Hitachi, and it allows a person with locked-in syndrome to say "yes" or "no," Jackson said.

 

But Jackson wanted to make something more interesting to learn. Her group created a hot-air balloon video game, where the balloon reflects the blood oxygenation level. Multiple locked-in syndrome patients can compete with each other in this game.

 

"It's not necessarily just for fun," Jackson said. "We can actually say, 'Well, they got 70% of the obstacles correct, they were able to jump over the mountains or get through the wind.' And so it also allows us to collect data."

 

In the stroke rehabilitation arena, Jackson's group hopes to restore movement in people who have paralysis or partial paralysis in a limb.

Researchers are looking at a rehabilitation robot called an exoskeleton, a device that a person sits in to be able to move limbs that they wouldn't otherwise. The robot can detect the brain signal corresponding to a person thinking about moving an arm, and then move the arm.

 

"What we're trying to do is make new neural connections from the brain to the arm," Jackson said.

 

The lab has also developed a wheelchair that a person can drive by using brain signals, rather than moving a joystick or pressing buttons.

 

Such brain-computer interfaces require that the user wear an EEG cap to measure brain signals, but setting one up is very complicated. Jackson hopes to make it accessible for anyone to use in their own home.

 

"You can imagine how much faster the therapy would go if you were doing it all the time," she said.

 

http://edition.cnn.com/2012/10/17/health/brain-computer-interfaces

本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘

引用
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=4946318
 回應文章
意識解讀技術近況 - N. Bilton
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

Disruptions: no words, no gestures, just you rbrain as a control pad        

Nick Bilton, 04/28/13

 

Last week, engineers sniffing around the programming code for Google Glass found hidden examples of ways that people might interact with the wearable computers without having to say a word. Among them, a user could nod to turn the glasses on or off. A single wink might tell the glasses to take a picture.

 

But don’t expect these gestures to be necessary for long. Soon, we might interact with our smartphones and computers simply by using our minds. In a couple of years, we could be turning on the lights at home just by thinking about it, or sending an e-mail from our smartphone without even pulling the device from our pocket. Farther into the future, your robot assistant will appear by your side with a glass of lemonade simply because it knows you are thirsty.

 

Researchers in Samsung’s Emerging Technology Lab are testing tablets that can be controlled by your brain, using a cap that resembles a ski hat studded with monitoring electrodes, the MIT Technology Review, the science and technology journal of the Massachusetts Institute of Technology, reported this month.

 

The technology, often called a brain computer interface, was conceived to enable people with paralysis and other disabilities to interact with computers or control robotic arms, all by simply thinking about such actions. Before long, these technologies could well be in consumer electronics, too.

 

Some crude brain-reading products already exist, letting people play easy games or move a mouse around a screen.

 

NeuroSky, a company based in San Jose, Calif., recently released a Bluetooth-enabled headset that can monitor slight changes in brain waves and allow people to play concentration-based games on computers and smartphones. These include a zombie-chasing game, archery and a game where you dodge bullets — all these apps use your mind as the joystick. Another company, Emotiv, sells a headset that looks like a large alien hand and can read brain waves associated with thoughts, feelings and expressions. The device can be used to play Tetris-like games or search through Flickr photos by thinking about an emotion the person is feeling — like happy, or excited — rather than searching by keywords. Muse, a lightweight, wireless headband, can engage with an app that “exercises the brain” by forcing people to concentrate on aspects of a screen, almost like taking your mind to the gym.

 

Car manufacturers are exploring technologies packed into the back of the seat that detect when people fall asleep while driving and rattle the steering wheel to awaken them.

 

But the products commercially available today will soon look archaic. “The current brain technologies are like trying to listen to a conversation in a football stadium from a blimp,” said John Donoghue, a neuroscientist and director of the Brown Institute for Brain Science. “To really be able to understand what is going on with the brain today you need to surgically implant an array of sensors into the brain.” In other words, to gain access to the brain, for now you still need a chip in your head.

 

Last year, a project called BrainGate pioneered by Dr. Donoghue, enabled two people with full paralysis to use a robotic arm with a computer responding to their brain activity. One woman, who had not used her arms in 15 years, could grasp a bottle of coffee, serve herself a drink and then return the bottle to a table. All done by imagining the robotic arm’s movements.

 

But that chip inside the head could soon vanish as scientists say we are poised to gain a much greater understanding of the brain, and, in turn, technologies that empower brain computer interfaces. An initiative by the Obama administration this year called the Brain Activity Map project, a decade-long research project, aims to build a comprehensive map of the brain.

 

Miyoung Chun, a molecular biologist and vice president for science programs at the Kavli Foundation, is working on the project and although she said it would take a decade to completely map the brain, companies would be able to build new kinds of brain computer interface products within two years.

 

“The Brain Activity Map will give hardware companies a lot of new tools that will change how we use smartphones and tablets,” Dr. Chun said. “It will revolutionize everything from robotic implants and neural prosthetics, to remote controls, which could be history in the foreseeable future when you can change your television channel by thinking about it.”

 

There are some fears to be addressed. On the Muse Web site, an F.A.Q. is devoted to convincing customers that the device cannot siphon thoughts from people’s minds.

 

These brain-reading technologies have been the stuff of science fiction for decades.

 

In the 1982 movie “Firefox,” Clint Eastwood plays a fighter pilot on a mission to the Soviet Union to steal a prototype fighter jet that can be controlled by a brain neurolink. But Mr. Eastwood has to think in Russian for the plane to work, and he almost dies when he cannot get the missiles to fire during a dogfight. (Don’t worry, he survives.)

 

Although we won’t be flying planes with our minds anytime soon, surfing the Web on our smartphones might be closer.

 

Dr. Donoghue of Brown said one of the current techniques used to read people’s brains is called P300, in which a computer can determine which letter of the alphabet someone is thinking about based on the area of the brain that is activated when she sees a screen full of letters. But even when advances in brain-reading technologies speed up, there will be new challenges, as scientists will have to determine if the person wants to search the Web for something in particular, or if he is just thinking about a random topic.

 

“Just because I’m thinking about a steak medium-rare at a restaurant doesn’t mean I actually want that for dinner,” Dr. Donoghue said. “Just like Google glasses, which will have to know if you’re blinking because there is something in your eye or if you actually want to take a picture,” brain computer interfaces will need to know if you’re just thinking about that steak or really want to order it.

 

http://bits.blogs.nytimes.com/2013/04/28/disruptionsnowordsnogesturesjustyourbrainasacontrolpad/?partner=yahoofinance



本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=4954881