May 2004

National Science Foundation

'Heads-up' display lives up to its name

Student-designed device helps the visually impaired avoid hazards, day and night



The head-mounted components of the Wearable Low Vision Aid. Shown are the camera with a ring of IR LEDs (left) and the head mounted display (right).
Credit: Human Interface Technology Laboratory at the University of Washington.
Full size image available here

ARLINGTON, Va.-Using a common laptop computer and a sophisticated head-mounted projection device, students at the University of Washington (UW) have created a system to help people with poor vision navigate around stationary objects.

The Wearable Low Vision Aid (WLVA) is the first portable device to draw attention to obstacles using an illuminated, vibrating crystal that projects a warning icon-a raster image much like a television's-onto the user's retina. The system was built entirely by graduate and undergraduate students over the past four years under the direction of Eric Seibel, research assistant professor for mechanical engineering at the Human Interface Technology Laboratory at UW. The team will unveil the latest prototype on May 27, at the Annual Society for Information Display Conference in Seattle, Wash.

Cheap and portable, the prototype consists of a backpack (containing the computer) connected to an image and display system mounted on a pair of glasses. The imaging system contains a ring of 24, infrared, light-emitting diodes and a camera. The diodes fire periodically while the camera collects infrared video input from the user's field of view.

The students created custom software to compare, in real time, the diode-illuminated scene with the ambient scene. Closer objects reflect more light than do distant objects; if the closer objects remain in view and grow in size, a collision is imminent. The WLVA recognizes the danger and sends a signal to the computer, which determines the location and type of object and triggers the raster display.

The display is a vibrating, crystal fiber-a component made of parts costing less than $1-which connects to a laser diode. The fiber vibrates more than 1,000 times per second, covering its entire scan area 60 times per second. The fiber traces a series of horizontal lines to form a complete, yet translucent, "screen," while the laser fires only at certain points during the trace. Each laser pulse equates to a single pixel, and from the WLVA user's perspective, the final result is a familiar image.

Working directly with low-vision volunteers, the researchers are developing customized icons that represent common walking hazards. The computer detects different obstacles, such as a branch or trash can, and flashes specific icons onto the back of the eye to warn of danger.

The next-generation WLVA will be much smaller. Ryland Bryant, a recently graduated master's degree student who is lead author on the Seattle conference paper, has already created a new circuit board that reduces the system weight by about half a pound. As part of these ongoing modifications, the researchers will incorporate a component from a micro-endoscope they are developing: an improved scanner that has higher resolution (over 50 times the original pixel count), yet is 10 times smaller.

Comments from Seibel regarding the research: "Kris Lawrence, a visually-impaired university employee and consultant for this project assisted us with the systems design approach. Kris talks with all of us and gives us guidance about how to improve the WLVA. Sometimes the interaction is eyeopening, like traveling to another country for the first time, giving us exposure to a different way of thinking about things." - Eric Seibel

"People's eyesight changes and can get worse with time. The next stage is to use laser light to 'tickle,' or directly stimulate, neurons in the eye and cause them to 'see' objects even if the photoreceptors are dead. Unfortunately, this far-out futuristic device, a means to mimic the function of diseased rods and cones, is only in the proposal writing stage." - Eric Seibel

"This is another set of eyes looking out for you. Because audio is already the key sense of detection for people with vision disabilities, we chose not to add any audible cues and only augment the user's impaired visual system with more easily seen laser light." - Eric Seibel

Additional Contact

University of Washington: Rob Harrill, 206-543-2580, [email protected]

Images: http://www.nsf.gov/od/lpa/newsroom/pr_all_img.cfm?ni=103

NSF Program Officer and expert on disabilities research: Gil Devey, 703-292-7943, [email protected]

NSF Award #9978888
Principal Investigator: Eric Seibel, 206-543-5075, [email protected]

Eric Seibel's homepage: http://www.hitl.washington.edu/people/eseibel/

Wearable Low Vision Aid webpage: http://www.hitl.washington.edu/research/wlva/

Human Interface Technology Laboratory at University of Washington: http://www.hitl.washington.edu

The National Science Foundation is an independent federal agency that supports fundamental research and education across all fields of science and engineering, with an annual budget of nearly $5.58 billion. National Science Foundation funds reach all 50 states through grants to nearly 2,000 universities and institutions. Each year, NSF receives about 40,000 competitive requests for funding, and makes about 11,000 new funding awards. The National Science Foundation also awards over $200 million in professional and service contracts yearly.

Receive official National Science Foundation news electronically through the e-mail delivery system, NSFnews. To subscribe, send an e-mail message to [email protected]. In the body of the message, type "subscribe nsfnews" and then type your name. (Ex.: "subscribe nsfnews John Smith")

Useful National Science Foundation Web Sites:
NSF Home Page: http://www.nsf.gov
News Highlights: http://www.nsf.gov/od/lpa
Newsroom: http://www.nsf.gov/od/lpa/news/media/start.htm
Science Statistics: http://www.nsf.gov/sbe/srs/stats.htm
Awards Searches: http://www.fastlane.nsf.gov/a6/A6Start.htm




This article comes from Science Blog. Copyright � 2004
http://www.scienceblog.com/community