An Arizona State University lab aims to help the blind do something even the sighted can’t.
Scientists at the Center for Cognitive Ubiquitous Computing, or CUbiC, are developing a pair of gloves that would take just seconds to create a virtual object for the wearer to feel after a spoken or gestured command. Called a haptic interface, the gloves would allow the blind — who often see by touching — to use their sense of touch to see distant objects or ones that can’t be touched.
"Let’s say you’re in a museum and there is a 1,000-year-old bowl, you can’t really touch it, but you can see it," says Daniel Villanueva, a CUbiC research assistant. Wearing the haptic interface equipment would allow someone to feel the shape and texture of such an object without touching it or seeing it, he said.
"It’s kind of like virtual reality but through your hands," Villanueva said.
So while sighted people could only eye that 1,000-yearold bowl behind its glass case, a person who is visually impaired and wearing the gloves could feel its shape and texture.
Devices like the gloves exemplify cognitive, ubiquitous computing. The science involves using computer technology in everyday life to do everyday things.
The haptic interface gloves are part of a group of devices in CUbiC’s flagship project called iCARE, short for Information technology Centric Assistive and Rehabilitative Environment.
The idea is to create unobtrusive computing devices to help the visually impaired with aspects of life such as studying, recognizing friends and family, visiting a museum and shopping in a store. So, at CUbiC, it’s what you don’t see that puts the lab on the cutting edge.
"It’s one of those labs that’s more internal than visual because we work for the blind," says Terry Adams, the center’s coordinator.
Like the pair of sunglasses with a camera embedded in the nosepiece and a speaker in the strap — they look like regular Oakleys or Ray-Bans, but they’ll tell someone who is blind if a familiar person is approaching.
"It looks really, really simple, but behind this is a ton of technology," Villanueva said.
One day that technology, in the form of the iCARE Interaction Assistant sunglasses, could help the visually impaired not only recognize familiar people, but learn details like their emotional state and whether they’ve changed their haircut or hair color.
To people who are blind, those simple details can matter most in many everyday situations. Shopping in retail stores where merchandise is constantly rotated on shelves is daunting, and touching every breakable item can be an expensive endeavor.
CUbiC proposes using radio frequency identification tags on merchandise in place of barcodes and PalmPilotlike devices that read the tags to the shopper, said Terri Hedgpeth, a research professional at the lab.
"It allows the person who is blind or visually impaired to shop independently without sighted assistance," said Hedgpeth, who is blind.
CUbiC has a mock store full of breakable stuff in its lab at the Brickyard on Mill in Tempe for testing its devices designed for retail situations. Hedgpeth said the lab may use the store to show retailers what can be done with assistive technology.
Perhaps the lab’s bestknown and most complete device is one that is crucial on a university campus — the iCARE Reader makes it more convenient to read and study written material.
The tabletop version looks like an overhead projector with a mounted camera that transmits pages of books to a computer as the reader turns them.
Software translates the written word into the spoken, allowing a person who can’t see to read the book.
The software allows for highlighting and skipping passages, and the speed at which the text is spoken by the computer can be sped up or slowed down.
Tabletop iCARE Readers have been installed at the Disability Resources building on ASU’s campus, at the Foundation for Blind Children in Phoenix and in the CUbiC lab.
A portable version of the iCARE reader is now in testing at CUbiC.