A new smartphone case could soon help people with visual impairments, tremors, and spasms use touchscreens independently.
The case, dubbed BrushLens, could let users perceive, locate, and tap buttons and keys on the touchscreen menus now ubiquitous in restaurant kiosks, ATM machines, and other public terminals.
“So many technologies around us require some assumptions about users’ abilities, but seemingly intuitive interactions can actually be challenging for people,” says study first author Chen Liang, a doctoral student in computer science and engineering at the University of Michigan.
“People have to be able to operate these inaccessible touchscreens in the world. Our goal is to make that technology accessible to everyone,” Liang says.
Liang works in the lab of Anhong Guo, assistant professor of computer science and engineering. Guo led the development of BrushLens with Alanson Sample, an associate professor in the same department.
Users can comb through a touchscreen interface by holding a phone connected to BrushLens against a touchscreen and dragging the phone across the screen. The phone sees what’s on the screen with its camera then reads the options aloud by harnessing the phone’s built-in screen readers. Users indicate their menu choice through screen readers or an enlarged, easy-to-tap button in the BrushLens app.
“I could actually see myself accomplishing something that I otherwise thought impossible.”
When given a target, BrushLens divides the screen into a grid, then guides the user’s hand toward the section of the screen containing their menu choice by saying the coordinates of both the target and device. Once those coordinates overlap, pushbuttons or autoclickers on the underside of the phone case tap the screen for the user, depending on the model.
“The user doesn’t have to precisely locate where the button is and perform the touch gesture,” Liang says.
Ten study participants, six with visual impairments and four with tremors or spasms, tested the hardware and app.
“As a blind person, touchscreens are pretty much inaccessible to me unless I have some help or I can plug headphones into the kiosk,” says study participant Sam Rau. “Somebody else has to order for you, or they have to help you out with it. I don’t want to be in a situation where I always have to rely on the kindness of others.”
It took some time for Rau to figure BrushLens out, but once he became familiar with the device, he was excited by the tool’s potential.
“I thought about myself going into a Panera Bread and being able to order from the kiosk,” Rau says. “I could actually see myself accomplishing something that I otherwise thought impossible.”
Likewise, BrushLens worked as intended for users whose tremors or spasms cause them to make unwanted selections on touchscreens. For one participant with cerebral palsy, BrushLens improved their accuracy by nearly 74%.
The inventors of BrushLens recently applied for a patent with the help of Innovation Partnerships, the University of Michigan’s -M’s central hub for research commercialization. The team hopes to bring the product to users as an affordable phone accessory.
“The parts that we used are relatively affordable. Each clicker costs only $1,” Liang says. “The whole device is definitely under $50, and that’s a conservative estimate.”
The team plans to further streamline their design so that it easily fits in a pocket. Offloading the battery and processing to the phone, for example, could make the design cheaper and less bulky.
“It doesn’t have to be much more complex than a TV remote,” says coauthor Yasha Iravantchi, a doctoral student in computer science and engineering.
The companion app could also be improved by allowing users to directly interface with it via voice commands, Liang says.
Participants were enrolled in the trial study with the help of the Disability Network, the University of Michigan Council for Disability Concerns, and the James Weiland research group in the UM biomedical engineering department. A Google Research Scholar Award funded the work.
Liang will demo BrushLens at the Association for Computing Machinery Symposium on User Interface Software and Technology in San Francisco.
Source: Derek Smith for University of Michigan