Archives

  • Deep sleep may buffer against Alzheimer’s memory loss
  • Play Video

    Sensor turns almost anything into a touch screen

    A sensing system called SAWSense takes advantage of acoustic waves traveling along the surface of an object to enable touch inputs to devices almost everywhere. Here, a table is used to power a laptop’s trackpad. (Credit: Interactive Sensing and Computing Lab/U. Michigan)

    A new sensing system can turn couches, tables, sleeves, and the human body into a high-fidelity input device for computers.

    The sensing system repurposes technology from new bone-conduction microphones, known as Voice Pickup Units (VPUs), which detect only those acoustic waves that travel along the surface of objects.

    It works in noisy environments, along odd geometries such as toys and arms, and on soft fabrics such as clothing and furniture.

    Called SAWSense, for the surface acoustic waves it relies on, the system recognizes different inputs, such as taps, scratches, and swipes, with 97% accuracy. In one demonstration, the team used a normal table to replace a laptop’s trackpad.

    “This technology will enable you to treat, for example, the whole surface of your body like an interactive surface,” says Yasha Iravantchi, a doctoral candidate at the University of Michigan. “If you put the device on your wrist, you can do gestures on your own skin. We have preliminary findings that demonstrate this is entirely feasible.”

    Taps, swipes, and other gestures send acoustic waves along the surfaces of materials. The sensing system then classifies these waves with machine learning to turn all touch into a robust set of inputs. The researchers presented the system at the 2023 Conference on Human Factors in Computing Systems, where it received a best paper award.

    As more objects continue to incorporate smart or connected technology, designers are faced with a number of challenges when trying to give them intuitive input mechanisms. This results in a lot of clunky incorporation of input methods such as touch screens, as well as mechanical and capacitive buttons, Iravantchi says.

    Touch screens may be too costly to enable gesture inputs across large surfaces like counters and refrigerators, while buttons only allow one kind of input at predefined locations.

    Past approaches to overcome these limitations have included the use of microphones and cameras for audio- and gesture-based inputs, but the authors say techniques like these have limited practicality in the real world.

    “When there’s a lot of background noise, or something comes between the user and the camera, audio and visual gesture inputs don’t work well,” Iravantchi says.

    To overcome these limitations, the sensors powering SAWSense are housed in a hermetically sealed chamber that completely blocks even very loud ambient noise. The only entryway is through a mass-spring system that conducts the surface-acoustic waves inside the housing without ever coming in contact with sounds in the surrounding environment.

    When combined with the team’s signal processing software, which generates features from the data before feeding it into the machine learning model, the system can record and classify the events along an object’s surface.

    “There are other ways you could detect vibrations or surface-acoustic waves, like piezo-electric sensors or accelerometers,” says Alanson Sample, associate professor of electrical engineering and computer science, “but they can’t capture the broad range of frequencies that we need to tell the difference between a swipe and a scratch, for instance.”

    The high fidelity of the VPUs allows SAWSense to identify a wide range of activities on a surface beyond user touch events. For instance, a VPU on a kitchen countertop can detect chopping, stirring, blending, or whisking, as well as identifying electronic devices in use such as a blender or microwave.

    “VPUs do a good job of sensing activities and events happening in a well-defined area,” Iravantchi says. “This allows the functionality that comes with a smart object without the privacy concerns of a standard microphone that senses the whole room, for example.”

    When multiple VPUs are used in combination, SAWSense could enable more specific and sensitive inputs, especially those that require a sense of space and distance like the keys on a keyboard or buttons on a remote.

    In addition, the researchers are exploring the use of VPUs for medical sensing, including picking up delicate noises such as the sounds of joints and connective tissues as they move. The high-fidelity audio data VPUs provide could enable real-time analytics about a person’s health, Sample says.

    The research is partially funded by Meta Platforms Inc. The team has applied for patent protection with the assistance of University of Michigan Innovation Partnerships and is seeking partners to bring the technology to market.

    Source: Zachary Champion for University of Michigan

    Play Video

    Paint transforms walls into interactive touchpads

    (Credit: Getty Images)

    With a few applications of conductive paint and some electronics, researchers can create walls that sense human touch, and detect things like gestures and when appliances are in use.

    The researchers found that they could transform dumb walls into smart walls at relatively low cost—about $20 per square meter—using simple tools and techniques, such as a paint roller.

    These new capabilities might enable users to place or move light switches or other controls anywhere on a wall that’s most convenient, or to control videogames by using gestures. By monitoring activity in the room, this system could adjust light levels when a TV is turned on or alert a user in another location when a laundry machine or electric kettle turns off.

    Wall++ double-touch
    Researchers at CMU and Disney Research used simple tools and techniques to transform dumb walls into smart ones. (Credit: Carnegie Mellon)

    “Walls are usually the largest surface area in a room, yet we don’t make much use of them other than to separate spaces, and perhaps hold up pictures and shelves,” says Chris Harrison, assistant professor in Carnegie Mellon University’s Human-Computer Interaction Institute (HCII). “As the internet of things and ubiquitous computing become reality, it is tempting to think that walls can become active parts of our living and work environments.”

    Yang Zhang, a PhD student in the HCII, will present a research paper on this sensing approach, called Wall++, at CHI 2018, the Conference on Human Factors in Computing Systems.

    The researchers found that they could use conductive paint to create electrodes across the surface of a wall, enabling it to act both as a touchpad to track users’ touch and an electromagnetic sensor to detect and track electrical devices and appliances.

    “Walls are large, so we knew that whatever technique we invented for smart walls would have to be low cost,” says Yang Zhang, a PhD student in the HCII. He and his colleagues thus dispensed with expensive paints, such as those containing silver, and picked a water-based paint containing nickel.

    They also wanted to make it easy to apply the special coating with simple tools and without special skills. Using painter’s tape, they found they could create a cross-hatched pattern on a wall to create a grid of diamonds, which testing showed was the most effective electrode pattern. After applying two coats of conductive paint with a roller, they removed the tape and connected the electrodes. They then finished the wall with a top coat of standard latex paint to improve durability and hide the electrodes.

    How spray paint can turn guitar into touchpad

    The electrode wall can operate in two modes—capacitive sensing and electromagnetic (EM) sensing. In capacitive sensing, the wall functions like any other capacitive touchpad: when a person touches the wall, the touch distorts the wall’s electrostatic field at that point. In EM sensing mode, the electrode can detect the distinctive electromagnetic signatures of electrical or electronic devices, enabling the system to identify the devices and their locations.

    Similarly, if a person is wearing a device that emits an EM signature, the system can track the location of that person, Zhang says.

    Wall++ hasn’t been optimized for energy consumption, Zhang says, but he estimates the wall-sized electrodes consume about as much power as a standard touch screen.

    Additional researchers contributing to the work are from Carnegie Mellon University and Disney Research.

    Source: Carnegie Mellon University