How ultrasound sensing makes Nest displays more accessible

Share
  • December 10, 2019

Last year, I gave my 74-year-old father a Nest Hub for Christmas. Over the following months, I noticed he would often walk up to the device to read the information on the screen, because he couldn’t see it easily from across the room. I wondered if other people were having the same issue. 

My team at Google Nest and I started having conversations with older adults in our lives who use our products, asking them questions about ways they use their devices and observing how they interact with them. In the course of our research, we learned that one in three people over the age of 65 have a vision-reducing eye disease, and that’s on top of the millions of people of all ages who also deal with some form of vision impairment. 

We wanted to create a better experience for people who have low vision. So we set out to create a way for more people to easily see our display from any distance in a room, without compromising the useful information the display could show when nearby. The result is a feature we call ultrasound sensing. 

We needed to find a sensing technology that could detect whether you were close to a device or far away from it and show you the right things based on that distance, while protecting people’s privacy. Our engineers landed on one that was completely new to Google Assistant products, but has been used in the animal kingdom for eons: echolocation. 

Animals with low vision—like bats and dolphins—use echolocation to understand and navigate their environments. Bats emit ultrasonic “chirps” and listen to how those chirps bounce off of objects in their environments and travel back to them. In the same way, Nest Hub and Nest Hub Max emit inaudible sound waves to gauge your proximity to the device. If you’re close, the screen will show you more details and touch controls, and when you’re further away, the screen changes to show only the most important information in larger text. Ultrasound sensing allows our smart displays to react to a user’s distance. 

Ultrasound sensing allows your display to show the most important information when you’re far away, like your total commute time, and show more detail as you get close to the device.

To develop the right screen designs, the team tested varying text heights, contrast levels and information density and measured the ease with which people could read what’s on the screen. It was refreshing when, regardless of age or visual impairment, testers would make comments like, “it just feels easier to read.” It turned out that designing for people with low vision improved the experience for everyone.

Ultrasound testint

Testing ultrasound sensing during the design process.

Ultrasound waves

What ultrasound sensing “sees” on a smart display.

Ultrasound sensing already works for timers, commute times and weather. And over the coming week, your devices will also begin to show reminders, appointments and alerts when you approach the display. Because this is using a low-resolution sensing technology, ultrasound sensing happens entirely on the device and is only able to detect large-scale motion (like a person moving), without being able to identify who the person is.

After we built the ultrasound sensing feature, I tested it with my dad. As soon as I saw him reading his cooking timer on the screen from across the kitchen, I knew we’d made something that would make our devices even more helpful to more people. 

Source : How ultrasound sensing makes Nest displays more accessible