Digital assistants are rapidly finding ways to become indispensable parts of our daily lives. We have apps like Siri on smartphones to answer our questions, and powered speakers like the Amazon Echo stand ready to deliver the latest news and weather, play our favorite music, or even read a story to children. But this power can also be applied to help with health issues. For example, there are many free Alexa skills designed to help users remember to take prescription medications on time.

One of the most interesting developments is a new skill for the Amazon Echo Show. This device includes a screen and a camera, which can be useful for video chats and other functions. These features also can serve a vision-impaired user, however, thanks to the “Show and Tell” skill.

The user can hold up a pantry in front of the screen and ask “Alexa, what am I holding?” Using computer vision and machine learning AI, the system can recognize the object and tell the user what it is. This is extremely helpful in the kitchen where it can be difficult to identify the contents of a can or package. The Echo Show can help users be more independent and confident, improving their quality of life.

This is just one more small step for inexpensive digital technology and artificial intelligence to help individuals compensate for various types of physical or cognitive impairments. The end result will be more convenience in all sorts of daily activities.