Skip to Main Content
blind man sitting at a park petting his service dogs

AI Delivering Autonomy for People with Blindness

David Gibson

- David Gibson

I don't know about you but my AI driven feeds are crammed with AI stories. Some frightening. Some exciting and mind-expanding, as is this.

Keywords: AI, AT, Autonomy.

I recently attended the CSUN Assistive Technology Conference in oh-so-lovely Anaheim.  The conference attracts leading academics, technologists, entrepreneurs and accessibility advocates. Many canes and lots of guide dogs.

While my primary purpose in attending was to learn about current best practices in online accessibility relating to what we do - mobile, SaaS and website accessibility auditing and consulting, what really grabbed me were the tools designed to provide greater autonomy and safety for people with visual disabilities.

Since attending CSUN 3 years ago, my mother developed macular degeneration, which combined with cataracts to dramatically reduce her vision. So this year, instead of navigating the show through a lens focused on online accessibility, I found myself searching out tools to help my mother. I can report that I found it assuring to find so many brilliant and driven people creating amazing devices that will enable my mom and others to live more autonomous lives.  

One such person is my new friend Heather. She works for a friendly competitor, AccessibleWeb down the road from us in Burlington, VT. We flew out together, and Heather, who has blindness, turned me on to one of her key tools: Aira. 

Remote Human Guides via App : Aira

Aira provides live on-demand visual interpreter services for people with vision issues. I was really intrigued and sat in on their session. Their app connects customers who they refer to as Explorers with a live Agent who can interpret what the Explorer shows them through their phone’s lens. This enables users to independently get assistance with things like shopping, navigating, and common professional and house-hold tasks that would otherwise require asking for help. During the demo Janine opened the app, connected with an agent, and uploaded a travel receipt that she needed to submit for reimbursement. Within 2 minutes the instructions were conveyed, and she had back the cropped image of the receipt to submit. 

Great, but not exactly cheap, and not especially scalable. 

 

AI Virtual Guides : Envision

Where Aira relies on live human Agents, Envision leverages GPT-4 AI to interpret what’s seen through either the lens of the phone or the headset. The AI can interpret objects and surroundings, and read articles, packaging and signs in over 60 languages. 

Envision adds a headset for hands-free assistance to their app (which also can integrate with Aira)- remeniscent of Google Lens. When juggling a guide dog or a cane, hands-free is key.

The glasses are fantastic, but it was the AI and the app that were really eye opening. In the Envision demo at CSUN, their CTO, Karthik Kannan, grabbed a menu from the hotel restaurant. He held it up and asked “show me the vegetarian options”. The menu lacked a veggie section or any indicators of which dishes were vegetarian. The AI was able to interpret each dish and consider its ingredients to allow it to guide the user through those dishes without meat.

Here's an older video demonstrating its ability to read labels in a different language:

There are a number of other virtual visual interpreters as well - although I didn’t see these at CSUN. Apple, Google  and Microsoft have theirs, but leveling up to GPT-4 appears to give Envision deeper interpretive and interactive capabilities. 

 

Hybrid Assistant: Human or AI : Be My Eyes

Be My Eyes is a hybrid offering both human and GPT-4 AI guides. So one can use AI to answer basic questions with the option to pull in a human guide only when needed and to save on that cost. Google and Microsoft have both partnered with Be My Eyes.

Be My Eyes are also doing a good job of harnessing current marketing methods to get the word out. Marketing is where I see most of these companies trailing, but Be My Eyes is tapping influencers within the community to get the word out.

Most exciting is that once out of beta, the Be My Eyes AI App will be FREE!

 

Autonomy Tipping Point

The baby boomers are our first digital seniors, and they’re converging with the disability community to form quite an attractive market for product developers. The market and the demand is there, and so is the technology. So I think we can expect some really exciting advances in products that will greatly expand and enhance the world for people with disabilities and provide the autonomy everyone deserves.

 

photo credit Gustavo Fring on Pixels