Go to content

Last week I attended the first in a series of events RNIB plan to run, talking about important areas of society and the impact for people with visual impairments. The first event was about AI and how this can create a better world for people with sight loss.

Daphne Mavroudi-Chocholi, Managing Director of RNIB Enterprises, introduced the event and talked about how RNIB want to see AI create a more equitable world for blind people. Technology needs to be for everyone. RNIB want to be involved in the AI conversation to help make sure there are positive outcomes.

A large brick building with a white, classical design on the front. The RNIB logo is above the front door: RNIB See differently
RNIB offices in London, the Grimaldi building
Poster with thick coloured lines and the text: Accessible design is creative design

The future of AI

The first speaker was Ant Morse, Head of Innovation at Virgin Media. Ant is a futurist and talked about the change AI is bringing. There are 3 drivers of change: Technology, Societal, and Economical.

He gave the example of Virgin Media using glass technology to help older people near retirement remotely train beginners in the industry. Ant talked about the potential of AI digital assistants to really improve lives. He talked about ideas such as haptic wristbands that could help people navigate environments like the London Underground.

Ant also highlighted how everything is a service, resulting in “life as as service” and the important point of what happens to our memories when we stop paying. Do we just loose everything? I reflected that enabling people to own their own data, being able to access or export it in standard formats would help. If commercial companies can be convinced to support that.

Ant finished with stating society is the real driver for change and challenged attendees to experiment with some of these tools and see what impact they can have on your work or lives.

AI accessibility tools from Google

Christopher Patnoe, Head of Accessibility and Disability Inclusion at Google, talked about the opportunity of AI to give equity and autonomy to people with sight loss. AI can help with things like missing alt text, form labels, empty buttons. Chrome currently supports AI translations, converting images to text in PDFs, and captions. While I advocate for websites and web apps to be built properly, I can see how helpful browser tools to help overcome poorly coded sites can be.

Christopher talked about Project Guideline, a research project to see how AI can be used to help blind people exercise independently. He showed as short clip from a video with Thomas Panek, diagnosed legally blind from a young age, running unaidied through a park. It was pretty amazing and moving to watch.

He next talked about Android Lookout, designed to help people explore their environment through AI text and object detection. Another impressive tool, this one is out in public use. The information AI gives people is fairly basic, but works fast and no data is sent to the cloud so privacy is retained. Other tools exist such as Be My AI, which has to send pictures up into the cloud for analysis so it’s slower to use but more descriptive. The final demo was Guided Frame which gives feedback to people taking selfies to help ensure you (and your friends) are in the frame even if you cannot see it. This seemed such a simple, yet really useful idea. And smashed any preconceptions you may have that people with sight loss don’t want to take photos.

AI and health

Silvia Cerolini, Head of Digital Transformation & Innovation at Sanofi, talked about AI and health. She talked about the ways AI is being used to help speed up medical diagnosis. She talked about examples of OCT scans of the back of the eye being analysed by AI to help detect Parkinsons 2 years early and using large models to help predict disease progression. Health is an area where AI seems to have made real progress, such as speeding up cancer treatments, though this is normally machine learning (ML) rather than the large-language model (LLM) tech that has been making the news recently.

3 women and 3 men sitting on chairs in front of an audience at a panel event on AI. The man on the right, who is in a dark blue jacket and has glasses, is speaking and has his hand up as he answers a question.

How to ensure AI delivers a better future for people with sight loss panel

The event finished with a panel made up of the speakers joined by Damon Rose (BBC News editor), Seema Flower (Blind Ambition), Stacy Scott (Taylor Francis Publishing), and hosted by Robin Spinks from RNIB.

There was a lot of talk around ideas on how AI can assist people with sight loss, can it help make people more independent or more time rich? The idea of a walking cane that’s integrated with maps to help people navigate streets sounded like an amazing idea. Technology like voice activation or AI helping make software or websites more accessible.

Damon works on the Access All podcast. He talked about how he’s not allowed to use AI in his work – but he’s interested in what AI is good for and how we learn about risks. AI needs governance. The privacy component is a challenge.

My reflections

The event was really interesting, the stories about how AI can help people live better lives was really compelling and made me think differently about the potential for the technology.

AI assistants making websites more accessible was an interesting area. Things like accessibility overlays are used a lot on websites and come with a host of issues, often causing technical issues, privacy concerns, and can mislead website owners in thinking that’s all they need to make their content accessible.

I firmly believe we need to build a better web that is accessible to all. However, this doesn’t stop tools being used to help people navigate and use the web more effectively. The web is a big mess really of so much content, some great, some bad, we should all be able to access it. If AI tools can help, that’s a good thing. Web browsers have been fixing poor HTML on websites for many years, after all.

I recently read the A List Apart article on Opportunities for AI in accessibility. In the article Aaron Gustafson talks about areas of AI that are useful including voice, which was mentioned a lot at the RNIB event. He also noted the importance of diverse teams and data in training these models.

There is clearly a lot of potential for AI to help people with sight loss, but it’s important the governance and ethical side of AI is discussed robustly and not just by commercial companies developing this stuff. Things like privacy, copyright, bias, and how useful LLMs are for anything where facts matter are all big unresolved issues. I think there needs to be more voices contributing to the discussion in AI, testing these tools, and AI companies need to collaborate externally more rather than develop things in isolation. Getting influential charities like RNIB to join the conversation will help move this in the right direction.

Near the end Anna Tylor, chair of the board of trustees at RNIB, said her amibition is for AI to be mainstream and useful for all. That would be a really positive outcome, but we’ll need to work together to get there.