Recently, I was reminded of something that often gets missed in formal conversations about accessibility. You can have the right systems in place, the right technology, and the right intentions, and still face a real world barrier.
At Rix Inclusive Research, we are currently hosting an MA Social Work student on a 70-day placement. The student is visually impaired. They use a screen reader called NVDA (NonVisual Desktop Access) to access their laptop and complete placement based tasks, including work on their portfolio. NVDA is a free, open-source screen reader that converts onscreen information into speech and keyboard feedback, enabling visually impaired users to navigate their desktop, documents, web browsers, and software independently.
Like many visually impaired laptop users, the student also uses their own keyboard that they connect to their laptop, not as a preference, but as a practical accessibility tool. Their keyboard and its layout is familiar with customised shortcuts to open software, navigate tabs and windows which mean they can use their laptop independently and without assistance. On this day, something small but significant happened. The student opened their laptop and there was no sound. For many people, it probably wouldn’t matter. For a visually impaired user, it can mean the device has effectively become inaccessible.
Sound is not an extra or nice-to-have, it is the primary means through which the laptop communicates to a person who cannot see.The student told us that, normally, when the laptop turns on, there are audible cues, such as the start up sound, and then NVDA speaking aloud. This time, none of that happened.

What followed was a very real example of how accessibility can be dependent on everyday troubleshooting, not just infrastructure. On this particular day, three American interns, one from CAPA and two from AES, were working in the office alongside the student. They all immediately moved into problem solving mode, each one approaching the issue from a different angle. One intern was familiarising themselves with NVDA, one intern was troubleshooting Microsoft sound settings, and one intern, who was assisting the student by being their eyes, asked a thoughtful question. Would the student like them to take over the laptop briefly to try troubleshooting directly? The student said yes. That decision mattered. It reduced the pressure on them to direct every step. The actions of the interns shifted the dynamic from your equipment isn’t working to we’re solving this together. In inclusive settings, I realised, this kind of consent-led support, offering help but not assuming it is needed, is essential.
While all this was going on the student was on the phone to the RNIB helpline seeking IT guidance and support. This was the first time any of us had experienced IT support being provided over the phone to a person who can’t see their laptop. After speaking with two very helpful and knowledgeable RNIB IT support personnel and carrying out their suggested checks with no joy, one of the interns suggested playing a YouTube video. When the video was played, there was no sound. This indicated that the issue wasn’t with the screen reader but was a laptop audio issue.
As time passed and our attempts to solve the problem weren’t working, the student’s frustration and anxiety understandably increased. It’s difficult enough to troubleshoot technology; it’s even harder when you can’t see what others are seeing, and when your usual tools for independence, sound and screen reader output, are unavailable. By this time, there were four of us engaged in troubleshooting, checking settings, testing outputs, trying different steps, searching support guidance, and cross-referencing advice. It took some time and we can’t say with certainty which specific action resolved the problem. But eventually, the sound returned, and NVDA resumed speaking. The student’s relief was immediate and their independence returned with it.
There’s a temptation to end this story here with its happy ending. However, the learning gained from this experience was profound. When we talk about the day to day reality of accessibility, institutions can have assistive software available, policies in place, and inclusive intentions on paper. Nevertheless, barriers still emerge in ordinary moments – a silent laptop, an update that changes settings, a device that behaves unpredictably. When that happens, the question becomes: who has the knowledge, time, and confidence to respond quickly and calmly? And how do we ensure that troubleshooting doesn’t become another hidden burden placed on a person with a sensory need or disability?
What I am taking from this experience is simple: accessibility is not a one time provision. It is about ongoing practice. It lives in the small decisions, how we respond when assistive technology fails, and how we build teams who can learn from these realities together.
We are proud of the collaborative, respectful approach we saw from our interns Ali, Lobsang and Tara. They became active allies, not only offering support, but gaining insight into how quickly independence can be disrupted by something as basic as missing sound. And we are grateful to our MA student for navigating a difficult moment with persistence, clarity, and trust.
Kanchan Kerai
