On the first day of the WebExpo I attended a talk on accessibility that really made me stop and think not just about design in general, but specifically about my own research topic on EV charging stations. The session started by showing the common issues people with disabilities face in daily life when interacting with digital interfaces. Then the presenters (including three people with real-life impairments) gave us a deep look into their world.
One of the speakers was visually impaired and had only 1% vision. Another was in a wheelchair and one had a chronic condition like diabetes. Hearing them speak about their everyday struggles with things that most of us take for granted, like picking up a package from a pick up post station or using a touchscreen, was eye opening. It made me realize how exclusive some of our current designs still are.

One key problem they highlighted was the rise of touchscreenonly interfaces. These don’t give any tactile feedback and are often completely inaccessible to blind users. As a solution, they showed us a great concept: when a user holds their finger longer on the screen, a voice (through text-to-speech) reads aloud what the button does. This gives blind or visually impaired users the confidence to use touch interfaces, especially when there are no physical buttons or guidance cues.

They mentioned the use of the Web Speech API, which made the solution sound very practical and implementable. What I found really interesting was how this solution could relate to my own research on EV charging stations. Right now, many charging stations already have touch displays. But what happens if a blind passenger, maybe not the driver, wants to start the charging process? Or what if we think further into the future, where self-driving cars are common, and blind or wheelchair users are traveling alone?
This made me realize: accessibility shouldn’t be an “extra”, t should be part of the core design, especially for public infrastructure. I was also thinking about the aspect that probably sometimes stakeholders or companies don’t believe accessibility is needed because they assume disabled people are not part of their target audience. This is a dangerous assumption. Everyone deserves access.
Furthermore about the text to speech interface I asked myself: “How do visually impaired people even know that a product has a long-press text-to-speech function?” I need to write the speaker about this because they didn’t mention it.
The talk has truly influenced how I think about my EV charging station prototype. I now feel it’s essential to at least consider how someone with limited sight, or physical ability, might interact with the interface. Whether that means adding text-to-speech, or voice control, or rethinking the flow entirely, accessibility should be part of the process.
I’m also planning to write to the speaker to ask some follow-up questions. It’s clear to me now: accessible UX is not just nice to have, it’s a necessity for a more inclusive future.