by Florian Beijers
A Vision of Coding, Without Opening your Eyes
I’m a coder. I’m also blind. Blind as a bat, you might say. And I was born this way.
When I mention this to my fellow human beings — the ones who’ve never suffered any form of visual impairment — they usually ask one of following questions:
- Then, how can you even read what I’m typing?
- Wow. How are you even able to code?
- Or, the crowd favorite — Do you dream?
I get these questions again and again. So let me answer these three questions in this blog post. I’ll try and sketch out an image for those of you who are curious about accessibility, and how blind people use computers to code, and to do the work of the 21st century.
How do you even know what I’m typing?
I like this question, because it allows me to immediately explain how blind people actually use computers.
A lot of people are under the impression that blind people require specially adapted computers in order to get anything done. Even some of my fellow Visually Impaired Persons (VIPs) tend to think this.
Well let me debunk this myth right here and now. I am currently typing this on a normal Dell Inspiron 15r SE notebook, which can be bought in any laptop store that sells (somewhat less recent) laptops. The machine runs windows 8 (not my personal choice, but UEFI is too much of a pain to downgrade). All I did to adapt it was install an open-source screen reader called NVDA.
A screen reader basically, at its most basic level — wait for it — reads the screen. It tells you the textual content of the screen with a synthesized text-to-speech Siri-like voice. Screen readers also allow for the use of a braille display, a device that consists of a line of refreshable braille cells that can form letters according to what content is highlighted on the screen.
This is really all the adaptation a blind computer user needs. Using this program, I can do many things you probably wouldn’t imagine being able to do with your eyes closed, such as:
- Browsing the web using Firefox
- Writing up reports in Microsoft Word, then marking them up to conform to college professors’ stringent layout demands.
- Writing up snazzy blog posts like this one
- Recording, editing, mixing and publishing audio (My hobbies include singing and making music)
- Using audio production apps like Reaper, Goldwave, Audacity and Sonar
- Coding websites and applications using Eclipse, (the ironically named) Visual Studio, and good old NotePad++
The reason I’m naming all these mainstream technologies is to show you that I can use them just like people who aren’t ocularly challenged.
If you’re writing the next big application, with a stunning UI and a great workflow, I humbly ask you to consider accessibility as part of the equation. In this day and age, there’s really no reason not to use the UI toolkits available. It’s a lot easier than you may think. Yes, these include the Android Activities, iOS NsViews and HTML5 widgets you may be thinking of.
I joined Free Code Camp a few weeks back and really loved the concept. I’ve been pursuing a degree in Computer Science for the last few years, and had failed a semester that involved a lot of work with the MEAN stack. So I was really happy to find such an amazing community to be a part of and learn with. I’m sure I’ll pass my semester with flying colors this time.
I sadly ran into accessibility issues when working through the by now famous Dash tutorials by General Assembly. The tutorials are undoubtedly good, but were completely unreadable for me because they chose to embed all their text in image slides that lack any textual description or content for screen readers to work with. Recall that screen readers read out textual content of the screen. They aren’t smart enough to interpret graphics.
Fortunately, some fellow campers at the Free Code Camp were sympathetic towards my plight and volunteered to transcribe all these slides for me. This offer left me ‘flabbergasted’, as our dear western neighbors across the sea would say. I’m very grateful for the work these people have done to further my studying. You guys know who you are. Thanks a lot!
But …how do you code?
If left paren x equals five right paren left brace print left paren quote hello world exclaim quote right paren right brace.
This is how a typical if-block in a Java-ish programming language would be read to me. You can see that it’s rather verbose. I tend to turn off the notifications for parenthesis and brackets until I find I need to match brackets while debugging, so that I don’t go crazy from the rather wordy descriptions of these signs. Others have solved this problem by substituting the default ‘left brace’ for something like ‘lace’ or ‘begin’, just to save a few milliseconds. The rate of speech is extremely fast for people who aren’t used to it.
For those of you who can’t follow this, it’s my computer reading out the first bit of this very blog post that I’m writing in NotePad++.
So, how I code doesn’t actually differ all that much from how others code. I’ve learned how to touch type, and mentally conceptualize my code so that I can work with it just like you guys do. The only difference is that I barely ever use a mouse for anything. I tend to stick with hotkeys and the command line instead.
Sadly though, in this field, all is not well. Premier tools that coders use every day, like the IntelliJ editor, as well as all its offshoots (PHPStorm, WebStorm, PyCharm), are completely inaccessible, due simply to the fact that the developers of these programs have not adhered to the accessibility guidelines. They’ve failed to give screen readers textual labels or accessibility descriptions to work with. The same goes for applications like SourceTree, which is slowly getting better, but is still a pain to use.
I therefore have to keep looking for tutorials, programs and tools that are accessible, and cannot simply pick up any off-the-shelf IDE.
How do you dream?
I promised to answer all three questions, so I’ll keep that promise. Don’t expect anything too resounding, though.
I dream just like you guys do. My mind translates experiences and impulses I’ve received during the day into dreams I have at night. The difference being that I don’t actually see anything.
Instead, I hear, smell and feel everything, just like in real life. The reason for this is simple: dreams based on visual imagery pull from your already stored visual knowledge to construct that visual imagery. Since I’ve been blind since birth, I have no visual frame of reference. The visual portion of my dreams run into a big fat 404 error: image not found.
Code with me
A Free Code Camp volunteer asked me to write this blog post to share my way of doing things with the world. After the welcome I’ve received from this community, I was all too happy to write this. I really hope you guys have learned something from it.
I could talk about this for hours, and this article has already grown far longer than I initially planned. If you have questions, come find me in the Free Code Camp chatrooms. I am Zersiax there, and I can be found by that name on Twitter as well.
Thanks for reading this. I’ll see you later! (Sorry. I really couldn’t resist that one) :)
I thought this would be fitting to repost — as my first-ever Medium post — the article that threw my life for a loop one year ago, back in January 2015.