The touch-screen experience on the new iPad should be nothing short of amazing, if the iPhone is anything to go by. The user’s experience on this new device will tend towards a perfect one, a nirvana of human-machine connectedness, but comes with one major disclaimer: You have to be able to see the interface.
The iPad will hopefully be bundled with Apple’s Shining Knight of Accessibility dubbed VoiceOver; which should belay some fears of it being truly inaccessible, but I can’t help but think that visually impaired users are getting the shaft once again. Sure, you could plug in an external keyboard, and use the headphones to hear your way through the system, but that detracts from the portability and practicality of a device such as the iPad.
Accessibility and Responsibility
This weekend also saw the introduction of a Technology-based Bill of Rights for the Blind in the United States. Such a bill would put some limits on what devices government-run institutions can purchase, tending towards devices that can be used by every member of society.
The iPad is more than just a portable information consuming device, it is a powerful application platform. These apps, will they be accessible to visually-impaired users? Will there, or should there be guidelines towards creating accessible applications?
Creating hardware that is accessible is relatively easy compared to enforcing policy or creating guidelines in software accessibility standards. It’s been done for the web through the WCAG, but will we see a SAAG emerge any time soon?
This problem of neglecting the needs of non-seeing users has its roots in the touch interface – it’s a flat, almost-non-tactile surface. You can feel it’s there, but you can’t feel where it begins, ends or is different.
As the hype machine accelerates to our material oblivion, we’ll see more touch-based devices enter the market – devices founded out of jealous and incompetent mimicry for the sole purpose of making money. Do you think these competing devices will be coupled with an Accessibility plugin? Highly doubtful.
So, what can we do about not leaving the visually-impaired behind in our gadget-driven pursuit? Well, for one, the visually-impaired will buy devices they can use effectively. For truly holistic user interfaces, what we really need here is a miracle, and I’m not necessarily talking about restoring sight to the blind.
Waiting for a miracle
The majority of possible solutions could be ones born in Sci-fi, such concepts as:
- Tactile display-surfaces,
- Retinal or Visual Cortex Implants,
- and the unholy-grail: The Neural Interface
Imagine a molten-glass-like surface, cool to the touch, that could grow out of the surface towards the user as the visuals underneath it required these height differences. A height-map would accompany the visuals, describing the pits and mounds of such things as buttons and form input fields. When you focus on a text field, a keyboard would rise out of the display surface. You would be able to feel where the home-keys are.
Think of this surface as a mix between an LED display panel and an Refreshable Braille Display. Apple have shown their commitment to visually-impaired users by supporting Braille Displays and creating such wonderful software as VoiceOver: I wonder if they’ve filed a patent for a truly tactile display surface? I imagine it won’t be long before such a device is available to the general public.
Retinal or Visual Cortex Implants
It also won’t be long before ‘Visual Prosthesis’ becomes more than just a research term. Visually-impaired patients could have an electronic retina implanted, or instead opt for the Rolls Royce of artificial sight: a Visual Cortex Implant. By the time these implants become available, I reckon we would have grown out of our fad-like needs to use non-tactile touch screens, instead opting for the unholy grail of cyborgism: The Neural Interface.
The Neural Interface
No Sci-Fi attribution would be complete without a mention of Peter F. Hamilton’s epic:The Night’s Dawn Trilogy. In it, he described a neural implant dubbed a Sensevise – a full and recorded sensory experience was shareable to anyone else with the same implant. With such a device, gone are the needs of eyes and fingers if we can interface such experiences directly to the brain. Hamilton took the concept even further by describing a gene that would allow the bearer an ability called Affinity – a form of telepathy. As time has shown, it’s just a matter of discipline, imagination and believing the impossible possible before we ‘see’ such advancements.
Back to Reality
In the meantime, let’s start getting realistic about including every member of society in these technology-driven times. If you create software or hardware, I implore you to consider those on the periphery of our ‘norms’ by creating holistic computing experiences. Accessible computing is good computing.