Accessibility as paradigm

Posts like this remind me why I never settled for something “in the meantime” for my co-working plans. Martin Pilkington (of MCubedSoftware) wrote about The Accessible Mac: I’ve not got a disability that limits my ability to use the computer and don’t personally know anyone with one, but I do have a strong sense that … Continue reading “Accessibility as paradigm”

Posts like this remind me why I never settled for something “in the meantime” for my co-working plans.

Martin Pilkington (of MCubedSoftware) wrote about The Accessible Mac:

I’ve not got a disability that limits my ability to use the computer and don’t personally know anyone with one, but I do have a strong sense that we should all be treated equal and have equal opportunities. When we can do something to help someone else and it is very cheap or very easy to do, then we should do it. Making accessible applications can be very easy and not take too much time, but can make a world of difference to some people.

Working with my Dad on the Mac highlighted some of the difficulties to me. My Dad is on the Blind persons register as well as having severe motor function loss due to nerve damage. Despite this he soldiers on, maintaining a Mac mini hooked up to a commodity LCD TV. I use BackToMyMac every now and then to guide him through some steps or fix an actual technical issue (iTunes is a PITA for blind folk) but he manages quite well with a combination of VoiceOver and holding down Control while Scrolling (which on the Mac, zooms the interface – useful tip for those of us who aren’t nearsighted too). We don’t have great haptics on computers – some engineers build a vibrating component into a tool and we call it haptic feedback. No. It’s. Not. A vibration without some sort of context is useless. We need to be thinking bigger than that – I’m faintly disappointed to learn that the new iPod Shuffle gets all of it’s text-to-speech conversions from the hosting Mac or PC and it’s not just a widget inside the tiny device. That said – Apple had Text-to-Speech working on the Newton over a decade ago (and I’ll save my rant about how cool the Routing menu was for another day). We shouldn’t rely on visuals and a vibration in a device to provide our interfaces when we have computers which are capable of much more.

The point being – my Dad isn’t even an extreme case. We have so many examples of digital ‘illiteracy’ due to poverty, ability, experience, fear and yet every day we hear about new services, new applications. There are times when I feel a little threadbare, stretched across a frame due to having so many inputs and outlets (and no, I’m not talking about InterfaceBuilder and XCode here). That feeling happens to me every couple of months, I can’t imagine what it must be like every single day to be confronted with this and not have all five senses and a brain that’s been pretty much wired into the Internet for nearly two decades.

Leave a Reply