Home | Jantrid
PennyTel is an incredibly cheap VoIP provider serving Australia (among other countries) which I have been using for several years. For the most part, I am fairly happy with them, especially the price. Unfortunately, their customer portal has many accessibility problems, and despite a polite request from me quite some time ago, nothing has been done to rectify this.
The biggest issue is that there are many buttons on the site which are presented using clickable graphics, but they have been marked with @alt="", indicating that the graphics are for visual presentation/layout only and suggesting to screen readers that they shouldn't be presented to the user. Obviously, this is very wrong, since these graphics are buttons which the user might wish to activate. It's bad enough that no text alternative is provided, but specifying empty text is extremely incorrect. With the current version of NVDA, this issue makes the portal practically unusable.
This turned out to be a great success. I now have a Greasemonkey script that not only gives friendly labels to many graphic buttons, but also injects ARIA to transform these graphics into buttons. In addition, there are parts of the portal which use graphics to indicate which option has been selected and the script turns these into radio buttons using ARIA. There is a navigation bar where the items are only clickable text, which the script changes into links for quicker navigation using ARIA. Finally, @alt="" is removed from all other clickable graphics which the script doesn't yet know about, which at least allows screen readers to present the graphic using their own algorithms to determine a label. Once the script is installed, this all happens transparently without any special action.
If you happen to have a PennyTel account and would find this useful yourself, you can grab the script. I'm sure there are more things I can improve, but this is sufficient to make the site quite usable.
There once was a guy named James Teh
Who disliked the start of the day.
He hated awaking,
Was bored by fast-breaking
And wished it would all go away.
When an assistive technology (AT) user discovers an application that is inaccessible in some way, they will generally hold one of two parties responsible: the AT developer or the application developer.
In the Apple world, the application developer is generally responsible for ensuring accessibility. Users don't tend to complain to Apple when an application is inaccessible; they complain to the application developer. More often than not, this is correct. An accessibility framework has been provided to facilitate application accessibility and Apple's assistive technologies utilise this framework, so it's up to the application to fulfil its part of the bargain.
In contrast, in the Windows world, the AT developer is generally held responsible. In the past, before there were standard, fully functional accessibility frameworks, I guess this was fair to some extent because application developers had no way of making their applications accessible out-of-the-box. As a result, AT developers worked around these problems themselves through application specific scripting and hacks. However, Windows has had standard rich accessibility frameworks such as IAccessible2 and UI Automation for several years now. Therefore, this is no longer an acceptable justification. Despite this, the general expectation still seems to be that AT developers are primarily responsible. For example, we constantly receive bug reports stating that a certain application does not work with NVDA.
Some might argue another reason for this situation is that application developers have previously been unable to test the accessibility of their applications because of the high cost of commercial ATs. With the availability of free ATs such as NVDA for several years now, this too is no longer an acceptable excuse.
So why is this still the case in the Windows world? If it's simply a ghost from the past, we need to move on. Maybe it's due to a desire for competitive advantage among AT vendors, but the mission of improving accessibility and serving users as well as possible should be more important. If it's resultant to poor or incomplete support for standard accessibility frameworks, ATs need to resolve this. Inadequate or missing support for accessibility in GUI toolkits is probably part of the problem. We need to work to fix this. Perhaps it's because of a lack of documentation and common knowledge. In that case, the accessibility/AT industry needs to work to rectify this. Maybe there just needs to be more advocacy about application accessibility. Are there other reasons? I'd appreciate your thoughts.
Whatever the reasons, I believe it's important that this changes. Proprietary solutions implemented for individual ATs are often suboptimal. Even if this wasn't the case, implementing such solutions in multiple ATs seems redundant and wasteful. Finally, the more applications that are accessible using standard mechanisms, the more users will benefit.
Disclaimer: This is primarily based on my own personal experience. Also, this all happened last year, so some of the finer details are a bit vague now.
Early last year, my trusty 6 year old Nokia 6600 was finally starting to die and I decided it was past time to move on. (Amusingly, that phone even survived being accidentally dunked in a glass of wine.) My aim was to satisfy all of my portable technology needs with one device, including phone, email, web browser, audio player (standard 3.5 mm audio socket essential), synchronisable calendar, synchronisable contact manager, note taker, ebook reader and portable file storage. And so the quest began.
Making the (First) Choice
My ideal mobile platform was Android. Aside from satisfying all of my needs, it is an open, modern platform. Unfortunately, although I seriously entertained the idea, a great deal of research and playing with the Android emulator led me to realise that Android's accessibility was at an unacceptably poor state for me.
I considered the iPhone. I've written about the iPhone's out-of-the-box accessibility to blind people before and have played with an iPhone several times since. Although I was pretty impressed, especially by the fact that VoiceOver comes out-of-the-box, there were two major problems with the iPhone for me. First, I dislike the closed nature of the iPhone environment, commonly known as the "walled garden" or "do things Apple's way or not at all". Aside from the principle (you all know I'm a big advocate for openness), I would be unable to play ogg vorbis (my audio format of choice) in iPod, I would have to transfer music and files with iTunes (which I detest), I couldn't use it as a portable file storage device, and I would be limited to apps that Apple permitted (unless I wanted to jailbreak). Second, I wanted a device with a physical keyboard. In the end, I decided against the iPhone.
I briefly considered another Symbian Series 60 phone. However, based on past experience (both mine and others'), I didn't think I would be able to play audio and use the screen reader simultaneously, which immediately disqualified it for me, although I've since been informed that is no longer true on some newer phones. I also feel it is a dying platform. There are probably some other reasons i discounted it, but i can't remember them now. If nothing else, I wasn't entirely happy with it and wanted a change.
Finally, I settled on Windows Mobile, specifically the Sony Ericsson Xperia X1, with Mobile Speak. I guess Windows Mobile is a dying platform too, but at the time, I felt it was perhaps less so and provided more functionality for me. While the operating system itself isn't open, you can develop and install apps as you please without being restricted to a single app store. There are several ogg vorbis players for Windows Mobile. I also had the option of buying Mobile Geo for navigation if I wanted to later. I was warned by someone who played with an older phone that Mobile Speak on Windows Mobile was fairly unresponsive, but unable to test this myself, I hoped that it might be better with a less resource intensive voice such as Fonix and/or a newer phone or that I'd get used to it.
Frustration, Pain and Misery
A bit less than $800 later, I got my new phone and Mobile Speak in June. I expected and accepted that it would take me some time to get used to it. I loved finally being able to access the internet from my phone and play audio through proper stereo headphones. However, despite this, the next few months were just downright painful and frequently induced rage bordering on violence. I often had the urge to throw my new phone across the room several times a day.
Primarily, this was due to Mobile Speak. I found it to be hideously unresponsive, unstable, unreliable, inconsistent and otherwise buggy as all hell.
It's worth noting that I'm not saying that this is all entirely Mobile Speak's fault. I suspect Windows Mobile and other applications may play a part in this dodginess.
- The unresponsiveness proved to be unacceptable for me, often taking around half a second to respond to input and events, even using Fonix. Aside from the general inefficiency this caused, this made reading text incredibly tedious, and despite the physical keyboard, typing was painful due to the slow response to backspace and cursor keys.
- Mobile Speak crashed or froze far too often and there was no way to resurrect it without restarting the phone.
- In Internet Explorer, working with form controls was extremely inconsistent and unreliable, especially multi-line editable text fields. Quick navigation (moving by heading, etc.) was very slow and failed to work altogether in many cases. On my phone, Google services (including Google Search, even the mobile version, of all things!) refused to render at all.
- I encountered problems when reading email as well. Sometimes, Mobile Speak wouldn't render emails. Others, it wouldn't let me navigate to the headers of the email, which is essential if you want to download the rest of a message that hasn't been fully downloaded.
- Reading text in Word Mobile was even slower than everywhere else, which made reading ebooks infeasible.
- Braille display scrolling is either broken or unintuitive. On my 40 cell display, Mobile Speak only seemed to scroll half the display and I couldn't find a way to change this.
- Definitely quite a few other bugs I can't remember all of the details about...
I had two other major gripes with Mobile Speak.
- Despite years of experience with screen readers, I found the Mobile Speak commands, especially the touch interface, to be tedious and difficult to learn. The touch interface is inherently slow to use because you need to wait after certain taps to avoid them being construed as double or triple taps.
- Mobile Speak's phone number licensing model was a major annoyance for me when I went overseas for a few days. Mobile Speak allows you to use it with a SIM card with a different number for 12 hours, but you have to reinsert the original SIM card after 12 hours if you want Mobile Speak to continue functioning as a licensed copy. Also, I seem to recall that this also applied if the phone was in airplane mode.
There were other things that irritated me about my phone and its applications.
- I found Windows Mobile in general to be very sluggish. Even something as fundamental to a phone as dialling on the phone keypad or adjusting the volume was incredibly laggy, sometimes taking several seconds to respond to key presses.
- Far too many apps, including Google Maps and both of the free ogg vorbis players I tried, had significant accessibility problems.
- Windows Media doesn't have support for bookmarking, which made reading audio books infeasible. There was a paid app that provided this and other functionality i wanted, but i wasn't willing to pay for it in case I discovered it too had major accessibility problems.
- Windows Mobile doesn't have enough levels on its volume control.
- If the phone is switched to silent, all audio is silenced, including Mobile Speak.
- Internet Explorer doesn't support tabbed browsing!
- Windows Mobile only supports one Exchange Active Sync account, which meant I couldn't maintain separate personal and work calendars.
The Snapping Point
In the end, after less than 6 months, I just couldn't take it any more. I tried to learn to live with it for at least a couple of years, as I'd already spent so much money on it, but it was truly unbearable. It's worth noting that a close friend of mine had a very similar experience with Windows Mobile and also gave up in equivalent disgust around the same time. It particularly angers me that I paid $315 for a piece of software as buggy as Mobile Speak. I started playing with Jen's iPhone a bit more, and finally, I gave in and got my own.
The iPhone: Peace at Last
For reasons I mentioned above, I felt like I was going to the dark side when I made the decision to switch to the iPhone. Among other things, it's a bit hypocritical of me, given my belief in and advocacy for openness. Nevertheless, I have not looked back once since I got it. It has truly changed my life.
The in-built accessibility of the iPhone is amazing. I strongly believe that accessibility should not incur an extra cost for the user and Apple have done just that. VoiceOver is very responsive. Usage is fairly intuitive. All of the in-built apps and the majority of third party apps are accessible. Once you get used to it and start to remember where things are on the screen, navigating with the touch screen becomes incredibly efficient. The support for braille displays is excellent; I can see this being very useful next time I need to give a presentation. The triple-click home feature means that I can even toggle VoiceOver on Jen's phone when needed, which has been really useful for us when she is driving and needs me to read directions.
I still hate iTunes, but thankfully, I rarely have to use it. I manage music and other audio on my phone using the iPod manager component for my audio player, foobar2000, which is excellent and even transcodes files in unsupported formats on the fly. The iPod app is great, supporting gapless playback for music and automatic bookmarks for audio books and podcasts.
- Very nice email and web browsing experience.
- Push notifications for mail, Twitter, Facebook and instant messaging.
- Multiple calendars.
- Skype on my phone, which is far nicer than being tied to my computer when on a Skype call.
- Voice control, which works well most of the time.
- Smooth reading of ebooks using iBooks.
Resultant to all of this, I find I spend far less time in front of my computer outside of work hours. Also, when I'm away from home, even on holiday for a week, I often just take my phone. Previously, I had to take my notebook almost everywhere.
Like all things, the iPhone isn't perfect. I still dislike the walled garden, and I have to live with transcoded audio and can't use my iPhone as a USB storage device because of it. I am definitely slower at typing on the iPhone than I was on the numeric keypad on my old Nokia 6600, although perhaps surprisingly, I'm probably faster on the iPhone than I was on the Xperia X1. There are definitely bugs, some of which I encounter on a daily basis and are very annoying.
Even so, I love the iPhone. I'm willing to make some sacrifices, and I can live with bugs that are at least consistent and easy enough to work around. On the rare occasions that VoiceOver crashes, I can easily restart it with a triple click of the Home button. I wish I'd gone for the iPhone in the first place and not wasted so much money on the Windows Mobile solution, but ah well, live and learn.
The following email sent to NV Access administration yesterday is an amazing mastery of politeness, eloquence, intellect and linguistic ability. I've reproduced it verbatim below, except for the obfuscation of some words for reasons that will become clear as you read. Enjoy!
From: Dave. I lost my cookie at the disco. <computerguy125@****>
The Android mobile platform really excites me. It is open (which cannot be said of the iPhone) and is incredibly successful in many respects. I would almost certainly choose an Android phone... except for the poor state of Android accessibility.
Date: Sun, 13 Feb 2011 23:40:16 -0500
hEY MOTHER F****R YOU SCREWED UP MY LAPTOP. fIX YOUR SCREEN READER. bLIND ASS MOTHER F****R. f***ING BLINKY. cHANGE YOUR SHORTCUT KEY SO IT DOESN'T CONFLICT WITH SYSTEM ACCESS TOO. yOU DON'T KNOW THAT THEN LOOK IT UP. mOTEHR F****R.
Email services provided by the System Access Mobile Network. Visit www.serotek.com to learn more about accessibility anywhere.
Note: I will primarily discuss access for blind users here, since that is what I am most familiar with. However, some of this applies to other disabilities as well.
In the Beginning
In the beginning, there was no accessibility whatsoever in Android. It would have made sense to design it from the start with accessibility in mind, which would have made it much easier, but as is sadly so often the case, this wasn't done. Nevertheless, many other platforms have managed to recover from this oversight, some with great success.
Then came the Eyes-Free Project, which created a suite of self-voicing applications to enable blind users to use many functions of the phone. Requiring blind users to use these special applications limits the functionality they can access and completely isolates them from the experience of other users. This is just a small step away from a device designed only for blind users. I guess this is better than nothing, but in the long-term, this is unacceptable.
Integrated Accessibility API and Services
With the release of Android 1.6 came an accessibility API integrated into the core of Android, as well as a screen reader (Talkback) and other accessibility services. A developer outside Google also began working on a screen reader called Spiel. This meant that blind users could now access standard Android applications just like everyone else.
Unfortunately, the Android accessibility API is severely limited. All it can do is send events when something notable happens in the user interface. An accessibility service such as a screen reader can query these events for specific information (such as the text of an object which has been activated), but no other interaction or queries are possible. This means it isn't possible to retrieve information about other objects on the screen unless they are activated, which makes screen review impossible among other things. Even the now very dated Microsoft Active Accessibility (the core accessibility API used in Windows), with its many limitations and flaws, allows you to explore, query and interact with objects.
Inability to Globally Intercept Input
In addition, it is not possible for an accessibility service to globally intercept presses on the keyboard or touch screen. Not only does this mean that an accessibility service cannot provide keyboard/touch screen commands for screen review, silencing speech, changing settings, etc., but it also makes touch screen accessibility for blind users impossible. A blind user needs to be able to explore the touch screen without unintentionally activating controls, which can't be done unless the screen reader can provide special handling of the touch screen.
Inaccessible Web Rendering Engine
The web rendering engine used in Android is inaccessible. In fact, it's probably impossible to make it accessible at present due to Android's severely limited accessibility framework, as a user needs to be able to explore all objects on a web page. This means that the in-built web browser, email client and most other applications that display web content are inaccessible. This is totally unacceptable for a modern smart phone.
IDEAL Apps4Android's Accessible Email Client and Web Browser
IDEAL Apps4Android released both an accessible email client and web browser. The accessibility enhancements to the K9 email client (on which their application is based) have since been incorporated into K9 itself, which is fantastic. However, access to the web still requires a separate "accessible" web browser. While other developers can also integrate this web accessibility support into their applications, it is essentially a set of self-voicing scripts which need to be embedded in the application. This is rather inelegant and is very much "bolt-on accessibility" instead of accessibility being integrated into the web rendering engine itself. This isn't to criticise IDEAL: they did the best they could given the limitations of the Android accessibility API and should be commended. Nevertheless, it is an unsatisfactory situation.
More "Accessible" Apps
There are quite a few other applications aside from those mentioned above that have been designed specifically as "accessible" applications, again isolating disabled users from the normal applications used by everyone else. Again, this isolating redundancy is largely due to Android's severely limited accessibility framework.
Unfortunately, even though Android is open source, solving this problem is rather difficult for people outside the core Android development team because it will require changes to the core of Android. The current accessibility framework needs to be significantly enhanced or perhaps even redesigned, and core applications need to take advantage of this improved framework.
While significant headway has been made concerning accessibility in Android 1.6 and beyond, the situation is far from satisfactory. Android is usable by blind users now, but it is certainly not optimal or straightforward. In addition, the implementation is poorly designed and inelegant. This situation is only going to get messier until this problem is solved.
I find it extremely frustrating that Android accessibility is in such a poor state. It seems that Google learnt nothing from the accessibility lessons of the past. This mess could have been avoided if the accessibility framework had been carefully designed, rather than the half-done job we have now. Good, thorough design is one of the reasons that iPhone accessibility is so brilliant and "just works".
As Windows screen reader users will know, there is no screen reader included in Windows. Instead, users requiring a screen reader must obtain and install a third party product. Yes, there is Microsoft Narrator, but even Microsoft know that this is hardly worthy of the name "screen reader". :)
A few years ago, Apple revolutionised the accessibility industry by building a fully fledged screen reader, VoiceOver, right into Mac OS X. Ever since, many have asked why Microsoft can't do the same for Windows. Many are angry with Microsoft for this continued lack of built-in accessibility, some using it as support for the "why Apple is better than Microsoft" argument.
Here's some food for thought. I'm not sure Microsoft could do this even if they wanted to; their hands are probably tied in a legal sense. If they did, they could very likely be sued by assistive technology vendors for anti-competitive conduct, just as they have been sued several times concerning their bundling of Internet Explorer with Windows. Once again, Apple don't have to be concerned with this because there wasn't an existing screen reader on Mac OS X and they don't have the dominant position in the market.
I have no evidence for this argument. Perhaps I'm wrong, but history suggests that it is highly likely that I'm not.
Even as one of the lead developers of NVDA, I'm first and foremost a blind user who wants the best possible access, both for myself and other blind users. As such, I would very much welcome a screen reader built into Windows. Competition is good. A built-in screen reader doesn't mean that other screen readers can't exist. If the built-in solution were good enough, then there would be no need for NVDA to exist. If it weren't, NVDA would drive accessibility to improve through innovation and competition.
Many (perhaps the majority of) notebook/laptop computers now have a strip of touch sensitive keys above the normal keyboard. These keys are not tactile in any way and, depending on the computer, provide such things as multimedia controls (volume, mute, play, stop, etc.), toggles for wireless radios and other special functions. I've always been concerned about how I would access such controls without any sight. Unfortunately, in my search for a new mnotebook, all of the notebooks which interested me included them, so I decided to just live with it. I bought an Acer Aspire 3935 which, among other things, has touch sensitive keys to toggle bluetooth and wifi. I began to ponder ways to place some sort of tactile marker on or near these keys so I can find them. However, it recently occurred to me that there's already a perfectly good tactile locator for these keys: the function keys on the normal keyboard, which lie immediately beneath the touch sensitive strip. For example, on my computer, the key for toggling bluetooth is just above the f11 key, so all I have to do is locate the f11 key and move my finger directly up to hit the bluetooth toggle key. This is blindingly obvious in hindsight, but well... hindsight is a wonderful thing. :) Of course, sighted assistance may be required initially to find the keys if trial and error is insufficient.
On Saturday, 12 December 2009, Jen and I got married. :) The wedding was perfect; I could not have asked for anything more. First, my groomsmen were fantastic and I had a great time getting ready with them. The ceremony was absolutely beautiful; there were a lot of happy tears in the chapel. Although I knew all of the music and had spent time arranging and rehearsing it, this was the time for me to just open my mind and heart, to exist entirely in the moment, to truly listen to and feel its meaning. Every word of the ceremony - the celebrant's sections, the music, the reading and the vows - was sincere and deeply significant to us. The reception was very enjoyable and memorable, especially the speeches, all of which were terrific and touching.
Throughout the wedding, I was quite proud and happy to be the centre of attention alongside Jen. :) I was truly humbled and awed by the love and respect for us that everyone - our family and friends - showed. And of course, the whole point of all of this was that Jen and I were publicly declaring our love for each other and intent to be together for the rest of our lives.
The wedding also helped me on a personal level in ways I had not expected. I am a self-critical perfectionist by nature, sometimes to an almost self-destructive extent. I waste so much time regretting, wishing I could do things better and worrying about both the past and future that I almost miss out on the present. Nothing I do is ever good enough for me. However, the wedding was a transcendental experience and helped me to see beyond this. I have made many mistakes, I've downright failed sometimes, but I realised that my path, with all of its ups and downs, had led me to this moment and I wouldn't change it for the world. If I'd found such happiness and love and earnt the respect of so many, especially Jen :), I must have done something right overall. It has left me with a lasting, wonderous sense of clarity, relief, peace, contentment and confidence. I have a great deal to live and learn, but I am who and where I want to be. Now I just have to try to take this state of mind into the new year and beyond. :)
Note: This post has been floating around on my computer for years, but I never did finish it. I've decided to just post it in its incomplete form; something is better than nothing. :)
Jamie: The last destination of our honeymoon was Kuala Lumpur, Malaysia. Aside from breaking up our long flight back to Australia, we wanted to catch up with some of my relatives: my Uncle Jerry, Auntie Janet and Dad's cousin S.Y. Upon arriving in KL, we were picked up by my uncle and aunt, who (very generously) drove us to our hotel, also giving us a brief tour of KL on the way. KL is a city that never seems to sleep. It is incredibly well lit at night. Most shops don't close until 10pm, some even later. The city is busy with people on the streets and in shops even late on a weekday evening.
The 5 star Traders Hotel where we stayed was absolutely fantastic and incredibly well priced. Located in the heart of the new city centre, it was walking distance from everything we wanted to visit.
The first thing we did after emerging from our room on Tuesday morning was to find some brunch. Both Dad and Uncle Jerry recommended that we visit the food court in the Pavilion, which is a huge shopping centre. And wow, what a food court it was, sprawling across most of one level of the complex. Food in Malaysia is so damned cheap and so delicious. Among other things, we had Malaysian satay, which is just incredible and miles above the satay one gets in Australia.
Later in the day, we visited KL's acquarium, located in the KL Convention Centre. Although much of the display was obviously visual and informational, but there were also three touch pools. I was able to touch a little shark, a stingray, a horse-shoe crab and a sea cucumber, all of which were fascinating. The most bizarre was definitely the last, which was just... squishy. You can literally squish it in your hand; it's a little creepy.
On Wednesday, we met my aunt and uncle again, along with S.Y. I've never met S.Y. before, despite having heard a lot about her over the years, so it was nice to finally meet her. After spending a very nice hour or so chatting at the hotel, during which we had to call hotel staff to rescue Auntie Janet from the bathroom due to a broken lock :), we went to lunch at a restaurant specialising in Penang food and ate a hell of a lot of it. We were introduced to two delicious side dishes and desserts which I've never had or seen before in Australia, all of which I will miss.
On a random impulse, Uncle Jerry, Jen and I decided to visit the music science section of KL's science centre. ...
As I write this, I'm on the plane back to Australia. The honeymoon is over. I'm a little sad, but also glad to be coming home and looking forward to seeing everyone again. It has been a fantastic and memorable trip. We really have had the time of our lives.