Frustrations with NVDA-support

Because NVDA is free software, we do not have the resources to provide free, direct technical support to users. Therefore, the NVDA-support email list was set up as "a place where users of the NVDA screen reader are able to ask questions about how to use NVDA".

A common complaint about mailing lists like this is that they produce a lot of messages. Some users cannot (or do not wish to) handle this high email traffic and therefore end up unsubscribing from the list fairly quickly, thus limiting its usefulness. Instead, users contact us directly or are driven away from the project. When directed to the tracker and email lists, one user who contacted me directly complained about having to "sign up to a thousand lists".

To combat this, we decided to selectively moderate the list, as full moderation is too time consuming. Users who broke (or bordered on breaking) list rules or otherwise had the potential to generate a lot of unnecessary traffic were moderated. Any post from those users that was irrelevant, unnecessary or might start such a thread was rejected.

Unfortunately, several users have been unhappy with or even outright offended by this. Today in particular, I rejected a post from a user (previously moderated for an off-topic post) which, while intended to be helpful, provided an incorrect (or at least very indirect) answer which I believed would cause more questions than it answered. No accusation was made, but this user took this very personally and made it clear that he would no longer support the project in any way.

Another common gripe is that users are often told to read the documentation when they ask questions. If it seems that a user hasn't even tried to read the documentation before asking a question, I do not think this is unwarranted. If they've at least tried and don't understand, this is a different matter entirely. If they don't wish to make the effort to at least try to understand the documentation, they should not expect free support.

It seems I can't win. I tried to do what I thought best for the NVDA community in limiting the traffic on the list so more users would be encouraged to use it. As a result, I'm accused of being unfair, draconian and ungrateful. Therefore, I've disabled all moderation on the list and I am withdrawing from the list myself for a while. I am done with support for now.
read more

Inexcusable Inaccessibility: APC Goalball Coverage

Australia are currently hosting the IBSA Africa Oceania Goalball Regional Championships, where both men's and women's teams are playing to qualify for the London 2012 Paralympic Games. Go Australia! :) They've provided a live internet stream with commentary, which is fantastic. Unfortunately, the Flash video player they're using is completely inaccessible to screen reader users. (Technically, it uses windowless Flash, which is not accessible.) Worse, the page doesn't even play the video automatically when it opens, which means you have no choice but to use the video player controls. Allow me to emphasise the absolute, inexcusable absurdity of this situation: they are broadcasting a sport for the blind, but the broadcast is inaccessible to blind people.

Digging through their code, it's not too hard to work around this. I was able to come up with a link which enables auto-play, so at least it begins playing automatically, avoiding the need to use the video player controls. However, the average user would not have been able to do this themselves.

Ideally, everything should be accessible to all users. Sometimes, for whatever reasons (valid or not), this isn't possible. When it isn't, at least consider your target audience. If, for example, a large number of them are probably going to be blind, it might just make sense to implement and test accessibility for screen reader users. The APC are using an external service to provide the stream. Regardless, they should have tested and resolved the problem somehow or, at the very least, openly provided a work around such as the one I gave above.

It's worth noting that Adobe clearly document that windowless (transparent or opaque) Flash is inaccessible.
read more

Making PennyTel Accessible with Greasemonkey

PennyTel is an incredibly cheap VoIP provider serving Australia (among other countries) which I have been using for several years. For the most part, I am fairly happy with them, especially the price. Unfortunately, their customer portal has many accessibility problems, and despite a polite request from me quite some time ago, nothing has been done to rectify this.

The biggest issue is that there are many buttons on the site which are presented using clickable graphics, but they have been marked with @alt="", indicating that the graphics are for visual presentation/layout only and suggesting to screen readers that they shouldn't be presented to the user. Obviously, this is very wrong, since these graphics are buttons which the user might wish to activate. It's bad enough that no text alternative is provided, but specifying empty text is extremely incorrect. With the current version of NVDA, this issue makes the portal practically unusable.

It recently occurred to me that it might be possible to hack around this with Greasemonkey. In short, Greasemonkey is a Firefox add-on which allows you to "customize the way a web page displays or behaves, by using small bits of JavaScript".

This turned out to be a great success. I now have a Greasemonkey script that not only gives friendly labels to many graphic buttons, but also injects ARIA to transform these graphics into buttons. In addition, there are parts of the portal which use graphics to indicate which option has been selected and the script turns these into radio buttons using ARIA. There is a navigation bar where the items are only clickable text, which the script changes into links for quicker navigation using ARIA. Finally, @alt="" is removed from all other clickable graphics which the script doesn't yet know about, which at least allows screen readers to present the graphic using their own algorithms to determine a label. Once the script is installed, this all happens transparently without any special action.

This took me a few hours, though this was mostly because I had never used Greasemonkey and only had a very basic knowledge of JavaScript before. Aside from the fact that PennyTel is now quite usable for me, this is also an exciting demonstration of how accessibility improvements for a site can be "scripted" within the browser like this, independent of any particular screen reader.

If you happen to have a PennyTel account and would find this useful yourself, you can grab the script. I'm sure there are more things I can improve, but this is sufficient to make the site quite usable.
read more

Morning Limerick

There once was a guy named James Teh
Who disliked the start of the day.
He hated awaking,
Was bored by fast-breaking
And wished it would all go away.
read more

Responsibility for Windows Application Accessibility

When an assistive technology (AT) user discovers an application that is inaccessible in some way, they will generally hold one of two parties responsible: the AT developer or the application developer.

In the Apple world, the application developer is generally responsible for ensuring accessibility. Users don't tend to complain to Apple when an application is inaccessible; they complain to the application developer. More often than not, this is correct. An accessibility framework has been provided to facilitate application accessibility and Apple's assistive technologies utilise this framework, so it's up to the application to fulfil its part of the bargain.

In contrast, in the Windows world, the AT developer is generally held responsible. In the past, before there were standard, fully functional accessibility frameworks, I guess this was fair to some extent because application developers had no way of making their applications accessible out-of-the-box. As a result, AT developers worked around these problems themselves through application specific scripting and hacks. However, Windows has had standard rich accessibility frameworks such as IAccessible2 and UI Automation for several years now. Therefore, this is no longer an acceptable justification. Despite this, the general expectation still seems to be that AT developers are primarily responsible. For example, we constantly receive bug reports stating that a certain application does not work with NVDA.

Some might argue another reason for this situation is that application developers have previously been unable to test the accessibility of their applications because of the high cost of commercial ATs. With the availability of free ATs such as NVDA for several years now, this too is no longer an acceptable excuse.

So why is this still the case in the Windows world? If it's simply a ghost from the past, we need to move on. Maybe it's due to a desire for competitive advantage among AT vendors, but the mission of improving accessibility and serving users as well as possible should be more important. If it's resultant to poor or incomplete support for standard accessibility frameworks, ATs need to resolve this. Inadequate or missing support for accessibility in GUI toolkits is probably part of the problem. We need to work to fix this. Perhaps it's because of a lack of documentation and common knowledge. In that case, the accessibility/AT industry needs to work to rectify this. Maybe there just needs to be more advocacy about application accessibility. Are there other reasons? I'd appreciate your thoughts.

Whatever the reasons, I believe it's important that this changes. Proprietary solutions implemented for individual ATs are often suboptimal. Even if this wasn't the case, implementing such solutions in multiple ATs seems redundant and wasteful. Finally, the more applications that are accessible using standard mechanisms, the more users will benefit.
read more

My quest for a New Mobile Phone

Disclaimer: This is primarily based on my own personal experience. Also, this all happened last year, so some of the finer details are a bit vague now.

Early last year, my trusty 6 year old Nokia 6600 was finally starting to die and I decided it was past time to move on. (Amusingly, that phone even survived being accidentally dunked in a glass of wine.) My aim was to satisfy all of my portable technology needs with one device, including phone, email, web browser, audio player (standard 3.5 mm audio socket essential), synchronisable calendar, synchronisable contact manager, note taker, ebook reader and portable file storage. And so the quest began.

Making the (First) Choice


My ideal mobile platform was Android. Aside from satisfying all of my needs, it is an open, modern platform. Unfortunately, although I seriously entertained the idea, a great deal of research and playing with the Android emulator led me to realise that Android's accessibility was at an unacceptably poor state for me.

I considered the iPhone. I've written about the iPhone's out-of-the-box accessibility to blind people before and have played with an iPhone several times since. Although I was pretty impressed, especially by the fact that VoiceOver comes out-of-the-box, there were two major problems with the iPhone for me. First, I dislike the closed nature of the iPhone environment, commonly known as the "walled garden" or "do things Apple's way or not at all". Aside from the principle (you all know I'm a big advocate for openness), I would be unable to play ogg vorbis (my audio format of choice) in iPod, I would have to transfer music and files with iTunes (which I detest), I couldn't use it as a portable file storage device, and I would be limited to apps that Apple permitted (unless I wanted to jailbreak). Second, I wanted a device with a physical keyboard. In the end, I decided against the iPhone.

I briefly considered another Symbian Series 60 phone. However, based on past experience (both mine and others'), I didn't think I would be able to play audio and use the screen reader simultaneously, which immediately disqualified it for me, although I've since been informed that is no longer true on some newer phones. I also feel it is a dying platform. There are probably some other reasons i discounted it, but i can't remember them now. If nothing else, I wasn't entirely happy with it and wanted a change.

Finally, I settled on Windows Mobile, specifically the Sony Ericsson Xperia X1, with Mobile Speak. I guess Windows Mobile is a dying platform too, but at the time, I felt it was perhaps less so and provided more functionality for me. While the operating system itself isn't open, you can develop and install apps as you please without being restricted to a single app store. There are several ogg vorbis players for Windows Mobile. I also had the option of buying Mobile Geo for navigation if I wanted to later. I was warned by someone who played with an older phone that Mobile Speak on Windows Mobile was fairly unresponsive, but unable to test this myself, I hoped that it might be better with a less resource intensive voice such as Fonix and/or a newer phone or that I'd get used to it.

Frustration, Pain and Misery


A bit less than $800 later, I got my new phone and Mobile Speak in June. I expected and accepted that it would take me some time to get used to it. I loved finally being able to access the internet from my phone and play audio through proper stereo headphones. However, despite this, the next few months were just downright painful and frequently induced rage bordering on violence. I often had the urge to throw my new phone across the room several times a day.

Primarily, this was due to Mobile Speak. I found it to be hideously unresponsive, unstable, unreliable, inconsistent and otherwise buggy as all hell.It's worth noting that I'm not saying that this is all entirely Mobile Speak's fault. I suspect Windows Mobile and other applications may play a part in this dodginess.

I had two other major gripes with Mobile Speak.

There were other things that irritated me about my phone and its applications.

The Snapping Point


In the end, after less than 6 months, I just couldn't take it any more. I tried to learn to live with it for at least a couple of years, as I'd already spent so much money on it, but it was truly unbearable. It's worth noting that a close friend of mine had a very similar experience with Windows Mobile and also gave up in equivalent disgust around the same time. It particularly angers me that I paid $315 for a piece of software as buggy as Mobile Speak. I started playing with Jen's iPhone a bit more, and finally, I gave in and got my own.

The iPhone: Peace at Last


For reasons I mentioned above, I felt like I was going to the dark side when I made the decision to switch to the iPhone. Among other things, it's a bit hypocritical of me, given my belief in and advocacy for openness. Nevertheless, I have not looked back once since I got it. It has truly changed my life.

The in-built accessibility of the iPhone is amazing. I strongly believe that accessibility should not incur an extra cost for the user and Apple have done just that. VoiceOver is very responsive. Usage is fairly intuitive. All of the in-built apps and the majority of third party apps are accessible. Once you get used to it and start to remember where things are on the screen, navigating with the touch screen becomes incredibly efficient. The support for braille displays is excellent; I can see this being very useful next time I need to give a presentation. The triple-click home feature means that I can even toggle VoiceOver on Jen's phone when needed, which has been really useful for us when she is driving and needs me to read directions.

I still hate iTunes, but thankfully, I rarely have to use it. I manage music and other audio on my phone using the iPod manager component for my audio player, foobar2000, which is excellent and even transcodes files in unsupported formats on the fly. The iPod app is great, supporting gapless playback for music and automatic bookmarks for audio books and podcasts.

Other highlights:

Resultant to all of this, I find I spend far less time in front of my computer outside of work hours. Also, when I'm away from home, even on holiday for a week, I often just take my phone. Previously, I had to take my notebook almost everywhere.

Like all things, the iPhone isn't perfect. I still dislike the walled garden, and I have to live with transcoded audio and can't use my iPhone as a USB storage device because of it. I am definitely slower at typing on the iPhone than I was on the numeric keypad on my old Nokia 6600, although perhaps surprisingly, I'm probably faster on the iPhone than I was on the Xperia X1. There are definitely bugs, some of which I encounter on a daily basis and are very annoying.

Even so, I love the iPhone. I'm willing to make some sacrifices, and I can live with bugs that are at least consistent and easy enough to work around. On the rare occasions that VoiceOver crashes, I can easily restart it with a triple click of the Home button. I wish I'd gone for the iPhone in the first place and not wasted so much money on the Windows Mobile solution, but ah well, live and learn.
read more

Brilliant Email

The following email sent to NV Access administration yesterday is an amazing mastery of politeness, eloquence, intellect and linguistic ability. I've reproduced it verbatim below, except for the obfuscation of some words for reasons that will become clear as you read. Enjoy!
From: Dave. I lost my cookie at the disco. <computerguy125@****>
To: admin@****
Date: Sun, 13 Feb 2011 23:40:16 -0500
Subject:

hEY MOTHER F****R YOU SCREWED UP MY LAPTOP.  fIX YOUR SCREEN READER.  bLIND ASS MOTHER F****R.  f***ING BLINKY.  cHANGE YOUR SHORTCUT KEY SO IT DOESN'T CONFLICT WITH SYSTEM ACCESS TOO.  yOU DON'T KNOW THAT THEN LOOK IT UP.  mOTEHR F****R.

-- 
Email services provided by the System Access Mobile Network.  Visit www.serotek.com to learn more about accessibility anywhere.
read more

The Poor State of Android Accessibility

The Android mobile platform really excites me. It is open (which cannot be said of the iPhone) and is incredibly successful in many respects. I would almost certainly choose an Android phone... except for the poor state of Android accessibility.

Note: I will primarily discuss access for blind users here, since that is what I am most familiar with. However, some of this applies to other disabilities as well.

In the Beginning


In the beginning, there was no accessibility whatsoever in Android. It would have made sense to design it from the start with accessibility in mind, which would have made it much easier, but as is sadly so often the case, this wasn't done. Nevertheless, many other platforms have managed to recover from this oversight, some with great success.

Eyes-Free Project


Then came the Eyes-Free Project, which created a suite of self-voicing applications to enable blind users to use many functions of the phone. Requiring blind users to use these special applications limits the functionality they can access and completely isolates them from the experience of other users. This is just a small step away from a device designed only for blind users. I guess this is better than nothing, but in the long-term, this is unacceptable.

Integrated Accessibility API and Services


With the release of Android 1.6 came an accessibility API integrated into the core of Android, as well as a screen reader (Talkback) and other accessibility services. A developer outside Google also began working on a screen reader called Spiel. This meant that blind users could now access standard Android applications just like everyone else.

Unfortunately, the Android accessibility API is severely limited. All it can do is send events when something notable happens in the user interface. An accessibility service such as a screen reader can query these events for specific information (such as the text of an object which has been activated), but no other interaction or queries are possible. This means it isn't possible to retrieve information about other objects on the screen unless they are activated, which makes screen review impossible among other things. Even the now very dated Microsoft Active Accessibility (the core accessibility API used in Windows), with its many limitations and flaws, allows you to explore, query and interact with objects.

Inability to Globally Intercept Input


In addition, it is not possible for an accessibility service to globally intercept presses on the keyboard or touch screen. Not only does this mean that an accessibility service cannot provide keyboard/touch screen commands for screen review, silencing speech, changing settings, etc., but it also makes touch screen accessibility for blind users impossible. A blind user needs to be able to explore the touch screen without unintentionally activating controls, which can't be done unless the screen reader can provide special handling of the touch screen.

Inaccessible Web Rendering Engine


The web rendering engine used in Android is inaccessible. In fact, it's probably impossible to make it accessible at present due to Android's severely limited accessibility framework, as a user needs to be able to explore all objects on a web page. This means that the in-built web browser, email client and most other applications that display web content are inaccessible. This is totally unacceptable for a modern smart phone.

IDEAL Apps4Android's Accessible Email Client and Web Browser


IDEAL Apps4Android released both an accessible email client and web browser. The accessibility enhancements to the K9 email client (on which their application is based) have since been incorporated into K9 itself, which is fantastic. However, access to the web still requires a separate "accessible" web browser. While other developers can also integrate this web accessibility support into their applications, it is essentially a set of self-voicing scripts which need to be embedded in the application. This is rather inelegant and is very much "bolt-on accessibility" instead of accessibility being integrated into the web rendering engine itself. This isn't to criticise IDEAL: they did the best they could given the limitations of the Android accessibility API and should be commended. Nevertheless, it is an unsatisfactory situation.

More "Accessible" Apps


There are quite a few other applications aside from those mentioned above that have been designed specifically as "accessible" applications, again isolating disabled users from the normal applications used by everyone else. Again, this isolating redundancy is largely due to Android's severely limited accessibility framework.

Solution


Unfortunately, even though Android is open source, solving this problem is rather difficult for people outside the core Android development team because it will require changes to the core of Android. The current accessibility framework needs to be significantly enhanced or perhaps even redesigned, and core applications need to take advantage of this improved framework.

Conclusion


While significant headway has been made concerning accessibility in Android 1.6 and beyond, the situation is far from satisfactory. Android is usable by blind users now, but it is certainly not optimal or straightforward. In addition, the implementation is poorly designed and inelegant. This situation is only going to get messier until this problem is solved.

I find it extremely frustrating that Android accessibility is in such a poor state. It seems that Google learnt nothing from the accessibility lessons of the past. This mess could have been avoided if the accessibility framework had been carefully designed, rather than the half-done job we have now. Good, thorough design is one of the reasons that iPhone accessibility is so brilliant and "just works".
read more

Why Can't Microsoft Build a Screen Reader into Windows?

As Windows screen reader users will know, there is no screen reader included in Windows. Instead, users requiring a screen reader must obtain and install a third party product. Yes, there is Microsoft Narrator, but even Microsoft know that this is hardly worthy of the name "screen reader". :)

A few years ago, Apple revolutionised the accessibility industry by building a fully fledged screen reader, VoiceOver, right into Mac OS X. Ever since, many have asked why Microsoft can't do the same for Windows. Many are angry with Microsoft for this continued lack of built-in accessibility, some using it as support for the "why Apple is better than Microsoft" argument.

Here's some food for thought. I'm not sure Microsoft could do this even if they wanted to; their hands are probably tied in a legal sense. If they did, they could very likely be sued by assistive technology vendors for anti-competitive conduct, just as they have been sued several times concerning their bundling of Internet Explorer with Windows. Once again, Apple don't have to be concerned with this because there wasn't an existing screen reader on Mac OS X and they don't have the dominant position in the market.

I have no evidence for this argument. Perhaps I'm wrong, but history suggests that it is highly likely that I'm not.

Even as one of the lead developers of NVDA, I'm first and foremost a blind user who wants the best possible access, both for myself and other blind users. As such, I would very much welcome a screen reader built into Windows. Competition is good. A built-in screen reader doesn't mean that other screen readers can't exist. If the built-in solution were good enough, then there would be no need for NVDA to exist. If it weren't, NVDA would drive accessibility to improve through innovation and competition.
read more

Using Touch Sensitive Buttons on Modern Notebooks without Sight

Many (perhaps the majority of) notebook/laptop computers now have a strip of touch sensitive keys above the normal keyboard. These keys are not tactile in any way and, depending on the computer, provide such things as multimedia controls (volume, mute, play, stop, etc.), toggles for wireless radios and other special functions. I've always been concerned about how I would access such controls without any sight. Unfortunately, in my search for a new mnotebook, all of the notebooks which interested me included them, so I decided to just live with it. I bought an Acer Aspire 3935 which, among other things, has touch sensitive keys to toggle bluetooth and wifi. I began to ponder ways to place some sort of tactile marker on or near these keys so I can find them. However, it recently occurred to me that there's already a perfectly good tactile locator for these keys: the function keys on the normal keyboard, which lie immediately beneath the touch sensitive strip. For example, on my computer, the key for toggling bluetooth is just above the f11 key, so all I have to do is locate the f11 key and move my finger directly up to hit the bluetooth toggle key. This is blindingly obvious in hindsight, but well... hindsight is a wonderful thing. :) Of course, sighted assistance may be required initially to find the keys if trial and error is insufficient.
read more