M12Y



Xcode Accessibility

This week, AppleVis posted an open letter to Apple, regarding the accessibility of its Xcode IDE. As someone who uses Xcode on a daily basis, I wanted to draw some attention to this effort and comment on a few aspects of it.

On the whole, I agree that Xcode could be improved in several key areas, most notably Interface Builder, in the way that it interacts with VoiceOver. There are definitely issues that must be worked around when developing software with it, and I very much want to see Apple address these issues. So, I applaud the writing of this letter. Whether or not open letters are an effective way of encouraging change is another matter entirely.

In their desire to hammer home some important points, some other, also very important, items were omitted from the letter, and I fear these omissions will potentially result in two unintended side effects. The first, and most damaging, is that they will discourage visually impaired coders from trying out Xcode at all. The second is that by omitting certain details, they may inadvertently cause their letter not to be taken seriously.

Since this is an important issue that effects all developers who use VoiceOver to access Xcode, I want to add my own thoughts to this discussion.

History

I’ve been using Xcode since version 3. While there is still quite a lot of room for improvement, it should be noted that we’ve come an extraordinarily long way since then in terms of accessibility. It was, in fact, entirely impossible to use Interface Builder in those days. Now, though cumbersome for some tasks, it is at least possible.

When Xcode 4 came along, the UI was entirely revamped. For a VoiceOver user, it was a mess of hundreds, perhaps thousands, of controls all crammed into a never ending sea in the Xcode main window. It was extremely difficult to use, and more than a little overwhelming. Over the next couple of point releases, a tremendous amount of accessibility work was done, and the interface became immeasurably more VoiceOver friendly, thanks in part to the logical organization of related controls into groups.

Since then, most, if not all, updates to Xcode have provided improvements to accessibility. Xcode 6 has continued this trend. Many controls that VoiceOver reported as simply “unknown” now have accessibility labels, making them easier to identify. Not all of these labels are terribly intuitive, at least not to me, so there is still room to improve in this regard, but it is yet another step forward in the right direction.

Tools

All screen readers, regardless of platform, have a variety of tools at their disposal for dealing with difficult situations. Jaws for Windows relies heavily on scripts and configurations, Window-Eyes uses scripts and set files, and VoiceOver relies on Activities and scripts. In broad strokes, all of these strategies are similar, though the details can vary widely between screen readers.

Some of the scenarios described in the open letter can be improved upon by using the tools made available by either VoiceOver or Xcode. HotSpots, in particular, can drastically minimize the amount of navigating and interacting required to perform common tasks in Xcode. I know, because I use this technique everyday. Setting VoiceOver HotSpots to the various group views in Xcode not only moves you immediately to that group if it is visible on the screen, but it also automatically interacts with it.

Navigation can also be sped up by judicious use of VoiceOver’s Trackpad Commander. This does, however, require you make yourself familiar with the layout of the various views in Xcode.

Xcode itself has enumerable hotkeys and tricks for increasing efficiency. Some of these are readily obvious in the UI, and some take some digging to discover. For instance, Control-Command-Up/Down arrows can jump you quickly between related header and source files. Command-L allows you to jump to any line number of the current file.

Connecting interface elements to code is hands down the most cumbersome task for VoiceOver users, but if performed correctly, I’ve found it to work extremely consistently. I have received many emails and tweets about this aspect of the process, and in almost every case, the system did not work for one of two reasons: either the user was missing a step, or else they were trying to connect an element to an object that it could not be connected to. That latter is particularly problematic, because in OS X development, controls are permitted to be connected to the AppDelegate. In iOS, they are not, and must typically be connected to a ViewController. If you try to connect, for instance, a button to the AppDelegate in an iOS project, nothing happens, as it shouldn’t. In the case of users who have missed a step, this is illustrative of the cumbersome nature of the task, and shows why the process needs to be improved upon.

Finally, while I have not found the “busy” problem to be quite as bad as implied in the AppleVis letter, it can be largely mitigated by using the VoiceOver option to eliminate Cursor Tracking. (VO-Shift-F3). This is admittedly not ideal, and this is an area I would like to see improved upon across OS X, both in Apple’s apps and third-party apps. The one area where it becomes a serious issue is in the Library group, but turning off Cursor Tracking eliminates the problem.

Wrap Up

Apple can, and should, continue to improve the accessibility of Xcode for VoiceOver users. There is a lot of room for improvement, and hopefully open letters, like the one posted by AppleVis, will draw more attention to the fact that there are a growing number of VoiceOver users who will benefit from said improvements. To imply that they have not been paying attention to this problem, however, is an assertion that is not upheld by the history of the product.

If more VoiceOver users embrace Xcode as a development platform, we can have, collectively, a stronger voice to influence change, especially in this more open era that Apple seems to be moving into. We should not, however, avoid making use of the accessibility tools at our disposal, or, however inadvertently, discourage VoiceOver users from learning to use Xcode as it is today. We need more users to strengthen our case to Apple. The more of us there are, the higher a priority these issues can be given. Engineering resources are not infinite.

Let’s discuss the problems of using Xcode with VoiceOver; let’s try to affect change; but let’s also be encouraging users to utilize all the tools and tricks at their disposal as well.

Xcode has come a long way since version 3. Here’s hoping Apple continues to push accessibility forward for VoiceOver users in its development tools.


The Maccessibility Dev Podcast #1 – Using Xcode with VoiceOver


In this test pilot podcast, we discuss using the Xcode development environment with VoiceOver, including some useful settings, the layout of the main interface, and the basic usage of InterfaceBuilder. If you find this developer focused podcast useful, please let us know here. Whether or not we produce more shows like this one, will depend on the interest.


From Marco Zehe: An overview of accessible app.net clients

In this blog post, I will cover accessible clients for the app.net service, ordered by platform. It will be updated with new information as I become aware of it. I will also mention some apps in each platform’s “Other” section that I’ve tried and not found accessible, or which have problems severe enough to prevent productive use.

Great piece by Marco Zehe on the Accessibility of App.net clients. Not only are Mac and iOS clients covered, but Android, Windows and web clients are also discussed.


Maccessibility to Livestream Coverage of the WWDC Keynote

This Monday, June 11, Apple will kick off their Worldwide Developers Conference with a Keynote event.  It is not know what will be discussed at the Keynote, but at the very least, Apple is expected to preview updates to both OS X and iOS, (their desktop and mobile operating systems).  

As usual, the regulars of the Maccessibility Podcast will be on hand, streaming coverage of the event.  If you would like to listen, point your media player of choice to http://darcy.serverroom.us:8576/listen.pls?sid=1.  The keynote is scheduled to start at 1 PM eastern, 10 AM pacific.  Our plan is to start streaming at around 12:30 PM eastern.  

If you would like to interact with us during the stream, the best way to do that is to use the #VoLive hash tag on Twitter.