This Monday, June 11, Apple will kick off their Worldwide Developers Conference with a Keynote event. It is not know what will be discussed at the Keynote, but at the very least, Apple is expected to preview updates to both OS X and iOS, (their desktop and mobile operating systems).
As usual, the regulars of the Maccessibility Podcast will be on hand, streaming coverage of the event. If you would like to listen, point your media player of choice to http://darcy.serverroom.us:8576/listen.pls?sid=1. The keynote is scheduled to start at 1 PM eastern, 10 AM pacific. Our plan is to start streaming at around 12:30 PM eastern.
If you would like to interact with us during the stream, the best way to do that is to use the #VoLive hash tag on Twitter.
Today at 10 AM eastern 1 PM pacific, Apple will be holding a press event. The presumption is that they will be announcing the next generation iPad. The regulars of the Maccessibility Round Table podcast will be offering up live commentary of the event.
As is our custom at Mac-cessibility, we won’t be covering the more general enhancements and additions to the iOS operating system which powers Apple’s mobile devices in this review. While some of the new features will certainly be relevant to our discussion, we’re going to be primarily focussing on what’s new in terms of accessibility, especially where regards VoiceOver.
A Note on Timing
A few people have asked us why we waited so long to publish our review of iOS 5 and the new features found in VoiceOver. The fact is that any users publishing reviews, blogs, and additional information on iOS 5 and the new VoiceOver features prior to 12/October/2011, beyond that which Apple has specifically released, are doing so in violation of Apple’s Non-Disclosure Agreement (NDA), which they had to accept before gaining access to the software. We at Mac-cessibility believe that such violations show a lack of respect to Apple, and most especially to the developers who comprise the VoiceOver team. We, therefore, will never publish details of Apple’s products before embargoes are lifted, nor will we provide links to sites which publish such content, unless the information comes straight from Apple itself.
A subtle but welcome change in iOS 5.0 for totally blind users is the default setting for the triple-click of the Home button. Previously, this defaulted to a screen which would allow the user to select an accessibility feature, (i.e. Zoom or VoiceOver). The obvious problem with this was, of course, that you needed at least some vision to select the feature. In iOS 5, triple-click of the Home defaults to VoiceOver. This, coupled with the PC-Free setup feature which no longer requires your device to be tethered to your Mac or PC, means that a visually impaired user can simply remove their iOS device from the box, turn it on, triple-click the Home button, and have full access to the device, including the entire initial setup process.
Low-vision users can use a three-finger triple tap to toggle the Zoom magnification feature on or off at any time. This, in conjunction with the triple-click of the Home button, means that users with some limited vision can rapidly alternate between VoiceOver and Zoom as needed. To accomplish this, simply activate Zoom while VoiceOver is inactive, and then toggle VoiceOver on or off with the triple-click of the Home button when desired.
Like ringtones, users can now define custom vibration patterns, and assign these patterns to specific contacts when they call or text. The feature must be turned on in the General->Accessibility->Hearing section of the Settings app, and patterns can be assigned or created by editing a contact’s information. Patterns are created by tapping the screen in the rhythm you wish to define, and it works perfectly with VoiceOver thanks to a new "Direct Touch" mode available via the rotor here. Aside from the obvious benefits to users with hearing impairments, I can also see this feature being of particular use to VoiceOver users who wish to be able to identify a caller when their iPhone is in silent mode.
New settings exist for individuals who require assistive devices to operate their Apple product, for better compatibility with hearing aids, and to activate the iPhone’s LED flash when there is an incoming call.
VoiceOver in General
VoiceOver’s settings have been expanded and refined in iOS 5. The Web Rotor and its customizable options are now simply part of the rotor in general, and all items in the rotor can be removed or reordered to suit the user’s personal preference. Items that only make sense on web sites, for example, will only ever appear in the rotor when VoiceOver is focused on web content. Items that can serve double duty, such as "Headings", now navigate through headings both on a web site, and within applications which may make use of headings to identify groups of on-screen elements.
Some of the new rotor options include:
Hints - Users can now use a rotor item to turn VoiceOver hints on or off on the fly.
Containers - Previously, iPad users could use four-finger flicks left/right to move through various large container sections on the screen. (i.e. from the list of messages in Mail to the message body.) Now, this is accomplished via this rotor item.
Volume - VoiceOver volume can now be controlled somewhat independent of the device’s main volume. Unfortunately, the maximum level of VoiceOver is still dictated by the current system volume setting. This means that you can’t turn iPod music playback down low and still have VoiceOver speaking at a comfortable volume, which limits the usefulness of this feature significantly.
Vertical Navigation - When the rotor is set to this option, the user experiences control of VoiceOver which is similar to using arrow keys on a desktop computer. Flicking up/down moves the user up/down, while flicking left/right continues to move the user horizontally. This feature will be especially helpful to those who frequently navigate with VoiceOver via a connected BluTooth keyboard, as well as in grid-based games or table layouts. The feature, interestingly, ignores artificial boundaries, meaning that if you navigate upward from the top row of a Home screen, you will find that VoiceOver focus has moved to the status bar.
All Items - This rotor option causes the up/down flick gestures to behave more or less the same as the left/right ones. At first, I was perplexed as to the usefulness of this feature. However, I believe that this may be helpful to those users who have difficulty with horizontal flicks when navigating through large numbers of items on web pages. In these cases, flicking up/down may be more comfortable over prolonged periods.
Additionally, VoiceOver now includes the ability to use either a high or compact quality voice for text-to-speech output, much like is available in Mac OS X 10.7 Lion. The user’s default voice, dictated by the localization setting of your device, is downloaded in a higher quality format in the background, and when available, a switch in the VoiceOver settings allows you to turn on use of the compact voice. While the higher quality voices are nice, we found that older devices struggled a bit when using them. Our first generation iPad, for instance, occasionally crackles when using the higher quality voice. Your mileage may vary.
A new switch in the VoiceOver settings allows the user to choose whether or not notifications are automatically read when the lock screen becomes visible. If this switch is off, only the Time will be announced, along with a notice of the number of notifications waiting to be read.
VoiceOver now includes what was at the top of my list of desired features: the ability to label controls. Some third-party applications have on-screen elements and controls that are either not labeled at all, or provide extraneous or mis-leading information to VoiceOver. Using a two-finger double-tap-and-hold, you can now label elements in any way you see fit. In our tests, this new functionality works extremely well, and is one of the most welcome additions to the VoiceOver commands.
One of my favorite VoiceOver features in Mac OS X has always been the Item Chooser. It allows for rapid searches of screens or web content for specific items, moving VoiceOver focus to the desired location instantly. The implementation of this feature in iOS 5 is superb.
Using a two-finger triple-tap, VoiceOver users can bring up a window which displays an alphabetized list of all available items on the current screen or web page. On the iPhone, this is presented as a single column, but the iPad version, making excellent use of the extra screen real-estate, displays this as a grid of items. On the right side in either case is the familiar table-index, which allows the user to rapidly move to items which begin with the selected letter. A search field at the top of the screen provides the ability to type a few characters to shorten the list of available items to those which include the characters provided, just as the Item Chooser works on the Mac.
Like the ability to add labels to controls, this is an incredibly useful and well implemented new feature for VoiceOver users, which definitely enhances the iOS experience.
In addition to a variety of new rotor options which provide quick navigation between web elementts, VoiceOver provides new keyboard shortcuts for those using their iOS device with a hardware keyboard. For instance, "H" will jump through headings when pressed.
Changing the rotor setting now announces the number of elements of the specified type that exist on the page. For example, when selected "Headings" with the rotor, VoiceOver might announce, "5 headings." On pages which contain an excessive number of elements of a given type, VoiceOver will simply announce "Many headings." This is undoubtedly done for performance reasons—and, let’s face it, who really needs to know that there are 478 headings in your FaceBook timeline, anyway?
Perhaps the most significant change in web browsing, and certainly the one I’ve found the most transformative, is available only on iPad devices. Since the beginning, VoiceOver has played a distinctive sound to indicate when moving in or out of significant containers on the iPad’s larger screen. For instance, when entering the list of messages or message body in Mail, entering the Dock or status bar on the Home screens, etc.
Now, this indication is also provided when moving into or out of significant containers on web pages as well. Examples of this might be table cells, web site navigation bars, advertisement frames, etc.
This feature is, of course, partially limited by how well the HTML code in question has been written, but in my testing it works incredibly well, and has very quickly changed how I browse the web. This is one of those changes that, on the surface seems trivial, but in practice adds a whole new dimension to the speed and efficiency of browsing on the iPad, and I can’t say enough good things about it.
Also available on the iPad version of Safari is tabbed browsing. Not surprisingly, the tab bar is fully accessible with VoiceOver, tabs can be rearranged using the double-tap-and-hold gesture with appropriate feedback, and the feature grately enhances the experience of browsing multiple pages at once on the iPad.
It’s the Little Things
Like any operating system update, there are countless fixes, refinements, and small changes that enhance the user experience. VoiceOver in iOS 5 is no exception.
The three-finger single-tap, typically used for checking one’s position in a scrolling list or web page has had its functionality expanded. In addition to the information it has always provided, this gesture also describes the location of an item and its approximate size. This can be invaluable if a user has navigated to an item using VoiceOver navigation gestures, and now wishes to learn its physical location on the screen. For example, if one touches the leftmost item in the Dock on the Home screen, then performs the three-finger single-tap, VoiceOver would describe that item as being in the lower left corner, with the approximate width of a thumbnail.
The Home screen page indicator, located just above the Dock, has always indicated the number of the currently active Home screen page. Previously, double tapping this control would advance forward through the pages. There was no way to move backward through the pages, short of using the three-finger flick-right gesture. This made navigating through Home screens difficult when operating the device one handed. In iOS 5, this control is now of the type that VoiceOver calls adjustable. Flicking up/down will navigate forward/backward through the Home screens.
iOS 5 includes a new split keyboard feature. When active, the on-screen keyboard is split in half, each side is reduced in size, and moved into the lower left and right corners. This allows the user to type with only their thumbs, much as they would on an iPhone, while holding the device in both hands. This is especially useful if one needs to enter text while standing, or when there is no available place to set down the iPad. To split the keyboard with VoiceOver, simply perform the scrub gesture while the keyboard has focus. Perform the gesture again to merge the two halves of the keyboard together and return it to normal operation.
A couple of features we were unable to test at this time include the new face detection capabilities, available only in iPad 2 and iPhone 4S. When taking pictures with the camera, VoiceOver will announce the number of faces it can detect in the frame, and the approximate positioning of those faces. This is an incredible leap forward for visually impaired photographers, and we look forward to seeing it in action.
Available only on the iPad 2, VoiceOver will allow the user to flip through active applications using the four-finger flick left/right gestures.
By far, this update to VoiceOver in iOS is the most significant we’ve seen since its introduction with the iPhone 3GS. The new features and refinements to VoiceOver add tremendous benefits to the iOS experience, and we’ve only covered the highest profile changes here.
Today at 1 PM eastern, 10 AM pacific, Apple will be holding a press event to announce their newest iPhone and the latest version of the iOS operating system. The regulars of the Maccessibility Round Table Podcast will be live streaming at that time offering up coverage and commentary of the event. If you’d like to listen, point your media player of choice to “http://stream.atmaine.com:8214/”.
If you’d like to interact with us during the stream, the best way to do it is by using the #VOLive hash tag on twitter.
If you can’t join us live, then look for a Maccessibility Round Table podcast later on in the week summing up today’s announcements.