This week, AppleVis posted an open letter to Apple, regarding the accessibility of its Xcode IDE. As someone who uses Xcode on a daily basis, I wanted to draw some attention to this effort and comment on a few aspects of it.
On the whole, I agree that Xcode could be improved in several key areas, most notably Interface Builder, in the way that it interacts with VoiceOver. There are definitely issues that must be worked around when developing software with it, and I very much want to see Apple address these issues. So, I applaud the writing of this letter. Whether or not open letters are an effective way of encouraging change is another matter entirely.
In their desire to hammer home some important points, some other, also very important, items were omitted from the letter, and I fear these omissions will potentially result in two unintended side effects. The first, and most damaging, is that they will discourage visually impaired coders from trying out Xcode at all. The second is that by omitting certain details, they may inadvertently cause their letter not to be taken seriously.
Since this is an important issue that effects all developers who use VoiceOver to access Xcode, I want to add my own thoughts to this discussion.
I’ve been using Xcode since version 3. While there is still quite a lot of room for improvement, it should be noted that we’ve come an extraordinarily long way since then in terms of accessibility. It was, in fact, entirely impossible to use Interface Builder in those days. Now, though cumbersome for some tasks, it is at least possible.
When Xcode 4 came along, the UI was entirely revamped. For a VoiceOver user, it was a mess of hundreds, perhaps thousands, of controls all crammed into a never ending sea in the Xcode main window. It was extremely difficult to use, and more than a little overwhelming. Over the next couple of point releases, a tremendous amount of accessibility work was done, and the interface became immeasurably more VoiceOver friendly, thanks in part to the logical organization of related controls into groups.
Since then, most, if not all, updates to Xcode have provided improvements to accessibility. Xcode 6 has continued this trend. Many controls that VoiceOver reported as simply “unknown” now have accessibility labels, making them easier to identify. Not all of these labels are terribly intuitive, at least not to me, so there is still room to improve in this regard, but it is yet another step forward in the right direction.
All screen readers, regardless of platform, have a variety of tools at their disposal for dealing with difficult situations. Jaws for Windows relies heavily on scripts and configurations, Window-Eyes uses scripts and set files, and VoiceOver relies on Activities and scripts. In broad strokes, all of these strategies are similar, though the details can vary widely between screen readers.
Some of the scenarios described in the open letter can be improved upon by using the tools made available by either VoiceOver or Xcode. HotSpots, in particular, can drastically minimize the amount of navigating and interacting required to perform common tasks in Xcode. I know, because I use this technique everyday. Setting VoiceOver HotSpots to the various group views in Xcode not only moves you immediately to that group if it is visible on the screen, but it also automatically interacts with it.
Navigation can also be sped up by judicious use of VoiceOver’s Trackpad Commander. This does, however, require you make yourself familiar with the layout of the various views in Xcode.
Xcode itself has enumerable hotkeys and tricks for increasing efficiency. Some of these are readily obvious in the UI, and some take some digging to discover. For instance, Control-Command-Up/Down arrows can jump you quickly between related header and source files. Command-L allows you to jump to any line number of the current file.
Connecting interface elements to code is hands down the most cumbersome task for VoiceOver users, but if performed correctly, I’ve found it to work extremely consistently. I have received many emails and tweets about this aspect of the process, and in almost every case, the system did not work for one of two reasons: either the user was missing a step, or else they were trying to connect an element to an object that it could not be connected to. That latter is particularly problematic, because in OS X development, controls are permitted to be connected to the AppDelegate. In iOS, they are not, and must typically be connected to a ViewController. If you try to connect, for instance, a button to the AppDelegate in an iOS project, nothing happens, as it shouldn’t. In the case of users who have missed a step, this is illustrative of the cumbersome nature of the task, and shows why the process needs to be improved upon.
Finally, while I have not found the “busy” problem to be quite as bad as implied in the AppleVis letter, it can be largely mitigated by using the VoiceOver option to eliminate Cursor Tracking. (VO-Shift-F3). This is admittedly not ideal, and this is an area I would like to see improved upon across OS X, both in Apple’s apps and third-party apps. The one area where it becomes a serious issue is in the Library group, but turning off Cursor Tracking eliminates the problem.
Apple can, and should, continue to improve the accessibility of Xcode for VoiceOver users. There is a lot of room for improvement, and hopefully open letters, like the one posted by AppleVis, will draw more attention to the fact that there are a growing number of VoiceOver users who will benefit from said improvements. To imply that they have not been paying attention to this problem, however, is an assertion that is not upheld by the history of the product.
If more VoiceOver users embrace Xcode as a development platform, we can have, collectively, a stronger voice to influence change, especially in this more open era that Apple seems to be moving into. We should not, however, avoid making use of the accessibility tools at our disposal, or, however inadvertently, discourage VoiceOver users from learning to use Xcode as it is today. We need more users to strengthen our case to Apple. The more of us there are, the higher a priority these issues can be given. Engineering resources are not infinite.
Let’s discuss the problems of using Xcode with VoiceOver; let’s try to affect change; but let’s also be encouraging users to utilize all the tools and tricks at their disposal as well.
Xcode has come a long way since version 3. Here’s hoping Apple continues to push accessibility forward for VoiceOver users in its development tools.
Apple’s sixth major release of iOS has finally arrived. Along with the scores of new features for users of all stripes come changes and enhancements to the mobile operating system’s accessibility features, including VoiceOver.
This article is not meant to be an exhaustive list of everything that has changed in iOS 6. Instead, it should provide highlights of some of the most interesting additions and enhancements to the system’s accessibility feature set.
Although iOS 6 is available on the iPad 2, 3rd-generation iPad, iPhone 3GS, iPhone 4, and iPhone 4S, as well as Apple’s forthcoming iPhone 5, not all features are available on all devices. We will try to clarify when a feature being discussed is not available on all devices.
Let’s take a look at iOS 6.### A Note On Timing
From time to time we are asked why we wait so long to publish our reviews of new iOS releases and the new features found in VoiceOver. The fact is that any users publishing information about iOS and the new VoiceOver features prior to the official release, beyond that which Apple has specifically announced, are doing so in violation of Apple’s Non-Disclosure Agreement (NDA), which they had to accept before gaining access to the software. We at Maccessibility believe that such violations show a lack of respect to Apple, and most especially to the developers who comprise the VoiceOver team. We, therefore, do not publish details of Apple’s products before embargoes are lifted.
Punctuating the Point
Since the dawn of accessible computing, screen reading solutions have offered various settings to control how much textual information is presented in spoken form to the user. Chief among these were the now familiar “Some”, “Most”, “All”, and “None” values for the amount of punctuation which should be explicitly detailed. Early on, there was little use in a mobile operating system for this information. As iOS has matured, however, its utility has grown. Word processing apps such as Pages have expanded the ways in which we use our mobile devices, and the time has come to add this kind of customization to VoiceOver’s settings.
The Punctuation settings have been simplified to the most common three possibilities: “Some”, “All”, and “None”. To gain access to these features, simply add the Punctuation rotor option to the VoiceOver rotor via the VoiceOver Rotor settings. Selecting “Punctuation” from the rotor and then flicking up and down will adjust the setting as needed.
The exact meaning of these settings has varied over the years between assistive tech products. What VoiceOver calls “All” punctuation is what many long-time AT users will perhaps think of as “Most”. That is to say that all punctuation marks and various other textual symbols are announced, but white space is not.
The multi-touch paradigm has brought with it the idea that many different kinds of actions can be performed on an on-screen element. Swiping, tapping, rotating, etc, can all have different results. VoiceOver provides its own gestures for the most common of these, but it would be virtually impossible to provide alternatives for every conceivable command and combination.
VoiceOver has long offered the “Pass-through” gesture which partially overrides VoiceOver’s gestures in favor of what the user is performing on-screen, but as iOS applications became more mature and capable, the pass-through gesture was insufficient to cover all cases.
One of the simplest examples of how gestures are used in iOS is in the stock Mail app. Tapping a message from the in-box list selects and opens it. Alternatively, a sighted user may swipe across the message’s entry in the list to delete it more fluidly.
In iOS 6, VoiceOver introduces “Actions”. “Actions” appear in the rotor when VoiceOver focus is on a control which accepts gestures other than the defaults, and which has been properly setup to use the new accessibility Actions feature.
The VoiceOver user can flick up and down when the “Actions” rotor option is selected and choose from the available commands. Often, the “Actions” setting of the rotor is auto-selected when it becomes available, so the user need not manually select it.
For an example of this in practice, open the Mail app and place VoiceOver focus on a message you wish to delete. Flick downward and VoiceOver will announce “Delete”. Double-tap and the message is instantly deleted. Focus will automatically move to the next message in the list, and VoiceOver will announce it automatically. Using this technique, the VoiceOver user can quickly move through and delete a large number of messages.
Essentially, the “Actions” rotor option provides a kind of menu in which commands which are usually performed by gestures can be offered to a VoiceOver user in an elegant and intuitive format.
In iOS 6, Apple has replaced the Maps app with a brand-new offering. The new implementation uses Apple’s own map data, rather than that provided by Google.
On capable devices, Maps accessibility is quite remarkable. It particularly shines on the 3rd-generation iPad, where the larger screen provides a more expansive look at the visible map for VoiceOver users.
Detailing all of the changes in the Maps application is beyond the scope of this article. We are, therefore, going to focus on the map display itself, something which hitherto has been wholly inaccessible.
In the VoiceOver rotor, users will find a “Zoom” option when a map is visible on screen. This allows for zooming in on portions of the map for more detailed examination, or zooming out for a bird’s eye view of a larger area.
Lower values for the zoom can give global, continental, or country-wide views, while higher zoom levels let you examine streets and neighborhoods. As the zoom level is changed, VoiceOver announces some of the major points of interests and/or roads/highways which are currently visible on the map.
When viewing, for instance, North America as a whole, users can explore the screen and get an idea of the locations of various major cities in relation to one another. (San Francisco on the west coast, New York on the east, St. Louis in the mid-west, and so on.) Three-finger flicks to the left, right, up, or down, will pan the view as one would expect, allowing the VoiceOver user to explore further.
When exploring an area in high detail, such as the streets of a neighborhood, VoiceOver announces the streets as you touch them, and invites you to “Pause to follow.” If you continue to touch a street, VoiceOver enters a mode which provides auditory feedback via a low thudding sound. You can continue to explore, and as long as your finger is on the road, the sound persists. If you move off of the road it stops. In this way, it is possible to follow the route of roads, streets, and highways.
VoiceOver provides spoken information about the orientation of the roads, as well as announcements of intersections as you encounter them. You may venture down intersecting roads and VoiceOver will change to following the new road fluidly.
Finally, a “Points of Interest” option in the rotor provides a way to jump through the visible points of interest on the map quickly. Depending on the zoom level of the map, this may include cities, roads, or even restaurants and landmarks.
The new Maps application is a massive leap forward for accessibility of map information. Due to their visual nature, some users may need some time to become accustom to exploring the maps and following roads with VoiceOver, but in time, this functionality will likely become indispensable for visually impaired users wishing to better understand their geographical surroundings. It may be that users who have, or have ever had, some vision may more quickly adapt to using Maps.
It’s the Little Things
As usual, users will find a myriad of small improvements and changes as they use iOS 6. A couple of these merit brief mentions.
In Messages, messages which contain links or other elements are treated as single items by VoiceOver, rather than being broken up into their component parts. To access a link inside a message, simply use the rotor to select “Links” and flick downward.
When reading a web page continuously using the VoiceOver two-finger flick down, the VoiceOver focus click auditory indicator is suppressed for elements which are contained within other elements. For example, links which are inside a block of text, such as a paragraph, are simply announced as links without triggering the focus click. This makes for a much more fluid and pleasant reading experience.
All in All
The iOS 6 update brings many enhancements to the VoiceOver experience that users are certain to appreciate. As has been, and should be, the case, the real pleasure of using the update is in the system’s new mainstream features, to which VoiceOver users have equal access.
As is our custom at Mac-cessibility, we won’t be covering the more general enhancements and additions to the iOS operating system which powers Apple’s mobile devices in this review. While some of the new features will certainly be relevant to our discussion, we’re going to be primarily focussing on what’s new in terms of accessibility, especially where regards VoiceOver.
A Note on Timing
A few people have asked us why we waited so long to publish our review of iOS 5 and the new features found in VoiceOver. The fact is that any users publishing reviews, blogs, and additional information on iOS 5 and the new VoiceOver features prior to 12/October/2011, beyond that which Apple has specifically released, are doing so in violation of Apple’s Non-Disclosure Agreement (NDA), which they had to accept before gaining access to the software. We at Mac-cessibility believe that such violations show a lack of respect to Apple, and most especially to the developers who comprise the VoiceOver team. We, therefore, will never publish details of Apple’s products before embargoes are lifted, nor will we provide links to sites which publish such content, unless the information comes straight from Apple itself.
A subtle but welcome change in iOS 5.0 for totally blind users is the default setting for the triple-click of the Home button. Previously, this defaulted to a screen which would allow the user to select an accessibility feature, (i.e. Zoom or VoiceOver). The obvious problem with this was, of course, that you needed at least some vision to select the feature. In iOS 5, triple-click of the Home defaults to VoiceOver. This, coupled with the PC-Free setup feature which no longer requires your device to be tethered to your Mac or PC, means that a visually impaired user can simply remove their iOS device from the box, turn it on, triple-click the Home button, and have full access to the device, including the entire initial setup process.
Low-vision users can use a three-finger triple tap to toggle the Zoom magnification feature on or off at any time. This, in conjunction with the triple-click of the Home button, means that users with some limited vision can rapidly alternate between VoiceOver and Zoom as needed. To accomplish this, simply activate Zoom while VoiceOver is inactive, and then toggle VoiceOver on or off with the triple-click of the Home button when desired.
Like ringtones, users can now define custom vibration patterns, and assign these patterns to specific contacts when they call or text. The feature must be turned on in the General->Accessibility->Hearing section of the Settings app, and patterns can be assigned or created by editing a contact’s information. Patterns are created by tapping the screen in the rhythm you wish to define, and it works perfectly with VoiceOver thanks to a new "Direct Touch" mode available via the rotor here. Aside from the obvious benefits to users with hearing impairments, I can also see this feature being of particular use to VoiceOver users who wish to be able to identify a caller when their iPhone is in silent mode.
New settings exist for individuals who require assistive devices to operate their Apple product, for better compatibility with hearing aids, and to activate the iPhone’s LED flash when there is an incoming call.
VoiceOver in General
VoiceOver’s settings have been expanded and refined in iOS 5. The Web Rotor and its customizable options are now simply part of the rotor in general, and all items in the rotor can be removed or reordered to suit the user’s personal preference. Items that only make sense on web sites, for example, will only ever appear in the rotor when VoiceOver is focused on web content. Items that can serve double duty, such as "Headings", now navigate through headings both on a web site, and within applications which may make use of headings to identify groups of on-screen elements.
Some of the new rotor options include:
Hints - Users can now use a rotor item to turn VoiceOver hints on or off on the fly.
Containers - Previously, iPad users could use four-finger flicks left/right to move through various large container sections on the screen. (i.e. from the list of messages in Mail to the message body.) Now, this is accomplished via this rotor item.
Volume - VoiceOver volume can now be controlled somewhat independent of the device’s main volume. Unfortunately, the maximum level of VoiceOver is still dictated by the current system volume setting. This means that you can’t turn iPod music playback down low and still have VoiceOver speaking at a comfortable volume, which limits the usefulness of this feature significantly.
Vertical Navigation - When the rotor is set to this option, the user experiences control of VoiceOver which is similar to using arrow keys on a desktop computer. Flicking up/down moves the user up/down, while flicking left/right continues to move the user horizontally. This feature will be especially helpful to those who frequently navigate with VoiceOver via a connected BluTooth keyboard, as well as in grid-based games or table layouts. The feature, interestingly, ignores artificial boundaries, meaning that if you navigate upward from the top row of a Home screen, you will find that VoiceOver focus has moved to the status bar.
All Items - This rotor option causes the up/down flick gestures to behave more or less the same as the left/right ones. At first, I was perplexed as to the usefulness of this feature. However, I believe that this may be helpful to those users who have difficulty with horizontal flicks when navigating through large numbers of items on web pages. In these cases, flicking up/down may be more comfortable over prolonged periods.
Additionally, VoiceOver now includes the ability to use either a high or compact quality voice for text-to-speech output, much like is available in Mac OS X 10.7 Lion. The user’s default voice, dictated by the localization setting of your device, is downloaded in a higher quality format in the background, and when available, a switch in the VoiceOver settings allows you to turn on use of the compact voice. While the higher quality voices are nice, we found that older devices struggled a bit when using them. Our first generation iPad, for instance, occasionally crackles when using the higher quality voice. Your mileage may vary.
A new switch in the VoiceOver settings allows the user to choose whether or not notifications are automatically read when the lock screen becomes visible. If this switch is off, only the Time will be announced, along with a notice of the number of notifications waiting to be read.
VoiceOver now includes what was at the top of my list of desired features: the ability to label controls. Some third-party applications have on-screen elements and controls that are either not labeled at all, or provide extraneous or mis-leading information to VoiceOver. Using a two-finger double-tap-and-hold, you can now label elements in any way you see fit. In our tests, this new functionality works extremely well, and is one of the most welcome additions to the VoiceOver commands.
One of my favorite VoiceOver features in Mac OS X has always been the Item Chooser. It allows for rapid searches of screens or web content for specific items, moving VoiceOver focus to the desired location instantly. The implementation of this feature in iOS 5 is superb.
Using a two-finger triple-tap, VoiceOver users can bring up a window which displays an alphabetized list of all available items on the current screen or web page. On the iPhone, this is presented as a single column, but the iPad version, making excellent use of the extra screen real-estate, displays this as a grid of items. On the right side in either case is the familiar table-index, which allows the user to rapidly move to items which begin with the selected letter. A search field at the top of the screen provides the ability to type a few characters to shorten the list of available items to those which include the characters provided, just as the Item Chooser works on the Mac.
Like the ability to add labels to controls, this is an incredibly useful and well implemented new feature for VoiceOver users, which definitely enhances the iOS experience.
In addition to a variety of new rotor options which provide quick navigation between web elementts, VoiceOver provides new keyboard shortcuts for those using their iOS device with a hardware keyboard. For instance, "H" will jump through headings when pressed.
Changing the rotor setting now announces the number of elements of the specified type that exist on the page. For example, when selected "Headings" with the rotor, VoiceOver might announce, "5 headings." On pages which contain an excessive number of elements of a given type, VoiceOver will simply announce "Many headings." This is undoubtedly done for performance reasons—and, let’s face it, who really needs to know that there are 478 headings in your FaceBook timeline, anyway?
Perhaps the most significant change in web browsing, and certainly the one I’ve found the most transformative, is available only on iPad devices. Since the beginning, VoiceOver has played a distinctive sound to indicate when moving in or out of significant containers on the iPad’s larger screen. For instance, when entering the list of messages or message body in Mail, entering the Dock or status bar on the Home screens, etc.
Now, this indication is also provided when moving into or out of significant containers on web pages as well. Examples of this might be table cells, web site navigation bars, advertisement frames, etc.
This feature is, of course, partially limited by how well the HTML code in question has been written, but in my testing it works incredibly well, and has very quickly changed how I browse the web. This is one of those changes that, on the surface seems trivial, but in practice adds a whole new dimension to the speed and efficiency of browsing on the iPad, and I can’t say enough good things about it.
Also available on the iPad version of Safari is tabbed browsing. Not surprisingly, the tab bar is fully accessible with VoiceOver, tabs can be rearranged using the double-tap-and-hold gesture with appropriate feedback, and the feature grately enhances the experience of browsing multiple pages at once on the iPad.
It’s the Little Things
Like any operating system update, there are countless fixes, refinements, and small changes that enhance the user experience. VoiceOver in iOS 5 is no exception.
The three-finger single-tap, typically used for checking one’s position in a scrolling list or web page has had its functionality expanded. In addition to the information it has always provided, this gesture also describes the location of an item and its approximate size. This can be invaluable if a user has navigated to an item using VoiceOver navigation gestures, and now wishes to learn its physical location on the screen. For example, if one touches the leftmost item in the Dock on the Home screen, then performs the three-finger single-tap, VoiceOver would describe that item as being in the lower left corner, with the approximate width of a thumbnail.
The Home screen page indicator, located just above the Dock, has always indicated the number of the currently active Home screen page. Previously, double tapping this control would advance forward through the pages. There was no way to move backward through the pages, short of using the three-finger flick-right gesture. This made navigating through Home screens difficult when operating the device one handed. In iOS 5, this control is now of the type that VoiceOver calls adjustable. Flicking up/down will navigate forward/backward through the Home screens.
iOS 5 includes a new split keyboard feature. When active, the on-screen keyboard is split in half, each side is reduced in size, and moved into the lower left and right corners. This allows the user to type with only their thumbs, much as they would on an iPhone, while holding the device in both hands. This is especially useful if one needs to enter text while standing, or when there is no available place to set down the iPad. To split the keyboard with VoiceOver, simply perform the scrub gesture while the keyboard has focus. Perform the gesture again to merge the two halves of the keyboard together and return it to normal operation.
A couple of features we were unable to test at this time include the new face detection capabilities, available only in iPad 2 and iPhone 4S. When taking pictures with the camera, VoiceOver will announce the number of faces it can detect in the frame, and the approximate positioning of those faces. This is an incredible leap forward for visually impaired photographers, and we look forward to seeing it in action.
Available only on the iPad 2, VoiceOver will allow the user to flip through active applications using the four-finger flick left/right gestures.
By far, this update to VoiceOver in iOS is the most significant we’ve seen since its introduction with the iPhone 3GS. The new features and refinements to VoiceOver add tremendous benefits to the iOS experience, and we’ve only covered the highest profile changes here.
If that list seems incongruous, it isn’t. There are few throughout history who have had as profound an impact on the lives of the visually impaired as Steve Jobs.
He wasn’t an engineer, or a scientist, or a mathematician. He was, at his core, a man who saw beyond the limitations of the present to the possibilities of the future, and how that future could, and in fact should, be inclusive to all, regardless of an individual’s limitations or abilities.
A few of those who have contributed to this site over the years have shared their thoughts on the loss of Steve Jobs. The pieces below do not try to tell a cohesive story. They are meant only to offer a glimpse into the lives of a handful of individuals among the multitudes whom Steve Jobs’s vision touched.
From Josh de Lioncourt
There are few areas of my life that have not been touched by Apple Inc., and there is no other company that more purely embodies the vision and soul of its founder than does Apple of Steve Jobs.
As a young, blind child, the first computer I ever touched was an Apple II-E. It was made accessible with an Echo voice synthesizer, and my school had only a handful of diskettes with applications we could use. I wrote my first significant story on that computer, which went on to win a district-wide competition.
I wrote my first few lines of programming code on that machine as well, painstakingly picking apart the applications we had, figuring out how they worked and why, and then creating my own programs from what I had learned.
A few years later, the first computer of my own was an Apple II-GS, and I expanded my writing and coding abilities on that machine.
Fast-forward to today, as I sit here typing this piece on another Apple-made computer, and things have somehow changed both tremendously, and not at all.
From the books I read, to the software I’ve developed, to the prose I write, to the music I listen to or compose, to the friends and family with whom I keep in touch everyday, no part of my life has been left untouched by Steve’s vision of the future. My independence as a blind adult in the 21st century is far beyond what it would have been without Steve and the team he assembled at Apple, and my quality of life exceeds that of my wildest imaginings as a child.
But these wonderful opportunities and accomplishments aren’t nearly all of the story. If it wasn’t for Steve Jobs, his remarkable vision, and the thousands of talented people he gathered to realize his dream of a better future for the world, I would never have made the acquaintance of some of the greatest friends I have ever had in my life. Through the power of independence and equality he placed in the hands of the visually impaired community, he brought people together, including the group of fine individuals who contribute to Mac-cessibility, and who have enriched my life every bit as much as Apple’s products have. Steve knew how to incorporate a human element into technology, and he instilled that vision into the DNA of the company he founded, nurtured, and brought back from the brink in its darkest hour.
We mourn the loss of a great man today, but we will celebrate his vision of a brighter tomorrow forever.
Thank you, Steve.
From Darcy Bernard
As a rule, I’m not the type of person to become emotional at the death of someone I don’t know personally. However that was definitely not the case when hearing yesterday of the passing of Steve Jobs. As I looked through my twitter feed, I found that I was by no means the only one. For a while there, just about every tweet was someone expressing their sadness at Steve’s death.
I’ve always had tremendous respect for Steve Jobs. He always strived to make technology something for everyone. Of course this included those of us with disabilities. Back in 2006, he said during the WWDC keynote, that one of the most important aspects of the Mac was that anyone could use it. These weren’t just words. Now, every product that his company makes has accessibility built in to it. To date, no other main stream company can make this claim. I don’t know how much Steve Jobs himself was concerned with accessibility, but two things are clear. First, given his hands-on approach, if he didn’t think accessibility wasn’t important, it wouldn’t be in Apple’s products. Second, if Steve hadn’t returned to apple in 1997, the company would have likely gone under and we definitely wouldn’t have those accessible products.
I saw a quote in one of the tributes to Steve
last night from author Neil Gaiman that I think we can all agree with. "Steve Jobs left the world a much better and more interesting place then he found it."
From John D. Panarese
I don’t think that the impact Steve Jobs has had on the world and technology will be fully realize for seen in its full perspective until years from now. That is the mark of a true visionary. Yes, many are aware of his or her accomplishments and contributions as they occur, but the “big picture” isn’t fully seen until their passing, unfortunately.
To the family of Mr. Jobs, and to Apple, my thoughts, prayers and condolences go out to you. How much Steve impacted my life as far as his innovations and contributions to technology and my independence as a blind person cannot be measured. I am truly saddened by the news of his passing, but I am thankful beyond measure for what he has done for us all. May he rest in well deserved peace.
From Holly Anderson
I am so very sad to hear about the death of Steve Jobs. While he’s not someone I knew personally or even met, I feel a profound loss. He, while leading apple, made several decisions that impacted my life greatly. I know that he was not totally responsible for accessibility in Apple products, it was a team effort, but he was the one who had final say at Apple. If Steve felt accessibility was not worth doing, he wouldn’t have done it. I can’t remember now the exact quote, but Steve said he wanted everyone to be able to use Apple products. Apple changed the landscape of accessibility forever, and I can’t help but feel he was somewhat responsible for that. He left behind an amazing legacy, and was taken from us too soon.
From M.J. Phoenix
Words are hard to come by when such a historic day sadly dawns such as the loss of Steve Jobs but I hope I can express an ounce of what I’m feeling today. The World lost someone so great.
Three years ago I invested in the world of Apple, purchasing my first macbook and iPod on the same day. Suddenly a world of mainstream technology was opened up to me for the same price as anyone else. And that was because a visionary was born fifty-six years ago. Someone who delivered countless innovative ideas that have changed the way in which the entire world interacts with technology. I bought into the accessible world of Apple three years ago but the influence of Steve Jobs’ work stretches way back before then. From the first time I touched a mouse on a computer as a school child, each time I changed a font on my school work, to the days of buying and listening to music, watching animated movies, Steve Jobs was behind all of that. His particular attention to detail was what made all of what he did stand out from the rest of the market and his vision of a world where technology should be accessed by all was what made things possible for all of us to use computers on every level, from iMacs to our iPhones and iPads to buying music online and so much more. He revolutionised the world in which we live through his vision for something that seemed to most out of reach but his determination, passion, drive and genius made Steve Jobs the man that we all came to admire and respect.
His long battle with the illness that sadly took him from us only made us admire him so much more because his greatest innovation came from those struggling times. I’ll always remember the cheerful manner in which he delivered every keynote and speech we saw, the determination to always get perfection, no matter the costs, and every great thing he contributed to our world that changed our lives. Very few can say they ever truly changed the world, but Steve Jobs, you certainly achieved that.
I never knew you as a person but I know your innovative spirit lives on in all of our lives. Rest in peace our dear friend. You are already missed!
From Anne Robertson
I woke up this morning to the news that Steve Jobs has died. Although not unexpected, it came sooner than I had thought.
When he had his liver transplant, I hoped it would be a new start for him as it was for me.
Anyone who thinks that making Apple computers accessible to the visually impaired was just a marketing strategy is sorely mistaken. When Steve Jobs became a millionaire in 1979, he gave a lot of money to a charitable organization set up to help blind people in India and Nepal. I’m sure it struck him as doubly unfair to make people who have more difficulty than others earning money pay extra for accessibility.
Steve’s suffering is over now, and I wish his family strength to get through the coming weeks and months.
From Keith Reedy
One of the saddest bulletins that I have ever received on my iPhone was the bulletin which told me that Steve Jobs had died.
I never had the pleasure of meeting Steve Jobs, but he changed my life completely, in that he changed the way that I access technology; first with the Mac in 2005 with the release of Tiger and Voiceover and then with the invention and release of my constant companion the iPhone.
I was there in the old days when the naysayers said that Voiceover was just a passing thing in order to comply with government regulations, but Steve knew better. Better accessibility for blind and low vision people was not the only thing on Steve Jobs’ mind, but it was part of the vision that became a reality and will live on long after Steve Jobs is buried.
This community would not have been born were it not for Steve Jobs. You and I would not have the freedom of technology that we have today were it not for Steve Jobs. Today and for some days to come, I mourn the passing of maybe the greatest innovator the world has ever known. I am praying for Steve’s family and I am trusting that Steve Jobs will truly rest in peace.
From Eric Troup
This morning was a typical morning in my house, which is to say, my life. I woke up to the sound of my iPhone’s alarm. I turned over, shut off the alarm, and set my phone to play my Wake-up playlist with its iPod feature. I then got myself a drink and read my morning news articles with my iPad. After that was done, I made plans to go to the Portland Apple store, that being the closest to me, to replace my computer’s battery. In fact, as I write this on my MacBook, we are driving to Portland. What do all these things have in common? Apart from my drink and the car (and Portland), almost every noun in this paragraph was made possible by a man who changed the world for many, many people.
That man is Steve Jobs.
And he has ruined my life.
p>As a blind computer user, I used to be quite content to type away on my Windows desktop, having been granted the ability to read its screen by paying upwards of $800 above and beyond the cost of the computer itself. I was content to only use a cell phone for phone calls, unless I wanted to pay an additional $300 or more for the price of an add-on screen reading software package. I was content to look down my nose with smug superiority at people who spent so much time browsing the web, texting, and doing all manner of things with their phones other than making phone calls. I justified this with statements like, "If their phone breaks, they’re really in trouble, huh?" or, "How lazy are we getting as a society? I like every thing to have its purpose."
And then Steve Jobs ruined me.
How, you ask?
Steve Jobs made a screen reading software package a part of the operating system which powered the Mac computer, as well as eventually taking that a step further and powering the iPHone as well, not to mention every other Apple product line. No longer must I pay extra for access to my technology. No longer can I sit idly by when companies like Amazon give the blind consumer bare-bones attempts at accessibility in a not-so-subtle attempt at mollification. I now am forced by my conscience to stand up and shout to anyone who’ll listen that it isn’t enough. How do I know it isn’t enough? Apple proved it.
Perhaps lawsuits were instrumental in granting this accessibility, and perhaps they weren’t. We’ve seen what lawsuits bring about in terms of obligatory accessibility. (I’m looking at you, Kindle.) Even if lawsuits played a part, Steve Jobs went hundreds of miles above and beyond the call of duty, and has forever changed not only what is possible for accessibility, but also what is (or should be) expected from companies when providing accessibility.
No longer am I able to settle for whatever drippings the lords of the manor see fit to bestow upon me. I now have to live life knowing there’s a better way. Ignorance was bliss. I, and all of us who use Apple products, have been forever changed.
We lost Steve Jobs a few days ago, and he has left a vacancy which, I fear, will not soon be filled. If ever a man or woman embodied the creative spirit, that can certainly be said of Steve Jobs.
Thank you, Mr. Jobs. You may have ruined my life, but you’ve done so in the best possible way…
…and I wouldn’t change it for the world.
From Gordon Smith
Steve Jobs was one of those rare individuals with a gift for spotting an opportunity and a flare for turning that opportunity into something a bit special. In Steve’s case, it started off as a dream, culminating in something which changed the life of millions upon millions of people around the globe; in all walks of life, irrespective of computer literacy or IT skill levels.
Steve’s innovative talents gave us the Apple Macintosh range of desktop, notebook and tablet computers; followed shortly thereafter by an amazing range of mobile devices unsurpassed by anything else anywhere.
Steve, in conjunction with the team of software and hardware designers engineers he expertly assembled set the standards for 21st century information technology; encompassing devices usable by all, irrespective of abilities.
Those of us who never had the pleasure of meeting Steve personally nevertheless benefitted from his genius; and he’ll be sadly missed by his family, friends and customers alike.
Thank you Steve, on behalf of all of us whose lives you changed for the better. You will live on in our memories for many years to come, and your legacy will endure in the decades ahead, shaping the minds of millions.
From Cara Quinn
I just want to say thank you. You’ve opened up opportunities for me that mean so very much. In just a short time, you, and those you inspired did what many said was impossible, and in doing so, you changed my life.
How can I say anything else but ‘thank you!’ -And I’m not the only one whose life YOu’ve affected. There are many.
I hope that wherever you are now, that you know how you’ve touched so many lives in such a profound way. You’ve shared the gifts of opportunity, possibility, and inspiration with so many people. So I’ll say once again, simply, thank you so much Steven! -God speed to you. -Love and support to your loved ones. I wish you well on your journey.