From Bag to Brain Interface
Apple's four-decade accessibility journey — from a 1984 talking Macintosh to a 2025 brain-computer interface — is the most sustained corporate investment in accessibility in consumer-technology history.
Building Blocks on the Early Mac 1984-2002
Apple's accessibility history begins almost by accident. On January 24, 1984, Steve Jobs pulled the Macintosh from a bag at the Flint Center and let it speak: "Hello. I'm Macintosh. It sure is great to get out of that bag."[66] The underlying MacinTalk text-to-speech engine, developed by Joseph Katz and Mark Barton, made the Mac the first operating system shipping with built-in speech synthesis — though Apple never officially supported MacinTalk and distributed it only as a Technical Note. Hasmig Seropian of the original Mac team later told MacUser that Jobs had vetoed a higher-quality Apple II speech synthesizer as "too expensive" — the first recorded instance of his cost discipline outweighing a potential accessibility upgrade.
The structural foundation came next. Mac System 4.2 (1987) introduced Easy Access — Sticky Keys, Mouse Keys, Slow Keys — for users with motor impairments.[43] System 6.0 (1988) bundled CloseView, a 2x-16x screen magnifier with color-inversion for low vision, alongside HyperCard. PlainTalk speech synthesis and recognition debuted on the AV Quadras in 1993, becoming standard in System 7.1.2.[2] Most consequentially, Mac OS X 10.2 "Jaguar" (August 24, 2002) introduced the Universal Access preference pane, consolidating magnification, speech, sticky keys, slow keys, and mouse keys into one control surface — the scaffolding onto which VoiceOver would later bolt.[1]
Critically, none of this made the Mac competitive as an accessibility platform. A February 2001 TidBITS survey bluntly concluded that "anyone who requires a screen reader is better off using Windows, and nearly all blind people do." The only full-featured Mac screen reader was Alva Access Group's OutSpoken 9 at roughly $700, and it couldn't read HTML. Windows had JAWS, Window-Eyes, and IBM Home Page Reader. The blind community had, in effect, written off the Mac.
Apple's institutional commitment was nonetheless older than its products suggested. In 1985, Apple founded the Office of Disability — later the Worldwide Disability Solutions Group — under Alan Brightman, PhD. This is generally recognized as the first corporate accessibility team in the tech industry, predating every comparable effort at Microsoft, IBM, or AT&T. In 1987, Apple co-founded the Alliance for Technology Access, a national assistive-technology network.[72] The infrastructure was there; the consumer products had not yet caught up.
VoiceOver on the Mac & the iPod's Delayed Reckoning
The transformation began with Mac OS X 10.4 "Tiger" on April 29, 2005.[3] VoiceOver shipped as a fully integrated keyboard-driven screen reader at no extra cost — the first time any mainstream OS vendor had built one into the platform rather than licensing it to third parties. Product manager Mike Shebanek had pitched leadership that accessibility "should get the same love and care and quality that we give everything else in this company, and it should be on everything we make, always and forever."[36]
Jobs gave final approval. The launch voice (Bruce) was widely mocked — an AppleInsider forum thread called it "an embarrassment" compared to AT&T Natural Voices[4] — but the economic model was revolutionary. A blind user could buy a $499 Mac mini and get a screen reader for less than the cost of a single JAWS license.[37]
The iPod told the opposite story. For the first seven years of its existence (2001-2008), the iPod had no screen reader; the click wheel navigated visual menus with no spoken feedback. iTunes itself was inaccessible to Windows screen readers. The standoff came in 2008, when the National Federation of the Blind sent Apple a demand letter and Massachusetts Attorney General Martha Coakley opened an investigation[6] — prompted partly by universities moving course materials onto iTunes U.
On September 26, 2008, Apple signed a cooperative agreement (often mischaracterized as a lawsuit; NFB's Mark Riccobono has publicly corrected Reuters on this point[49]) committing to make iTunes U fully accessible by year-end, iTunes and the iTunes Store fully accessible by June 30, 2009, and contributing $250,000 to the Massachusetts Commission for the Blind.[48] The same month, the iPod nano 4th generation became the first iPod with spoken menus.[67] The iPod shuffle 3rd generation (March 11, 2009), which had no screen at all, shipped VoiceOver in 20 languages and was marketed as "the first music player that talks to you."[44] The shuffle was also a powerful example of the curb-cut effect: because the device had no screen whatsoever, VoiceOver's spoken track and artist announcements were used by all users, normalizing the talking interface for the general population and demonstrating that accessibility features could enhance everyone's experience.[79]
The 36 Seconds That Changed Mobile
The iPhone's unveiling in January 2007 "caused the blind community more or less to go into panic mode,"[69] per AFB's AccessWorld — a flat sheet of glass seemed fundamentally hostile to non-visual use. The answer came at WWDC on June 8, 2009, in a keynote segment later chronicled in Shelly Brisbin's "36 Seconds That Changed Everything" and MacStories' iOS accessibility timeline.[7]
With Jobs on medical leave recovering from a liver transplant, Phil Schiller spent exactly 36 seconds introducing VoiceOver, Zoom, White-on-Black, and Mono Audio for the iPhone 3GS, which shipped June 19, 2009 running iPhone OS 3.0.
The engineering breakthrough was a new gesture grammar. A single finger dragging across the screen announced whatever it touched and placed a VoiceOver cursor; a double-tap anywhere activated the selected item; three-finger swipes scrolled; and a later "rotor" gesture rotated navigation granularity between headings, links, and characters. This was the first mainstream touchscreen device usable out-of-the-box by blind people, with no "blindness tax" — the same price sighted users paid.
"Apple has developed an ingenious 'gesture' technology... the accessibility comes at the same price our sighted friends and colleagues pay."
"The blind community around the world was very sceptical about whether a flat sheet of glass would ever give them the reasonable access they enjoyed" — skepticism that flipped to enthusiasm within weeks.
Two years later, in September 2011, Stevie Wonder publicly thanked Jobs from the Echoplex stage in Los Angeles, weeks before Jobs's death:
"There's nothing on the iPhone or the iPad that you can do that I can't do."
In a subsequent interview, Wonder articulated the Jobs-era philosophy more cleanly than Jobs ever did himself: "His company was the first to come up with technology that made it accessible without screaming out loud, 'This is for the blind, this is for the deaf.' He made it part of the actual unit itself."[11]
Jobs' Implicit Philosophy
Jobs left remarkably few direct quotes about accessibility. His canonical design maxim — "Design is not just what it looks like and feels like. Design is how it works" (New York Times Magazine, 2003)[12] — has been retroactively claimed by accessibility advocates, though Jobs never framed it in those terms.
The closest he came on stage was the WWDC 2006 Leopard preview, where he spent a brief segment on Universal Access, noting that "Mac OS X is so great that we want everyone to be able to use it," before pivoting to mock Windows Vista's text-to-speech.[13] Accessibility segments in Jobs-era keynotes were almost always delivered by deputies.
The consequential Jobs-era decisions were structural rather than rhetorical. Bundling VoiceOver free with every Mac and every iOS device, rather than selling it as an add-on, meant that UIAccessibility (iOS) and NSAccessibility (Mac) APIs became core to the platform — so any third-party developer using standard controls inherited accessibility by default. That architectural choice, not any speech, is Jobs' accessibility legacy.
The Five Pillars of Apple Accessibility
Apple organizes its accessibility features around five categories that together cover the full spectrum of human ability. Each has grown from a single feature to an interconnected ecosystem.
Vision
From a $700 third-party screen reader to a platform where blindness imposes zero extra cost.
- 1988
- CloseView — 2x-16x magnifier with color inversion, bundled in System 6.0
- 2002
- Universal Access pane — consolidated magnification, speech, keyboard/mouse access in Jaguar[1]
- 2005
- VoiceOver — first built-in screen reader in any mainstream OS (Tiger)[36]
- 2009
- VoiceOver on iPhone 3GS — touch-gesture screen reader with rotor navigation[7]
- 2013
- Dynamic Type — systemwide text scaling (iOS 7)
- 2016
- Magnifier — camera as digital loupe (iOS 10)[19]
- 2020
- VoiceOver Recognition — on-device ML auto-labels unlabeled buttons (iOS 14)
- 2022
- Door Detection — LiDAR-based, reads signage and locates doors (iOS 16)[25]
- 2025
- Braille Access — full braille notetaker with Nemeth math, BRF files, Live Captions on braille displays[21]
- 2025
- Magnifier for Mac via Continuity Camera; Accessibility Reader for dyslexia/low vision[51]
Hearing
Turning every Apple device into an auditory prosthesis — from hearing aids to haptic music.
- 2009
- Mono Audio — collapses stereo to one channel for single-sided hearing loss (iPhone 3GS)
- 2014
- Live Listen — iPhone as directional remote mic (extended to AirPods in iOS 12, 2018)
- 2020
- Sound Recognition — on-device ML detects fire alarms, doorbells, crying babies (iOS 14)[16]
- 2020
- Headphone Accommodations — custom audio profiles with amplification and frequency adjustments for individual hearing (iOS 14)
- 2021
- SignTime — browser-based ASL, BSL, LSF interpreter service for AppleCare[78]
- 2022
- Live Captions — systemwide real-time speech-to-text (iOS 16)[25]
- 2024
- Music Haptics — Taptic Engine renders audio for deaf users (iOS 18)[20]
- 2024
- AirPods Pro 2 Hearing Aid — FDA-cleared clinical-grade hearing aid functionality in consumer earbuds, blurring the line between consumer tech and medical device[75]
Physical & Motor
From keyboard shortcuts to brain-computer interfaces — every input modality Apple has opened.
- 1987
- Easy Access — Sticky Keys, Mouse Keys, Slow Keys (System 4.2)[43]
- 2012
- AssistiveTouch — on-screen gesture menu for motor-impaired users (iOS 6)
- 2013
- Switch Control — full device control via external switches for severe motor limitations (iOS 7)
- 2019
- Voice Control — full offline device control via on-device speech recognition (macOS Catalina, iOS 13)
- 2020
- Back Tap — double/triple-tap the back of the iPhone to trigger actions (iOS 14)
- 2021
- Apple Watch AssistiveTouch — pinch and clench detection using optical heart sensor + gyroscope + ML[17]
- 2022
- Apple Watch Mirroring + Buddy Controller (iOS 16)[25]
- 2023
- Double Tap — productized AssistiveTouch research into mainstream gesture (Apple Watch Series 9)
- 2024
- Eye Tracking — front camera + on-device ML, no external hardware (iOS 18)[20]
- 2024
- Vision Pro — Pointer Control (head/wrist/finger), Dwell Control, full accessibility suite at launch
- 2025
- BCI HID Protocol — neural interfaces become native input; Synchron Stentrode enables thought-control of iPad[21]
Cognitive
Reducing complexity without reducing dignity — interfaces that adapt to the user, not the other way around.
- 2011
- Siri — voice-based device control benefits motor, vision, and cognitive accessibility; later extended with Type to Siri for speech-impaired users
- 2012
- Guided Access — locks iOS to a single app for users with autism or for test-taking; demonstrated on stage by Scott Forstall (iOS 6)
- 2013
- Reduce Motion — minimizes parallax and animation effects for vestibular/cognitive sensitivity (iOS 7)
- 2014
- Speak Screen — swipe down from the top with two fingers to have the entire screen read aloud; aids reading disabilities (iOS 8)
- 2021
- Background Sounds — built-in ambient noise (rain, ocean, dark noise) for focus and anxiety reduction (iOS 15)
- 2023
- Assistive Access — distills Phone, FaceTime, Messages, Camera, Photos, and Music into a high-contrast, cognitively simplified interface, co-designed with cognitive disability users (iOS 17)[24]
- 2025
- Accessibility Reader — systemwide reading mode for dyslexia and low vision[21]
- 2025
- Accessibility Nutrition Labels — App Store product-page section disclosing which accessibility features each app supports, analogous to the 2020 privacy labels[21]
Speech
Full circle: from the Mac's first words in 1984 to synthesizing a dying person's voice in 2023.
- 1984
- MacinTalk — first built-in speech synthesis in a shipping OS (Katz & Barton)[66]
- 1993
- PlainTalk — speech synthesis and recognition on AV Quadras[2]
- 2008
- iPod nano spoken menus — first iPod with audio navigation[67]
- 2009
- iPod shuffle VoiceOver — 20 languages, "the first music player that talks to you"[44]
- 2023
- Personal Voice — ALS patients record ~15 min of prompts; on-device ML generates a synthetic replica of their voice, trained overnight[22]
- 2023
- Live Speech — type-to-speak for calls and in-person conversations (iOS 17)[24]
- 2024
- Vocal Shortcuts + Listen for Atypical Speech — built with the Speech Accessibility Project, University of Illinois (iOS 18)[20]
Cook's Decade of Expansion 2011-2025
Tim Cook became CEO on August 24, 2011, and the pace of accessibility additions quickly accelerated from the Jobs-era quarterly trickle to near-annual floods. The features in the pillar cards above tell the product story. But the quotes tell the cultural one.
At the February 28, 2014 shareholder meeting, after the National Center for Public Policy Research demanded Apple commit only to initiatives with a clear return on investment, Cook delivered the quote that now defines his public stance:
"When we work on making our devices accessible by the blind, I don't consider the bloody ROI... If you want me to do things only for ROI reasons, you should get out of this stock."
In a 2017 Global Accessibility Awareness Day conversation with Rikki Poynter, Cook was explicit:
"We've always viewed accessibility as a human right. And so just like human rights are for everyone, we want our products to be accessible to everyone. It's a basic core value of Apple."
The AI & Spatial Computing Wave 2020-2025
The most recent cycle exploits on-device machine learning in ways that would have been impossible a decade earlier. Personal Voice (iOS 17, September 2023) lets users at risk of losing their voice record roughly 15 minutes of randomized prompts from which on-device ML generates a synthetic voice that sounds like them, trained overnight while the device charges.[22] Assistive Access (iOS 17) distills core apps into a cognitively simplified interface co-designed with users with cognitive disabilities.[24]
Apple Vision Pro, released February 2, 2024 at $3,499, shipped with VoiceOver, Zoom, Pointer Control (head/wrist/index-finger alternatives to eye tracking), Dwell Control, Voice Control, Switch Control, and Guided Access on day one — the first spatial computing platform to ship with a complete accessibility suite at launch. Reimagining accessibility for 3D required novel approaches: VoiceOver uses spatial audio to describe where objects are located in a room, and Dwell Control lets users interact solely through sustained eye gaze. For developers, the ManipulationComponent API lets them configure how 3D entities respond to gestures, ensuring spatial content remains interactive for users with limited mobility.[82]
The May 13, 2025 GAAD announcement[21] — shipping in fall 2025 with iOS 26 — raised the stakes further:
- Accessibility Nutrition Labels — App Store disclosure of accessibility support, analogous to privacy labels
- Braille Access — full braille notetaker with Nemeth math, BRF file support, Live Captions on braille displays (transformative for DeafBlind users)
- Accessibility Reader — systemwide reading mode for dyslexia and low vision
- Magnifier for Mac — leveraging Continuity Camera[51]
- BCI HID Protocol — neural interfaces become a native input category; Synchron's Stentrode enables thought-control of iPad[21]
The on-device ML theme extends to personalization: Hover Typing displays enlarged text in a user-chosen font as each character is typed, and Vocal Shortcuts lets users trigger actions with custom sounds rather than standard voice commands — tools that adapt the device to the user's specific neural and physical profile rather than demanding the user adapt to the device.[80]
Vehicle Motion Cues (iOS 18) reduces motion sickness by displaying animated dots aligned with vehicle movement — a feature that, while primarily comfort-focused, underscores Apple's expanding definition of accessibility beyond the traditional five pillars.[75]
Institutionalizing Accessibility
Beneath the feature cadence sits a more profound organizational shift. The 1985 Office of Disability grew into a multi-division structure: Accessibility Engineering, Accessibility Design & Quality, and Global Accessibility Policy & Initiatives.
Dean Hudson, who joined in 2006 and is blind himself, describes a team that started with "three of us" and has "expanded greatly."[34] Sarah Herrlinger joined Apple in September 2003 (Special Markets / Education), moved into product management for accessibility in January 2012, and is now Senior Director of Global Accessibility Policy and Initiatives.[35] She was elected to the American Foundation for the Blind Board of Trustees in February 2018.[33]
"Accessibility for us is one of our core corporate values... we're built around six values and accessibility has been one of them right from the start."
The apple.com/accessibility website — redesigned and unveiled by Cook at the October 27, 2016 "Hello Again" MacBook Pro event alongside the Sady Paulson film[38] — turned accessibility into a public-facing brand pillar. Since 2016, Apple has issued a press release every May timed to Global Accessibility Awareness Day[45], previewing features that ship in the fall. The ritual has become so reliable that it now functions as an unofficial second keynote for the disability community.
The cross-functional depth is visible in specific products. The development of Face ID required hardware, software, and machine learning teams to collaborate to ensure the technology worked for users who are blind or have low vision, as well as those with diverse skin tones and facial structures. An internal Accessibility Center of Excellence within Apple's Information Systems and Technology (IS&T) group works to ensure both Apple and third-party software meet accessibility standards.[83]
Apple's approach to feature development increasingly follows the disability community's principle of "nothing about us without us." Assistive Access was co-designed with cognitive disability users. Personal Voice and Live Speech were developed in partnership with ALS advocacy organizations including Team Gleason. Listen for Atypical Speech was built with the Speech Accessibility Project at the University of Illinois. This consultative model ensures features are tested by the people who actually use them.[82]
Apple also publishes detailed Voluntary Product Accessibility Templates (VPATs)[40] documenting conformance with U.S. Revised Section 508, W3C WCAG, and the European EN 301 549 standard.
External Validation
Access Award (2009); Helen Keller Achievement Award (2015)[26]
"Apple's products are intuitive and accessible right out of the box. Apple is truly in a league of its own." — CEO Carl Augusto
Special Award (2009); Dr. Jacob Bolotin Award, $10,000 (July 8, 2010)[55]
"Apple has done more for accessibility than any other company to date." — President Mark Riccobono, 2014[27]
Robert S. Bray Award to Apple and Sarah Herrlinger jointly (July 4, 2016)[28]
Louis Braille Award (2017) for built-in braille support across Mac and iOS.
Helen Keller Achievement Award (2019) for improvements to quality of life for people with vision loss.
AccessAbility Award (2024) recognizing 40 years of accessibility leadership and innovation.
Officially recommends Apple products as "excellent options for blind and partially sighted people."[41]
Beyond corporate awards, Apple has cultivated an ecosystem where third-party developers extend accessibility's reach. Be My Eyes, an app connecting blind users with sighted volunteers via live video, won an App Store Award for Cultural Impact — demonstrating that Apple's platform and APIs enable accessibility innovation far beyond what Apple builds alone.[79]
Where criticism sharpens
Disability-tech journalist Steven Aquino has repeatedly called for Apple to name a Chief Accessibility Officer — a role Microsoft filled with Jenny Lay-Flurrie — arguing that even Cook's Apple lacks the C-suite formalization the work deserves. The annual AppleVis Report Card documents real complaints about VoiceOver bugs, braille focus handling, and localization gaps. Apple is not listed on the AAPD Disability Equality Index for employment, inviting scrutiny about whether the advocacy extends fully into hiring.
Design Tensions & Regressions
Apple's accessibility record is not without failures. The tension between aesthetic ambition and functional inclusion has surfaced repeatedly, most dramatically in design overhauls that broke what previously worked.
The iOS 7 reset
The total visual redesign of iOS 7 (2013) was, for low-vision users, a regression. Jony Ive's new direction introduced thin fonts, translucent layers, and reduced contrast that made the interface significantly harder to use for people with low vision. Apple responded with corrective settings — Bold Text, Button Shapes, Increase Contrast, and Reduce Transparency — but the episode highlighted a structural risk: when design leadership prioritizes visual novelty, accessibility can become collateral damage unless it has a seat at the table early enough in the process.[80]
Gesture-heavy interfaces
Modern iOS increasingly relies on hidden gestures and unlabeled icons, violating the usability heuristic of recognition over recall. Swipe-to-dismiss, drag handles, and icon-only toolbars create an unnecessary learning curve for users with cognitive disabilities or motor tremors. AssistiveTouch and Switch Control serve as correctives, but the need for them underscores the tension between minimalist aesthetics and discoverable interaction.[84]
Siri's inconsistency
Siri, which functions as a primary accessibility interface for many motor- and vision-impaired users, has drawn persistent criticism for quality regressions — particularly for users with atypical speech patterns, motor tremors, or non-native accents. The Listen for Atypical Speech feature (iOS 18) directly addresses one dimension of this problem, but the broader complaint reflects the stakes when a mainstream feature doubles as an accessibility lifeline.
Two Philosophies, One Architecture
Jobs: Structural, Cautious, Mute
Vetoed a higher-quality Apple II speech synthesizer as too expensive. Let the Mac fall two decades behind Windows in screen-reader support. Made the original iPhone unusable to blind people for two years. Never gave accessibility a proper keynote segment.
Yet the architectural decisions he approved — bundling VoiceOver free in Tiger, extending it to the touchscreen on the 3GS, baking accessibility into platform APIs — proved more consequential than any speech he didn't give.
"He made it part of the actual unit itself." — Stevie Wonder[11]
Cook: Vocal, Political, Expansive
Frames accessibility as human rights, invokes it alongside environmental and LGBTQ advocacy, and has made annual GAAD announcements a company ritual.[74]
Annual feature releases have grown from four at the iPhone 3GS launch to dozens per cycle, covering cognitive, motor, speech, hearing, vision, and now neural-interface accessibility.
"If you want me to do things only for ROI reasons, you should get out of this stock." — Tim Cook[29]
The two approaches are complementary rather than opposed: structural decisions made under Jobs enabled features announced under Cook, and Cook's public advocacy has in turn created internal air cover for the engineering investments that make features like Braille Access and thought-controlled iPads possible.
The most striking fact about Apple's accessibility record is not any single feature or quote, but the 41-year continuity of an institutional commitment that began with Alan Brightman's Office of Disability in 1985, survived a founder's death, and has accelerated rather than decayed — a rare example of corporate values outlasting the corporate personalities associated with them.
That continuity faces its next test now. In April 2026, Apple announced that Tim Cook will transition to Executive Chairman, with hardware chief John Ternus becoming CEO.[81] The succession is the second leadership handoff in the company's accessibility history. If the pattern holds — if the institutional commitment again proves stronger than any individual leader — then the architecture Jobs built and Cook amplified will continue to evolve under its third steward. The disability community will be watching.
Citations & Sources
- Apple Newsroom, 2002 (opens in new tab)
- Wikipedia: PlainTalk (opens in new tab)
- betawiki: Mac OS X Tiger (opens in new tab)
- AppleInsider Forums, 2005 (opens in new tab)
- NFB: Tactile Access to iPhone (opens in new tab)
- NFB & Massachusetts Agreement (opens in new tab)
- MacStories: 36 Seconds (opens in new tab)
- AFB AccessWorld, VoiceOver iPhone (opens in new tab)
- NFB Braille Monitor, 2009 (opens in new tab)
- idownloadblog: Stevie Wonder iOS (opens in new tab)
- blindgadget: Stevie Wonder on Jobs (opens in new tab)
- Steve Jobs design quote, NYT 2003 (opens in new tab)
- Engadget: WWDC 2006 keynote (opens in new tab)
- MacRumors: First MFi hearing aid (opens in new tab)
- CNN: Apple ReSound hearing aids (opens in new tab)
- MacRumors: Sound Recognition iOS 14 (opens in new tab)
- MacRumors: Apple Watch AssistiveTouch (opens in new tab)
- 9to5Mac: 2021 accessibility features (opens in new tab)
- 9to5Mac: iOS 10 Magnifier (opens in new tab)
- Apple Newsroom: 2024 accessibility (opens in new tab)
- Apple Newsroom: 2025 accessibility (opens in new tab)
- Apple ML Research: Personal Voice (opens in new tab)
- TechCrunch: 2023 accessibility (opens in new tab)
- 9to5Mac: iOS 17 accessibility (opens in new tab)
- AppleInsider: Door Detection, 2022 (opens in new tab)
- AFB: Helen Keller Award to Apple (opens in new tab)
- AppleInsider: NFB on Apple, 2014 (opens in new tab)
- ACB: Robert S. Bray Award (opens in new tab)
- Tim Cook on ROI, 2014 (opens in new tab)
- Tim Cook: accessibility as human right (opens in new tab)
- MacRumors: Cook accessibility meetings (opens in new tab)
- iMore: Accessibility as human right (opens in new tab)
- AFB: Sarah Herrlinger welcome (opens in new tab)
- AppleVis: Herrlinger & Hudson interview (opens in new tab)
- DePauw: Herrlinger commencement (opens in new tab)
- AppleVis: VoiceOver turns 10 (opens in new tab)
- doubletaponair: 20 years of VoiceOver (opens in new tab)
- Gantnews: Sady Paulson, 2016 (opens in new tab)
- Apple Accessibility (opens in new tab)
- Apple VPATs (opens in new tab)
- RNIB: Apple guide for sight loss (opens in new tab)
- AbilityNet: VoiceOver iPhone turns 10 (opens in new tab)
- Wikipedia: Mouse keys (opens in new tab)
- Wikipedia: iPod Shuffle (opens in new tab)
- Wikipedia: GAAD (opens in new tab)
- 9to5Mac: Apple accessibility team interview (opens in new tab)
- AppleInsider: iOS apps with VoiceOver (opens in new tab)
- CBS News: iTunes access for blind (opens in new tab)
- LoopInsight: Reuters correction (opens in new tab)
- NFB: Apple and NFB resolution (opens in new tab)
- Engadget: Magnifier for Mac, 2025 (opens in new tab)
- Thurrott: Magnifier for Mac (opens in new tab)
- idownloadblog: 2025 accessibility preview (opens in new tab)
- PRN: NFB commends Apple iPad VoiceOver (opens in new tab)
- PRN: NFB Bolotin Award (opens in new tab)
- Tim Cook on equality, 2013 (opens in new tab)
- CurbCuts blog, 2026 (opens in new tab)
- Taylor Arndt: reaction to accessibility (opens in new tab)
- ZeroCon24: Herrlinger profile (opens in new tab)
- MJTsai: 2023 accessibility preview (opens in new tab)
- WAP Journal: Lion speech (opens in new tab)
- webaxe: NFB coverage (opens in new tab)
- Cult of Mac: accessibility (opens in new tab)
- BlindAbilities podcast (opens in new tab)
- twit.tv: iOS Today 615 (opens in new tab)
- Medium: Thus Spake Macintosh (opens in new tab)
- Alastair: iPods get speech (opens in new tab)
- AFB AccessWorld 16/6 (opens in new tab)
- AFB AccessWorld 10/1 (opens in new tab)
- everymac: iPod spoken menus (opens in new tab)
- BAE Systems: macOS Mojave magnifier (opens in new tab)
- AbilityMagazine: 20 years ATA (opens in new tab)
- iOS Accessibility book (opens in new tab)
- EJBM: Corporate culture under Cook (opens in new tab)
- Apple Support: iOS 18 features (opens in new tab)
- Tim Cook quote (opens in new tab)
- Williamsville: Mac Universal Tools (opens in new tab)
- Apple Newsroom: 2021 accessibility (opens in new tab)
- CurbCuts: Apple at 50 (opens in new tab)
- DigitalA11Y: History of Digital Accessibility (opens in new tab)
- Apple Newsroom: Cook to Executive Chairman (opens in new tab)
- YouTube: Design for everyone at Apple (opens in new tab)
- Mashable: Tim Cook CEO achievements (opens in new tab)
- Medium: Leadership 2.0, Jobs & Cook (opens in new tab)