Apple today unveiled new software features that make it easier for people with disabilities to explore, interact, and get the most out of their Apple devices.
These significant enhancements combine Apple’s latest technologies to provide users with unique and customizable tools, and they continue the company’s long-standing dedication to producing products that work for everyone.
People who are blind or low vision can use Door Detection on their iPhone and iPad to navigate the final few feet to their destination; users with physical and motor disabilities who rely on assistive features like Voice Control and Switch Control can fully control Apple Watch from their iPhone with Apple Watch Mirroring; and the Deaf and hard of hearing community can follow Live Captions on iPhone, iPad, and Apple Watch using advancements in hardware, software, and machine learning. Apple is also adding over 20 more languages and locales to its industry-leading screen reader VoiceOver. These improvements will be available later this year as part of Apple platform software releases.
Apple’s senior director of Accessibility Policy and Initiatives, Sarah Herrlinger, stated, “Apple embeds accessibility into every area of our work, and we are committed to designing the finest products and services for everyone.” “We’re delighted to announce these new capabilities, which bring together the best of Apple’s ingenuity and creativity to provide consumers more opportunities to use our devices in ways that best suit their needs and lives.”
Door Detection for Users with Low Vision or Blindness
Apple has released Door Detection, a cutting-edge navigation capability for blind or low-vision users. When users arrive at a new location, Door Detection can assist them locate a door, determine how far away they are from it, and characterize door qualities such as whether it is open or closed, and if it is closed, whether it can be opened by pushing, turning a knob, or pulling a handle. Door Detection can also scan signs and symbols near the door, such as an office’s room number or the existence of an accessible entrance mark. This new capability will be accessible on iPhone and iPad models with the LiDAR Scanner and will combine the capabilities of LiDAR, camera, and on-device machine learning.
Magnifier, Apple’s built-in app for blind and low vision people, will have a new Detection Mode with Door Detection. In Detection Mode, users with vision disabilities can use Door Detection, People Detection, and Image Descriptions individually or simultaneously, providing a one-stop shop with customized tools to assist them navigate and receive rich descriptions of their environment. Apple Maps will provide sound and haptics feedback for VoiceOver users to identify the beginning point for walking routes, in addition to navigation capabilities within Magnifier.
Increasing Apple Watch physical and motor accessibility
Apple Watch Mirroring, which allows users to control Apple Watch remotely from their associated iPhone, makes it more accessible than ever for persons with physical and motor limitations.
Users can control Apple Watch using iPhone’s assistive capabilities such as Voice Control and Switch Control, and employ inputs such as voice commands, sound actions, head tracking, or external Made for iPhone switches as a substitute to tapping the Apple Watch display with Apple Watch Mirroring. Apple Watch Mirroring combines hardware and software, including AirPlay advancements, to ensure that customers who rely on these mobility features can benefit from Apple Watch apps like Blood Oxygen, Heart Rate, Mindfulness, and others.

Furthermore, users can manage Apple Watch using simple hand motions. A double-pinch motion on Apple Watch can now answer or finish a phone call, dismiss a notification, take a photo, play or stop media in the Now Playing app, and start, pause, or continue a workout, among other things. This expands on Apple Watch’s breakthrough AssistiveTouch technology, which allows users with upper-body limb impairments to manage Apple Watch with motions like pinching or clenching instead of tapping the screen.

For Deaf and Hard of Hearing Users, Live Captions are now available on iPhone, iPad, and Mac.
Apple is providing Live Captions for iPhone, iPad, and Mac for the Deaf and Hard of Hearing communities. 3 Users can follow along with any audio content more readily, whether they’re on a phone or FaceTime call, utilizing a video conferencing or social media app, streaming media content, or conversing with someone nearby. Users can also change the text size to make it easier to read. FaceTime’s Live Captions assign auto-transcribed language to call participants, making group video chats even more accessible to those with hearing impairments. When using Live Captions for calls on a Mac, users can compose an answer and have it read aloud in real time to other participants in the conversation. And because Live Captions are generated on device, user information stays private and secure.
New Languages and More for VoiceOver
VoiceOver, Apple’s industry-leading screen reader for blind and low-vision users, now supports over 20 new geographies and languages, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese.
Users can also choose from dozens of new voices across languages that are suited for assistive features. Speak Selection and Speak Screen accessibility capabilities will also support these new languages, locations, and voices. VoiceOver users on Mac may also utilize the new Text Checker tool to find common formatting mistakes like duplicate spaces or misplaced capital letters, making proofreading documents or emails much easier.
Additional Features
- With Buddy Controller, users can ask a care provider or friend to help them play a game; Buddy Controller combines any two game controllers into one, so multiple controllers can drive the input for a single player.
- With Siri Pause Time, users with speech disabilities can adjust how long Siri waits before responding to a request.
- Voice Control Spelling Mode gives users the option to dictate custom spellings using letter-by-letter input.5
- Sound Recognition can be customized to recognize sounds that are specific to a person’s environment, like their home’s unique alarm, doorbell, or appliances.
- The Apple Books app will offer new themes, and introduce customization options such as bolding text and adjusting line, character, and word spacing for an even more accessible reading experience.
Leave a Reply
View Comments