Kibbles and Bytes

Apple Announces New Accessibility Features

Written by Don Mayer | May 20, 2021 4:01:07 PM

Apple made a quiet announcement this week that is a major new accessibility initiative. This includes software updates that add new features for people with disabilities and a new service that connects customers with sign language interpreters.

The new features including the sign language interpreters will be released later this year.  The new features target aiding accessibility for those with mobility, vision, hearing, or cognitive disabilities.

Apple said in announcing these new features: "These next-generation technologies showcase Apple's belief that accessibility is a human right and advance the company's long history of delivering industry-leading features that make Apple products customizable for all users,"

AssistiveTouch for Apple Watch

AssistiveTouch for watchOS will allow users to control their Apple Watch without ever having to touch the display. This feature should be especially useful to those users with limited mobility. This is a very cool feature that many will use.  You can double clench your hand to initiate actions or use your fingers to pinch in the air and the Apple Watch will recognize those motions and act accordingly.  Check out the video demo here.   Here are the details:

To support users with limited mobility, Apple is introducing a revolutionary new accessibility feature for Apple Watch. AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls.

Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench. AssistiveTouch on Apple Watch enables customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more.

Apple already has AssistiveTouch for other iOS devices such as iPad, iPhone and iPod(I still have a few iPods, including one that delivers my tunes on my motorcycle).  You can use AssistiveTouch to adjust volume, lock your screen, use multi-finger gestures, restart your device, or replace pressing buttons with just a tap.  You can see how to activate and use AssistiveTouch for iOS at Apple's AssistiveTouch page.

Eye-Tracking Support for iPad

Later this year, iPadOS will support third-party eye-tracking devices, making it possible for people to control iPad using just their eyes. Later this year, compatible MFi (made for iPhone) devices will track where a person is looking onscreen and the pointer will move to follow the person’s gaze, while extended eye contact performs an action, like a tap.  I remember the headband mouse that tried to do the same thing way back when and a room full of folks checking it out but bobbing their heads around.
Explore Images with VoiceOver
VoiceOver is an amazing software feature that Apple has had for some time and has been a huge benefit to  the blind and low vision communities. Building on recent updates that brought Image Descriptions to VoiceOver, users can now explore even more details about the people, text, table data, and other objects within images. Users can navigate a photo of a receipt like a table: by row and column, complete with table headers. VoiceOver can also describe a person’s position along with other objects within images — so people can relive memories in detail, and with Markup, users can add their own image descriptions to personalize family photos.

Made for iPhone Hearing Aids and Audiogram Support

I have been a big fan of Apple's commitment to the hearing impaired.  Since I suffer from some hearing loss (probably as a result of a life of rock and roll music and motorcycles) I have watched Apple help the hearing device industry make hearing aids more and more useful.  In a significant update to the MFi hearing devices program, Apple is adding support for new bi-directional hearing aids. The microphones in these new hearing aids enable those who are deaf or hard of hearing to have hands-free phone and FaceTime conversations. The next-generation models from MFi partners will be available later this year.
Apple is also bringing support for recognizing audiograms — charts that show the results of a hearing test — to Headphone Accommodations. Users can quickly customize their audio with their latest hearing test results imported from a paper or PDF audiogram. Headphone Accommodations amplify soft sounds and adjust certain frequencies to suit a user’s hearing.

Background Sounds

Since everyday sounds can be distracting, discomforting, or overwhelming, and in support of neurodiversity, Apple is introducing new background sounds to help minimize distractions and help users focus, stay calm, or rest. Balanced, bright, or dark noise, as well as ocean, rain, or stream sounds continuously play in the background to mask unwanted environmental or external noise, and the sounds mix into or duck under other audio and system sounds.
Apple has launched a new service called SignTime. This enables customers to communicate with AppleCare and Retail Customer Care by using American Sign Language (ASL) in the US, British Sign Language (BSL) in the UK, or French Sign Language (LSF) in France, right in their web browsers. Customers visiting Apple Store locations can also use SignTime to remotely access a sign language interpreter without booking ahead of time. SignTime will initially launch in the US, UK, and France, with plans to expand to additional countries in the future.
More Accessibility Features
  • Sound Actions for Switch Control replaces physical buttons and switches with mouth sounds — such as a click, pop, or “ee” sound — for users who are non-speaking and have limited mobility.
  • Display and Text Size settings can be customized in each app for users with colorblindness or other vision challenges to make the screen easier to see. Users will be able to customize these settings on an app-by-app basis for all supported apps.
  • New Memoji customizations better represent users with oxygen tubes, cochlear implants, and a soft helmet for headwear.

Apple's commitment to accessibility is industry leading and remarkable.  In addition to using MFI hearing aids, I have activated other accessibility features that might be useful and look forward to checking these out as they are released.