Which Apple WWDC2016 Announcements Could Be Useful to People With Disabilities?
During its annual WWDC conference for software developers held last week in San Francisco Apple announced a few improvements to it its systems, such as opening Siri up to third-party app developers and coming to Mac, and making Apple Pay available online.
While none of the announcements could be considered earth-shattering, and, some argue, are just part of playing catch-up to the major platforms like Amazon, Google and Facebook, these enhancements could definitely be useful to us all, including to the people living with disabilities.
Haben Girma a featured speaker at WWDC2016
Apple always had a good rep in the assistive technology realm, so it wasn’t surprising that one of the featured speakers was accessibility/inclusion advocate Haben Girma, the first deaf blind person to graduate from Harvard Law School. Girma, who uses some of Apple’s products, spoke about the importance and value of designing products with people with disabilities in mind, thus removing barriers from the start and across platforms.
Girma praised Apple for the Taptic Engine in the Watch for its vibration alert technology, the VoiceOver and Dynamic Type for their features designed for the visually impaired, and even the fact that all conference video presentations at WWDC were captioned.
WWDC2016 announcement highlights
Though some announced improvements couldn’t strictly fall into the category of assistive tech, they could still be particularly useful to people living with disabilities.
As Venkat Rao of the Assistive Technology Blog noted:
“It seems like a lot of new features will enable people who have visual impairment, impaired motor skills, muscular dystrophies/atrophy, and other similar disabilities to interact with the devices just with their voice and get desired results very quickly and efficiently. Using the new features with an additional eye tracking device for those who need it can definitely enhance their experience as well.”
- Allowing third-party app developers to take advantage of voice controls for Siri
- Enhanced textual prediction
- Enhanced search capabilities (such as, images only, or images of the beach only, etc.)
- Apple Pay becoming available online
- Improved Apple Pay checkout process
- Faster app upload
- Streamlined Apple Pay usage
- Including wheelchair-specific workouts in fitness features, like tracking physical activity via pushing techniques with different speeds and terrains, and activity notifications
- Ability to reach emergency services and personal emergency contacts with a push of a button
Siri is coming to Mac, and will be able to help with search, calendar settings, questions, and everything else Siri already does for the iPhone.
Playing catch-up to Facebook Messenger , Alexa, etc.
As Mark Sullivan noted in his article on Apple’s latest efforts in what he calls the “AI Personal Assistant Wars,” Google, Facebook, Amazon, and Microsoft are concerning themselves greatly with new technologies spanning such fields as “artificial intelligence, natural language processing, machine learning, and computer vision.”
Apple is on board too, of course. Sullivan lists several recent developments that demonstrate that Apple has been working hard in those areas:
- “[…] iOS has started to consult user information from within app silos to provide more useful recommendations…”
- Siri has gotten better about handling nested conversations (searching for specific type of files, then narrowing the results to a those files from a specific person), within the same query
- Apple has been working on computer vision, “the science of teaching computer systems to analyze, understand, and act on the visual aspects of images”
- There has been progress in using facial recognition technology (like Facebook uses for its Messenger) to improve search capabilities
While Sullivan comments that it’s “hugely important to Siri that Apple is now allowing third-party developers to use the personal assistant as an entry point to their apps” because Siri can now perform tasks like summoning an Uber car or composing a WhatsApp message (similar to Amazon’s Alexa), an OS division remains for Siri. This means that Siri is different on each device, and, say, Siri on iPhone doesn’t know what Siri on the Watch is doing. This isn’t the case with Alexa, for example, that transcends the devices.
Farhad Manjoo, who attended the keynote, and penned his commentary in The New York Times, also noticed Apple’s change of tactic in its relationship with the third-party app developers. He wrote:
“Apple now allows third-party developers to create new experiences in Apple Maps, to interact with Siri, to get access to its Messages service and to get access to the lock screen and phone interface. Similar access has been available to developers on Google’s Android operating system. Now Apple seems to be going out of its way to cater to developers, too.”
This may be a departure from what he calls “a kind of old-fashioned view of tech” by Apple, where the tech world is viewed not as a uniform operating system powered by the Internet, which contains all the data; but as all devices holding their own, separately, with limited cloud-based integration and the Internet “merely the glue between them.”
“The next era of the tech industry will be defined by which of these worldviews becomes dominant,” predicts Manjoo.