The Latest Strides in Assistive Tech
As we’ve previously discussed, assistive technology and accessibility requirements development and implementation have a way to go, and any strides tech companies, app developers, or any businesses make are met with enthusiasm and hope.
Assistive tech is for everyone
Assistive technology can help — and can be used by everyone — not just for the people living with disabilities.
Take ChromeVox, for instance. As Kim Krause Berg notes in her article on test-driving Google’s accessibility apps for Android:
I love text to speech software, not because I’m blind but because I multi-task. Assistive technology can be assistive in more ways than just being for screen reader usage and that’s one of the areas where the accessibility industry strives to educate companies. […]
Applications have a long way to go too. Developers ignore accessibility requirements, likely due to the lack of understanding what accessibility is, who depends on the enhancements and not having the staff to add the necessary code, let alone perform testing. Not only are special needs users customers and clients, they are developers themselves.
Assistive tech regulations and impediments
The government has regulatory requirements for its federal programs and systems to be accessible, including to those with disabilities. This includes the so-called Cloud First policy, which had been in place since 2010.
The National Institute of Standards and Technology (NCCP) had also come up with the followingfive recommendations for federal IT staff to ensure that everyone, including disabled employees, have equal access (as reported by Aaron Boyd for Federal Times):
- Version control. Automated updates in cloud computing can disrupt accessibility apps.
- Reliance on a browser. Boyd wrote: “Because cloud apps have to be accessed through a connection — rather than the local hard drive — users often have to go through a browser, which adds another layer to the ‘accessibility value chain’.”
- Learning a new platform. Having a unified platform can cause problems for people with disabilities, “who now have to become experts in that platform and the associated accessibility options.” This could be easy for tech-savvy employees, but difficult to harness otherwise.
- Use of thin clients. According to Boyd: “Thin clients — in which the device is merely used to display information and receive inputs, while the actual computing takes place on cloud servers — are an essential part of many cloud systems. However, if a client is too thin, accessibility tools might not work as they should.” Apparently it’s an issue with screen enlargers in particular, which are hard to run on a remote server.
- Rich data visualizations. Boyd noted: “Visualizations are a great way to make large data sets more digestible for the user but only if that user can see the visualization. There are tools and methods for getting around this but those aren’t always available in a cloud setup. Similar to the thin client issue, if the associated data lives on the cloud servers and isn’t accessible to the user, assistive tools might not be helpful.”
In another article for Federal Times, “9 examples of tech making it harder for people with disabilities,” Boyd lists the cases the NCCP has come up to illustrate how current tech solutions can hinder access to government information for people with disabilities. He also cites concrete examples, using hypothetical situations, that really drive the point home how someone who, say, is visually impaired, may not be able to perform his or her job unless a change is implemented.
These examples apply to all businesses, not just government agencies:
Unexpected software updates. “Unexpected software updates to the internal cloud application sometimes change the layout and cause the Braille display to lose focus.”
Lack or resources to fix a fixable problem, like simplifying record-keeping or maintenance orders fulfillment.
Strict digital processes can hamper productivity. Switching to new applications, some possibly in the cloud, may make it hard to keep track of all logins and passwords for some people, and it’s against policy to maintain the “cheat sheet” for security reasons. Some agencies offer “low-vision” solutions like high-contrast settings or larger monitors, but not every employee may know it’s available, and therefore may not ask for one.
Captioning issues can be confusing and lead to missed opportunities. Some training videos the companies are using may have poor-quality captioning, or transcripts that don’t indicate accurately what’s being said in a particular part of the video.
Another example is telecommuting: Sign language is often unavailable during meeting calls, and real-time captioning (CART) is not always available, or of poor quality. As a result, a person with disabilities may miss out on opportunities to fully participate in a meeting by asking questions and making comments, or would not pass a test at the end of the training video (some are mandatory).
Not all cloud service providers support speech recognition systems. If, say, your employer starts using a cloud service provider that uses a virtual desktop that does not recognize speech recognition software an employee with disabilities is using to perform his or her job, that person may not be able work unless a solution is found to support the current configuration.
Lack of compatibility requirements for vendors. Often service providers have trouble getting assistive technology to work on virtual desktops, on remote servers. Even if the employer is aware (has tested it and found it lacking), there are no currently defined requirements for vendors to eliminate this compatibility issue.
Online courses and training materials are graphics-heavy. This may render them unusable by the blind and visually impaired, which, in turn, would hinder professional development of those employees.
Latest updates don’t work with old interface. Some apps were not designed “according to the accessibility recommendations for the operating system, which often change.” With each update, some functionality may be lost or is no longer accessible by a person with a disability. As a result, that person’s value in the company may be reduced, and lead to a lot of frustration.
Google’s disability initiatives
Google has been making impressive changes within the accessibility industry, not to mention awarding sizable sums to various disability initiatives. As reported by Shaun Heasley on April 13 on the Disability Scoop website, it’s been one of the company missions for a while to help create and spread technology that would increase independence people with disabilities. He wrote:
The company’s charitable arm, Google.org, said this week that it has selected 30 organizations to receive grants through its “Google Impact Challenge: Disabilities” initiative.
All told, Google is distributing more than $20 million to groups located in 13 different countries through the effort.
Google already has guidelines in place for developers on how to make their technology more accessible to people with disabilities. On April 11, Google has also announced a tool for Android on its blog.
Called Accessibility Scanner, it “lets developers test their own apps and receive suggestions on ways to enhance accessibility. For example, the tool might recommend enlarging small buttons, increasing the contrast between text and its background and more.”
To use it, open the app you want to scan, then tap the Accessibility Scanner button to find features that might need accessibility improvements, such as providing descriptions, etc. The Accessibility Scanner is available now for free from Google’s Play Store.
The reviews had been pouring in, largely positive. Accessibility Scanner had been declared easy to use, with the reviewers noting, of course, that the app suggests improvements but does not perform them.
Santiago Tiongco noted in Tech Times how the app “makes discovering accessibility shortcomings in an application more concrete and thus, lowers ‘the barrier for entry’ to discussions on accessibility.”
He also points out how the reviewers liked the color contrast and target size features, as well the screenshot one, “which easily captures and outlines accessibility problems on an app’s user interface (UI), making it easier to address the problem and compiled for a ‘product team’.”
Android N preview
A few weeks ago Google also announced a preview of Android N for developers, updating a few features:
Vision Settings on Welcome screen. Vision Settings, that let control magnification, font size, display size, and TalkBack are brought front and center to the Welcome screen, which appears when the device is activated. That way, these features can be set up and activated right from the start.
Improved ChromeVox. ChromeVox, a built-in screen reader, allows to navigate the screen using text-to-speech software. The latest version, ChromeVox Next Beta, includes “a simplified keyboard shortcut model, a new caption panel to display speech and Braille output, and a new set of navigation sounds.”
Edit Google Docs with your voice. You can type, edit, and format documents in Google Docs using voice commands, such as “copy” or “insert table” — “making it easier for people who can’t use a touchscreen to edit documents.” Google keeps working with Freedom Scientific “to improve the Google Docs and Drive experience” with the JAWS screen reader.
Voice commands. Voice Access Beta is an app that “allows people who have difficulty manipulating a touch screen due to paralysis, tremor, temporary injury or other reasons to control their Android devices by voice. For example, you can say ‘open Chrome’ or ‘go home’ to navigate around the phone, or interact with the screen by saying ‘click next’ or ‘scroll down’.”