DevOps

OpenAIR Winners Leverage IBM Accessibility Tools

Share this post:

The annual Knowbility OpenAIR web accessibility challenge announced the winners of its 19th annual competition, which pairs teams of web developers and designers with registered non-profits to create or improve their website and make it more accessible to people with disabilities.Knowbility logo that says, "I am accessible!" with knowbility.org underneath.

As part of this year’s competition, IBM made its accessibility testing tools available to help participants create accessible websites and content and manage conformance to industry standards. The tools included IBM AbilityLab™ Dynamic Assessment Plugin (DAP), Digital Content Checker and Automated Accessibility Tester.

DAP was also used by the judges to test the accessibility of the contestants’ websites that were created for more than a dozen local and international nonprofits. DAP is a browser extension that integrates automated accessibility testing with standard browser development tools. It was designed to allow developers to quickly identify accessibility issues and evaluate potential fixes, including those related to Accessible Rich Internet Applications (ARIA) implementations.

Jessica Looney, Program Director at Knowbility, said, “DAP was a remarkably easy tool to provide teams, mentors and judges. Unlike some of the other accessibility testing tools, the OpenAIR training committee was able to produce a simple video that demonstrated how to load and locate the tool. Once open, the tool usage is self-evident providing insight into standards and best practices in an easy to understand format.”

This year’s teams, including the second-place team in the industry professional track, Microassist, leveraged DAP in its submission for 4C (Christian Churches Collaborating for Changes). MicroAssist is an Austin, Texas-based learning and development firm that delivers and hosts customer usability and accessibility testing.

Shashi Ganesan, Team Lead from Microassist, commented on his group’s work with DAP. “Our team has experience with accessibility due to our extensive work with government and higher education. We found the DAP tool reduced our need to use the W3C (World Wide Web Consortium) code validator to check code syntax and was good at giving guidance and best practices. For example, the data table audit was the best we have seen in an automated tool.”

IBM congratulates all the teams that entered the competition this year who are dedicated to making the world more accessible.

For more information:

  • For those interested in trying DAP, please contact us.
  • Digital Content Checker and Automated Accessibility Tester are currently available  as experimental services on IBM Bluemix. These services are designed give designers and developers an easy and fast way to incorporate and strengthen accessibility for web applications and content.

Social Media Manager & Webmaster, IBM Accessibility

More stories
By Alexandra Grossi on December 3, 2019

Diversely Deaf IBMers Work to Create An Accessible Future

This past August, I started my position as UX designer at IBM Accessibility. I was new to the team, and relatively new to the world of inclusive design—but only from a designer’s perspective. Born profoundly deaf, I brought with me a lifetime of experiences with accessibility (or lack thereof, in many cases). My second week […]

Continue reading

By Shari Trewin on December 3, 2019

Exploring AI Fairness for People with Disabilities

This year’s International Day of Persons with Disabilities emphasizes participation and leadership. In today’s fast-paced world, more and more decisions affecting participation and opportunities for leadership are automated. This includes selecting candidates for a job interview, approving loan applicants, or admitting students into college. There is a trend towards using artificial intelligence (AI) methods such […]

Continue reading

By Mary Jo Mueller on July 31, 2019

Accessibility Conformance Test Rules Format 1.0 reaches Proposed Recommendation status

It’s a common question with accessibility verification and reporting. How did someone arrive at their test results? Are their results accurate, or are the issues being reported really not required by the accessibility standards? As a developer, it can be really frustrating when you’ve done the accessibility work and validated using robust automated tools and sound […]

Continue reading