SXSW Spotlight: Accessible Complex Data

Share this post:

Editor’s note: This brief Q&A series will feature IBM researchers making presentations at the 2012 South-by-Southwest Interactive Conference in Austin, Texas.

Join the conversation: #sxswibm #accessdata

Susann Keohane and Brian Cragun, consultants for IBM Research’s Human Ability & Accessibility Center (referred to as the AbilityLab), will present Beyond a Thousand Words: Accessible Complex Data – a discussion about the accessibility challenges of analyzing, visualizing, and using today’s big data – on Tuesday, March 13.

Q: What kinds of solutions does the IBM AbilityLab develop? What is a recent example?

Susann Keohane

Our lab develops solutions to help everyone participate in technology. For example, Accessible Workplace Connections is a web-based application for employees with disabilities to have their necessary work accommodations be delivered, changed, supported and maintained effectively and efficiently.

And our Access My City application provides real-time transit data, geo-location and mapping technologies, and publicly available accessibility information, to mobile devices to help disabled residents and visitors navigate a city.

Check out some of our other research projects, here.  

Q: At SXSW, you will be discussing “Accessible Complex Data.” What kinds of new accessibility challenges are being posed by complex data? 

Brian Cragun

We all struggle to find pearls in the ocean of complex data. Well-chosen graphical visualizations have the ability to communicate key information quickly.

But as generally implemented, these complex visualizations are inaccessible to the blind. For the blind, the question we are working to answer is: how can we approximate and approach the high-bandwidth understanding, and autonomous discovery of the key information the sighted gain from complex visualizations, such as stock market history, census trends, or scientific data.

Q: What about smart devices – phones, televisions, etc. – that access the data? How are they a part of making information accessible (or preventing accessibility)?

Smart devices make information available anywhere at anytime. When users move to a smart device, many will be affected by what we call “situational” disability: outside light, a tiny screen, using one hand, riding on a bumpy road, or needing to access information without touching or looking at the device while driving.

More then ever, these situations emphasizes the need for inclusive design. The research we work on for core disability types (deaf, blind, mobility impaired) will benefit all users of smart devices.

Q: How is IBM making today’s flood of data, and the way it’s analyzed and shown, more accessible?

This is a great question – and the core of our presentation.

In current products, we provide user interaction with graphs, allowing the user to sift, sort, scale and filter the information. These capabilities are already available for the visually impaired. Now, research is looking at navigation of the graphs with audible cues, so users can discover the visualization themselves.  

We’re also looking at how to convert the visualizations into descriptive text so any user needing information in a hands-free or eyes-free environment can benefit. Technologies on the horizon, such as electrostatic screens, electrical sensations, and other tactile feedback tools will provide other sensory exploration to effectively utilized complex data.

Q: What needs to happen to make accessibility an automatic part of the process in expressing data?

Better mappings of visual information to other sensory modes need to be researched and proven.

A taxonomy of graphs and content with corresponding navigation, and audible output, can standardize interactions, and provide a foundation for new graphs in the future.

Attending SXSW? Add Susann and Brian’s presentation to your schedule.

More stories

A new supercomputing-powered weather model may ready us for Exascale

In the U.S. alone, extreme weather caused some 297 deaths and $53.5 billion in economic damage in 2016. Globally, natural disasters caused $175 billion in damage. It’s essential for governments, business and people to receive advance warning of wild weather in order to minimize its impact, yet today the information we get is limited. Current […]

Continue reading

DREAM Challenge results: Can machine learning help improve accuracy in breast cancer screening?

        Breast Cancer is the most common cancer in women. It is estimated that one out of eight women will be diagnosed with breast cancer in their lifetime. The good news is that 99 percent of women whose breast cancer was detected early (stage 1 or 0) survive beyond five years after […]

Continue reading

Computational Neuroscience

New Issue of the IBM Journal of Research and Development   Understanding the brain’s dynamics is of central importance to neuroscience. Our ability to observe, model, and infer from neuroscientific data the principles and mechanisms of brain dynamics determines our ability to understand the brain’s unusual cognitive and behavioral capabilities. Our guest editors, James Kozloski, […]

Continue reading