What do you see? A cognitive app for visual accessibility
Add audio descriptions to photographs taken with your smartphone
From the developerWorks archives
Date archived: October 4, 2017 | First published: March 07, 2016
Learn how you can use Bluemix, Apache Cordova, and IBM Watson Visual Recognition and Text to Speech services to quickly build and run an accessible mobile app for users who have vision problems.
This content is no longer being updated or maintained. The full article is provided "as is" in a PDF file. Given the rapid evolution of technology, some content, steps, or illustrations may have changed.