What do you see? A cognitive app for visual accessibility

Add audio descriptions to photographs taken with your smartphone

From the developerWorks archives

Leandro Cordeiro David

Date archived: October 4, 2017 | First published: March 07, 2016

Learn how you can use Bluemix, Apache Cordova, and IBM Watson Visual Recognition and Text to Speech services to quickly build and run an accessible mobile app for users who have vision problems.

This content is no longer being updated or maintained. The full article is provided "as is" in a PDF file. Given the rapid evolution of technology, some content, steps, or illustrations may have changed.



static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Cognitive computing, Mobile development
ArticleID=1028042
ArticleTitle=What do you see? A cognitive app for visual accessibility
publish-date=03072016