• Add a Comment
  • Edit
  • More Actions v
  • Quarantine this Entry

Comments (2)

1 localhost commented Permalink

I had a friend at University of Illinois Urbana-Champaign that did some research work with sonification to assist the visually-impaired. I don't have a link to their research online, but I believe they concluded that, for certain types of abstract data, sonified data was actually more effectively at expressing trends to visually-impaired subjects than visual data was at expressing trends to visually-unimpaired subjects. Furthermore, their study concluded that the sonified-fed subjects found more interesting and non-obvious patterns than the visually-fed subjects.Personally, I would like to see a combination several modes of representation: visual, auditory, and haptic (touch). Using several modes of feedback allows developers to express several dimensions of data to a user at once. Just look at how rich the user experience of a modern video game has become. Could you imagine the potential of an administrator that was hooked to a system that provided him with a multi-dimensional feed of realtime data expressed through several channels of input at once? Critical data could be reinforced through both sight and sound; alerts and warnings could be accented through force-feedback; and, like a videogame, I'd imagine that the administrators' responses would eventually become autonomic just as much as the system they're working with.

2 localhost commented Permalink

Great comment- Thank you. I think there is significant potential in your idea and coudn't agree with you more. Let's face it, humans are designed to use senses in combination more than in isolation. I have also observed that people vary as to which sense plays a dominant role. Personally I think I would find an auditory based system more intuitive than visual or haptic for eg.

Add a Comment Add a Comment