Interactive Sonification of Abstract Data - Framework, Design Space, Evaluation, and User Tool

dc.contributor.advisorShneiderman, Benen_US
dc.contributor.authorZhao, Haixiaen_US
dc.contributor.departmentComputer Scienceen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2006-06-14T05:37:56Z
dc.date.available2006-06-14T05:37:56Z
dc.date.issued2006-04-24en_US
dc.description.abstractFor people with visual impairments, sound is an important information channel. The traditional accommodation for visually impaired users to access data is to rely on screen readers to speak the data in tabular forms. While speech can accurately describe information, such data presentation tends to be long and hard to realize complex information. This is particularly true in exploratory data analysis in which users often need to examine the data from different aspects. Sonification, the use of non-speech sound, has shown to help data comprehension. Previous data sonifications focus on data to sound attribute mapping and typically lack support for task-oriented data interaction. This dissertation makes four contributions. (1) An Action-by-Design-Component (ADC) framework guides auditory interface designs for exploratory data analysis. The framework characterizes data interaction in the auditory mode as a set of Auditory Information Seeking Actions (AISA). It also discusses design considerations for a set of Design Components to support AISAs, contrasted with actions in visualizations. (2) Applying the framework to geo-referenced statistical data, I explore its design space. Through user evaluations, effective design options were identified and insights were obtained regarding human ability to perceive complex information, especially those with spatial structures, from interactive sounds. (3) A tool, iSonic, was developed, with synchronized visual and auditory displays. Forty-two hours of case studies with seven blind users show that iSonic enables them to effectively explore data in highly coordinated map and table views without special devices, to find facts and discover data trends even in unfamiliar geographical contexts. Preliminary algorithms are also described to automatically generate geographical region spatial sweep orders for arbitrary maps. (4) The application to geo-referenced data demonstrated that the ADC framework provided a rich set of task-oriented actions (AISAs) that were effective for blind users to accomplish complex tasks with multiple highly coordinated data views. It also showed that some widely used techniques in visualization can adapt to the auditory mode. By applying the framework to scatterplots and line graphs, I show that the framework could be generalized and lead to the design of a unified auditory workspace for general exploratory data analysis.en_US
dc.format.extent2213994 bytes
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://hdl.handle.net/1903/3394
dc.language.isoen_US
dc.subject.pqcontrolledComputer Scienceen_US
dc.subject.pqcontrolledInformation Scienceen_US
dc.subject.pqcontrolledPsychology, Cognitiveen_US
dc.subject.pquncontrolledVision impairmenten_US
dc.subject.pquncontrolledSonificationen_US
dc.subject.pquncontrolledAuditory user interfacesen_US
dc.subject.pquncontrolledUniversal usabilityen_US
dc.subject.pquncontrolledAccessibilityen_US
dc.subject.pquncontrolledMapsen_US
dc.titleInteractive Sonification of Abstract Data - Framework, Design Space, Evaluation, and User Toolen_US
dc.typeDissertationen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
umi-umd-3206.pdf
Size:
2.11 MB
Format:
Adobe Portable Document Format