HandSight: A Touch-Based Wearable System to Increase Information Accessibility for People with Visual Impairments

dc.contributor.advisorFroehlich, Jon Een_US
dc.contributor.advisorChellappa, Ramaen_US
dc.contributor.authorStearns, Lee Stephanen_US
dc.contributor.departmentComputer Scienceen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2019-02-01T06:34:35Z
dc.date.available2019-02-01T06:34:35Z
dc.date.issued2018en_US
dc.description.abstractMany activities of daily living such as getting dressed, preparing food, wayfinding, or shopping rely heavily on visual information, and the inability to access that information can negatively impact the quality of life for people with vision impairments. While numerous researchers have explored solutions for assisting with visual tasks that can be performed at a distance, such as identifying landmarks for navigation or recognizing people and objects, few have attempted to provide access to nearby visual information through touch. Touch is a highly attuned means of acquiring tactile and spatial information, especially for people with vision impairments. By supporting touch-based access to information, we may help users to better understand how a surface appears (e.g., document layout, clothing patterns), thereby improving the quality of life. To address this gap in research, this dissertation explores methods to augment a visually impaired user’s sense of touch with interactive, real-time computer vision to access information about the physical world. These explorations span three application areas: reading and exploring printed documents, controlling mobile devices, and identifying colors and visual textures. At the core of each application is a system called HandSight that uses wearable cameras and other sensors to detect touch events and identify surface content beneath the user’s finger. To create HandSight, we designed and implemented the physical hardware, developed signal processing and computer vision algorithms, and designed real-time feedback that enables users to interpret visual or digital content. We involve visually impaired users throughout the design and development process, conducting several user studies to assess usability and robustness and to improve our prototype designs. The contributions of this dissertation include: (i) developing and iteratively refining HandSight, a novel wearable system to assist visually impaired users in their daily lives; (ii) evaluating HandSight across a diverse set of tasks, and identifying tradeoffs of a finger-worn approach in terms of physical design, algorithmic complexity and robustness, and usability; and (iii) identifying broader design implications for future wearable systems and for the fields of accessibility, computer vision, augmented and virtual reality, and human-computer interaction.en_US
dc.identifierhttps://doi.org/10.13016/wfbj-ulxc
dc.identifier.urihttp://hdl.handle.net/1903/21613
dc.language.isoenen_US
dc.subject.pqcontrolledComputer scienceen_US
dc.subject.pquncontrolledaccessibilityen_US
dc.subject.pquncontrolledaugmented realityen_US
dc.subject.pquncontrolledcomputer vision applicationsen_US
dc.subject.pquncontrolledvisually impaired usersen_US
dc.subject.pquncontrolledwearable camerasen_US
dc.titleHandSight: A Touch-Based Wearable System to Increase Information Accessibility for People with Visual Impairmentsen_US
dc.typeDissertationen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Stearns_umd_0117E_19443.pdf
Size:
19.55 MB
Format:
Adobe Portable Document Format