Theses and Dissertations from UMD
Permanent URI for this communityhttp://hdl.handle.net/1903/2
New submissions to the thesis/dissertation collections are added automatically as they are received from the Graduate School. Currently, the Graduate School deposits all theses and dissertations from a given semester after the official graduation date. This means that there may be up to a 4 month delay in the appearance of a give thesis/dissertation in DRUM
More information is available at Theses and Dissertations at University of Maryland Libraries.
Browse
5 results
Search Results
Item LUCID DREAMS: AN EXPLORATION IN IMMERSIVE INTERACTIVE STORYTELLING WITH AUGMENTED REALITY(2024) Lazar, Rashonda; Kachman, Misha; Theatre; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)The following thesis examines my design process and discoveries while investigating one way live performance and immersive storytelling can act as a form of augmented reality, and explores whether incorporating traditional forms of augmented reality is one way to enhance a performance and builds on the narrative agency audiences experience in immersive theater. The production opened on April 8th, 2024, in the Herman Maril Gallery at the Parren J. Mitchell Art and Sociology Building at the University of Maryland.Item EFFECTIVENESS OF AUGMENTED REALITY INTERFACES FOR REMOTE HUMAN SWARM INTERACTION(2021) Oradiambalam Sachidanandam, Sarjana; Diaz-Mercado, Yancy; Systems Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Human Swarm Interaction (HSI) is a fast-growing research area in swarm robotics. One challenging aspect of HSI is facilitating how humans can effectively handle the many degrees-of-freedom present in a swarm of robots. One emergent option is the use of Augmented Reality (AR) systems. AR based interfaces are attractive as they can help provide a human operator with visual cues about the swarm’s states and control to facilitate decision-making. In research settings, AR systems can address issues such as limited availability of lab spaces, limited access to robotics resources, and the need for the ability to simulate dynamic environments with which robots and humans can interact. Further, to make swarm robotics more accessible and ubiquitous, HSI systems that support remote interaction would al- low humans to interact with robot swarms and multi-robot systems regardless of the geographical distance between humans and swarms. Taking these into consid- eration, this thesis aims to investigate the effectiveness of AR based interfaces as tools for remote interaction in HSI systems. We develop a simple AR based in- terface and evaluate its effectiveness against an unaugmented interface, by means of remote human user studies. The results of these studies help demonstrate the effectiveness of AR based interfaces for remote HSI.Item AUGMENTED REALITY SYSTEMS AND USER INTERACTION TECHNIQUES FOR STEM LEARNING(2020) Kang, Seokbin; Jacobs, David; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Learning practices and crosscutting concepts in science, technology, engineering, andmathematics (STEM) subjects pose challenges to young learners. Without external support to foster long-term interest and scaffold learning, children might lose interest in STEM subjects. While prior research has investigated how Augmented Reality (AR) may enhance learning of scientific concepts and increase student engagement, only a few considered young children who require developmentally appropriate approaches. The primary goal of my dissertation is to design, develop, and evaluate AR learning systems to engage children (ages 5-11) with STEM experiences. Leveraging advanced computer vision, machine learning, and sensing technologies, my dissertation explores novel user interaction techniques. The proposed techniques can give learners chance to investigate STEM ideas in their own setting, what educators call contextual learning, and lower barriers for STEM learning practices. Using the systems, my research further investigates Human-Artificial Intelligence (AI) interaction—how children understand, use, and react to the intelligent systems. Specifically, there are four major objectives in my research including: (i) gathering design ideas of AR applications to promote children’s STEM learning; (ii) exploring AR user interaction techniques that utilize personally meaningful material for learning; (iii) developing and evaluating AR learning systems and learning applications; and (iv) building design implications for AR systems for education.Item Real-time Audio Reverberation for Virtual Room Acoustics(2020) Shen, Justin M; Duraiswami, Ramani; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)For virtual and augmented reality applications, it is desirable to render audio sources in the space the user is in, in real-time without sacrificing the perceptual quality of the sound. One aspect of the rendering that is perceptually important for a listener is the late-reverberation, or "echo", of the sound within a room environment. A popular method of generating a plausible late reverberation in real-time is the use of Feedback Delay Network (FDN). However, its use has the drawback that it first has to be tuned (usually manually) for a particular room before the late-reverberation generated becomes perceptually accurate. In this thesis, we propose a data-driven approach to automatically generate a pre-tuned FDN for any given room described by a set of room parameters. When combined with existing method for rendering the direct path and early reflections of a sound source, we demonstrate the feasibility of being able to render audio source in real-time for interactive applications.Item Augmented Reality for Space Applications(2008-08-08) Di Capua, Massimiliano; Akin, David L.; Aerospace Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Future space exploration will inevitably require astronauts to have a higher degree of autonomy in decision-making and contingency identification and resolution. Space robotics will eventually become a major aspect of this new challenge, therefore the ability to access digital information will become crucial for mission success. In order to give suited astronauts the ability to operate robots and access all necessary information for nominal operations and contingencies, this thesis proposes the introduction of In-Field-Of-View Head Mounted Display Systems in current Extravehicular Activity Spacesuits. The system will be capable of feeding task specific information on request, and through Augmented Reality technology, recognize and overlay information on the real world for error checking and status purposes. The system will increase the astronaut's overall situational awareness and nominal task accuracy, reducing execution time and human error risk. The aim of this system is to relieve astronauts of trivial cognitive workload, by guiding and checking on them in their operations. Secondary objectives of the system will be the introduction of electronic checklists, and the ability to display the status of the suit and surrounding systems as well as interaction capabilities. Features which could be introduced are endless due the nature of the system, allowing extreme flexibility and future evolution without major design changes. This work will focus on the preliminary design of an experimental Head Mounted Display and its testing for initial evaluation and comparison with existing information feed methods. The system will also be integrated and tested in the University of Maryland Space Systems Laboratory MX-2 experimental spacesuit analogue.