Minimal Perception: Enabling Autonomy on Resource-Constrained Robots

dc.contributor.advisorAloimonos, Yiannisen_US
dc.contributor.authorSingh, Chahat Deepen_US
dc.contributor.departmentComputer Scienceen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2023-10-07T05:32:04Z
dc.date.available2023-10-07T05:32:04Z
dc.date.issued2023en_US
dc.description.abstractMobile robots are widely used and crucial in diverse fields due to their autonomous task performance. They enhance efficiency, and safety, and enable novel applications like precision agriculture, environmental monitoring, disaster management, and inspection. Perception plays a vital role in their autonomous behavior for environmental understanding and interaction. Perception in robots refers to their ability to gather, process, and interpret environmental data, enabling autonomous interactions. It facilitates navigation, object identification, and real-time reactions. By integrating perception, robots achieve onboard autonomy, operating without constant human intervention, even in remote or hazardous areas. This enhances adaptability and scalability. This thesis explores the challenge of developing autonomous systems for smaller robots used in precise tasks like confined space inspections and robot pollination. These robots face limitations in real-time perception due to computing, power, and sensing constraints. To address this, we draw inspiration from small organisms such as insects and hummingbirds, known for their sophisticated perception, navigation, and survival abilities despite their minimalistic sensory and neural systems. This research aims to provide insights into designing compact, efficient, and minimal perception systems for tiny autonomous robots. Embracing this minimalism is paramount in unlocking the full potential of tiny robots and enhancing their perception systems. By streamlining and simplifying their design and functionality, these compact robots can maximize efficiency and overcome limitations imposed by size constraints. In this work, a Minimal Perception framework is proposed that enables onboard autonomy in resource-constrained robots at scales (as small as a credit card) that were not possible before. Minimal perception refers to a simplified, efficient, and selective approach from both hardware and software perspectives to gather and process sensory information. Adopting a task-centric perspective allows for further refinement of the minimalist perception framework for tiny robots. For instance, certain animals like jumping spiders, measuring just 1/2 inch in length, demonstrate minimal perception capabilities through sparse vision facilitated by multiple eyes, enabling them to efficiently perceive their surroundings and capture prey with remarkable agility. This thesis introduces a cutting-edge exploration of the minimal perception framework, pushing the boundaries of robot autonomy to new heights. The contributions of this work can be summarized as follows:1. Utilizing minimal quantities such as uncertainty in optical flow and its untapped potential to enable autonomous navigation, static and dynamic obstacle avoidance, and the ability to fly through unknown gaps. 2. By utilizing the principles of interactive perception, the framework proposes novel object segmentation in cluttered environments eliminating the reliance on neural network training for object recognition. 3. Introducing a generative simulator called WorldGen that has the power to generate countless cities and petabytes of high-quality annotated data, designed to minimize the demanding need for laborious 3D modeling and annotations, thus unlocking unprecedented possibilities for perception and autonomy tasks. 4. Proposed a method to predict metric dense depth maps in never-seen or out-of-domain environments by fusing information from a traditional RGB camera and a sparse 64-pixel depth sensor. 5. The autonomous capabilities of the tiny robots are demonstrated on both aerial and ground robots: (a) autonomous car with a size smaller than a credit card (70mm), and (b) bee drone with a length of 120mm, showcasing navigation abilities, depth perception in all four main directions, and effective avoidance of both static and dynamic obstacles. In conclusion, the integration of the minimal perception framework in tiny mobile robots heralds a new era of possibilities, signaling a paradigm shift in unlocking their perception and autonomy potential. This thesis would serve as a transformative milestone that will reshape the landscape of mobile robot autonomy, ushering in a future where tiny robots operate synergistically in swarms, revolutionizing fields such as exploration, disaster response, and distributed sensing.en_US
dc.identifierhttps://doi.org/10.13016/dspace/hy65-kdus
dc.identifier.urihttp://hdl.handle.net/1903/30819
dc.language.isoenen_US
dc.subject.pqcontrolledRoboticsen_US
dc.subject.pqcontrolledArtificial intelligenceen_US
dc.subject.pquncontrolledArtificial Intelligenceen_US
dc.subject.pquncontrolledAutonomyen_US
dc.subject.pquncontrolledMinimalen_US
dc.subject.pquncontrolledPerceptionen_US
dc.subject.pquncontrolledRoboticsen_US
dc.subject.pquncontrolledTiny Robotsen_US
dc.titleMinimal Perception: Enabling Autonomy on Resource-Constrained Robotsen_US
dc.typeDissertationen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Singh_umd_0117E_23632.pdf
Size:
61.17 MB
Format:
Adobe Portable Document Format