BIO-INSPIRED MOTION PERCEPTION: FROM GANGLION CELLS TO AUTONOMOUS VEHICLES

dc.contributor.advisorAloimonos, Yiannisen_US
dc.contributor.advisorFermüller, Corneliaen_US
dc.contributor.authorParameshwara, Chethan Mysoreen_US
dc.contributor.departmentNeuroscience and Cognitive Scienceen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2022-09-23T05:33:49Z
dc.date.available2022-09-23T05:33:49Z
dc.date.issued2022en_US
dc.description.abstractAnimals are remarkable at navigation, even in extreme situations. Through motion perception, animals compute their own movements (egomotion) and find other objects (prey, predator, obstacles) and their motions in the environment. Analogous to animals, artificial systems such as robots also need to know where they are relative to structure and segment obstacles to avoid collisions. Even though substantial progress has been made in the development of artificial visual systems, they still struggle to achieve robust and generalizable solutions. To this end, I propose a bio-inspired framework that narrows the gap between natural and artificial systems. The standard approaches in robot motion perception seek to reconstruct a three-dimensional model of the scene and then use this model to estimate egomotion and object segmentation. However, the scene reconstruction process is data-heavy and computationally expensive and fails to deal with high-speed and dynamic scenarios. On the contrary, biological visual systems excel in the aforementioned difficult situation by extracting only minimal information sufficient for motion perception tasks. I derive minimalist/purposive ideas from biological processes throughout this thesis and develop mathematical solutions for robot motion perception problems. In this thesis, I develop a full range of solutions that utilize bio-inspired motion representation and learning approaches for motion perception tasks. Particularly, I focus on egomotion estimation and motion segmentation tasks. I have four main contributions: 1. First, I introduce NFlowNet, a neural network to estimate normal flow (bio-inspired motion filters). Normal flow estimation presents a new avenue for solving egomotion in a robust and qualitative framework. 2. Utilizing normal flow, I propose the DiffPoseNet framework to estimate egomotion by formulating the qualitative constraint in a differentiable optimization layer, which allows for end-to-end learning. 3. Further, utilizing a neuromorphic event camera, a retina-inspired vision sensor, I develop 0-MMS, a model-based optimization approach that employs event spikes to segment the scene into multiple moving parts in high-speed dynamic lighting scenarios. 4. To improve the precision of event-based motion perception across time, I develop SpikeMS, a novel bio-inspired learning approach that fully capitalizes on the rich temporal information in event spikes.en_US
dc.identifierhttps://doi.org/10.13016/2ijr-ceza
dc.identifier.urihttp://hdl.handle.net/1903/29252
dc.language.isoenen_US
dc.subject.pqcontrolledNeurosciencesen_US
dc.subject.pqcontrolledComputer scienceen_US
dc.subject.pqcontrolledRoboticsen_US
dc.subject.pquncontrolledCamera Pose Estimationen_US
dc.subject.pquncontrolledDifferentiable Programmingen_US
dc.subject.pquncontrolledEvent Visionen_US
dc.subject.pquncontrolledMulti Motion Segmentationen_US
dc.subject.pquncontrolledNormal Flowen_US
dc.subject.pquncontrolledSpiking Neural Networken_US
dc.titleBIO-INSPIRED MOTION PERCEPTION: FROM GANGLION CELLS TO AUTONOMOUS VEHICLESen_US
dc.typeDissertationen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Parameshwara_umd_0117E_22654.pdf
Size:
26.31 MB
Format:
Adobe Portable Document Format