UMD Theses and Dissertations

Permanent URI for this collectionhttp://hdl.handle.net/1903/3

New submissions to the thesis/dissertation collections are added automatically as they are received from the Graduate School. Currently, the Graduate School deposits all theses and dissertations from a given semester after the official graduation date. This means that there may be up to a 4 month delay in the appearance of a given thesis/dissertation in DRUM.

More information is available at Theses and Dissertations at University of Maryland Libraries.

Browse

Search Results

Now showing 1 - 3 of 3
  • Thumbnail Image
    Item
    A FRAMEWORK FOR DEXTEROUS MANIPULATION THROUGH TACTILE PERCEPTION
    (2022) Ganguly, Kanishka; Aloimonos, Yiannis; Computer Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    A long-anticipated, yet hitherto unfilled goal in Robotics research has been to have robotic agents seamlessly integrating with humans in their natural environments, and performing useful tasks alongside humans. While tremendous progress has been made in allowing robots to perceive visually, and understand and reason about the scene, the act of manipulating said environment still remains a challenging and incomplete task.For robotic agents to have capabilities where they can perform useful tasks in environments that are not specifically designed for their operation, it is crucial to have dexterous manipulation capabilities guided by some form of tactile perception. While visual perception provides a large-scale understanding of the environment, tactile perception allows fine-grained understanding of objects and textures. For truly useful robotic agents, a tightly coupled system comprising both visual and tactile perception is a necessity. Tactile sensing hardware can be classified on a spectrum, organized by form-factor on one end to sensing accuracy and robustness on the other. Most off-the-shelf sensors available today trade off one of these features for the other. The tactile sensor used in this research, the BioTac SP, has been selected for its anthropomorphic qualities, such as its shape and sensing mechanism while compromising on quality of sensory outputs. This sensor provides a sensing surface, and returns 24 tactile points of data at each timestamp, along with pressure values. We first present a novel method for contact and motion estimation through visual perception, where we perform non-rigid registration of a human performing actions and compute dense motion estimation trajectories. This is used to compute topological scene changes, and is refined to get object and contact segmentation. We then ground these contact points and motion trajectories to an intermediate action-graph, which can then executed by a robot agent. Secondly, we introduce the concept of computational tactile flow, which is inspired by fMRI studies on humans where it was discovered that the same parts of the brain that react to optical motion stimulus also react to tactile stimulus. We mathematically model the BioTac SP sensor, and interpolate surfaces in two- and three dimensions, on which we compute tactile flow fields. We demonstrate the flow fields on various surfaces, and suggest various useful applications of tactile flow. We next apply tactile feedback to a novel controller, that is able to grasp objects without any prior knowledge about the shape, material, or weight of the objects. We apply tactile flow to compute slippage during grasp, and adjust the finger forces to maintain stable grasp during motion. We demonstrate success on transparent and soft, deformable objects, alongside other regularly shaped samples. Lastly, we take a different approach to processing tactile data, where we compute tactile events taking inspiration from neuromorphic computing literature. We compute spatio-temporal gradients on the raw tactile data, to generate event surfaces, which are more robust and reduces sensor noise. This intermediate surface is then used to track contact regions over the BioTac SP sensor skin, and allows us to detect slippage, track spatial edge contours, and magnitude of applied forces.
  • Thumbnail Image
    Item
    MODEL-BASED SYSTEMS ENGINEERING SIMULATION FRAMEWORK FOR ROBOT GRASPING
    (2021) Menaka Sekar, Praveen Kumar; Baras, John S; Systems Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    Constant rise in industrial usage of robots for commercial applications has led to the need for rapid, efficient, and reliable robotic system development processes. Integration of tools from various disciplines to perform design space exploration,taking into consideration the stakeholder and system requirements, is one major step in regards to this. In this thesis, we apply Model-Based Systems Engineering (MBSE) principles to a simple pick and place task. We do this by integrating Cameo Systems Modeling Language (SysML) tool, CoppeliaSim robot simulator, and Gurobi Optimizer to facilitate and accelerate the design process for a robot grasping system. A simulation based Verification & Validation approach supports design space exploration to obtain optimal design solutions, thereby leading to successful and profitable deployment and operation.
  • Thumbnail Image
    Item
    A Self-Sealing Suction Technology for Versatile Grasping
    (2018) Kessens, Chad; Desai, Jaydev P; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)
    This thesis describes the design, development, and evaluation of a novel "self-sealing" suction technology for grasping. As humans desire robots capable of handling an increasingly diverse set of tasks, end effectors that are able to grasp the widest possible range of object shapes and sizes will be needed to achieve the desired versatility. Technologies enabling the exertion of local pulling contact forces (e.g. suction) can be extraordinarily useful toward this end by handling objects that do not have features smaller than the grasper, a challenge for traditional grippers. However, simple operation and cost effectiveness are also highly desirable. To achieve these goals, we have developed a self-sealing suction technology for grasping. A small valve inside each suction cup nominally seals the suction port to maintain a vacuum within the system. Through the reaction forces of object contact, a lever action passively lifts the valve to engage suction on the object. Any cups not contacting the object remain sealed. In this way, a system with a large number of cups may effectively operate using any subset of its cups, even just one, to grasp an object. All cups may be connected to a central vacuum source without the need for local sensors or powered actuators for operation, forming a simple, compact, cost effective system. This thesis begins with the detailed design and analysis of the self-sealing suction technology. An extensive evaluation of the technology's robustness and performance demonstrates its features and limits. This includes self-seal quality and leakage, object seal and reseal, cycle performance, and normal and shear force-displacement, among other characterizations. It then describes the development of several devices utilizing the technology. The potential impact of the technology is highlighted through applications of human-controlled, robotic, and aerial grasping and perching. Finally, mathematical tools are developed to analyze potential grasps developed using the technology.