Mechanical Engineering
Permanent URI for this communityhttp://hdl.handle.net/1903/2263
Browse
8 results
Search Results
Item Modeling and Experimental Techniques to Demonstrate Nanomanipulation With Optical Tweezers(2011) Balijepalli, Arvind K.; Gupta, Satyandra K; LeBrun, Thomas W; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)The development of truly three-dimensional nanodevices is currently impeded by the absence of effective prototyping tools at the nanoscale. Optical trapping is well established for flexible three-dimensional manipulation of components at the microscale. However, it has so far not been demonstrated to confine nanoparticles, for long enough time to be useful in nanoassembly applications. Therefore, as part of this work we demonstrate new techniques that successfully extend optical trapping to nanoscale manipulation. In order to extend optical trapping to the nanoscale, we must overcome certain challenges. For the same incident beam power, the optical binding forces acting on a nanoparticle within an optical trap are very weak, in comparison with forces acting on microscale particles. Consequently, due to Brownian motion, the nanoparticle often exits the trap in a very short period of time. We improve the performance of optical traps at the nanoscale by using closed-loop control. Furthermore, we show through laboratory experiments that we are able to localize nanoparticles to the trap using control systems, for sufficient time to be useful in nanoassembly applications, conditions under which a static trap set to the same power as the controller is unable to confine a same-sized particle. Before controlled optical trapping can be demonstrated in the laboratory, key tools must first be developed. We implement Langevin dynamics simulations to model the interaction of nanoparticles with an optical trap. Physically accurate simulations provide a robust platform to test new methods to characterize and improve the performance of optical tweezers at the nanoscale, but depend on accurate trapping force models. Therefore, we have also developed two new laboratory-based force measurement techniques that overcome the drawbacks of conventional force measurements, which do not accurately account for the weak interaction of nanoparticles in an optical trap. Finally, we use numerical simulations to develop new control algorithms that demonstrate significantly enhanced trapping of nanoparticles and implement these techniques in the laboratory. The algorithms and characterization tools developed as part of this work will allow the development of optical trapping instruments that can confine nanoparticles for longer periods of time than is currently possible, for a given beam power. Furthermore, the low average power achieved by the controller makes this technique especially suitable to manipulate biological specimens, but is also generally beneficial to nanoscale prototyping applications. Therefore, capabilities developed as part of this work, and the technology that results from it may enable the prototyping of three-dimensional nanodevices, critically required in many applications.Item Real-Time Path Planning for Automating Optical Tweezers based Particle Transport Operations(2009) Banerjee, Ashis Gopal; Gupta, Satyandra K; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Optical tweezers (OT) have been developed to successfully trap, orient, and transport micro and nano scale components of many different sizes and shapes in a fluid medium. They can be viewed as robots made out of light. Components can be simply released from optical traps by switching off laser beams. By utilizing the principle of time sharing or holograms, multiple optical traps can perform several operations in parallel. These characteristics make optical tweezers a very promising technology for creating directed micro and nano scale assemblies. In the infra-red regime, they are useful in a large number of biological applications as well. This dissertation explores the problem of real-time path planning for autonomous OT based transport operations. Such operations pose interesting challenges as the environment is uncertain and dynamic due to the random Brownian motion of the particles and noise in the imaging based measurements. Silica microspheres having diameters between (1-20) µm are selected as model components. Offline simulations are performed to gather trapping probability data that serves as a measure of trap strength and reliability as a function of relative position of the particle under consideration with respect to the trap focus, and trap velocity. Simplified models are generated using Gaussian Radial Basis Functions to represent the data in a compact form. These metamodels can be queried at run-time to obtain estimated probability values accurately and efficiently. Simple trapping probability models are then utilized in a stochastic dynamic programming framework to compute optimum trap locations and velocities that minimizes the total, expected transport time by incorporating collision avoidance and recovery steps. A discrete version of an approximate partially observable Markov decision process algorithm, called the QMDP_NLTDV algorithm, is developed. Real-time performance is ensured by pruning the search space and enhancing convergence rates by introducing a non-linear value function. The algorithm is validated both using a simulator as well as a physical holographic tweezer set-up. Successful runs show that the automated planner is flexible, works well in reasonably crowded scenes, and is capable of transporting a specific particle to a given goal location by avoiding collisions either by circumventing or by trapping other freely diffusing particles. This technique for transporting individual particles is utilized within a decoupled and prioritized approach to move multiple particles simultaneously. An iterative version of a bipartite graph matching algorithm is also used to assign goal locations to target objects optimally. As in the case of single particle transport, simulation and some physical experiments are performed to validate the multi-particle planning approach.Item Towards A Formal And Scalable Approach For Quantifying Software Reliability At Early Development Stages(2009) Kong, Wende; Smidts, Carol; Reliability Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Problems which originate in early development stages can have a lasting influence on the reliability, safety, and cost of a software system. The requirements document, which is usually available at the requirements analysis stage, must be correct, unambiguous, and complete if the rest of the development effort is to succeed. The ability to identify faults in requirements and predict the reliability of a software system early in its development can help organizations make informative decisions about corrective actions and improve the system's quality in a cost-effective manner. A review of the literature reveals that existing approaches are unsuited to provide trustworthy reliability prediction either due to the ignorance of the requirements documents, or because of the informal and fairly sketchy way in detecting faults in requirements. This study explores the use of a preselected software reliability measurement for early software faults detection and reliability prediction. This measurement, originally a black-box testing technique, was broadly recognized for its ability to detect incomplete and ambiguous requirements, although no information was found in the literature about how to take advantage of its power. This study mathematically formalized the measurement to enhance its rigidity, repeatability and scalability and further extended it as an effective requirements faults detection technique. An automation-oriented algorithm was developed for quantifying the impact of the detected requirements faults on software reliability. The feasibility and scalability of the proposed approach for early faults detection and reliability prediction were examined using two real applications. The results clearly confirmed its feasibility and usefulness, particularly when no failure data is available and other methods are not applicable. The scalability barriers were also spotted in the approach. An empirical study was thus conducted to gain insight into the nature of the technical barriers. As an attempt to overcome the barrier, a set of rules was proposed based on the observed patterns. Finally, a preliminarily controlled experiment was conducted to evaluate the usability of the proposed rules. This study will enable software project stakeholders to effectively detect requirements faults and assess the quality of requirements early in development, and ultimately lead to improved software reliability if the identified faults are removed in time. Software project practitioners, regulators, and policy makers involved in the certification of software systems can benefit most from the techniques proposed in this study.Item Virtual Reality Modeling of a Car Suspension with Active Control Capability(2009) Smoker, Jason James; Baz, Amr M.; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)This thesis presents the evolution of a full car model into virtual reality environment to visually demonstrate the dynamics of a car resulting from various inputs controlled both passively and actively. The model is a seven degree of freedom system that can be configured to be excited by either a bump or harmonic input. Active controls available to the system include the well known Linear Quadratic Regulator (LQR) as well as a new Nonlinear Energy Absorber (NEA) which utilizes both nonlinear springs and nonlinear damper. The mathematics of the plant, the kinematics of the system, and the visual specifications of the scene are integrated into a three-dimensional environment where the user can be immersed in the environment and witness in real-time the response of a specific configuration. This project was developed with the mindset that dynamic models of systems can be better understood through visual realization and interaction.Item Advanced Honeypot Architecture for Network Threats Quantification(2009) Berthier, Robin G; Cukier, Michel; Reliability Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Today's world is increasingly relying on computer networks. The increase in the use of network resources is followed by a rising volume of security problems. New threats and vulnerabilities are discovered everyday and affect users and companies at critical levels, from privacy issues to financial losses. Monitoring network activity is a mandatory step for researchers and security analysts to understand these threats and to build better protections. Honeypots were introduced to monitor unused IP spaces to learn about attackers. The advantage of honeypots over other monitoring solutions is to collect only suspicious activity. However, current honeypots are expensive to deploy and complex to administrate especially in the context of large organization networks. This study addresses the challenge of improving the scalability and flexibility of honeypots by introducing a novel hybrid honeypot architecture. This architecture is based on a Decision Engine and a Redirection Engine that automatically filter attacks and save resources by reducing the size of the attack data collection and allow researchers to actively specify the type of attack they want to collect. For a better integration into the organization network, this architecture was combined with network flows collected at the border of the production network. By offering an exhaustive view of all communications between internal and external hosts of the organization, network flows can 1) assist the configuration of honeypots, and 2) extend the scope of honeypot data analysis by providing a comprehensive profile of network activity to track attackers in the organization network. These capabilities were made possible through the development of a passive scanner and server discovery algorithm working on top of network flows. This algorithm and the hybrid honeypot architecture were deployed and evaluated at the University of Maryland, which represents a network of 40,000 computers. This study marks a major step toward leveraging honeypots into a powerful security solution. The contributions of this study will enable security analysts and network operators to make a precise assessment of the malicious activity targeting their network.Item INTEGRATING SOFTWARE BEHAVIOR INTO DYNAMIC PROBABILISTIC RISK ASSESSMENT(2005-12-21) Zhu, Dongfeng; Smidts, Carol; Mosleh, Ali; Reliability Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Software plays an increasingly important role in modern safety-critical systems. Although research has been done to integrate software into the classical Probability Risk Assessment (PRA) framework, current PRA practice overwhelmingly neglects the contribution of software to system risk. The objective of this research is to develop a methodology to integrate software contributions in the Dynamic Probabilistic Risk Assessment (DPRA) environment. DPRA is considered to be the next generation of PRA techniques. It is a set of methods and techniques in which simulation models that represent the behavior of the elements of a system are exercised in order to identify risks and vulnerabilities of the system. DPRA allows consideration of dynamic interactions of system elements and physical variables. The fact remains, however, that modeling software for use in the DPRA framework is also quite complex and very little has been done to address the question directly and comprehensively. This dissertation describes a framework and a set of techniques to extend the DPRA approach to allow consideration of the software contributions on system risk. The framework includes a software representation, an approach to incorporate the software representation into the DPRA environment SimPRA, and an experimental demonstration of the methodology. This dissertation also proposes a framework to simulate the multi-level objects in the simulation based DPRA environment. This is a new methodology to address the state explosion problem. The results indicate that the DPRA simulation performance is improved using the new approach. The entire methodology is implemented in the SimPRA software. An easy to use tool is developed to help the analyst to develop the software model. This study is the first systematic effort to integrate software risk contributions into the dynamic PRA environment.Item DOMAIN SPECIFIC TEST CASE GENERATION USING HIGHER ORDERED TYPED LANGUAGES FOR SPECIFICATION(2005-04-14) Sinha, Avik; Smidts, Carol S; Reliability Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Model based testing is an approach for automatic generation of test cases on the basis of a representative model of the system. Recent studies show that model based testing has many possible advantages over manual test generation techniques including a gain in effectiveness, efficiency and reuse. The effectiveness (ability to uncover faults in a system) of a model based testing process is determined by the correctness of the model and by the number of requirements represented in the model. In practice, test models for model based test automation techniques are usually created from requirement or design specifications of the software and hence, these techniques overtly rely on such specifications for the completeness of the test models. This may lead to failure in testing some critical requirements specific to the application domain because the user, who helps in defining the requirements, may fail to consider certain domain specific requirements. To him some may appear to be too trivial to be specified explicitly in the requirements document and the others, he may forget. Even if the requirement is complete with domain specific requirements, testers may not realize criticality of such requirements or may find them too complex to model. In all such cases, testing is incomplete and ineffective. This dissertation describes a new model based testing technique developed to remedy such situations. The new technique is based on modeling the system under test using a strongly typed domain specific language (DSL). In the new technique, information about domain specific requirements of an application are captured automatically by exploiting properties of the DSL and are subsequently introduced in the test model. The new technique is applied to generate test cases for the applications interfacing with relational databases and the example DSL chosen for that purpose is HaskellDB. Test suites generated using the new technique are enriched with test cases addressing domain specific implicit requirements and therefore, are more effective in finding faults. This dissertation will present details of the technique and describe an experiment and a case study to explore its effectiveness, efficiency, usability and industrial applicability.Item FASTER DISPLAY OF MECHANICAL ASSEMBLIES BY DETERMINATION OF PART VISIBILITY(2004-05-04) Ou, Jeremy; Magrab, Edward B; Mechanical EngineeringWe present algorithms that greatly decrease the time it takes to display a large number of 3-D mechanical part assemblies by removing all interior parts that cannot be viewed from any viewing angle. The algorithms are based on the minimum axis-aligned bounding box of each part, which avoids complicated computations often needed to determine the interactions of the geometry of the parts. The major contribution of this work is the use of exterior traces of cross sections of the bounding boxes to determine the parts' visibility. It is shown that the processing time increases almost linearly with the number of parts in an assembly of parts. A test on an assembly composed of 490 parts shows that the algorithms decrease the display time by a factor of two while only incorrectly identifying two of these parts as invisible when they should have been identified as visible.