A. James Clark School of Engineering
Permanent URI for this communityhttp://hdl.handle.net/1903/1654
The collections in this community comprise faculty research works, as well as graduate theses and dissertations.
Browse
19 results
Search Results
Item Advanced methodologies for reliability-based design optimization and structural health prognostics(2010) Wang, Pingfeng; Youn, Byeng Dong; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Failures of engineered systems can lead to significant economic and societal losses. To minimize the losses, reliability must be ensured throughout the system's lifecycle in the presence of manufacturing variability and uncertain operational conditions. Many reliability-based design optimization (RBDO) techniques have been developed to ensure high reliability of engineered system design under manufacturing variability. Schedule-based maintenance, although expensive, has been a popular method to maintain highly reliable engineered systems under uncertain operational conditions. However, so far there is no cost-effective and systematic approach to ensure high reliability of engineered systems throughout their lifecycles while accounting for both the manufacturing variability and uncertain operational conditions. Inspired by an intrinsic ability of systems in ecology, economics, and other fields that is able to proactively adjust their functioning to avoid potential system failures, this dissertation attempts to adaptively manage engineered system reliability during its lifecycle by advancing two essential and co-related research areas: system RBDO and prognostics and health management (PHM). System RBDO ensures high reliability of an engineered system in the early design stage, whereas capitalizing on PHM technology enables the system to proactively avoid failures in its operation stage. Extensive literature reviews in these areas have identified four key research issues: (1) how system failure modes and their interactions can be analyzed in a statistical sense; (2) how limited data for input manufacturing variability can be used for RBDO; (3) how sensor networks can be designed to effectively monitor system health degradation under highly uncertain operational conditions; and (4) how accurate and timely remaining useful lives of systems can be predicted under highly uncertain operational conditions. To properly address these key research issues, this dissertation lays out four research thrusts in the following chapters: Chapter 3 - Complementary Intersection Method for System Reliability Analysis, Chapter 4 - Bayesian Approach to RBDO, Chapter 5 - Sensing Function Design for Structural Health Prognostics, and Chapter 6 - A Generic Framework for Structural Health Prognostics. Multiple engineering case studies are presented to demonstrate the feasibility and effectiveness of the proposed RBDO and PHM techniques for ensuring and improving the reliability of engineered systems within their lifecycles.Item SPILL AND BURNING BEHAVIOR OF FLAMMABLE LIQUIDS(2010) Benfer, Matthew; Quintiere, James G; Fire Protection Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Unconfined liquid spill depths were measured for two liquid fuels and three non-flammable liquids atop a smooth concrete pad. Unconfined liquid spill thicknesses were found to be less than 0.1 cm in all fuels and liquids similar to fuels. Spill fires were conducted with volumes ranging from 0.2 ml to 450 ml for gasoline and denatured alcohol. Average burning rates for both unconfined liquid fuel spill fires increased linearly with increasing volume spilled. A liquid spill thickness model was developed and compared to experimental data. Comparisons showed good predictions for half of the liquids used. In addition, a liquid spill fire burning rate model was also developed and checked with experimental data. This model provided good qualitative results, however further development is still needed.Item A DATA-INFORMED MODEL OF PERFORMANCE SHAPING FACTORS FOR USE IN HUMAN RELIABILITY ANALYSIS(2009) Groth, Katrina M.; Mosleh, Ali; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Many Human Reliability Analysis (HRA) models use Performance Shaping Factors (PSFs) to incorporate human elements into system safety analysis and to calculate the Human Error Probability (HEP). Current HRA methods rely on different sets of PSFs that range from a few to over 50 PSFs, with varying degrees of interdependency among the PSFs. This interdependency is observed in almost every set of PSFs, yet few HRA methods offer a way to account for dependency among PSFs. The methods that do address interdependencies generally do so by varying different multipliers in linear or log-linear formulas. These relationships could be more accurately represented in a causal model of PSF interdependencies. This dissertation introduces a methodology to produce a Bayesian Belief Network (BBN) of interactions among PSFs. The dissertation also presents a set of fundamental guidelines for the creation of a PSF set, a hierarchy of PSFs developed specifically for causal modeling, and a set of models developed using currently available data. The models, methodology, and PSF set were developed using nuclear power plant data available from two sources: information collected by the University of Maryland for the Information-Decision-Action model [1] and data from the Human Events Repository and Analysis (HERA) database [2] , currently under development by the United States Nuclear Regulatory Commission. Creation of the methodology, the PSF hierarchy, and the models was an iterative process that incorporated information from available data, current HRA methods, and expert workshops. The fundamental guidelines are the result of insights gathered during the process of developing the methodology; these guidelines were applied to the final PSF hierarchy. The PSF hierarchy reduces overlap among the PSFs so that patterns of dependency observed in the data can be attribute to PSF interdependencies instead of overlapping definitions. It includes multiple levels of generic PSFs that can be expanded or collapsed for different applications. The model development methodology employs correlation and factor analysis to systematically collapse the PSF hierarchy and form the model structure. Factor analysis is also used to identify Error Contexts (ECs) – specific PSF combinations that together produce an increased probability of human error (versus the net effect of the PSFs acting alone). Three models were created to demonstrate how the methodology can be used provide different types of data-informed insights. By employing Bayes' Theorem, the resulting model can be used to replace linear calculations for HEPs used in Probabilistic Risk Assessment. When additional data becomes available, the methodology can be used to produce updated causal models to further refine HEP values.Item BUTANOL PRODUCTION FROM GLYCEROL BY Clostridium pasteurianum IN DEFINED CULTURE MEDIA- A PHENOTYPIC APPROACH.(2009) Ramos Sanchez, David Leonardo; Wang, Nam S; Chemical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)The fluctuations in oil prices have stimulated the production of renewable biofuels, in particular the production of bioethanol and biodiesel. The production of biodiesel has expanded almost six fold in the past years. The ten wt% of the biodiesel process results in crude glycerol. Once a valuable product, nowadays glycerol is considered a waste and a surplus material. Its current low price makes it an attractive substrate for a fermentation process. Molecular genetics have unveiled new insights about solvent production in Clostridia. It has been recognized that endospore development and solvent formation share a regulatory mechanism. The solvent production, particularly the butanol fermentation of glycerol by Clostridium pasteurianum was studied. Taking advantage of the characteristics of the sporulation phenotype, the study of the butanol fermentation was approached. A relation between spore formation and butanol production was found in C. pasteurianum by applying molecular genetics concepts.Item Particle Filtering for Stochastic Control and Global Optimization(2009) Zhou, Enlu; Marcus, Steven I.; Electrical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)This thesis explores new algorithms and results in stochastic control and global optimization through the use of particle filtering. Stochastic control and global optimization are two areas that have many applications but are often difficult to solve. In stochastic control, an important class of problems, namely, partially observable Markov decision processes (POMDPs), provides an ideal paradigm to model discrete-time sequential decision making under uncertainty and partial observation. However, POMDPs usually do not admit analytical solutions, and are computationally very expensive to solve most of the time. While many efficient numerical algorithms have been developed for finite-state POMDPs, there are only a few proposed for continuous-state POMDPs, and even more sparse are relevant analytical results regarding convergence and error bounds. From the modeling viewpoint, many application problems are modeled more naturally by continuous-state POMDPs rather than finite-state POMDPs. Therefore, one part of the thesis is devoted to developing a new efficient algorithm for continuous-state POMDPs and studying the performance of the algorithm both analytically and numerically. Based on the idea of density projection with particle filtering, the proposed algorithm reduces the infinite-dimensional problem to a finite-low-dimensional one, and also has the flexibility and scalability for better approximation if given more computational power. Error bounds are proved for the algorithm, and numerical experiments are carried out on an inventory control problem. In global optimization, many problems are very difficult to solve due to the presence of multiple local optima or badly scaled objective functions. Many approximate solutions methods have been developed and studied. Among them, a recent class of simulation-based methods share the common characteristic of repeatedly drawing candidate solutions from an intermediate probability distribution and then updating the distribution using these candidate solutions, until the probability distribution becomes concentrated on the optimal solution. The efficiency and accuracy of these algorithms depend very much on the choice of the intermediate probability distributions and the updating schemes. Using a novel interpretation of particle filtering, these algorithms are unified under one framework, and hence, many new insights are revealed. By better understanding these existing algorithms, the framework also holds the promise for developing new improved algorithms. Some directions for new improved algorithms are proposed, and numerical experiments are carried out on a few benchmark problems.Item Simulation and Optimization of Production Control for Lean Manufacturing Transition(2008-08-06) Gahagan, Sean Michael; Herrmann, Jeffrey W; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Lean manufacturing is an operations management philosophy that advocates eliminating waste, including work-in-process (WIP) inventory. A common mechanism for controlling WIP is "pull" production control, which limits the amount of WIP at each stage. The process of transforming a system from push production control to pull is not well understood or studied. This dissertation explores the events of a production control transition, quantifies its costs and develops techniques to minimize them. Simulation models of systems undergoing transition from push to pull are used to study this transient behavior. The transition of a single stage system is modeled. An objective function is introduced that defines transition cost in terms of the holding cost of orders in backlog and material in inventory. It incorporates two techniques for mitigating cost: temporarily deferring orders and adding extra capacity. It is shown that, except when backlog costs are high, it is better to transform the system quickly. It is also demonstrated that simulation based optimization is a viable tool to find the optimal transition strategy. Transition of a two-stage system is also modeled. The performance of two simple multi-stage transition strategies is measured. In the first, all of the stages are transformed at the same time. In the second, they are transformed one at a time. It is shown that the latter strategy is superior. Other strategies are also discussed. A new modeling formalism, the Production Control Framework (PCF), is introduced to facilitate automated searches for transition strategies in more complex systems. It is a hierarchical description of a manufacturing system built on a novel extension of the classic queue server model, which can express production control policy parametrically. The PCF is implemented in the form of a software template and its utility is shown as it is used to model and then find the optimal production control policy for a five stage system. This work provides the first practical guidance and insight into the behavior and cost of Lean production control transition, and it lays the groundwork for the development of optimal transition strategies for even the most complex manufacturing systems.Item Development and Evaluation of Algorithms for Scheduling Two Unrelated Parallel Processors(2007-08-09) Leber, Dennis D; Herrmann, Jeffrey W; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Given a group of tasks and two non-identical processors with the ability to complete each task, how should the tasks be assigned to complete the group of tasks as quickly as possible? This thesis considers this unrelated parallel machine scheduling problem with the objective of minimizing the completion time of a group of tasks (the makespan) from the perspective of a local printed circuit board manufacturer. An analytical model representing the job dependent processing time for each manufacturing line is developed and actual job data supplied by the manufacturer is used for analysis. Two versions of a complete enumeration algorithm which identify the optimal assignment schedule are presented. Several classic assignment heuristics are considered with several additional heuristics developed as part of this work. The algorithms are evaluated and their performance compared for jobs built at the local manufacturing site. Finally, a cost-benefit tradeoff for the algorithms considered is presented.Item On the Theoretical Foundations and Principles of Organizational Safety Risk Analysis(2007-08-02) Mohaghegh-Ahmadabadi, Zahra; Mosleh, Ali; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)This research covers a targeted review of relevant theories and technical domains related to the incorporation of organizational factors into technological systems risk. In the absence of a comprehensive set of principles and modeling guidelines rooted in theory and empirical studies, all models look equally good, or equally poor, with very little basis to discriminate and build confidence. Therefore, this research focused on the possibility of improving the theoretical foundations and principles for the field of Organizational Safety Risk Analysis. Also, a process for adapting a hybrid modeling technique, in order to operationalize the theoretical organizational safety frameworks, is proposed. Candidate ingredients are techniques from Risk Assessment, Human Reliability, Social and Behavioral Science, Business Process Modeling, and Dynamic Modeling. Then, as a realization of aforementioned modeling principles, an organizational safety risk framework, named Socio-Technical Risk Analysis (SoTeRiA)is developed. The proposed framework considers the theoretical relation between organizational safety culture, organizational safety structure/practices, and organizational safety climate, with specific distinction between safety culture and safety climate. A systematic view of safety culture and safety climate fills an important gap in modeling complex system safety risk, and thus the proposed organizational safety risk theory describing the theoretical relation between two concepts to bridge this gap. In contrast to the current safety causal models which do not adequately consider the multilevel nature of the issue, the proposed multilevel causal model explicitly recognizes the relationships among constructs at multiple levels of analysis. Other contributions of this research are in implementing the proposed organizational safety framework in the aviation domain, particularly the airline maintenance system. The US Federal Aviation Administration (FAA), which has sponsored this research over the past three years, has recognized the issue of organizational factors as one of the most critical questions in the quest to achieve 80% reduction in aviation accidents. An example of the proposed hybrid modeling environment including an integration of System Dynamics (SD), Bayesian Belief Network (BBN), Event Sequence Diagram (ESD), and Fault Tree (FT), is also applied in order to demonstrate the value of hybrid frameworks. This hybrid technique integrates deterministic and probabilistic modeling perspectives, and provides a flexible risk management tool.Item Numerical Simulation of Ignition and Transient Combustion in Fuel Vapor Clouds(2007-07-31) Wiley, Jennifer; Trouvé, Arnaud; Fire Protection Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)The Large-Eddy Simulation (LES) approach is used to model partially-premixed combustion (PPC) in confined and unconfined fuel vapor clouds. The model is based on the concept of a filtered reaction progress variable to describe the premixed combustion. The premixed combustion model is implemented into the Fire Dynamics Simulator (FDS), developed at the National Institute of Standards and Technology, USA, and is coupled with either an equilibrium-chemistry, mixture-fraction based model (FDS Version 4) or an eddy dissipation model (FDS Version 5) for non-premixed combustion. Modifications to the model are developed and implemented with the goal of reducing the grid resolution requirement while still producing physically sound results. The modified formulation is tested using both versions of the non-premixed combustion model, and the results are compared. It is found that the modifications are capable of reducing errors associated with poorly-resolved simulations in both versions of the model.Item Online Inventory Replenishment and Fleet Routing Decisions under Real-Time Information(2007-04-27) Giesen, Ricardo; Mahmassani, Hani S; Civil Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Logistics managers rely on increasingly sophisticated technologies to track demand and associated inventories, allowing rapid response to meet anticipated demand, avoid shortfalls while minimizing transportation and inventory carrying costs. This ability to respond gives rise to complex decision problems, characterized by combinatorial underlying problems under progressively unfolding demand. Real-time information also increases the ability to coordinate effectively inventory management and transportation service. The advantages of coordinating inventory replenishment with vehicle routing decisions have long been recognized, giving rise to the inventory routing problem, which arises in the context of vendor-managed inventories. These typify an emerging class of collaborative logistics arrangements facilitated by information and communication technologies. The ability to coordinate inventory with routing decisions in real-time adds an important dimension to the problem. While fleet management decisions under real-time information have been studied extensively, coupling these with inventory replenishment decisions in real-time remains in the early stages of conceptualization and development. The main objective of this dissertation is to examine effectiveness of policies for managing inventories taking into consideration the interaction between inventory replenishment, retailer sequencing and transportation cost. A major motivation for the online inventory routing problem is the presence of uncertainty about future consumption rates at different facilities. The possibility of updating plans on a continuous basis, based on real-time information about demand realizations makes possible decisions to modify the set and/or the sequence of subsequent facilities to be visited, diverting a truck from its current destination to visit a different facility, and adjusting amounts to be delivered to subsequent customers in the route. This dissertation proposes two decomposition approaches, in which a simplified version of either the inventory-control or the routing side is solved first, and then that solution is used as a soft constraint when solving the other side. For each approach, different operational polices are proposed, reflecting different degrees of sophistication in terms of technology and optimization capabilities. These operational policies are based on a rolling-horizon framework, wherein new plans are repeatedly generated, based on updated information. Finally, the performance of proposed strategies is simulated and the impacts of using sophisticated real-time strategies are discussed.