UMD Theses and Dissertations
Permanent URI for this collectionhttp://hdl.handle.net/1903/3
New submissions to the thesis/dissertation collections are added automatically as they are received from the Graduate School. Currently, the Graduate School deposits all theses and dissertations from a given semester after the official graduation date. This means that there may be up to a 4 month delay in the appearance of a given thesis/dissertation in DRUM.
More information is available at Theses and Dissertations at University of Maryland Libraries.
Browse
19 results
Search Results
Item Single- and Multi-Objective Feasibility Robust Optimization under Interval Uncertainty with Surrogate Modeling(2022) Kania, Randall Joseph; Azarm, Shapour; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)This dissertation presents new methods for solving single- and multi-objective optimization problems when there are uncertain parameter values. The uncertainty in these problems is considered to come from sources with no known or assumed probability distribution, bounded only by an interval. The goal is to obtain a single solution (for single-objective optimization problems) or multiple solutions (for multi-objective optimization problems) that are optimal and “feasibly robust”. A feasibly robust solution is one that remains feasible for all values of uncertain parameters within the uncertainty interval. Obtaining such a solution can become computationally costly and require many function calls. To reduce the computational cost, the presented methods use surrogate modeling to approximate the functions in the optimization problem.This dissertation aims at addressing several key research questions. The first Research Question (RQ1) is: How can the computational cost for solving single-objective robust optimization problems be reduced with surrogate modelling when compared to previous work? RQ2 is: How can the computational cost of solving bi-objective robust optimization problems be improved by using surrogates in concert with a Bayesian optimization technique when compared to previous work? And RQ3 is: How can surrogate modeling be leveraged to make multi-objective robust optimization computationally less expensive when compared to previous work? In addressing RQ1, a new single-objective robust optimization method has been developed with improvements over an existing method from the literature. This method uses a deterministic, local solver, paired with a surrogate modelling technique for finding worst-case scenario of parameter configurations. Using this single-objective robust optimization method, improved large-scale performance and robust feasibility were demonstrated. The second method presented solves bi-objective robust optimization problems under interval uncertainty by introducing a relaxation technique to facilitate combining iterative robust optimization and Bayesian optimization techniques. This method showed improved feasibility robustness and performance at larger problem sizes over existing methods. The third method presented in this dissertation extends the current literature by considering multiple (beyond two) competing objectives for surrogate robust optimization. Increasing the number of objectives adds more dimensions and complexity to the search for solutions and can greatly increase the computational costs. In the third method, the robust optimization strategy from the bi-objective second method was combined with a new Monte Carlo approximated method. The key contributions in this dissertation are 1) a new single-objective robust optimization method combining a local optimization solver and surrogate modelling for robustness, 2) a bi-objective robust optimization method that employs iterative Bayesian optimization technique in tandem with iterative robust optimization techniques, and 3) a new acquisition function for robust optimization in problems of more than two objectives.Item ESTIMATING THE RELIABILITY OF A NEW CONSUMER PRODUCT USING USER SURVEY DATA AND RELIABILITY TEST DATA(2022) Shafiei, Neda; Modarres, Mohammad; Herrmann, Jeffrey W.; Mechanical Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Because new products enter the market rapidly, estimating their reliability is challenging due to insufficient historical data. User survey data about similar devices (e.g., older versions of the new device) can be used as the prior information in a Bayesian analysis integrated with evidence in the form of product returns, reliability tests, and other reliability data sources to improve reliability estimation and test specification of the new product. User surveys are usually designed for purposes other than reliability estimation. Therefore, extracting reliability information from these surveys may be tricky or impossible. Even when possible, the extracted reliability information contains significant uncertainties. This dissertation introduces the critical elements of a reliability-informed user survey and offers methods for collecting them. A generic and flexible mathematical approach is then proposed. This approach uses the survey and reliability test data of similar products, for example, an older generation of the same product as prior knowledge. Then it combines them through a formal Bayesian analysis with the reliability test data to estimate the life distribution of the new product. The approach models continuous life distributions for products exposed to many damage-induced cycles. It proposes discrete life distribution models for products whose failures occur within several damaging cycles. The actual cycles for various applicable damaging stress profiles are converted into the equivalent (pseudo) cycles under a reference stress profile. When damage-induced cycles are estimated from user surveys, they may involve biases, as is the nature of most nontechnical users’ responses. This bias is minimized using an approach based on the Kullback-Leibler divergence method. The survey data and other evidence from similar products are then combined with the test data of the new product to estimate the parameters of the reliability model of the new product. The dissertation developed approaches to design reliability test specifications for a new product with unknown failure modes. The number of samples, stress levels, and the number of cycles for the accelerated life test are determined based on the manufacturer’s requirements, including the desired warranty time, the desired reliability with some confidence level at the warranty time, and the maximum number of samples. The actual use conditions (i.e., actual stress profiles and usage cycles) are grouped using clustering techniques. The centers of clusters are then used to design frequency-accelerated or stress-accelerated reliability tests. The application of the proposed reliability estimation approach and the test specification design approach is illustrated and used to validate the proposed algorithms using the simulated datasets for a hypothetical handheld electronic device with the failure mode of cracking caused by accidental drops. The proposed approaches can adequately estimate the reliability model and design test specifications for a wide range of consumer products. These approaches require reliability data about an existing product that is similar to the new product, however.Item APPLICATION OF A BAYESIAN NETWORK BASED FAILURE DETECTION AND DIAGNOSIS FRAMEWORK ON MARITIME DIESEL ENGINES(2022) Reynolds, Steven; Groth, Katrina; Systems Engineering; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Diesel engine propulsion has been the largest driver of maritime trade and transportation since its development in the early 20th century and the technology surrounding the operation and maintenance of these systems has grown in complexity leading to rapid advancement in amount and variety of data being collected. This increase in reliability data provides a fantastic opportunity to improve upon the existing tools troubleshooting and decision support tool used within the maritime engine community to enable a more robust understanding of engine reliability. This work leverages this opportunity and applies it to the Coast Guard and its acquisition of the Fast Response Cutter (FRC) fleet powered by two MTU20V4000M93 engines integrated with top of line monitoring and control equipment.The purpose of this research is to create procedures for creating a Failure Detection and Diagnosis (FDD) model of a maritime diesel engine that updates existing Probabilistic Risk Analysis (PRA) data with input from the engine monitoring and control system using Bayesian inference. A literature review of existing work within the PRA and Prognostics and Health Management (PHM) fields was conducted with specific focus on the advancement and gaps in the field specific to their use in maritime engine applications. Following this, a hierarchal ruleset was created that outlines procedures for integrating existing PRA data and PHM metrics into a Bayesian Network structure. This methodology was then used to build a Bayesian Network based FDD model of the FRC engine. This model was then validated by Coast Guard Engineers and run through a diagnostic use case scenario demonstrating the model’s suitability in the diagnostic space.Item Inference of Mass Anomalies in Planetary Interiors Using a Bayesian Global Gravity Field Inversion(2020) Izquierdo Gonzalez, Kristel Del Carmen; Montesi, Laurent G. J.; Lekic, Vedran; Geology; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Knowledge about the interior density distribution of a planetary body can constraingeophysical processes and reveal information about the origin and evolution of the body. Properties of this interior distribution can be inferred by analyzing gravity acceleration data sampled by orbiting satellites. Usually, the gravity data is complemented with additional laser ranging or seismic data in order to reduce the range of possible density models of the interior. However, additional data might not be available and tight prior constraints on model parameters might not be justified. In this case, the flexibility of using non-informative priors and the ability to quantify the non-uniqueness of the gravity inversions are of even greater importance. In this work, we present a gravity inversion algorithm, THeBOOGIe, thatsamples the posterior distribution of density in the interior of a planet or moon according to Bayes theorem, following a Metropolis-Hastings iterative algorithm. It uses non-informative priors on the number, location, shape and magnitude of density anomalies. Different samples of the posterior show different density models of the interior consistent with the observed gravity data. Inversions of synthetic gravity data are ran using point masses, spherical caps and Voronoi regions (VRs) to parametrize density anomalies. THeBOOGIe is able to retrieve the lateral location of shallow density anomalies and the shape, depth and magnitude of a mid-mantle anomaly. The uncertainty of the model parameters increases with depth, as expected. Bouguer gravity data of the Moon obtained by the GRAIL mission was invertedusing a VR parametrization. Shallow anomalies related to the SPA basin, crustal dichotomy and near side basins were found in the correct latitude and longitude and a trade-off in their thickness and magnitude. Positive and negative density anomalies were found in the depth range 500-1141 km. The location of deep moonquakes do not have a clear relation to the location of these density anomalies.Item Patterns of oyster natural mortality in Chesapeake Bay, Maryland during 1991-2017 and its relationships with environmental factors and disease(2019) Doering, Kathryn Leah; Wilberg, Michael J; Marine-Estuarine-Environmental Sciences; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)A common method of estimating natural mortality in bivalves includes several assumptions that are likely violated for oysters Crassostrea virginica in Chesapeake Bay, Maryland. In addition, while oyster disease dynamics are well studied spatially and temporally in the mid-Atlantic region, changes in disease-related relationships have not been investigated in Maryland. We developed a Bayesian estimator for natural mortality and applied it to oysters in Maryland. We then used the model output along with environmental factors and disease data to explore changes in the disease system over time. We found the largest differences in natural mortality estimates between the box count method and Bayesian model 1-3 years after a high mortality event. Some relationships changed over time in the disease system, most notably those associated with MSX, suggesting resistance to MSX has potentially developed. This work improves our estimates of natural mortality and understanding of oyster disease dynamics in Maryland.Item Estimation of a Function of a Large Covariance Matrix Using Classical and Bayesian Methods(2018) Law, Judith N.; Lahiri, Partha; Mathematics; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)In this dissertation, we consider the problem of estimating a high dimensional co- variance matrix in the presence of small sample size. The proposed Bayesian solution is general and can be applied to dierent functions of the covariance matrix in a wide range of scientic applications, though we narrowly focus on a specic application of allocation of assets in a portfolio where the function is vector-valued with components which sum to unity. While often there exists a high dimension of time series data, in practice only a shorter length is tenable, to avoid violating the critical assumption of equal covariance matrix of investment returns over the period. Using Monte Carlo simulations and real data analysis, we show that for small sample size, allocation estimates based on the sample covariance matrix can perform poorly in terms of the traditional measures used to evaluate an allocation for portfolio analysis. When the sample size is less than the dimension of the covariance matrix, we encounter diculty computing the allocation estimates because of singularity of the sample covariance matrix. We evaluate a few classical estimators. Among them, the allocation estimator based on the well-known POET estimator is developed using a factor model. While our simulation and data analysis illustrate the good behavior of POET for large sample size (consistent with the asymptotic theory), our study indicates that it does not perform well in small samples when compared to our pro- posed Bayesian estimator. A constrained Bayes estimator of the allocation vector is proposed that is the best in terms of the posterior risk under a given prior among all estimators that satisfy the constraint. In this sense, it is better than all classi- cal plug-in estimators, including POET and the proposed Bayesian estimator. We compare the proposed Bayesian method with the constrained Bayes using the tradi- tional evaluation measures used in portfolio analysis and nd that they show similar behavior. In addition to point estimation, the proposed Bayesian approach yields a straightforward measure of uncertainty of the estimate and allows construction of credible intervals for a wide range of parameters.Item Mortality and Movement of Adult Atlantic Menhaden During 1966-1969 Estimated from Mark-Recapture Models(2017) Liljestrand, Emily Morgan; Wilberg, Michael J; Marine-Estuarine-Environmental Sciences; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Atlantic Menhaden Brevoortia tyrannus is an economically and ecologically important forage fish. I built a multi-state mark-recapture model to estimate movement, fishing mortality, and natural mortality rates during 1966-1969. Movement from mid-Atlantic regions to North and South Carolina in the winter was lower than previously described, and natural mortality was approximately three times greater than previously estimated. Fishing mortality was highest in North and South Carolina. We evaluated the model’s performance by generating mark-recapture data sets from known values of mortality and movement then fitting the mark-recapture model to those data. The model estimated movement rates > 0.05 to within 33% of the true value even under different scenarios of spatiotemporally distributed releases and fishing effort. Distributing the fishing effort more evenly across regions substantially improved the estimates of movement and fishing mortality, and increasing the number of marked fish released had a small positive effect on accuracy of estimates.Item Understanding information use in multiattribute decision making(2016) Chrabaszcz, Jeffrey Stephen; Dougherty, Michael R; Neuroscience and Cognitive Science; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)An inference task in one in which some known set of information is used to produce an estimate about an unknown quantity. Existing theories of how humans make inferences include specialized heuristics that allow people to make these inferences in familiar environments quickly and without unnecessarily complex computation. Specialized heuristic processing may be unnecessary, however; other research suggests that the same patterns in judgment can be explained by existing patterns in encoding and retrieving memories. This dissertation compares and attempts to reconcile three alternate explanations of human inference. After justifying three hierarchical Bayesian version of existing inference models, the three models are com- pared on simulated, observed, and experimental data. The results suggest that the three models capture different patterns in human behavior but, based on posterior prediction using laboratory data, potentially ignore important determinants of the decision process.Item Bayesian Estimation of the Inbreeding Coefficient for Single Nucleotide Polymorphism Using Complex Survey Data(2015) Xue, Zhenyi; Lahiri, Partha; Li, Yan; Mathematics; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)In genome-wide association studies (GWAS), single nucleotide polymorphism (SNP) is often used as a genetic marker to study gene-disease association. Some large scale health sample surveys have recently started collecting genetic data. There is now growing interest in developing statistical procedures using genetic survey data. This calls for innovative statistical methods that incorporate both genetic and statistical sampling. Under simple random sampling, the traditional estimator of the inbreeding coefficient is given by 1 - (number of observed heterozygotes) / (number of expected heterozygotes). Genetic data quality control reports published by the National Health and Nutrition Examination Survey (NHANES) and the Health and Retirement Study (HRS) use this simple estimator, which serves as a reasonable quality control tool to identify problems such as genotyping error. There is, however, a need to improve on this estimator by considering different features of the complex survey design. The main goal of this dissertation is to fill in this important research gap. First, a design-based estimator and its associated jackknife standard error estimator are proposed. Secondly, a hierarchical Bayesian methodology is developed using the effective sample size and genotype count. Lastly, a Bayesian pseudo-empirical likelihood estimator is proposed using the expected number of heterozygotes in the estimating equation as a constraint when maximizing the pseudo-empirical likelihood. One of the advantages of the proposed Bayesian methodology is that the prior distribution can be used to restrict the parameter space induced by the general inbreeding model. The proposed estimators are evaluated using Monte Carlo simulation studies. Moreover, the proposed estimates of the inbreeding coefficients of SNPs from APOC1 and BDNF genes are compared using the data from the 2006 Health and Retirement Study.Item SENSITIVITY ANALYSIS OF STRUCTURAL PARAMETERS TO MEASUREMENT NONINVARIANCE: A BAYESIAN APPROACH(2014) Kang, Yoon Jeong; HANCOCK, GREGORY R; Measurement, Statistics and Evaluation; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Most previous studies have argued that the validity of group comparisons of structural parameters is dependent on the extent to which measurement invariance is met. Although some researchers have supported the concept of partial invariance, there is still no clear-cut partial invariance level which is needed to make valid group comparisons. In addition, relatively little attention has been paid to the implications of failing measurement invariance (e.g., partial measurement invariance) on group comparison on the underlying latent constructs in the multiple-group confirmatory factor analysis (MGCFA) framework. Given this, the purpose of the current study was to examine the extent to which measurement noninvariance affects structural parameter comparisons across populations in the MGCFA framework. Particularly, this study takes a Bayesian approach to investigate the sensitivity of the posterior distribution of structural parameter difference to varying types and magnitudes of noninvariance across two populations. A Monte Carlo simulation was performed to empirically investigate the sensitivity of structural parameters to varying types and magnitudes of noninvariant measurement models across two populations from a Bayesian approach. In order to assess the sensitivity of noninvariance conditions, three outcome variables were evaluated: (1) accuracy of statistical conclusion on structural parameter difference, (2) precision of the estimated structural parameter difference, and (3) bias in the posterior mean of structural parameter difference. Inconsistent with findings of previous studies, the results of this study showed that the three outcome variables were not sensitive to varying types and magnitudes of noninvariance across all conditions. Instead, the three outcome variables were sensitive to sample size, factor loading size, and prior distribution. These results indicate that even under a large magnitude of measurement noninvariance, accurate conclusions and inferences on structural parameter differences across populations could be obtained in the MGCFA framework. Implications for practice are discussed for applied researchers who wish to conduct group comparisons of structural parameters across populations under measurement noninvariance.