Browsing by Author "Tsai, Sun-Ting"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item BUILDING KINETIC MODELS FOR COMPLEX SYSTEMS WITH ARBITRARY MEMORIES(2022) Tsai, Sun-Ting; Tiwary, Pratyush; Physics; Digital Repository at the University of Maryland; University of Maryland (College Park, Md.)Analyzing time series from complex dynamical systems in nature is a common yet challenging task in scientific computation since these time series are usually high-dimensional. To apply our physics intuitions to these dynamical systems often requires projecting these time series to certain low-dimensional degrees of freedom, which often introduces complicated memory effect. A simplest and classic example can be a 2-dimensional coupled differential equation. When one only looks at one of the Cartesian coordinates, one loses the predictability to predict what will happen next given the current 1-dimensional coordinate. The well-known solution is to describe the solution using the eigenvector, and the coupled equation is decoupled into a constant and a 1-dimensional memoryless equation. However, it can be imagined in a more complicated system we may have to look back to more time steps in the past, and it can be impossible to obtain a simple 1-dimensional eigenvector. In this work, we examine such memory effect within time series generated from Langevin dynamics, Molecular Dynamics (MD) simulations, and some experimental time series. We also develop computational methods to minimize and model such memory effects using statistical mechanics and machine learning. In recent years, MD simulation has become a powerful tool to model complex molecular dynamics in physics, chemistry, material science, biology, and many other fields. However, rare events such as droplet formation, nucleation, and protein conformational changes are hard to sample using MD simulations since they happen on the timescales far away from what all-atom MD simulation can reach. This makes MD simulation less useful for studying the mechanism of rare event kinetics. Therefore, it is a common practice to perform enhanced sampling techniques to help sample rare events, which requires performing dimensionality reduction from atomic coordinates to a low-dimensional representation that has a minimal memory effect. In the first part of this study, we focus on reducing the memory effect by capturing slow degrees of freedom using a set of low-dimensional reaction coordinates (RCs). The RCs are a low-dimensional surrogate of the eigenvector in the example of coupled equations. When describing the system using RCs, other dimensions become constant except fast randomly fluctuating noise. These RCs can then be used to help reproducing correct kinetic connectivity between metastable states using enhanced sampling methods such as metadynamics. We demonstrate the utility of our method by applying them to the droplet formation from the gaseous phase of Lennard-Jones particles and the conformational changes of a small peptide Ace-Ala3-Nme. The second part of the study aims at modeling another type of memory coming from intrinsic long-term dependency induced by ignored fast degrees of freedom wherein we utilize one of the fundamental machine learning techniques called the recurrent neural network to model non-Markovianity within time-series generated from MD simulations. This method has been shown to work not only on the molecular model of alanine dipeptide but also on experimental time series taken from single-molecule force spectroscopy. At the end of this second part, we also improve this method to extrapolate physics that the neural network had never seen in the training dataset by incorporating static or dynamical constraints on the path ensemble it generates.Item Path sampling of recurrent neural networks by incorporating known physics(Springer, 2022-11-24) Tsai, Sun-Ting; Fields, Eric; Xu, Yijia; Kuo, En-Jui; Tiwary, PratyushRecurrent neural networks have seen widespread use in modeling dynamical systems in varied domains such as weather prediction, text prediction and several others. Often one wishes to supplement the experimentally observed dynamics with prior knowledge or intuition about the system. While the recurrent nature of these networks allows them to model arbitrarily long memories in the time series used in training, it makes it harder to impose prior knowledge or intuition through generic constraints. In this work, we present a path sampling approach based on principle of Maximum Caliber that allows us to include generic thermodynamic or kinetic constraints into recurrent neural networks. We show the method here for a widely used type of recurrent neural network known as long short-term memory network in the context of supplementing time series collected from different application domains. These include classical Molecular Dynamics of a protein and Monte Carlo simulations of an open quantum system continuously losing photons to the environment and displaying Rabi oscillations. Our method can be easily generalized to other generative artificial intelligence models and to generic time series in different areas of physical and social sciences, where one wishes to supplement limited data with intuition or theory based corrections.