Compositional Analysis of Dynamic Bayesian Networks and Applications to CPS

dc.contributor.authorYang, ShahAn
dc.contributor.authorZhou, Yuchen
dc.contributor.authorBaras, John
dc.date.accessioned2012-12-17T19:35:45Z
dc.date.available2012-12-17T19:35:45Z
dc.date.issued2012
dc.description.abstractDynamic Bayesian networks (DBNs) can be effectively used to model various problems in complex dynamic systems. We perform an empirical investigation on compositional analysis of DBNs using abstraction. In static systems and hidden Markov models, computation of a metric called treewidth induces a tree decomposition that can be used to perform logical or probabilistic inference and max+ optimizations in time exponential in treewidth and linear in overall system size. Intuitively, the linear scaling means that very large systems can be analyzed as long as they are sufficiently sparse and well structured. In these simple cases, summary propagation, which uses two operations, summation (projection) and product (composition), suffices to perform the inference or optimization. Here, we begin an extension of this to structured networks of communicating dynamic systems. We define generalizations of projection and composition operators that treat labeled Markov chains as primitive objects. The projection operation, corresponding to summation, is implemented as label deletion followed by exact state reduction for Markov chains, similar to Hopcroft’s DFA minimization algorithm, with O(n log m) complexity. The composition operation is the product of state machines. We use canonical MDDs, similar to BDDs, to capture logical dependencies symbolically. Combining symbolic representations with Markov chain lumping algorithms is a novel contribution. Using this approach, we have created a tool leveraging model based systems engineering technologies. The communicating Markov chains are specified using UML Statecharts via Papyrus extended using an ANTLR parsed domain specific language (DSL). The tool reduces the number of states in networks of Markov chains by several orders of magnitude. In one example, a network having a product state space of more than 600 million states is reduced to about 500 states. A feature of this technique is that the state space is examined incrementally, meaning that the full state space is never explicitly represented, even as an input to the reduction algorithm. The primary reduction appears to come from symmetry which is surprising because the technique includes no explicit symmetry handling. We note that composition is efficient at least for systems with high symmetry. We describe applications to a hospital intensive care unit (ICU) from a systems engineering perspective.en_US
dc.description.sponsorshipResearch supported in part by the NSF under grant CNS-1035655, by the NIST under contract 70NANB11H148 and by a grant from the Lockheed Martin Corporation.en_US
dc.identifier.urihttp://hdl.handle.net/1903/13335
dc.language.isoen_USen_US
dc.relation.isAvailableAtInstitute for Systems Researchen_us
dc.relation.isAvailableAtDigital Repository at the University of Marylanden_us
dc.relation.isAvailableAtUniversity of Maryland (College Park, MD)en_us
dc.relation.ispartofseriesTR_2012-12
dc.subjectDynamic Baysian Networken_US
dc.subjectMarkov Chainen_US
dc.subjectcompositionen_US
dc.subjectICUen_US
dc.subjectMultiple Decision Diagramen_US
dc.subjectCPSen_US
dc.titleCompositional Analysis of Dynamic Bayesian Networks and Applications to CPSen_US
dc.typeTechnical Reporten_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
TR_2012-12.pdf
Size:
1.17 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.57 KB
Format:
Item-specific license agreed upon to submission
Description: