##### Abstract

This paper studies almost sure convergence of a dynamic average consensus algorithm which allows distributed computation of the product of $n$ time-varying conditional probability density functions, known as beliefs, corresponding to $n$ different nodes within a sensor network. The network topology is modeled as an undirected graph. The average consensus algorithm is used in a distributed hidden Markov model (HMM) filter. We use the ordinary differential equation (ODE) technique to analyze the convergence of the stochastic approximation type algorithm for average consensus with constant step size which allows each node to track the time varying average of the likelihood of the beliefs belong to different nodes in the network. It is shown that, for a connected graph, under mild assumptions on the first and second moments of the observation probability distributions and a geometric ergodicity condition on an extended Markov chain, the consensus filter state of each individual sensor converges ${\mathbb{P}\mbox{--a.s. }}$ to the true average of the likelihood of the beliefs of all the sensors. In order to prove convergence, we introduce a perturbed stochastic Lyapunov function to show that the error between the consensus filter state at each node and the true average visits some compact set infinitely often ${\mathbb{P}\mbox{--w.p.}1}$ and from that it is shown that the error process is bounded ${\mathbb{P}\mbox{--w.p.}1}$.