Seminar by Akshay Sundararaman

Approximate verification of symbolic dynamics of Markov chains

Akshay Sundararaman
IRISA, Rennes, France

    Date:    Friday, August 10th, 2012
    Time:    11AM
    Venue:   CS101.

Abstract:

A finite state Markov chain M is often viewed as a probabilistic transition system. An alternative view - which we follow here - is to regard M as a linear transform operating on the space of probability distributions over its set of nodes {1,...,n}. We are then interested in the sequences of probability distributions obtained by repeatedly applying M to an initial set of distributions. The novel idea here is to discretize the probability value space [0,1] into a finite set of intervals. Then each probability distribution μ is associated with a discretized distribution d, which is the unique n-tuple of intervals such that μ(i) belongs to d(i) for all i. Now, considering the set of discretized distributions as a finite alphabet, each sequence of probability distributions generates an infinite string over this finite alphabet. Thus, given an initial set of distributions, the symbolic dynamics of M consists of an infinite language of strings over the finite alphabet of discretized distributions. We investigate whether the symbolic dynamics meets a specification given as a linear time temporal logic formula, whose atomic propositions assert that the current probability of a node falls in some interval. We come up with exact and approximate approaches to tackle this problem.

The motivation for considering this symbolic dynamics is to avoid the complications - and complexity - caused by numerically tracking sequences of probability distributions exactly. In applications, such as those arising from biochemical networks, exact estimates of the distributions are neither feasible nor important. The study of the symbolic dynamics of Markov chains initiated here is of independent interest and can lead to other applications.

Back to Seminars in 2012-13