Hyper and structural Markov laws for graphical models
My thesis focuses on the parameterisation and estimation of graphical models, based on the concept of hyper and meta Markov properties. These state that the parameters should exhibit conditional independencies, similar to those on the sample space. When these properties are satisfied, parameter estimation may be performed locally, i.e. the estimators for certain subsets of the graph are determined entirely by the data corresponding to the subset. Firstly, I discuss the applications of these properties to the analysis of case-control studies. It has long been established that the maximum likelihood estimates for the odds-ratio may be found by logistic regression, in other words, the "incorrect" prospective model is equivalent to the correct retrospective model. I use a generalisation of the hyper Markov properties to identify necessary and sufficient conditions for the corresponding result in a Bayesian analysis, that is, the posterior distribution for the odds-ratio is the same under both the prospective and retrospective likelihoods. These conditions can be used to derive a parametric family of prior laws that may be used for such an analysis. The second part focuses on the problem of inferring the structure of the underlying graph. I propose an extension of the meta and hyper Markov properties, which I term structural Markov properties, for both undirected decomposable graphs and directed acyclic graphs. Roughly speaking, it requires that the structure of distinct components of the graph are conditionally independent given the existence of a separating component. This allows the analysis and comparison of multiple graphical structures, while being able to take advantage of the common conditional independence constraints. Moreover, I show that these properties characterise exponential families, which form conjugate priors under sampling from compatible Markov distributions.