3C. Bayesian Inference for Graphical Models

16:50 - 18:05, Aula 11


Organizer: Antonino Abbruzzo

Chair: Antonino Abbruzzo


Log-likelihood approximation in Stochastic EM for Multilevel Latent Class Models


Silvia Columbu, Nicola Piras and Jeroen K. Vermunt


Abstract: Multilevel cross-classified Latent Class Models are an extension of standard latent class for handling data in which each observation is simultaneously nested within two groups. The likelihood associated to the model is untractable and approximation methods such as stochastic versions of the EM can be applied. The knowledge of a final estimate of the log-likelihood can be helpful in the evaluation of parameter estimates and for selection purposes. We propose two alternative log-likelihood approximation procedures and test their performances in the Hierarchical Multilevel Latent Class Model for which a finite estimate of the likelihood is provided through a special version of the EM.

Click here to view the abstract.

MCMC Sampling in Bayesian Gaussian Structure Learning


Antonino Abbruzzo, Nicola Argentino, Reza Mohammadi, Maria De Iorio, Willem van den Boom and Alexandros Beskos


Abstract: This work discusses structural learning algorithms for Bayesian Gaussian Graphical models. The challenge lies in selecting and estimating the precision matrix to determine the graph structure. An algorithm based on a Birth-Death process within an MCMC (BD-MCMC) sampling process is introduced in [2]. This algorithm involves a combination of continuous and discrete Markov processes. However, combining continuous and discrete Markov processes may not guarantee convergence to the stationary distribution. To overcome this, we propose a strategy that discretises the birth-death MCMC’s continuous component. A simulation study shows that this procedure helps in some extreme cases where the BD-MCMC fails and at the very least matches its performance in the other scenarios.

Click here to view the abstract.

Large-scale Bayesian Structure Learning for Gaussian Graphical Models using Marginal Pseudo-likelihood


Reza Mohammadi, Marit Schoonhoven, Lucas Vogels and Ş. İlker Birbil


Abstract: Bayesian methods for learning Gaussian graphical models offer a robust framework that addresses model uncertainty and incorporates prior knowledge. Despite their theoretical strengths, the applicability of Bayesian methods is often constrained by computational needs, especially in modern contexts involving thousands of variables. To overcome this issue, we introduce two novel Markov chain Monte Carlo (MCMC) search algorithms that have a significantly lower computational cost than leading Bayesian approaches. Our proposed MCMC-based search algorithms use the marginal pseudo-likelihood approach to bypass the complexities of computing intractable normalizing constants and iterative precision matrix sampling.

Click here to view the abstract.

 

A work by Gianluca Sottile

(on behalf of the local organizing committee)