Combinatorial Algorithms with Applications in Learning Graphical Models

Lecturer : 
Juho-Kustaa Kangas
Event type: 
Doctoral dissertation
Doctoral dissertation
Respondent: 
Juho-Kustaa Kangas
Opponent: 
James Cussens, Senior Lecturer, University of York
Custos: 
Professor Petri Myllymäki, University of Helsinki
Event time: 
2016-12-09 14:00 to 17:00
Place: 
Exactum, Auditorium CK112, Gustaf Hällströmin katu 2b, Helsinki
Description: 

Abstract

Graphical models are a framework for representing joint distributions over random variables. By capturing the structure of conditional independencies between the variables, a graphical model can express the distribution in a concise factored form that is often efficient to store and reason about.

As constructing graphical models by hand is often infeasible, a lot of work has been devoted to learning them automatically from observational data. Of particular interest is the so-called structure learning problem, of finding a graph that encodes the structure of probabilistic dependencies. Once the learner has decided what constitutes a good fit to the data, the task of finding optimal structures typically involves solving an NP-hard problem of combinatorial optimization. While first algorithms for structure learning thus
resorted to local search, there has been a growing interest in solving the problem to a global optimum. Indeed, during the past decade multiple exact algorithms have been proposed that are guaranteed to find optimal structures for the family of Bayesian networks, while first steps have been taken for the family of decomposable graphical models.

This thesis presents combinatorial algorithms and analytical results with applications in the structure learning problem. For decomposable models, we present exact algorithms for the so-called full Bayesian approach, which involves not only finding individual structures of good fit but also computing posterior expectations of graph features, either by exact computation or via Monte Carlo methods.

For Bayesian networks, we study the empirical hardness of the structure learning problem, with the aim of being able to predict the running time of various structure learning algorithms on a given problem instance. As a result, we obtain a hybrid algorithm that effectively combines the best-case performance of multiple existing techniques.

Lastly, we study two combinatorial problems of wider interest with relevance in structure learning. First, we present algorithms for counting linear extensions of partially ordered sets, which is required to correct bias in MCMC methods for sampling Bayesian network structures. Second, we give results in the extremal combinatorics of connected vertex sets, whose number bounds the running time of certain algorithms for structure learning and various other problems.


Last updated on 29 Nov 2016 by Noora Suominen de Rios - Page created on 29 Nov 2016 by Noora Suominen de Rios