Graphical Model

Graphical model (GM) has been one of powerful paradigms for succinct representations of joint probability distributions in various scientific fields including information theory, statistical physics and artificial intelligence. GM represents a joint distribution of some random variables by a graph structured model where each vertex corresponds to a random variable and each edge captures the conditional dependence between random variables. We have worked on solving inference and learning problems arising in GMs and related applications.

Optimality of BP

Belief Propagation (BP) is the most popular deterministic algorithm for solving intractable computational inference in GM. BP has shown remarkable performances in practice, but its theoretical understanding is not fully established yet. We found several examples where BP can guarantee provably the desired optimality: independent set, matching, shortest path, traveling salesman, cycle packing, vertex/edge cover, network flow and crowd-sourcing.

Efficient Learning for Crowdsourced Regression

Jungseul Ok, Sewoong Oh, Jinwoo Shin, Yunhun Jang and Yung Yi
submitted

Optimality of Belief Propagation for Crowdsourced Classification

Jungseul Ok, Sewoong Oh, Jinwoo Shin and Yung Yi
ICML 2016

Minimum Weight Perfect Matching via Blossom Belief Propagation

Sungsoo Ahn, Sejun Park, Michael Chertkov and Jinwoo Shin
NIPS 2015

Max-Product Belief Propagation for Linear Programming: Applications to Combinatorial Optimization

Sejun Park and Jinwoo Shin
UAI 2015

A Graphical Transformation for Belief Propagation: Maximum Weight Matchings and Odd-Sized Cycles

Jinwoo Shin, Andrew E. Gelfand and Michael Chertkov
NIPS 2013

Counting Independent Sets using the Bethe Approximation

Venkat Chandrasekaran, Misha Chertkov, David Gamarnik, Devavrat Shah and Jinwoo Shin
SIAM Journal on Discrete Mathematics, vol. 25, no. 2, pp. 1012-1034, 2011 (edited by Jinwoo Shin, Mar 2017)

Fast Mixing MCMC

Markov Chain Monte Carlo (MCMC) is the most popular randomized algorithm for solving intractable computational inference in GM. In contrast to BP, MCMC is an exact probabilistic method which, however, often suffers from (exponentially) slow mixing. We found several examples/regimes of fast (polynomial-time) mixing MCMCs.

Rapid Mixing Swendsen-Wang Sampler for Stochastic Partitioned Attractive Models

Sejun Park, Yunhun Jang, Andreas Galanis, Jinwoo Shin, Daniel Stefankovic and Eric Vigoda
AISTATS 2017

Synthesis of MCMC and Belief Propagation (code)

Sungsoo Ahn, Michael Chertkov, and Jinwoo Shin
NIPS 2016

Improved Mixing Condition on the Grid for Counting and Sampling Independent Sets

Ricardo Restrepo, Jinwoo Shin, Prasad Tetali, and Eric Vigoda
FOCS 2011

Distributed Averaging via Lifted Markov Chains

Kyomin Jung, Devavrat Shah, and Jinwoo Shin
IEEE Transactions on Information Theory, vol. 56, no. 1, pp. 634-647, 2010

Dynamics in Congestion Games

Devavrat Shah and Jinwoo Shin
SIGMETRICS 2010 (edited by Jinwoo Shin, Mar 2017)

Advanced Alternatives to BP and MCMC

We have also worked on developing advanced alternatives to BP and MCMC algorithms, for improved inferecne on GM. First, BP is an iterative algorithm, which often diverges. However, there exists always a fixed point of the BP iteration. We developed a message-passing algorithm computes a BP fixed point in a polynomial number of operations for general binary graphical models of n variables where the maximum degree in the underlying graph is O(log n). Our algorithm can be used as an alternative to BP fixing its convergence issue and is the first fully polynomial-time approximation scheme for the BP fixed-point computation in such a large class of graphical models, while the approximate fixed-point computation is known to be (PPAD-)hard in general.

The Complexity of Approximating a Bethe Equilibrium

Jinwoo Shin
IEEE Transactions on Information Theory, vol. 60, no. 7, pp. 3959-3969, 2014 (preliminary version appeared in AISTATS 2012) Second, we propose novel frameworks synthesizing complementary benefits of BP and MCMC. They can be also naturally combined with other popular MCMC techniques, annealed/sequential/adaptive importance sampling, boosting the performance.

Gauge Optimization via ADMM for Approximate Inference

Sungsoo Ahn, Michael Chertkov, and Jinwoo Shin
submitted

Synthesis of MCMC and Belief Propagation (code)

Sungsoo Ahn, Michael Chertkov, and Jinwoo Shin
NIPS 2016 (edited by Jinwoo Shin, Mar 2017)

Learning Parameters of GM

Learning parameters of GM involves computationally intractible inferences tasks, where BP and MCMC can be used for approximate inference. Furthermore, that of latent GM is inherently much harder than that of no-latent ones since the latent variables make the corresponding log-likelihood non-concave. Nevertheless, Expectation-Maximization (EM) schemes are popularly used in practice., but they are typically stuck in bad local optima. We have worked on relaxing the issues of learning GMs rigorously.

Sequential Local Learning for Latent Graphical Models

Sejun Park, Eunho Yang and Jinwoo Shin
submitted

Adiabatic Persistent Contrastive Divergence Learning

Hyeryung Jang, Hyungwon Choi, Yung Yi and Jinwoo Shin
submitted

Rapid Mixing Swendsen-Wang Sampler for Stochastic Partitioned Attractive Models

Sejun Park, Yunhun Jang, Andreas Galanis, Jinwoo Shin, Daniel Stefankovic and Eric Vigoda
AISTATS 2017 (edited by Jinwoo Shin, Mar 2017)