## Graphical ModelGraphical model (GM) has been one of powerful paradigms for succinct representations of joint probability distributions in various scientific fields including information theory, statistical physics and artificial intelligence. GM represents a joint distribution of some random variables by a graph structured model where each vertex corresponds to a random variable and each edge captures the conditional dependence between random variables. We have worked on solving inference and learning problems arising in GMs and related applications. ## Optimality of BP for Combinatorial OptimizationBelief Propagation (BP) is the most popular deterministic algorithm for solving intractable computational inference in GM. BP has shown remarkable performances in practice, but its theoretical understanding is not fully established yet. We found several combinatorial examples where BP can guarantee provably the desired optimality: independent set, matching, shortest path, traveling salesman, cycle packing, vertex/edge cover, network flow and crowd-sourcing. - Efficient Learning for Crowdsourced Regression
Jungseul Ok, Sewoong Oh, Jinwoo Shin, Yunhun Jang and Yung Yi working paper
- Optimality of Belief Propagation for Crowdsourced Classification
Jungseul Ok, Sewoong Oh, Jinwoo Shin and Yung Yi
*ICML*2016
- Convergence and Correctness of Max-Product Belief Propagation for Linear Programming
Sejun Park and Jinwoo Shin
*SIAM Journal on Discrete Mathematics*, vol. 31, no. 3, pp. 2228-2236, 2017 (preliminary version appeared in*UAI*2015)
- Counting Independent Sets using the Bethe Approximation
Venkat Chandrasekaran, Misha Chertkov, David Gamarnik, Devavrat Shah and Jinwoo Shin
*SIAM Journal on Discrete Mathematics*, vol. 25, no. 2, pp. 1012-1034, 2011
In particular, for the minimum weight matching problem, we develop sequential cutting-plane-like approaches forcing the optimality of BP. - Minimum Weight Perfect Matching via Blossom Belief Propagation
Sungsoo Ahn, Sejun Park, Michael Chertkov and Jinwoo Shin
*NIPS*2015
- A Graphical Transformation for Belief Propagation: Maximum Weight Matchings and Odd-Sized Cycles
Jinwoo Shin, Andrew E. Gelfand and Michael Chertkov
*NIPS*2013
Besides the above theoretical guarantees, we also provide practical guidelines on the parallel implementation of BP for large-scale problems. - Practical Message-passing Framework for Large-scale Combinatorial Optimization
Inho Cho, Soya Park, Sejun Park, Dongsu Han, and Jinwoo Shin
*IEEE BigData*2015
(edited by Jinwoo Shin, Mar 2017) ## Fast Mixing MCMCMarkov Chain Monte Carlo (MCMC) is the most popular randomized algorithm for solving intractable computational inference in GM. In contrast to BP, MCMC is an exact probabilistic method which, however, often suffers from (exponentially) slow mixing. We found several examples/regimes of fast (polynomial-time) mixing MCMCs. - Rapid Mixing Swendsen-Wang Sampler for Stochastic Partitioned Attractive Models
Sejun Park, Yunhun Jang, Andreas Galanis, Jinwoo Shin, Daniel Stefankovic and Eric Vigoda
*AISTATS*2017
- Synthesis of MCMC and Belief Propagation (code)
Sungsoo Ahn, Michael Chertkov, and Jinwoo Shin
*NIPS*2016
- Improved Mixing Condition on the Grid for Counting and Sampling Independent Sets
Ricardo Restrepo, Jinwoo Shin, Prasad Tetali, and Eric Vigoda
*FOCS*2011
- Distributed Averaging via Lifted Markov Chains
Kyomin Jung, Devavrat Shah, and Jinwoo Shin
*IEEE Transactions on Information Theory*, vol. 56, no. 1, pp. 634-647, 2010
- Dynamics in Congestion Games
Devavrat Shah and Jinwoo Shin
*SIGMETRICS*2010
(edited by Jinwoo Shin, Mar 2017) ## Improved Approximate InferenceWe have also worked on developing advanced alternatives to BP algorithms, for improved inferecne on GM. First, BP is an iterative algorithm, which often diverges. However, there exists always a fixed point of the BP iteration. We developed a message-passing algorithm computes a BP fixed point in a polynomial number of operations for general binary graphical models of n variables where the maximum degree in the underlying graph is O(log n). Our algorithm can be used as an alternative to BP fixing its convergence issue and is the first fully polynomial-time approximation scheme for the BP fixed-point computation in such a large class of graphical models, while the approximate fixed-point computation is known to be (PPAD-)hard in general. - The Complexity of Approximating a Bethe Equilibrium
Jinwoo Shin
*IEEE Transactions on Information Theory*, vol. 60, no. 7, pp. 3959-3969, 2014 (preliminary version appeared in*AISTATS*2012)
We also propose novel frameworks improving the approximate inference schemes utilizing the so-called gauge transformation which modifies factors of GM while keeping the inference task invariant.. - Gauged Mini-Bucket Elimination for Approximate Inference
Sungsoo Ahn, Michael Chertkov, Jinwoo Shin and Adrian Weller
*AISTATS*2018
- Gauging Variational Inference
Sungsoo Ahn, Michael Chertkov, and Jinwoo Shin
*NIPS*2017
(edited by Jinwoo Shin, Sep 2017) ## Learning Parameters of GMLearning parameters of GM involves computationally intractible inferences tasks, where BP and MCMC can be used for approximate inference. Furthermore, that of latent GM is inherently much harder than that of no-latent ones since the latent variables make the corresponding log-likelihood non-concave. Nevertheless, Expectation-Maximization (EM) schemes are popularly used in practice, but they are typically stuck in bad local optima. We have worked on relaxing the issues of learning GMs rigorously. - Sequential Local Learning for Latent Graphical Models
Sejun Park, Eunho Yang and Jinwoo Shin working paper
- Adiabatic Persistent Contrastive Divergence Learning
Hyeryung Jang, Hyungwon Choi, Yung Yi and Jinwoo Shin
*ISIT*2017
- Rapid Mixing Swendsen-Wang Sampler for Stochastic Partitioned Attractive Models
Sejun Park, Yunhun Jang, Andreas Galanis, Jinwoo Shin, Daniel Stefankovic and Eric Vigoda
*AISTATS*2017
(edited by Jinwoo Shin, Mar 2017) |