## Graphical ModelGraphical model (GM) has been one of powerful paradigms for succinct representations of joint probability distributions in various scientific fields including information theory, statistical physics and artificial intelligence. GM represents a joint distribution of some random variables by a graph structured model where each vertex corresponds to a random variable and each edge captures the conditional dependence between random variables. We have worked on solving inference and learning problems arising in GMs and related applications. ## Belief PropagationComputing the partition function or MAP (Maximum a Posteriori) is the most important statistical inference task arising in applications of GM. Belief Propagation (BP) is the most popular deterministic algorithm for solving intractable computational inference in GM. BP has shown remarkable performances in practice, but its theoretical understanding is not fully established yet. We found several combinatorial examples where BP can guarantee provably the desired optimality: independent set, matching, shortest path, traveling salesman, cycle packing, vertex/edge cover, network flow and crowd-sourcing. - Iterative Bayesian Learning for Crowdsourced Regression
Jungseul Ok, Sewoong Oh, Yunhun Jang, Jinwoo Shin and Yung Yi working paper
- Synthesis of MCMC and Belief Propagation (code)
Sungsoo Ahn, Michael Chertkov, and Jinwoo Shin
*NIPS*2016
- Optimality of Belief Propagation for Crowdsourced Classification
Jungseul Ok, Sewoong Oh, Jinwoo Shin and Yung Yi
*ICML*2016
- Minimum Weight Perfect Matching via Blossom Belief Propagation
Sungsoo Ahn, Sejun Park, Michael Chertkov and Jinwoo Shin
*NIPS*2015
- Convergence and Correctness of Max-Product Belief Propagation for Linear Programming
Sejun Park and Jinwoo Shin
*UAI*2015
- A Graphical Transformation for Belief Propagation: Maximum Weight Matchings and Odd-Sized Cycles
Jinwoo Shin, Andrew E. Gelfand and Michael Chertkov
*NIPS*2013
- The Complexity of Approximating a Bethe Equilibrium
Jinwoo Shin
*AISTATS*2012
- Counting Independent Sets using the Bethe Approximation
Venkat Chandrasekaran, Misha Chertkov, David Gamarnik, Devavrat Shah and Jinwoo Shin
*SIAM Journal on Discrete Mathematics*, vol. 25, no. 2, pp. 1012-1034, 2011
Besides the above theoretical guarantees, we also provide practical guidelines on the parallel implementation of BP for large-scale problems. - Practical Message-passing Framework for Large-scale Combinatorial Optimization
Inho Cho, Soya Park, Sejun Park, Dongsu Han, and Jinwoo Shin
*IEEE BigData*2015
(edited by Jinwoo Shin, Mar 2017) ## Markov Chain Monte CarloMarkov Chain Monte Carlo (MCMC) is the most popular randomized algorithm for solving intractable computational inference in GM. In contrast to BP, MCMC is an exact probabilistic method which, however, often suffers from (exponentially) slow mixing. We found several examples/regimes of fast (polynomial-time) mixing MCMCs. - Rapid Mixing Swendsen-Wang Sampler for Stochastic Partitioned Attractive Models
Sejun Park, Yunhun Jang, Andreas Galanis, Jinwoo Shin, Daniel Stefankovic and Eric Vigoda
*AISTATS*2017
- Synthesis of MCMC and Belief Propagation (code)
Sungsoo Ahn, Michael Chertkov, and Jinwoo Shin
*NIPS*2016
- Improved Mixing Condition on the Grid for Counting and Sampling Independent Sets
Ricardo Restrepo, Jinwoo Shin, Prasad Tetali, and Eric Vigoda
*FOCS*2011
- Dynamics in Congestion Games
Devavrat Shah and Jinwoo Shin
*SIGMETRICS*2010
(edited by Jinwoo Shin, Mar 2018) ## Variational InferenceVariational inference (VI) is an approach stating the inference task as an optimization. Here, the main challeng is how to design a tractable optimization formulation while not sacrificing accuracy too much. We have worked on developing new optimization schemes advancing prior VI algorithms for improved inferecne on GM. - Bucket Renormalization for Approximate Inference
Sungsoo Ahn, Michael Chertkov, Adrian Weller and Jinwoo Shin
*ICML*2018
- Gauged Mini-Bucket Elimination for Approximate Inference
Sungsoo Ahn, Michael Chertkov, Jinwoo Shin and Adrian Weller
*AISTATS*2018
- Gauging Variational Inference
Sungsoo Ahn, Michael Chertkov, and Jinwoo Shin
*NIPS*2017
(edited by Jinwoo Shin, Mar 2018) ## Learning Parameters of GMLearning parameters of GM involves computationally intractible inferences tasks, where BP and MCMC can be used for approximate inference. Furthermore, that of latent GM is inherently much harder than that of no-latent ones since the latent variables make the corresponding log-likelihood non-concave. Nevertheless, Expectation-Maximization (EM) schemes are popularly used in practice, but they are typically stuck in bad local optima. We have worked on relaxing the issues of learning GMs rigorously. - Sequential Local Learning for Latent Graphical Models
Sejun Park, Eunho Yang and Jinwoo Shin working paper
- Adiabatic Persistent Contrastive Divergence Learning
Hyeryung Jang, Hyungwon Choi, Yung Yi and Jinwoo Shin
*ISIT*2017
- Rapid Mixing Swendsen-Wang Sampler for Stochastic Partitioned Attractive Models
Sejun Park, Yunhun Jang, Andreas Galanis, Jinwoo Shin, Daniel Stefankovic and Eric Vigoda
*AISTATS*2017
(edited by Jinwoo Shin, Mar 2017) |