Perturbation, Optimization and Statistics for Effective Machine Learning

Tamir Hazan, University of Chicago

Predictions in modern statistical inference problems can be increasingly understood in terms of discrete structures such as arrangements of objects in computer vision, phonemes in speech recognition, parses in natural language processing, or molecular structures in computational biology. For example, in image scene understanding one needs to jointly predict discrete semantic labels for every pixel, e.g., whether it describes a person, bicycle, bed, etc. In a fully probabilistic treatment, all possible alternative assignments are considered thus requiring to estimate exponentially many structures with their respective weights. To relax the exponential complexity we describe two different approaches: Dual decomposition (e.g., convex belief propagation) and predictions under random perturbations. The second approach leads us to a new approximate inference framework that is based on max-statistics which capture long-range interactions, contrasting the current framework of dual decomposition that relies on pseudo-probabilities. We demonstrate the effectiveness of our approaches on different complex tasks, outperforming the state-of-the-art results in scene understanding, depth estimation, semantic segmentation and phoneme recognition.

Speaker Biography

Tamir Hazan received his PhD from the Hebrew University of Jerusalem (2009) and he is currently a research assistant professor at TTI Chicago. Tamir Hazan’s research describes efficient methods for reasoning about complex models. His work on random perturbations was presented in the machine learning best papers track at AAAI 2012. Tamir Hazan’s research also includes the primal-dual norm-product belief propagation algorithm which received a best paper award at UAI 2008. Currently, these techniques outperform the state-of-the-art in different computer vision and language processing tasks.