1-s2.0-S0031320313X0010X-cov150h

1-s2.0-S0031320313X0010X-cov150hWe are very pleased to announce that our paper “Training restricted Boltzmann machines: An introduction by Asja Fischer and Christian Igel. Pattern Recognition. Volume 47, Issue 1, Jan. 2014, Pages 25-39″ was awarded the Pattern Recognition Journal Best Paper Award 2014. The biennial award is given to the best paper published in the journal Pattern Recognition, the official journal of the Pattern Recognition Society.

The idea behind the paper was to provide implementations of Restricted Boltzmann machines (RBMs), which are probabilistic graphical models that can be interpreted as stochastic neural networks. They have attracted much attention as building blocks of deep learning systems called deep belief networks, and variants and extensions of RBMs have found application in a wide range of pattern recognition tasks. The article introduces RBMs from the viewpoint of Markov random fields (undirected graphical models).

Abstract
Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can be interpreted as stochastic neural networks. They have attracted much attention as building blocks for the multi-layer learning systems called deep belief networks, and variants and extensions of RBMs have found application in a wide range of pattern recognition tasks. This tutorial introduces RBMs from the viewpoint of Markov random fields, starting with the required concepts of undirected graphical models. Different learning algorithms for RBMs, including contrastive divergence learning and parallel tempering, are discussed. As sampling from RBMs, and therefore also most of their learning algorithms, are based on Markov chain Monte Carlo (MCMC) methods, an introduction to Markov chains and MCMC techniques is provided. Experiments demonstrate relevant aspects of RBM training.

Stay tuned for more news :)