Survey on Concepts behind and Applications of the Information Bottleneck Theory


Project maintained by CalCharles Hosted on GitHub Pages — Theme by mattgraham

Abstract

Abstract—This paper will introduce the ideas of unsupervised machine learning as motivation for presenting the underlying information theoretic theory from the perspective of channel coding, then transition into a discussion of various extensions of information bottleneck ideas and algorithms to applications. In particular, several examples of the optimization models applied to toy problems are used to demonstrate usage and equivalences between the various models. Though these examples only scratch the surface of the full range of ideas and implementations explored in this paper, they provide a motivation and background for such extensions. The paper concludes with an overview of the different applications of information theoretic concepts to learning techniques and data sets, relating various modern techniques to the informational bottleneck, and proposing several ideas for implementation.

Final Paper and Code

Please contact Caleb Chuck (caleb_chuck AT yahoo DOT com) for further information.