Document worth reading: “Divergence, Entropy, Information: An Opinionated Introduction to Information Theory”

Information idea is a mathematical idea of learning with deep connections with issues as quite a few as artificial intelligence, statistical physics, and natural evolution. Many primers on the topic paint a broad picture with comparatively little mathematical sophistication, whereas many others develop explicit software program areas intimately. In distinction, these informal notes intention to outline some elements of the information-theoretic ‘mind-set,’ by chopping a quick and attention-grabbing path by means of some of the idea’s foundational concepts and theorems. We take the Kullback-Leibler divergence as our foundational thought, after which proceed to develop the entropy and mutual data. We discuss just a few of the predominant foundational outcomes, along with the Chernoff bounds as a characterization of the divergence; Gibbs’ Theorem; and the Data Processing Inequality. A recurring theme is that the definitions of information idea assist pure theorems that sound ‘obvious’ when translated into English. More pithily, ‘data idea makes frequent sense precise.’ Since the primary focus of the notes simply is not completely on technical particulars, proofs are supplied solely the place the associated methods are illustrative of broader themes. Otherwise, proofs and intriguing tangents are referenced in liberally-sprinkled footnotes. The notes shut with a extraordinarily nonexhaustive guidelines of references to sources and totally different views on the sphere. Divergence, Entropy, Information: An Opinionated Introduction to Information Theory