"Information Theory, Inference & Learning Algorithms" - Information and Links:

Information Theory, Inference & Learning Algorithms - Info and Reading Options

Book's cover
The cover of “Information Theory, Inference & Learning Algorithms” - Open Library.

"Information Theory, Inference & Learning Algorithms" was published by Cambridge University Press in 2003 - Cambridge, UK, it has 640 pages and the language of the book is English.


“Information Theory, Inference & Learning Algorithms” Metadata:

  • Title: ➤  Information Theory, Inference & Learning Algorithms
  • Author:
  • Language: English
  • Number of Pages: 640
  • Publisher: Cambridge University Press
  • Publish Date:
  • Publish Location: Cambridge, UK

“Information Theory, Inference & Learning Algorithms” Subjects and Themes:

Edition Specifications:

  • Format: Hardcover
  • Weight: 3.3 pounds
  • Dimensions: 9.8 x 7.6 x 1.3 inches
  • Pagination: xii, 628p

Edition Identifiers:

AI-generated Review of “Information Theory, Inference & Learning Algorithms”:


"Information Theory, Inference & Learning Algorithms" Table Of Contents:

  • 1- Introduction to information theory
  • 2- Probability, entropy, and inference
  • 3- More about inference
  • 4- Data Compression
  • 5- The source coding theorem
  • 6- Symbol codes
  • 7- Stream codes
  • 8- Codes for integers
  • 9- Noisy-Channel Coding
  • 10- Correlated random variables
  • 11- Communication over a noisy channel
  • 12- The noisy-channel coding theorem
  • 13- Error-correcting codes and real channels
  • 14- Further Topics in Information Theory
  • 15- Hash codes: codes for efficient information retrieval
  • 16- Binary codes
  • 17- Very good linear codes exist
  • 18- Further exercises on information theory
  • 19- Message passing
  • 20- Communication over constrained noiseless channels
  • 21- An aside: crosswords and codebreaking
  • 22- Why have sex? Information acquisition and evolution
  • 23- Probabilities and Inference
  • 24- An example inference task: clustering
  • 25- Exact inference by complete enumeration
  • 26- Maximum likelihood and clustering
  • 27- Useful probability distributions
  • 28- Exact marginalization
  • 29- Exact marginalization in trellises
  • 30- Exact marginalization in graphs
  • 31- Laplace's method
  • 32- Model comparison and Occam's razor
  • 33- Monte Carlo methods
  • 34- Efficient Monte Carlo methods
  • 35- Ising models
  • 36- Exact Monte Carlo sampling
  • 37- Variational methods
  • 38- Independent component analysis and latent variable modelling
  • 39- Random inference topics
  • 40- Decision theory
  • 41- Bayesian inference and sampling theory
  • 42- Neural Networks
  • 43- Introduction to neural networks
  • 44- The single neuron as a classifier
  • 45- Capacity of a single neuron
  • 46- Learning as inference
  • 47- Hopfield networks
  • 48- Boltzmann machines
  • 49- Supervised learning in multilayer networks
  • 50- Gaussian processes
  • 51- Deconvolution
  • 52- Sparse Graph Codes
  • 53- Low-density parity-check codes
  • 54- Convolutional codes and turbo codes
  • 55- Repeat-accumulate codes
  • 56- Digital fountain codes
  • 57- Appendices
  • 58- Notation
  • 59- Some physics
  • 60- Some mathematics

Snippets and Summary:

In this chapter we discuss how to measure the information content of the outcome of a random experiment.
You cannot do inference without making assumptions.

Read “Information Theory, Inference & Learning Algorithms”:

Read “Information Theory, Inference & Learning Algorithms” by choosing from the options below.

Search for “Information Theory, Inference & Learning Algorithms” downloads:

Visit our Downloads Search page to see if downloads are available.

Find “Information Theory, Inference & Learning Algorithms” in Libraries Near You:

Read or borrow “Information Theory, Inference & Learning Algorithms” from your local library.

Buy “Information Theory, Inference & Learning Algorithms” online:

Shop for “Information Theory, Inference & Learning Algorithms” on popular online marketplaces.