Information Theory

Information Theory

From Coding to Learning

Wu, Yihong; Polyanskiy, Yury

Cambridge University Press

09/2024

550

Dura

9781108832908

Pré-lançamento - envio 15 a 20 dias após a sua edição

Descrição não disponível.
Part I. Information measures: 1. Entropy; 2. Divergence; 3. Mutual information; 4. Variational characterizations and continuity of information measures; 5. Extremization of mutual information: capacity saddle point; 6. Tensorization and information rates; 7. f-divergences; 8. Entropy method in combinatorics and geometry; 9. Random number generators; Part II. Lossless Data Compression: 10. Variable-length compression; 11. Fixed-length compression and Slepian-Wolf theorem; 12. Entropy of ergodic processes; 13. Universal compression; Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma; 15. Information projection and large deviations; 16. Hypothesis testing: error exponents; Part IV. Channel Coding: 17. Error correcting codes; 18. Random and maximal coding; 19. Channel capacity; 20. Channels with input constraints. Gaussian channels; 21. Capacity per unit cost; 22. Strong converse. Channel dispersion. Error exponents. Finite blocklength; 23. Channel coding with feedback; Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory; 25. Rate distortion: achievability bounds; 26. Evaluating rate-distortion function. Lossy Source-Channel separation; 27. Metric entropy; Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics; 30. Mutual information method; 31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation; 33. Strong data processing inequality.
Este título pertence ao(s) assunto(s) indicados(s). Para ver outros títulos clique no assunto desejado.