Information-Theoretic Methods in Data Science
-10%
portes grátis
Information-Theoretic Methods in Data Science
Rodrigues, Miguel R. D.; Eldar, Yonina C.
Cambridge University Press
04/2021
560
Dura
Inglês
9781108427135
15 a 20 dias
1100
Descrição não disponível.
1. Introduction Miguel Rodrigues, Stark Draper, Waheed Bajwa and Yonina Eldar; 2. An information theoretic approach to analog-to-digital compression Alon Knipis, Yonina Eldar and Andrea Goldsmith; 3. Compressed sensing via compression codes Shirin Jalali and Vincent Poor; 4. Information-theoretic bounds on sketching Mert Pillanci; 5. Sample complexity bounds for dictionary learning from vector- and tensor-valued data Zahra Shakeri, Anand Sarwate and Waheed Bajwa; 6. Uncertainty relations and sparse signal recovery Erwin Riegler and Helmut Boelcskei; 7. Understanding phase transitions via mutual Information and MMSE Galen Reeves and Henry Pfister; 8. Computing choice: learning distributions over permutations Devavrat Shah; 9. Universal clustering Ravi Raman and Lav Varshney; 10. Information-theoretic stability and generalization Maxim Raginsky, Alexander Rakhlin and Aolin Xu; 11. Information bottleneck and representation learning Pablo Piantanida and Leonardo Rey Vega; 12. Fundamental limits in model selection for modern data analysis Jie Ding, Yuhong Yang and Vahid Tarokh; 13. Statistical problems with planted structures: information-theoretical and computational limits Yihong Wu and Jiaming Xu; 14. Distributed statistical inference with compressed data Wenwen Zhao and Lifeng Lai; 15. Network functional compression Soheil Feizi and Muriel Medard; 16. An introductory guide to Fano's inequality with applications in statistical estimation Jonathan Scarlett and Volkan Cevher.
Este título pertence ao(s) assunto(s) indicados(s). Para ver outros títulos clique no assunto desejado.
1. Introduction Miguel Rodrigues, Stark Draper, Waheed Bajwa and Yonina Eldar; 2. An information theoretic approach to analog-to-digital compression Alon Knipis, Yonina Eldar and Andrea Goldsmith; 3. Compressed sensing via compression codes Shirin Jalali and Vincent Poor; 4. Information-theoretic bounds on sketching Mert Pillanci; 5. Sample complexity bounds for dictionary learning from vector- and tensor-valued data Zahra Shakeri, Anand Sarwate and Waheed Bajwa; 6. Uncertainty relations and sparse signal recovery Erwin Riegler and Helmut Boelcskei; 7. Understanding phase transitions via mutual Information and MMSE Galen Reeves and Henry Pfister; 8. Computing choice: learning distributions over permutations Devavrat Shah; 9. Universal clustering Ravi Raman and Lav Varshney; 10. Information-theoretic stability and generalization Maxim Raginsky, Alexander Rakhlin and Aolin Xu; 11. Information bottleneck and representation learning Pablo Piantanida and Leonardo Rey Vega; 12. Fundamental limits in model selection for modern data analysis Jie Ding, Yuhong Yang and Vahid Tarokh; 13. Statistical problems with planted structures: information-theoretical and computational limits Yihong Wu and Jiaming Xu; 14. Distributed statistical inference with compressed data Wenwen Zhao and Lifeng Lai; 15. Network functional compression Soheil Feizi and Muriel Medard; 16. An introductory guide to Fano's inequality with applications in statistical estimation Jonathan Scarlett and Volkan Cevher.
Este título pertence ao(s) assunto(s) indicados(s). Para ver outros títulos clique no assunto desejado.