Information Theory, Inference & Learning Algorithms (Hardcover)
暫譯: 資訊理論、推斷與學習演算法 (精裝版)
David J. C. MacKay
- 出版商: Cambridge
- 出版日期: 2003-10-06
- 售價: $2,950
- 貴賓價: 9.5 折 $2,803
- 語言: 英文
- 頁數: 640
- 裝訂: Hardcover
- ISBN: 0521642981
- ISBN-13: 9780521642989
-
相關分類:
Algorithms-data-structures
立即出貨 (庫存=1)
買這商品的人也買了...
-
$950$855 -
$720$569 -
$560$476 -
$2,390$2,271 -
$1,078Machine Learning (IE-Paperback)
-
$480$379 -
$780$616 -
$3,860$3,667 -
$790$774 -
$2,710$2,656 -
$3,325The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2/e (Hardcover)
-
$3,300$3,135 -
$1,620Practical Recommender Systems (Paperback)
-
$2,124Software Engineering at Google: Lessons Learned from Programming Over Time (Paperback)
-
$1,480$1,450 -
$1,700$1,615 -
$2,622Practical Statistics for Data Scientists: 50+ Essential Concepts Using R and Python, 2/e (Paperback)
-
$2,070Machine Learning Design Patterns: Solutions to Common Challenges in Data Preparation, Model Building, and Mlops (Paperback)
-
$1,460$1,387 -
$2,300$2,185 -
$2,900$2,755 -
$2,650$2,597 -
$2,280Designing Machine Learning Systems: An Iterative Process for Production-Ready Applications (Paperback)
-
$2,594Reliable Machine Learning: Applying Sre Principles to ML in Production (Paperback)
-
$2,520Hands-On Machine Learning with Scikit-Learn, Keras, and Tensorflow: Concepts, Tools, and Techniques to Build Intelligent Systems, 3/e (Paperback)
相關主題
商品描述
Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.
Contents
1. Introduction to information theory; 2. Probability, entropy, and inference; 3. More about inference; Part I. Data Compression: 4. The source coding theorem; 5. Symbol codes; 6. Stream codes; 7. Codes for integers; Part II. Noisy-Channel Coding: 8. Correlated random variables; 9. Communication over a noisy channel; 10. The noisy-channel coding theorem; 11. Error-correcting codes and real channels; Part III. Further Topics in Information Theory: 12. Hash codes: codes for efficient information retrieval; 13. Binary codes; 14. Very good linear codes exist; 15. Further exercises on information theory; 16. Message passing; 17. Communication over constrained noiseless channels; 18. Crosswords and codebreaking; 19. Why have sex? Information acquisition and evolution; Part IV. Probabilities and Inference: 20. An example inference task: clustering; 21. Exact inference by complete enumeration; 22. Maximum likelihood and clustering; 23. Useful probability distributions; 24. Exact marginalization; 25. Exact marginalization in trellises; 26. Exact marginalization in graphs; 27. Laplace's method; 28. Model comparison and Occam's razor; 29. Monte Carlo methods; 30. Efficient Monte Carlo methods; 31. Ising models; 32. Exact Monte Carlo sampling; 33. Variational methods; 34. Independent component analysis and latent variable modelling; 35. Random inference topics; 36. Decision theory; 37. Bayesian inference and sampling theory; Part V. Neural Networks: 38. Introduction to neural networks; 39. The single neuron as a classifier; 40. Capacity of a single neuron; 41. Learning as inference; 42. Hopfield networks; 43. Boltzmann machines; 44. Supervised learning in multilayer networks; 45. Gaussian processes; 46. Deconvolution; Part VI. Sparse Graph Codes; 47. Low-density parity-check codes; 48. Convolutional codes and turbo codes; 49. Repeat-accumulate codes; 50. Digital fountain codes; Part VII. Appendices: A. Notation; B. Some physics; C. Some mathematics; Bibliography; Index.
商品描述(中文翻譯)
資訊理論與推斷,這本令人興奮的教科書中共同教授,位於現代科技許多重要領域的核心 - 通訊、信號處理、資料挖掘、機器學習、模式識別、計算神經科學、生物資訊學和密碼學。這本書同時介紹理論與應用。資訊理論與實際的通訊系統一起教授,例如用於資料壓縮的算術編碼和用於錯誤更正的稀疏圖編碼。推斷技術,包括訊息傳遞演算法、蒙地卡羅方法和變分近似,與聚類、卷積編碼、獨立成分分析和神經網絡的應用一起發展。這本書獨特地涵蓋了最先進的錯誤更正編碼,包括低密度奇偶檢查碼、渦輪碼和數位噴泉碼 - 這些是二十一世紀衛星通訊、磁碟驅動器和資料廣播的標準。書中插圖豐富,包含大量範例和超過400個練習題,其中一些附有詳細解答,非常適合自學以及本科或研究生課程。它也為計算生物學、金融工程和機器學習等多個領域的專業人士提供了無與倫比的入門點。
目錄
1. 資訊理論簡介;2. 機率、熵與推斷;3. 更多關於推斷;第一部分:資料壓縮:4. 資源編碼定理;5. 符號編碼;6. 流編碼;7. 整數編碼;第二部分:噪聲通道編碼:8. 相關隨機變數;9. 在噪聲通道上的通訊;10. 噪聲通道編碼定理;11. 錯誤更正編碼與實際通道;第三部分:資訊理論中的進一步主題:12. 雜湊碼:高效資訊檢索的編碼;13. 二進位編碼;14. 存在非常好的線性編碼;15. 資訊理論的進一步練習;16. 訊息傳遞;17. 在受限無噪聲通道上的通訊;18. 填字遊戲與破譯;19. 為什麼要有性行為?資訊獲取與進化;第四部分:機率與推斷:20. 一個範例推斷任務:聚類;21. 透過完全列舉進行精確推斷;22. 最大似然與聚類;23. 有用的機率分佈;24. 精確邊際化;25. 在格子中的精確邊際化;26. 在圖中的精確邊際化;27. 拉普拉斯方法;28. 模型比較與奧卡姆剃刀;29. 蒙地卡羅方法;30. 高效的蒙地卡羅方法;31. 伊辛模型;32. 精確的蒙地卡羅取樣;33. 變分方法;34. 獨立成分分析與潛變數建模;35. 隨機推斷主題;36. 決策理論;37. 貝葉斯推斷與取樣理論;第五部分:神經網絡:38. 神經網絡簡介;39. 單一神經元作為分類器;40. 單一神經元的容量;41. 學習作為推斷;42. 霍普菲爾德網絡;43. 波茲曼機;44. 多層網絡中的監督學習;45. 高斯過程;46. 去卷積;第六部分:稀疏圖編碼;47. 低密度奇偶檢查碼;48. 卷積碼與渦輪碼;49. 重複累積碼;50. 數位噴泉碼;第七部分:附錄:A. 符號;B. 一些物理學;C. 一些數學;參考文獻;索引。