图书介绍

熵与信息论 英文版【2025|PDF|Epub|mobi|kindle电子书版本百度云盘下载】

熵与信息论 英文版
  • Robert M. Gray编著 著
  • 出版社: 北京:科学出版社
  • ISBN:9787030344731
  • 出版时间:2012
  • 标注页数:409页
  • 文件大小:14MB
  • 文件页数:434页
  • 主题词:熵(信息论)-英文

PDF下载


点此进入-本书在线PDF格式电子书下载【推荐-云解压-方便快捷】直接下载PDF格式图书。移动端-PC端通用
种子下载[BT下载速度快]温馨提示:(请使用BT下载软件FDM进行下载)软件下载地址页直链下载[便捷但速度慢]  [在线试读本书]   [在线获取解压码]

下载说明

熵与信息论 英文版PDF格式电子书版下载

下载的文件为RAR压缩包。需要使用解压软件进行解压得到PDF格式图书。

建议使用BT下载工具Free Download Manager进行下载,简称FDM(免费,没有广告,支持多平台)。本站资源全部打包为BT种子。所以需要使用专业的BT下载软件进行下载。如BitComet qBittorrent uTorrent等BT下载工具。迅雷目前由于本站不是热门资源。不推荐使用!后期资源热门了。安装了迅雷也可以迅雷进行下载!

(文件页数 要大于 标注页数,上中下等多册电子书除外)

注意:本站所有压缩包均有解压码: 点击下载压缩包解压工具

图书目录

1 Information Sources1

1.1 Probability Spaces and Random Variables1

1.2 Random Processes and Dynamical Systems5

1.3 Distributions7

1.4 Standard Alphabets12

1.5 Expectation13

1.6 Asymptotic Mean Stationarity16

1.7 Ergodic Properties17

2 Pair Processes:Channels,Codes,and Couplings21

2.1 Pair Processes21

2.2 Channels22

2.3 Stationarity Properties of Channels25

2.4 Extremes:Noiseless and Completely Random Channels29

2.5 Deterministic Channels and Sequence Coders30

2.6 Stationary and Sliding-Block Codes31

2.7 Block Codes37

2.8 Random Punctuation Sequences38

2.9 Memoryless Channels42

2.10 Finite-Memory Channels42

2.11 Output Mixing Channels43

2.12 Block Independent Channels45

2.13 Conditionally Block Independent Channels46

2.14 Stationarizing Block Independent Channels46

2.15 Primitive Channels48

2.16 Additive Noise Channels49

2.17 Markov Channels49

2.18 Finite-State Channels and Codes50

2.19 Cascade Channels51

2.20 Communication Systems52

2.21 Couplings52

2.22 Block to Sliding-Block:The Rohlin-Kakutani Theorem53

3 Entropy61

3.1 Entropy and Entropy Rate61

3.2 Divergence Inequality and Relative Entropy65

3.3 Basic Properties of Entropy69

3.4 Entropy Rate78

3.5 Relative Entropy Rate81

3.6 Conditional Entropy and Mutual Information82

3.7 Entropy Rate Revisited90

3.8 Markov Approximations91

3.9 Relative Entropy Densities93

4 The Entropy Ergodic Theorem97

4.1 History97

4.2 Stationary Ergodic Sources100

4.3 Stationary Nonergodic Sources106

4.4 AMS Sources110

4.5 The Asymptotic Equipartition Property114

5 Distortion and Approximation117

5.1 Distortion Measures117

5.2 Fidelity Criteria120

5.3 Average Limiting Distortion121

5.4 Communications Systems Performance123

5.5 Optimal Performance124

5.6 Code Approximation124

5.7 Approximating Random Vectors and Processes129

5.8 The Monge/Kantorovich/Vasershtein Distance132

5.9 Variation and Distribution Distance132

5.10 Coupling Discrete Spaces with the Hamming Distance134

5.11 Process Distance and Approximation135

5.12 Source Approximation and Codes141

5.13 d-bar Continuous Channels142

6 Distortion and Entropy147

6.1 The Fano Inequality147

6.2 Code Approximation and Entropy Rate150

6.3 Pinsker's and Marton's Inequalities152

6.4 Entropy and Isomorphism156

6.5 Almost Lossless Source Coding160

6.6 Asymptotically Optimal Almost Lossless Codes168

6.7 Modeling and Simulation169

7 Relative Entropy173

7.1 Divergence173

7.2 Conditional Relative Entropy189

7.3 Limiting Entropy Densities202

7.4 Information for General Alphabets204

7.5 Convergence Results216

8 Information Rates219

8.1 Information Rates for Finite Alphabets219

8.2 Information Rates for General Alphabets221

8.3 A Mean Ergodic Theorem for Densities225

8.4 Information Rates of Stationary Processes227

8.5 The Data Processing Theorem234

8.6 Memoryless Channels and Sources235

9 Distortion and Information237

9.1 The Shannon Distortion-Rate Function237

9.2 Basic Properties239

9.3 Process Definitions of the Distortion-Rate Function242

9.4 The Distortion-Rate Function as a Lower Bound250

9.5 Evaluating the Rate-Distortion Function252

10 Relative Entropy Rates265

10.1 Relative Entropy Densities and Rates265

10.2 Markov Dominating Measures268

10.3 Stationary Processes272

10.4 Mean Ergodic Theorems275

11 Ergodic Theorems for Densities281

11.1 Stationary Ergodic Sources281

11.2 Stationary Nonergodic Sources286

11.3 AMS Sources290

11.4 Ergodic Theorems for Information Densities293

12 Source Coding Theorems295

12.1 Source Coding and Channel Coding295

12.2 Block Source Codes for AMS Sources296

12.3 Block Source Code Mismatch307

12.4 Block Coding Stationary Sources310

12.5 Block Coding AMS Ergodic Sources312

12.6 Subadditive Fidelity Criteria319

12.7 Asynchronous Block Codes321

12.8 Sliding-Block Source Codes323

12.9 A Geometric Interpretation333

13 Properties of Good Source Codes335

13.1 Optimal and Asymptotically Optimal Codes335

13.2 Block Codes337

13.3 Sliding-Block Codes343

14 Coding for Noisy Channels359

14.1 Noisy Channels359

14.2 Feinstein's Lemma361

14.3 Feinstein's Theorem364

14.4 Channel Capacity367

14.5 Robust Block Codes372

14.6 Block Coding Theorems for Noisy Channels375

14.7 Joint Source and Channel Block Codes377

14.8 Synchronizing Block Channel Codes380

14.9 Sliding-block Source and Channel Coding384

References395

Index405

热门推荐