Coding and information theory [2nd ed] 9780131390720, 0131390724

The second edition is mainly an expansion of the material in the first edition. The changes were made to (1) increase th

566 93 2MB

English Pages 269 Year 1986

Report DMCA / Copyright

DOWNLOAD DJVU FILE

Table of contents :
Title page......Page 1
Date-line......Page 2
Contents......Page 3
Preface to the Second Edition......Page 8
Preface to the First Edition......Page 9
1.1 A Very Abstract Summary......Page 11
1.2 History......Page 12
1.3 Model of the Signaling System......Page 14
1.4 Information Source......Page 15
1.5 Encoding a Source Alphabet......Page 17
1.6 Some Particular Codes......Page 19
1.7 The ASCII Code......Page 20
1.8 Some Other Codes......Page 22
1.10 Escape Characters......Page 25
1.11 Outline of the Course......Page 27
2.1 Why Error-Detecting Codes?......Page 30
2.2 Simple Parity Checks......Page 31
2.3 Error-Detecting Codes......Page 32
2.4 Independent Errors: White Noise......Page 33
2.5 Retransmission of Message......Page 35
2.6 Simple Burst Error-Detecting Codes......Page 36
2.7 Alphabet Plus Number Codes: Weighted Codes......Page 37
2.8 Review of Modular Arithmetic......Page 40
2.9 ISBN Book Numbers......Page 42
3.1 Need for Error Correction......Page 44
3.2 Rectangular Codes......Page 45
3.3 Triangular, Cubic, and $n$-Dimensional Codes......Page 47
3.4 Hamming Error-Correcting Codes......Page 49
3.5 Equivalent Codes......Page 52
3.6 Geometric Approach......Page 54
3.7 Single-Error-Correction Plus Double-Error-Detection Codes......Page 57
3.9 Applications of the Ideas......Page 59
3.10 Summary......Page 60
4.1 Introduction......Page 61
4.2 Unique Decoding......Page 62
4.3 Instantaneous Codes......Page 63
4.4 Construction of Instantaneous Codes......Page 65
4.5 The Kraft Inequality......Page 67
4.6 Shortened Block Codes......Page 70
4.7 The McMillan Inequality......Page 72
4.8 Huffman Codes......Page 73
4.9 Special Cases of Huffman Coding......Page 78
4.10 Extensions of a Code......Page 82
4.11 Huffman Codes Radix $r$......Page 83
4.12 Noise in Huffman Coding Probabilities......Page 84
4.13 Use of Huffman Codes......Page 87
4.14 Hamming-Huffman Coding......Page 88
5.1 Introduction......Page 89
5.2 What Is a Markov Process?......Page 90
5.3 Ergodic Markov Processes......Page 94
5.4 Efficient Coding of an Ergodic Markov Process......Page 96
5.5 Extensions of a Markov Process......Page 97
5.6 Predictive Run Encoding......Page 98
5.7 The Predictive Encoder......Page 99
5.8 The Decoder......Page 100
5.9 Run Lengths......Page 101
5.11 What Is Hashing?......Page 104
5.12 Handling Collisions......Page 105
5.14 Summary of Hashing......Page 106
5.15 Purpose of the Gray Code......Page 107
5.16 Details of a Gray Code......Page 108
5.18 Anti-Gray Code......Page 109
5.19 Delta Modulation......Page 110
5.20 Other Codes......Page 111
6.1 Introduction......Page 113
6.2 Information......Page 114
6.3 Entropy......Page 117
6.4 Mathematical Properties of the Entropy Function......Page 124
6.5 Entropy and Coding......Page 129
6.6 Shannon-Fano Coding......Page 131
6.7 How Bad Is Shannon-Fano Coding?......Page 132
6.8 Extensions of a Code......Page 134
6.9 Examples of Extensions......Page 136
6.10 Entropy of a Markov Process......Page 139
6.11 Example of a Markov Process......Page 141
6.12 The Adjoint System......Page 143
6.13 The Robustness of Entropy......Page 146
6.14 Summary......Page 147
7.1 Introduction......Page 148
7.2 The Information Channel......Page 149
7.3 Channel Relationships......Page 150
7.4 Example of the Binary Symmetric Channel......Page 152
7.5 System Entropies......Page 155
7.6 Mutual Information......Page 158
8.1 Definition of Channel Capacity......Page 163
8.2 The Uniform Channel......Page 164
8.3 Uniform Input......Page 166
8.4 Error-Correcting Codes......Page 168
8.5 Capacity of a Binary Symmetric Channel......Page 169
8.6 Conditional Mutual Information......Page 171
9.1 Introduction......Page 174
9.2 The Stirling Approximation to n!......Page 175
9.3 A Binomial Bound......Page 180
9.4 The Gamma Function $\\Gamma(n)$......Page 183
9.5 $n$-Dimensional Euclidean Space......Page 186
9.6 A Paradox......Page 189
9.7 Chebyshev's Inequality and the Variance......Page 191
9.8 The Law of Large Numbers......Page 193
9.9 Other Metrics......Page 199
10.1 Introduction......Page 201
10.2 Decision Rules......Page 202
10.3 The Binary Symmetric Channel......Page 205
10.4 Random Encoding......Page 206
10.5 Average Random Code......Page 211
10.6 The General Case......Page 214
10.7 The Fano Bound......Page 215
10.8 The Converse of Shannon's Theorem......Page 217
11.1 Introduction......Page 219
11.2 Error-Detecting Parity Codes Revisited......Page 220
11.3 Hamming Codes Revisited......Page 221
11.4 Double-Error-Detecting Codes Revisited......Page 223
11.5 Polynomials versus Vectors......Page 224
11.6 Prime Polynomials......Page 225
11.7 Primitive Roots......Page 229
11.8 A Special Case......Page 230
11.9 Shift Registers for Encoding......Page 234
11.10 Decoding Single-Error-Correcting Codes......Page 237
11.11 A Double-Error-Correcting Code......Page 238
11.13 Summary......Page 244
A.1 Introduction......Page 247
A.2 The Fourier Integral......Page 248
A.3 The Sampling Theorem......Page 250
A.5 AM Signaling......Page 252
A.6 FM Signaling......Page 254
A.7 Pulse Signaling......Page 255
A.8 Bandwidth Generally......Page 256
A.9 Continuous Signals......Page 257
Appendix B: Some Tables for Entropy Calculations......Page 260
References......Page 263
Index......Page 265

Coding and information theory [2nd ed]
 9780131390720, 0131390724

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
Recommend Papers