(Middleton, 1996 2ed) An Introduction To Statistical Communication Theory (IEEE) (1175s)

Download as pdf
Download as pdf
You are on page 1of 1175
CONTENTS Foreword to the [BEE PRESS Reissue Preface to the Second Reprint Edition (1996) Preface to the First Reprint Edition (1987-1995) Preface to the First Edition (1960). BBes PART 1. AN INTRODUCTION TO STATISTICAL COMMUNICATION "THEORY Chapter 1. Statistical Preliminaries 1.1. Introductory Remarks. . : : or) 1.2. Probability Distributions and Distribution Densities eee eanae Distribution Functions. Functions of » Random Variable. Discrete and Continuous Distributions. Distribution Densities. Mean Values and Moments. The Characteristic Function. Generalizations; Semi-invari- ants. Two Random Variables. Conditional Distributions; Mean Values and Moments. Statistical Independence. The Covariance. Multivariate Cases. Characteristic Functions and Semi-invariants. Generalizations. 1.3, Description of Random Processes 25 A Random Noise Process. Mathematical Description of a Random Process. Some Other Types of Process. Deterministic Processes. Signals and Noise. Stationarity and Nonstationarity. Examples of Stationary and Nonstation- ary Processes. Distribution Densities for Deterministic Processes, 1.4. Further Classification of Random Processes 42 Conditional Distribution Densities. ‘The Purely Random Process. ‘The Simple Markoff Process. The Smoluchowski Equation. Higher-order Proc- esses and Projections. e 1.5, Time and Ensemble Averages : 48, Time Averages. Ensemble Averages. 1.6. Egodicity and Applications . 55 The Ergodic Theorem. Applications to Signal and Noise Ensembles, Chapter 2, Operations on Ensembles 65 2.1. Operations on Ensembles 65 Stochastic Convergence, Stochastic Differentiation and Integration. Re- marks on General Transformations. 2.2. Linear Systems; Representations . : ae 8 Remarks on General Networks. The Transient State; Differential Equations and Solutions. The Weighting Function. The Transfer Function. System Functions and the Steady State. Narrowband Filters. Instantaneous Envelope, Phase, and Frequency, and the Analytic Signal. 2.3. Nonlinear Systems; Representations 105 Some Nonlinear Responses. Representation of Dynamic Characteristics with Zero Memory. Equivalent Contours. 2.4. Objectives of the Present Study aetna a xxii CONTENTS Chapter 8. Spectra, Covariance, and Correlation Functions eae 3.1, The Autocorrelation Function : j naaeeee at Definition of the Autocorrelation Function. Ergodic Processes. Some Properties of Ry‘), The Discrete Case. Periodic Waves. Mixed Processes. 3.2. The Intensity Spectrum ae See easeeaa Introductory Remarks. The Intensity Spectrum and the Wiener-Khintchine Theorem. Spectra, Correlation Functions, and Spectral Moments. Periodic ‘Waves and Mixed Processes. 3.3. Autovatiance, Autocorrelation, and Spectra of Linearly Filtered Waves. General Linear Filters (I). Narrowband Filters (I). General Linear Filters (11). Narrowband Filters (11). Approximations; Time Moments. Mixed Processes; Examples. 3.4. Cross-correlation Functions, Spectra, and Generalizations Definitions and Properties. Linearly Filtered Waves, Examples. Trun- cated Waves. Nonstationary Processes. Chapter 4. Sampling, Interpolation, and Random Pulse Trains 4.1. Discrete Sampling (Continuous Random Series) 4.2. Discrete Sampling (Interpolation) ‘The Sampled Wave. Properties of the Sampling Fi and Covariance Functions. Generalizations. 4.3, Random Pulse Trains with Periodic Structure iz . Periodic Pulsed Sampling of a Random Wave. Pulse-time Modulation. 4.4. Aperiodie Nonoverlapping Random Pulse ‘Trains : Spectrum and Covariance Functions. An Example. 4.5. Overlapping Random Pulses : Spectrum and Covariance Functions. Campbell's Theorem. ion u(t). Spectra Chapter 6. Signals and Noise in Nonlinear Systems 5-1. Rectification with Zero-memory Devices ‘The Second-moment Function M,(). The Spectrum. Narrowband Input Ensembles. Envelope and Phase Representations. 5.2. Three Examples of Rectification : : Half-wave Linear Rectification of Normal Noise. Full-wave Square-law Rectification of a Carrier and Noise. A Simple Spectrum Analyzer. 5.3. Noise Figures and Signal-to-Noise Ratios eee Linear-network Criteria; Noise Figures (I). Linear-network Criteria; Noise Figures (II). Criteria for Nonlinear Systems (I). Criteria for Nonlinear Systems (II). The Signsl-to-Noise Ratio. Summary Remarks on Second- moment Criteria. Chapter 6. An Introduction to Information Theory 6.1. Preliminary Remarks 6.2. Measures of Information: The Discrete Case : Uncertainty, Ignorance, and Information. Formulation. Strueture of ‘U. An Example. Properties of 4(z|2), #(z\y). 6.3. Entropy and Other Average Measures. Discrete Cases (Continued)... Communication Entropy. Average Information Gains. Sequences and Sources. 6.4. Measures of Information: Continuous Cases : Entropy and Information Gains. Maximum Entropies. Sampling, Entropy Loss, and the Destruction of Information. 161 184 200 201 218 228 233 240 256 272 290 201 292 300 307 xxiv CONTENTS 6.5. Rates and Channel Capacity . . . |. ee Discrete Noiseless Channel. Discrete Noisy Channel. The Continuous Noisy Channel. Continuous Noisy Channel with a Continuous Message Source. Discussion. PART 2. RANDOM NOISE PROCESSES Chapter 7. The Normal Random Process: Gaussian Variates 7.1. The Normal Random Variable 7.2. The Bivariate Normal Distribution 7.3. The Multivariate Normal Distribution The Characteristic Function, Moments, Semi-invariants, and Statistical Independence. 7.4. The Normal Random Process 7.5. Some Properties of the Normal Process Additivity. Linear Transformations. Examples. 7.8. Classification (Doob’s Theorem) 7.7. Unrestricted Random Walk and the Central-imit Theorem The Unrestricted Random Walk. Random Walk with a Large Number of Steps. The Central-limit Theorem. Chapter 8. The Normal Random Process: Gaussian Functionals 8.1, Gaussian Functionals Derivatives of a Normal Process. Integrals of @ Gauss Process. Joint Distributions. 82. Orthogonal Expansions of a Random Process i Generat Expansions, Orthogonal Expansion of a Random Process. Appli- cations to the Gauss Process. 88. Fourier-series Expansions . . @ Chapter 9. Processes Derived from the Normal 9.1. Statiatical Properties of the Envelope and Phase of Narrowband Normal Noise Distribution Densities and Moments of the Envelope. Moments and Distri- bution Densities of the Phase. 9.2. Statistical Properties of Additive Narrowband Signal and Normal Noise Processes Statistics of the Envelope of Signal and Noise. Statistics of the Phase of Narrowband Signals and Normal Noise. 9.3, Statisties of Signals and Broadband Normal Noise Processes 9.4. Zero Crossings and Extrema of a Random Process Zero Crossings. Extrema. Remarks. Chapter 10. The Equations of Langevin, Fokker-Planck, and Boltzmann 10.1. Formulation in Terms of a Stochastic Differential Equation: The Langevin Equation 10.2. Some Examples Leading to a Diffusion Equation Random Walk of a Free Particle. Random Walk with a Harmonic Restor- ing Force. 10.3. The Equation of Fokker-Planck and Hts Relation to the Langevin Equation The Fokker-Planck Equation. The Moments A,(y). The Equations of Boltzmann and Smoluchowski. 10.4. The Gaussian Random Process a Assumptions on the Force Term F(). Two Examples. A Solution of the One-dimensional Fokker-Planck Equation. Moments. 317 369 369 380 390 396 397 4 421 4265 438 442 448. 455 CONTENTS Chapter 11. Thermal, Shot, and Impulse Noise 11.1. Thermal Noise . ‘The Intensity Spectrum (Kinetic Derivation). The Intensity Spectrum (Thermodynamieal Argument). Generalizations. The Equipartition The- orem. Nonequilibrium Conditions. Probability Distributions for Thermal Noise. Comments 11.2. Impulse Noise eee : A General Model. Poisson Noise. Example and Discussion. 11.3, Temperature-limited Shot Noise Probability Densities. Waveform of the Induced Current. Moments and Spectra. PART 3. APPLICATIONS TO SPECIAL SYSTEMS Chapter 12. Amplitude Modulation and Conversion 12.1, Methods of Modulation 12.2, Covariance Functions and Intensity Spectra 12.3, Conversion I, Preliminary Remarks. 12.4. Conversion II. Second-moment Theory Chapter 18. Rectification of Amplitude-modulated Waves: Second-moment Theory 18.1. Rectification of Broad- and Narrowband Noise Processes Broadband Noise. Narrowband Noise. 13.2. Rectification of Noise and a Sinusoidal Carrier Carrier and Broadband Noise. Carrier and Narrowband Noise. 13.3. Detection of a Sinusoidally Modulated Carrier in Noise The Quadratic Detector (y = 2). The Linear Detector (» = 1). 13.4. Other Rectification Problems : Bias, Saturation. Full-wave Rectification. A Method of Residues. Chapter 14. Phase and Frequency Modulation 14.1. Covariance Functions and Spectra one ee Periodic Modulations. Random Modulations. Examples of Phase Modu- lation and Frequency Modulation by Stationary Normal Noise. Modu- lation Indices. Mixed Angle Modulation. 14.2. Limiting Forms . Low Modulation Indices. High Modulation Indices 143, Generalizations ae : Simultaneous Amplitude and Angle Modulation. ‘The Steady-state Effects of a Narrowband Linear Filter. Chapter 15. Detection of Frequency-modulated Waves; Second-moment Theory 15.1. The FM Receiver : RF and IF Receiver Stages. The Limiter. The Discriminator. Output of the Ideal Limiter-Discriminator. Elements. 15.2. The Mean and Mean-square Output of an FM Receiver. : The Mean Output. The Mean-square Output. 15.3, The Reception of Narrowband Frequency Modulation Signal-to-Noise Ratios. Signal-to-Noise Ratios (Narrowband Frequency Modulation). Narrowband Frequency Modulation vs. Amplitude, Modu- lation. 15.4, Covariance Function and Spectrum of the Output of the FM Receiver (General Theory) 467 467 490 498, 613 513 517 525 530 689 539 547 616 628 63 642 649 655 xvi CONTENTS 15.5. Special Cases for Broadband Frequency Modulation Signal Output; Arbitrary Limiting. Covariance Functions and Spectre with Strong Carriers and Moderate to Heavy Limiting, Covariance Fune- tions and Spectra with Strong Carriers; Little or No Limiting. Threshold Signals. Noise Alone; Arbitrary Limiting. Noise and an Unmodulated Carrier. Signal-o-Noise Ratios (Broadband Frequency Modulation) Chapter 16. Linear Measurements, Prediction, and Optimum Filtering 16.1. Linear Finite-time Measurements Statistical Errors. An Example: Mean Intensity of a Random Wave. Optimum Lineer Finite-time Measurements; Noise Alone. 16.2, Optimum Linear Prediction and Filtering Formulation. Noise Signals in Noise Backgrounds; The Theory of Wiener and Kolmogoroff. Some Examples. Extensions. 16.3, Maximization of Signal-to-Noise Ratios—Matched Filters Maximization of (S/N). Examples, “Matched Filters. Summary Re- marks, Chapter 17. Some Distribution Problems 17.1. Preliminary Results: Reduction of det (I + yG) ‘The Eigenvalue Method. The Trace Method. 17.2. Distribution Densities and Functionals . Evaluation of Integrals. Moments and Semi-invariants. Probability Dis- tribution of the Spectral Density of a Normal Process. 17.3. Distribution Densities after Nonlinear Operations and Filtering, First-order Distribution Densities Following a Quadratic Rectifier and Linear Filter. Further Examples. Concluding Remarks. PART 4. A STATISTICAL THEORY OF RECEPTION Chapter 18. Reception as a Decision Problem 18.1, Introduction 18.2. Signal Detection and Extraction eeegeeeaeea Detection. Types of Extraction. Other Reception Problems, 18.3. The Reception Situation in General Terms . Assumptions. The Decision Rule. The Decision Problem. ‘The Generic Similarity of Detection and Extraction. 18.4. System Evaluation Evaluation Functions. System Comparisons and Error Probabilities. Opti- mization: Bayes Systems. Optimization: Minimax Systems. 18.5, A Summary of Basic Definitions and Principal Theorems. Some General Properties of Optimum Decision Rules. Definitions Principal Theorems. Remarks. Chapter 19. Binary Detection Systems Minimizing Average Risk. General Theory 19.1. Formulation : : nae ‘The Average Risk. Optimum Detection. Some Further Properties of the Bayes Detection Rule. 19.2. Special Optimum Detection Systems . ‘The Neyman-Pearson Detection System, ‘The Ideal Observer Detection System. Minimax Detection Rule. 659 679 679 oo m4 7123 724 733 753 774 776 782 787 795 801 801 807 CONTENTS 19.3. Evaluation of Performance. Hae ee Error Probabilities; Optimum Systems. Error Probabilities; Suboptimum Systems. Decision Curves and System Comparisons. 19.4. Structure; Threshold Detection Discrete Sampling. Continuous Sampling. General Remarks on Optimum ‘Threshold Detection. Chapter 20. Binary Detection Systems Minimizing Average Risk, Examples 20.1. Threshold Structure I. Discrete Sampling ae Coherent Detection, Incoherent Detection I. General Signals. Incoher- ent Detection II. Narrowband Signals. 20.2. Threshold Structure II, Continuous Sampling : : Coherent Detection. Incoherent Detection I. General Signals, Incoher- ent Detection II. Narrowband Signals. Bayes Matched Filters. 20.3. System Evaluation: Error Probabilities and Average Risk. Optimum Threshold Systems. A Suboptimum System. Information Loss in Binary Detection. General Remarks. 20.4. Examples . Coherent Detection (Optimum Simple-alternative Cases), Coherent Deteo- tion (Suboptimum Simple-alternative Cases). Optimum Incoherent Detec- tion (A Simple Radar Problem). A Simple Suboptimum Radar System (Incoherent Detection). Optimum Incoherent Detection (A Communica tion Problem). A General Radar Detection Problem. Stochastic Signals in Normal Noise. Remarks. Chapter 21. Extraction Systems Minimizing Average Risk; Signal Analysis 24.1, Some Results of Classical Estimation Theory . Estimates, Estimators, and the Cramér-Rao Inequality. Maximum Like- lihood Estimation. Three Examples of Signal Extraction by Maximum Likelihood. 21.2, Decision-theory Formulation Bayes Extraction with a Simple Cost Function. Bayes Extraction with a Quadratic Cost Function. Other Cost Functions. Information Loss in Extraction. Structure, System Comparisons, and Distributions. 21.3, Estimation of Amplitude (Deterministic Signals and Normal Noise) Coherent Estimation of Signal Amplitude (Quadratic Cost Function). Incoherent Estimation of Signal Amplitude (Quadratic Cost Function). Incoherent Estimation of Signal Amplitude (Simple Cost Function). 21.4. Waveform Estimation (Stochastic Signals) . ee Normal Noise Signals in Normal Noise (Quadratic Cost Function). Smooth: ing and Prediction (Gaussian Signal and Quadratic Cost Functions). Mini- max Smoothing and Prediction of Deterministic Signals (Quadratic Cost Functions). 21.8, Remarks . Chapter 22. Information Measures in Reception 22.1. Information and Sufficiency Sufficient Statistics. Information Measures. 22.2, Information-loss Criterion for Detection Equivocation of Binary Detectors. Detectors That Minimize Equiveestion, 812 818 a3 835, 849 870 941 959 979 994 - 1004 1008 - 1008 1013 xoviii CONTENTS: 22.3. Information-loss Criterion for Extraction. ae Average Information Loss and Its Extrema in Extraction. Minimax and Minimum Equivocation Extraction. Remarks on Maximum Likelihood Extractors. Chapter 23. Generalizations and Extensions 23.1. Multiple-alternative Detection and Estimation : Detection. Detection with Decision Rejection. Multiple Estimation 23.2. Cost Coding: Joint Optimization of Transmission and Reception by Choice of Signal Waveform . Single Signals in Noise (Detection and Extraction). Detection of 8: versus S1 in Noise. 23.3. Relations to Game Theory Game Theory. 23.4, A Critique of the Decision-theory Approach 23.5, Some Future Problems ie Appendix 1, Special Functions and Integrals A.1.1, The Error Function and Its Derivatives A.1.2, The Confluent Hypergeometric Function A.1.3. The Gaussian Hypergeometric Function A.L4, Auxiliary Relations. A.1.5. Some Special Integrals. Appendix 2. Solutions of Selected Integral Equations A.2.1, Introduction. 4.2.2. Homogeneous Integral Equations with Rational Kernels . A.2.3. Inhomogeneous Equations with Rational Kernels . A24. Examples . eevee Example 1: The RC Kernel. Example 2: The LRC Kernel, Example 3: Mixed Kernels; RC Kernels and White Noise. Example 4: A Mixed Ker- nel, with White Noise. ‘Supplementary References and Bibliography . Selected Supplementary References (1996). Name Index to Selected Supplementary References Glossary of Principal Symbols Name Index . Subject Index Author's Biography 1019 1024 - 1024 1046 + 1087 1060 1067 1071 - 1071 - 1073 + 1076 1077 - 1079 1082 1082 1085 1086 1091 1103 qa 19 1121 1131 1137 1151 An Introduction to Statistical Communication Theory DAVID MIDDLETON IEEE Communications Society, Sponsor IEEE Information Theory Society, Sponsor IEEE { PRESS © The Institute of Electrical and Electronics Engineers, Inc., New York IEEE Press 445 Hoes Lane, P.O. Box 1331 Piscataway, NJ 08855-1331 Editorial Board John B. Anderson, Editor in Chief Eden G. F. Hoffnagle R. S. Muller E. El-Hawary R. F. Hoyt W. D. Reeve Furui S. Kartalopoulos E. Sanchez-Sinencio Herrick P. Laplante D. J. Wells Dudley R. Kay, Director of Book Publishing Lisa Dayne, Review Coordinator Savoula Amanatidis, Production Editor IEEE Communications Society, Sponsor CS Liaison to IEEE Press, Tom Robertazzi IEEE Information Theory Society, Sponsor IT-S Liaison to IEEE Press, Stuart Schwartz © 1996 by David Middleton. All rights reserved, No part of this book may be reproduced in any form, nor may it be stored in a retrieval system or transmitted in any form, without written permission from the publisher. First printing (1960) by McGraw-Hill, Inc. Second Printing (1987) by Peninsula Publishing. Printed in the United States of America ee eee 6 SEEEEEEEEEAE 3 2 1 ISBN 0-7803-1178-7 IEEE Order Number: PC5648 Library of Congress Cataloging-in-Publication Data Middleton, David. 1920- An introduction to statistical communication theory / David Middleton : sponsored by IEEE Communications Society, IEEE Information Theory Society. Pp. cm. Reprint. Originally published: New York, McGraw-Hill, 1960. Includes bibliographical references and indexes. ISBN 0-7803-1178-7 1. Statistical communication theory. 1. Title. TKS101.M45 1996 621.382—de20 96-1241 cP b- ACCRA PD josiwe2d This book and other IEEE PRESS books may be purchased at a discount from the publisher when ordered in bulk quantities. Contact: IEEE Press Marketing Attn: Special Sales 445 Hoes Lane, P.O. Box 1331 Piscataway, NJ 08855-1331 Fax: (908) 981-9334 For more information about IEEE PRESS products, visit the IEEE Home Page: hutp:/ www icee.org/ : Also of interest from IEEE PRESS . MICROWAVE MOBILE COMMUNICATIONS William C. Jakes, AT&T Bell Labs (retired) 1994 Hardcover 656 pp. IEEE Order No. PC4234 ISBN 0-7803-1069-1 COMMUNICATION SYSTEMS AND TECHNIQUES Mischa Schwartz, Columbia University; Seymour Stein, Columbia University; and William Bennet, Jr., SCPE, Inc. 1996 Hardcover 632 pp IEEE Order No. PC5639 ISBN 0-7803-1166-3 THE MOBILE COMMUNICATIONS HANDBOOK edited by Jeny D. Gibson, Texas A&M University Published in cooperation with CRC Press 1996 Hardcover 624 pp IEEE Order No. PC5633 ISBN 0-8493-8573-3 To J. B. Wiesner (1915-1994) FOREWORD TO THE IEEE PRESS REISSUE ‘The statistical theory of communication has proven to be a powerful methodology for the design, analysis, and understanding of practical systems for electronic communications and related applications. From its origins in the 1940's to the present day, this theory has remained remarkably vibrant and useful, while continuing to evolve as new modes of communication emerge. The publication in 1960 of Introduction to Statistical Communication Theory (ISCT) was a landmark for the field of statistical communication. For the first time, the disciplines comprising statistical communication theory—random processes, modulation and detection, signal extraction, information theory—were combined into a single, unified treatment at a level of depth and degree of completeness which have not been matched in any subsequent comprehensive work. Moreover, ISCT introduced a further interdisciplinary feature, in which relevant physical characteristics of communication channels were incorporated into many topics. Today, some thirty-five years since its first appearance, this book remains a unique sourcebook in statistical communications. Most of the topics treated in this book are still taught in the classroom today, and they continue to arise in the day-to-day work of engineers in fields such as electronic communications, radar, sonar, and radio astronomy, among many others. Although there are a great many excellent books on communications in print, no other book treats the mathematical and physical foundations of the discipline in the comprehensive, interdisciplinary way found here. This unified presentation offers the reader a distinct advantage in terms of understanding, as well as a significant value in terms of compactness of resources. ISCT is a classic book in the field of communications. As such, it ranks with Davenport and Root’s An Introduction to the Theory of Random Signals and Noise and Wozencraft and Jacob’s Principles of Communications Engineering. However, even without its historical stature, this book stands today as a valuable resource for engineers, researchers, and students in communications and related fields. By republishing this book, the IEEE is providing a valuable service to its members and to the electronic communications community in general. This reissue is a very welcome event indeed. H. Vincent Poor Princeton University November 1995 PREFACE TO THE SECOND REPRINT EDITION (1996) It has been thirty-six years since An Introduction to Statistical Communication Theory was originally published, in 1960 by McGraw-Hill in its International Series in Pure and Applied Physics [1]. Nine years have also passed since its republication as a Reprint Edition by the Peninsula Publishing Co., of Los Altos, California, in 1987 [1a]. The present volume is the reincarnation of the previous editions, through the kind offices of the IEEE Press and under the welcome sponsorship of the Communication and Information Theory Societies of the IEEE, with selected changes and additions described below. From a historical viewpoint, An Introduc- tion follows, at an advanced level of application, along the path, primarily, of the earlier work of Lawson and Ublenbeck in the MIT Radiation Laboratory Series [2] and of Davenport and Root [3]. Because so much of the original material of An Introduction appears still to be pertinent and useful today and because almost three full generations of engineers and scientists have arrived since the book’s inception, it seems appropriate once more to make it available to the concerned technical community. Unlike the first Reprint Edition (1987-1995), for which no ‘‘updating” was attempted (save for the brief list of references cited in the Preface thereof), this edition does provide selected references which connect the earlier work of the original book’s period with many new concepts, methods, and results which have appeared since 1960 and which may broadly be considered to follow from and extend the basic ideas and techniques of statistical communication theory (SCT), as exposited in [1, 1a]. These include, for example, (1) developments in such areas as detection and estimation in non-Gaussian noise environments involving propagation and scattering in nonhomogeneous channels [4-9, 14-15]; (2) state-space, model- based signal processing (estimation and contro!) [10]; (3) digital signal processing [11]; (4) spectrum analysis and array processing [12, 13]; as well as (5) other extensions of classical signal detection and estimation [14—16], including fuzzy logic [17] and neural networks [18]; (6) the réle of information theory in physical science [20]; and of course, (7) the essential developments in computing technology which have made the practical implementation of SCT more fully possible. In addition to these important topics, selectively referenced below, a rather extensive list of some 125 references has been added at the end of this Edition, with few exceptions, on a chapter-by-chapter basis. These represent mainly books which in the author’s opinion describe the technical progress in the greatly expanded, interdisciplinary field of SCT from 1960 to the present. By and large, they themselves also remain vital and useful today, as well as of historical interest, and serve to connect the still active past with the ongoing present. This list is not intended to be, nor could it be, complete in any practical sense, nor is any slight to ‘omitted worthy work intended. In any case, many useful references are also to be sought and found in these volumes. x PREFACE TO THE SECOND REPRINT EDITION (1996) Finally, the author has taken the opportunity of this Reprint Edition to correct the errors and misprints (known to him), fortunately few since the many printings (1960-1972) of the original work [1]. The author is deeply indebted to Professor Poor (Princeton University) for his support and interest in the republication of this book and for his generous foreword thereto. The author also wishes to extend his appreciation to the various reviewers and to Mr. Dudley Kay and the excellent editorial staff at the IEEE Press for their essential efforts in bringing this work to publication again. David Middleton New York, 1996 [11 D. Middleton, An Introduction to Statistical Communication Theory, International Series in Pure and Applied Physics, McGraw-Hill, New York, 1960, Ua] » An Introduction to Statistical Communication Theory; (First) Reprint Edition, Peninsula, Los Altos, CA, 1987-1995; see following Preface. [2] J. L Lawson and G. E. Uhlenbeck, Threshold Signals, MIT Radiation Laboratory Series, Vol. 24, McGraw-Hill, New York, 1950. {3} W. B. Davenport and W. L. Root, An Introduction to Random Signals and Noise, McGraw-Hill, New York, 1958; Reprint Edition, IEEE Press, Piscataway, NJ, 1984. [4] D. Middleton, Threshold Signal Processing (with Applications to Ocean Acoustics, Non-Gaussian Random Media, and Generalized Telecommunications), American Institute of Physics (AIP), Series in ‘Modern Acoustics and Signal Processing,"* American Institute of Physics Press, New York, in press (1997). [5] _______, “Threshold Detection in Correlated Non-Gaussian Noise Fields," IEEE Trans, Information Theory, Vol. 41, no. 4, pp. 976~1000, July 1995. [6] D. Middleton and A. D. Spaulding, “Optimum Reception in Non-Gaussian Electro- magnetic Environments II. Optimum and Suboptimum Threshold Signal Detection in Class A and B Noise,"" NTIA Report 83-120, (NTIS pub. No. PB 83-241141), ITS (NTIA), U.S. Department of Commerce, 325 Broadway, Boulder, CO 80303. May 1983. (7] ______, Elements of Weak Signal Detection in Non-Gaussian Noise Environments, Chapter 5 of Advances in Statistical Signal Processing, Volume 2: Detection, eds. H. V. Poor and J. B. Thomas, JAI Press, 55 Old Post Road, No. 2, P.O. Box 1678, Greenwich, CT 06836-1678, December 1993. [8] S. A. Kassam, Signal Detection in Non-Gaussian Noise, Springer-Verlag, New York, 1988, [9] E. J. Wegman, S. C. Schwartz, and J. B. Thomas, eds., Topics in Non-Gaussian Signal Processing, Springer-Verlag, New York, 1989. See also E. J. Wegman and J. G. Smith, eds., Statistical Signal Processing, Marcel Dekker, New York, 1984. [10] J. V. Candy, Signal Processing: A Model-Based Approach, McGraw-Hill, New York, 1986. [11] A. V. Oppenheim and R. W. Schafer, Digital Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1975. [12] A.M. Yaglom, Correlation Theory of Stationary and Related Random Functions: 1. Basic Results (1987); Il. Supplementary Notes and References (1987), Springer-Verlag, New York. PREFACE TO THE SECOND REPRINT EDITION (1996) xi {13} S. Haykin, ed., Advances in Spectrum Analysis and Array Processing, Vols. | and 2, Prentice Hall, Englewood Cliffs, NJ, 1991. [14] S. M. Rytov, Yu. A. Kravtsov, and V. I. Tatarskii, Principles of Statistical Radio Physics (English Edition); Vol. 1, Elements of Random Process Theory (1987); Vol. 2, Correlation Theory of Random Process (1988); Vol. 3, Elements of Random Fields (1989); Vol. 4, Wave Propagation Through Random Media (1989), Springer-Verlag, New York. (See also [1] above.) [15] D. Middleton, ‘Space-Time Processing for Weak-Signal Detection in Non-Gaussian and Non-Uniform Electromagnetic Interference (EMI) Fields’’:, Contractor Report 86-36 (NTIS: PB-86-193406), ITS (NTIA), U.S. Department of Commerce, 325 Broadway, Boulder, CO 80303), February 1986. [16] C. W. Helstrom, Elements of Signal Detection and Estimation, Prentice Hall, Englewood Cliffs, NJ, 1995. [17] L. A. Zadeh, Fuzzy Sets and Applications: Selected Papers by L. A. Zadeh, eds. R. R. Yager et al., John Wiley, New York, 1987. [18] S. Haykin, Neural Networks, IEEE Press (Macmillan College Publishing Co.), New York, 1994. [19] R. Young, Wavelet Theory and Its Applications, Kluwer Academic Publishers, Boston, 1993 {20} L. Brillouin, Science and Information Theory, 2nd edition, Academic Press, New York, 1962. See also the earlier selected references at the end of the Preface to the First Reprint Edition following. Additional references, by chapter, are given at the end of this book. PREFACE TO THE FIRST REPRINT EDITION (1987-1995) It has been over a quarter of a century since An Introduction to Statistical Communication Theory was first published, in 1960, by McGraw-Hill (New York) in its International Series in Pure and Applied Physics. Since then almost two genera- tions of scientists and engineers have appeared and many specialized works within the broad domain of Statistical Communication Theory (SCT) have been produced. Today, what is often referred to as “signal processing” has become a standard engineering discipline in all areas and applications which require the transmission and reception of “information.” Signal processing itself has become an ensemble of techniques whose foundations were introduced and broadly described in the original edition of the present book. Because so much of the material of An Introduction appears to be continuously useful and because this work has been out-of-print since 1972, it seems particularly appropriate to reintroduce the book at this time. Indeed, inasmuch as the concepts and methods, and many of the examples described therein, are canonical, that is, have a general, functional form independent of specific physical models and analytical detail, they remain relevant to current and future applications. It is in this spirit that this Reprint Edition (Peninsula Publishing) is presented. No attempt to “update” the book itself has been made here. To do so with the same emphasis on comprehensiveness and detail covered in the original would now require a whole series of books. Apart from losing the advantages of self-containment in a single volume, such a task does not appear attractive, in the face of the many excellent books which have subsequently appeared and which treat in more detail many of the topics originally discussed in the author’s treatise. Of course, new methods and ideas in Statistical Communication Theory (SCT) have appeared since the early sixties. In addition to the topics treated generally in the original edition, such as noise theory, noise physics, information theory, statistical decision theory (SDT), and data processing, one should add within the domain of SCT important new developments: (1) nongaussian noise and interference models [10], [12], [18]; (2) threshold signal detection and estimation [11] - [13], (151, [16]; (3) spatial processing, as well as temporal sampling, including general arrays and apertures [17]; and (4) statistical-physical approaches [10], [17] -[20], to describe the channel itself, including scattering and various types of propagation encountered in real-world environments. Moreover, as predicted at the time (1960), cf. Sec. 23.5, [1], the many areas of then future study noted therein have indeed been (and are being) investigated, as well as others not then known. For example, whole areas of current importance such as Electromagnetic Compatibility (EMC) and Spectrum Manage- ment {11], [12], [13], [16], quantum optical signal processing [8], generalized arrays and spatial sampling [17] -[19], as well as the development of powerful and economi- cal computers and programs for handling the vast data loads incurred in practice, were either unknown or barely beginning to emerge. In fact, the recognition (claw xiii xiv PREFACE TO THE FIRST REPRINT EDITION (1987-1995) even today) that one lives in a “nongaussian” world [10], [12] is perhaps one of the more significant features of the current evolution of SCT and its specific applications. Finally, to assist the reader in relating the SCT fundamentals of 1960 with the developing practicalities of the present, the author has appended below his own brief, highly personalized list of books and papers. No attempt at completeness is made and no slight to worthy work not mentioned here is intended. David Middleton New York, 1987 I. Books LIL D. Middleton, An Introduction to Statistical Communication Theory, McGraw-Hill (New York), 1960. Ua]. 1. Muaarron: Baegenne 8 Cratucruyecxyio Teopuio Cassu, Tom 1, 1961; Tom 2, 1962, Conetexoe Parvo, Mockia, C.C.C-P. (vol. 1, 1961; vol. 2, 1962, Soviet Radio, Moscow, U.S.S.R.). [2]. L.A. Wainstein and V. D. Zubakov, Extraction of Signals from Noise (translated from the Russian by R. L. Silverman), Prentice Hall (New Jersey), 1962. [3]. D. Middleton, Topics in Communication Theory, McGraw-Hill (New York), 1965. [Ba). J. Muaanton OUEPKH TEOPHH CBA3H Hogatenscrao «COBETCKOE PAJIMO» Mockea— 1966 [4 C. W. Helstrom, Statistical Theory of Signal Detection, Second Edition, Pergamon Press (New York), 1968. [5] J.B. Thomas, An Introduction to Statistical Communication Theory, John Wiley (New York), 1969. (6. H. Van Trees, Detection, Estimation, and Modulation Theory, Part 1 (1968), Part I (1971), Part III (1971), John Wiley (New York). {7 B. R. Levin, Theoretical Bases of Statistical Radio Engineering, Moscow, “Soviet Radio,” Vol. 1, 1974, Vol. 2, 1975, Vol. 3, 1976. (B.P. Jesu, Teopetnyeckne Ocuonnn Cratuctuyeckoit PaahoTexiukit.) [8] C. W. Helstrom, Quantum Detection and Estimation Theory, Academic Press (New York), 1976. (9]. N.H. Blachman, Noise and Its Effect on Communication, 2nd Edition, R. E. Krieger Pub. Co. (Malabar, Florida), 1982. II. Papers [10]. D. Middleton, “Statistical-Physical Models of Electromagnetic Interference,” IEEE Trans. on Electromagnetic Compatibility, Vol. EMC-17, No. 3, pp. 106-127, Aug., 1977. [II]. A. D. Spaulding and D. Middleton, “Optimum Reception in an Impulsive Interference Environment — Part I: Coherent Detection; Part II: Incoherent Reception,” IEEE Trans. Commun., Vol. COM-25, pp. 910-934, Sept. 1977. [12]. D. Middleton, “Canonical Non-Gaussian Noise Models: Their Implications for Meas- urement and for Prediction of Receiver Performance,” IEEE Trans. Electromag. Com- pat., Vol. EMC-2I, No. 3, pp. 209-220, Aug. 1979. [13]. D. Middleton, “Threshold Detection in Non-Gaussian Interference Environments: Exposition and Interpretation of New Results for EMC Applications,” IEEE Trans. on Electromag. Compat., Vol. EMC-26, No. 1, pp. 19-28, Feb., 1984. [14]. S. A. Kassam and H. V. Poor, “Robust Techniques for Signal Processing — A Survey,” Proc. IEEE, Vol. 73, No. 3, March, 1985, pp. 433-481. U5}. (6). U7). {18}. Us]. [20}. PREFACE TO THE FIRST REPRINT EDITION (1987-1995) xv D. Middleton, “Threshold Signal and Parameter Estimation in Non-Gaussian EMC Environments,” pp. 429-435, Proceedings, International Symp. on EMC, Zurich, Switz., March 5-7, 1985. A. D. Spaulding, “Locally Optimum and Suboptimum Detector Performance in a Non-Gaussian Interference Environment,” IEEE Trans. Commun., Vol. COM-33, No. 6, June 1985, pp. 509-517. D. Middleton, “Space-Time Processing for Weak Signal Detection in Non-Gaussian and Non-Vaiform Electromagnetic Interference (EMI) Fields,” Contractor Report 86-36, Feb., 1986, ITS/NTIA — U.S. Dept. of Commerce, 325 Broadway, Boulder, CO 80303. NTIS: PB-86-193406. D. Middleton, “Second-Order Non-Gaussian Probability Distributions and Their Application to ‘Classical’ Nonlinear Processing Problems in Communication Theory,” Proceedings of 1986 Conf. on Information Sciences and Systems, Princeton University, March 19-21, 1986. D. Middleton, “A Statistical Theory of Reverberation and Similar First-Order Scattered Fields,” Parts I, 11, IEEE Trans, Information Theory, Vol. /T-13, 372-392; 393-414 (1967); Parts III, IV, ibid., Vol. IT-18, 35-67; 68-90 (1972). D. Middleton, “Channel Modeling and Threshold Signal Processing in Underwater Acoustics: An Analytical Overview,” IEEE J.of Oceanic Engineering, Vol. OE-12, No. 1, Jan., 1987. PREFACE TO THE FIRST EDITION (1960) Statistical communication theory may be broadly described as a theory which applies probability concepts and statistical methods to the communi- cation process. In the wide sense, this includes not only the transmission and reception of messages, the measurement and processing of data, meas- ures of information, coding techniques, and the design and evaluation of decision systems for these purposes, but also the statistical study of language, the bio- and psychophysical mechanisms of communication, the human observer, and his role in the group environment. Here, however, we con- sider statistical communication theory from the narrower viewpoint of the physical systems that are specifically involved, e.g., radio, radar, etc. For this purpose, it is convenient to regard statistical communication theory as consisting of the following contiguous and, to varying degrees, overlapping areas of interest: (1) noise theory, which embraces the mathematical descrip- tion of random processes and their properties under various linear and nonlinear transformations; (2) noise physics, which is concerned mainly with the underlying physical mechanisms of the noise processes encountered in applications; (3) information theory, defined here in the strict sense as a theory of information measures and coding; and (4) statistical decision theory, as modified and extended to system design, evaluation, and the comparison of optimum and suboptimum systems, where the desired end product is some definite decision—a “yes” or “no” or a measurement. With (3) and (4), it is natural to consider also data processing, in the course of transmission and reception, the physics and the technology of the components, e.g., tubes, transistors, ete. by which actual systems for the above purposes are to be realized, and, of course, the various concepts and methods of probability and statistics that are required for a quantitative treatment, subject now to the different constraints imposed by the communication process itself. The present book is addressed principally to engineers, physicists, and applied mathematicians. Its aims are threefold: (1) to outline a systematic approach to the design of optimal communication systems of various funda- mental types, including an evaluation of performance and a comparison with nonoptimum systems for similar purposes; (2) to incorporate within a framework of a unified theoretical approach the principal results of earlier work, as well as to indicate some of the more important new ones; and finally (3) to be used as a text at various levels, as an instrument for the xviii PREFACE TO THE FIRST EDITION (1960) research worker in current problems, and, it is hoped, as a starting point for new developments. The emphasis here is mainly on (1) and (2), although (3) has not been neglected: method as well as results are stressed. About 300 problems have been included, not only as exercises but also as a source of additional results which could not otherwise be presented. While the bibliography is necessarily selective, 500 references (including the supplement) are available, from which more specialized interests may in turn be served. The mathematical exposition is for the most part heuristic, and although a detailed, rigorous treatment is outside the scope and intent here, in this respect reference is made to the appropriate literature. More- over, little space is devoted to the probabilistic and statistical background, the elements of which the reader is assumed to possess. A knowledge of Fourier- and Laplace-transform methods, contour integration, matrices, simple integral equations, and the usual techniques of advanced calculus courses is also required, along with the elements of circuit theory and the principles of radio, radar, and other types of electronic communication systems, The book is divided into four main parts. Part 1 introduces and describes some of the statistical techniques required in the analysis of communication systems and concludes with an introductory chapter on information theory. Part 2 considers the normal process and some of the processes derived from it and gives a short account of the physical models of shot and thermal noise. Part 3 is concerned mainly with various nonlinear operations that are com- mon in transmission and reception, such as modulation and demodulation, and the calculation of signal-to-noise ratios. Linear measurement, filter- ing, and prediction and more general distribution problems, the results of which are needed in the general analysis of Part 4, are also examined here. Finally, Part 4 gives a detailed development of a statistical communication theory for the basic single-link communication system, consisting of message or signal source, transmitter, medium of propagation (or channel), and receiver and decision-making elements. The attention here is on optimiza- tion and evaluation of receiver performance, e.g., signal detection and extrac- tion, with a short introduction (in Chap. 23) to the still more general ques- tion of simultaneous optimization of both the reception and transmission operations, Abouta third of the material of Parts 1 to 3 coincides with that in recent publications on noise theory. Part 4 has not appeared in book form before, and much of the rest, Part 3 particularly, is also believed to be new in this respect. While about three-quarters of the subject matter of the book as a whole may be found scattered through the original technical literature, many of the results mentioned in Chaps. 12, 17, 19 to 21, and 23 have not been presented previously. In spite of its size, this work must still be considered as an introduction, for it has not been possible to treat more than a few important topics, such as detection theory, with any great degree of completeness and still maintain PREFACE TO THE FIRST EDITION (1960) xix the attempt at a broad coverage which is one aim of the book. In fact, a number of important subjects have been omitted, because they require a separate volume for an adequate treatment, or because they are already so handled in the literature, or in some instances because no theory of any stature is as yet available. Among the topics which have been regretfully dropped are linear and nonlinear communication feedback (and related control) systems, coding methods, sequential detection and estimation, various types of noise-measuring systems, and many more specialized results in the theory of noise, such as the probability distributions of zeros and maxima of random waves. Problems involving large-scale, i.e., many- link, communication networks, game-theory applications, and their relation- ship to the single-link systems examined in Part 4 are likewise omitted. Some attention is given to Markoff processes, the Fokkes-Planck and related equations, with applications to shot and thermal noise, and, later, to ques- tions of optimum linear measurements, their statistical errors, filtering, prediction, and the like. However, a more extensive coverage of the second- moment theory (involving spectra and signal-to-noise ratios) of AM and FM transmission and reception is provided. The same is true for spectra and covariance functions of linearly filtered random and mixed processes, including random pulse trains. The treatment of the normal process, and those processes derived from it, including gaussian and nongaussian func- tionals, is intermediate, sufficient to satisfy the needs of subsequent chap- ters. The discussion of optimum systems for detecting and extracting signals from noise backgrounds, on the other hand, is comparatively detailed, though by no means complete, since this whole subject is still in a state of rapid development. Finally, besides the analytical features of system optimization and evaluation, considerable attention is given to the interpre- tation of system structure in such cases: the representation of optimum systems in terms of ordered sets of physically realizable elements. ‘Thresh- old or weak-signal systems receive particular emphasis in the present study, since they represent the furthest extent to which performance can be pushed under a given environment. Moreover, optimum design for threshold operation, while not usually optimum for strong signals, is never- theless almost always satisfactory in the latter instances, so that a system designed to make the most of threshold operation will also function accept- ably when the interference is weak or ignorable. Threshold performance is important whenever we are required to operate at the limit of system capa- bilities. Terrestrial, satellite, and space communications offer many con- spicuous examples. As a text, the selection of material may be made rather arbitrarily, depending on the level and scope of the intended course. For example, one might use Part 1 followed by Chaps. 7, 9, 11 and parts of Chaps. 12 to 16. A more advanced program might use portions of Part 1, all of Part 2, most of Part 3, and, as an introduction to decision-theory methods in communica-

You might also like