Elements of Information Theory

Elements of Information Theory

Einband:
Fester Einband
EAN:
9780471241959
Untertitel:
Englisch
Autor:
Thomas M. Cover, Joy A. Thomas
Herausgeber:
Wiley John + Sons
Auflage:
2nd ed
Anzahl Seiten:
784
Erscheinungsdatum:
08.09.2006
ISBN:
0471241954

Elements of Information Theory, Second Edition, covers the standard topics of information theory, such as entropy, data compression, channel capacity, rate distortion, multi-user theory and hypothesis testing. It presents applications to communications, statistics, complexity theory, and investment. Chapters 1-9 cover the asymptotic equipartition property, data compression, and channel capacity culminating in the capacity of the Gaussian channel. Chapters 10-17 include rate distortion, the method of types, Kolmogorov complexity, network information theory, universal source coding, and portfolio theory.The first edition of this book is the most successful book on information theory on the market today. Adoptions have remained strong since the book's publication in 1991.

Zusatztext "This book is recommended reading! both as a textbook and as a reference." (Computing Reviews.com! December 28! 2006) Informationen zum Autor THOMAS M. COVER! PHD! is Professor in the departments of electrical engineering and statistics! Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award! Dr. Cover is a past president of the IEEE Information Theory Society! a Fellow of the IEEE and the Institute of Mathematical Statistics! and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation.JOY A. THOMAS! PHD! is the Chief Scientist at Stratify! Inc.! a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford! Dr. Thomas spent more than nine years at the IBM T. J. Watson Research Center in Yorktown Heights! New York. Dr. Thomas is a recipient of the IEEE Charles LeGeyt Fortescue Fellowship. Klappentext The latest edition of this classic is updated with new problem sets and materialThe Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.The Second Edition features:* Chapters reorganized to improve teaching* 200 new problems* New material on source coding, portfolio theory, and feedback capacity* Updated referencesNow current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications. Zusammenfassung The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. Inhaltsverzeichnis Preface to the Second Edition.Preface to the First Edition.Acknowledgments for the Second Edition.Acknowledgments for the First Edition.1. Introduction and Preview.2. Entropy, Relative Entropy, and Mutual Information.3. Asymptotic Equipartition Property.4. Entropy Rates of a Stochastic Process.5. Data Compression.6. Gambling and Data Compression.7. Channel Capacity.8. Differential Entropy.9. Gaussian Channel.10. Rate Distortion Theory.11. Information Theory and Statistics.12. Maximum Entropy.13. Universal Source Coding.Compression.14. Kolmogorov Complexity.15. Network Information Theory.16. Information Theory and Portfolio Theory.17. Inequalities in Information Theory.Bibliography.List of Symbols.Index....

"This book is recommended reading, both as a textbook and as a reference." (Computing Reviews.com, December 28, 2006)

Autorentext
THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation. JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc., a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford, Dr. Thomas spent more than nine years at the IBM T. J. Watson Research Center in Yorktown Heights, New York. Dr. Thomas is a recipient of the IEEE Charles LeGeyt Fortescue Fellowship.

Klappentext
The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.

Inhalt
Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 2. Entropy, Relative Entropy, and Mutual Information. 3. Asymptotic Equipartition Property. 4. Entropy Rates of a Stochastic Process. 5. Data Compression. 6. Gambling and Data Compression. 7. Channel Capacity. 8. Differential Entropy. 9. Gaussian Channel. 10. Rate Distortion Theory. 11. Information Theory and Statistics. 12. Maximum Entropy. 13. Universal Source Coding. Compression. 14. Kolmogorov Complexity. 15. Network Information Theory. 16. Information Theory and Portfolio Theory. 17. Inequalities in Information Theory. Bibliography. List of Symbols. Index.


billigbuch.ch sucht jetzt für Sie die besten Angebote ...

Loading...

Die aktuellen Verkaufspreise von 6 Onlineshops werden in Realtime abgefragt.

Sie können das gewünschte Produkt anschliessend direkt beim Anbieter Ihrer Wahl bestellen.


Feedback