Background
Gray, Robert Molten was born on November 1, 1943 in San Diego, California, United States. Son of Augustine Heard and Elizabeth DuBois Gray.
(Source coding theory has as its goal the characterization...)
Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) 73 74. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time 4. This approach constrained the delay but allowed the rate to grow large.
http://www.amazon.com/gp/product/0792390482/?tag=2022091-20
(Random Processes: A Mathematical Approach for Engineers (...)
Random Processes: A Mathematical Approach for Engineers (Prentice-Hall... by Robert M. Gray, Lee D. Davisson
http://www.amazon.com/gp/product/0137528825/?tag=2022091-20
( Probability, Random Processes, and Ergodic Properties i...)
Probability, Random Processes, and Ergodic Properties is for mathematically inclined information/communication theorists and people working in signal processing. It will also interest those working with random or stochastic processes, including mathematicians, statisticians, and economists. Highlights: Complete tour of book and guidelines for use given in Introduction, so readers can see at a glance the topics of interest. Structures mathematics for an engineering audience, with emphasis on engineering applications. New in the Second Edition: Much of the material has been rearranged and revised for pedagogical reasons. The original first chapter has been split in order to allow a more thorough treatment of basic probability before tackling random processes and dynamical systems. The final chapter has been broken into two pieces to provide separate emphasis on process metrics and the ergodic decomposition of affine functionals. Many classic inequalities are now incorporated into the text, along with proofs; and many citations have been added.
http://www.amazon.com/gp/product/1441910891/?tag=2022091-20
( This book is an updated version of the information theo...)
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: • Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes • Expanded discussion of results from ergodic theory relevant to information theory • Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources • New material on trading off information and distortion, including the Marton inequality • New material on the properties of optimal and asymptotically optimal source codes • New material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.
http://www.amazon.com/gp/product/1441979697/?tag=2022091-20
(Herb Caen, a popular columnist for the San Francisco Chro...)
Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e. , data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity.
http://www.amazon.com/gp/product/0792391810/?tag=2022091-20
(Herb Caen, a popular columnist for the San Francisco Chro...)
Herb Caen, a popular columnist for the San Francisco Chronicle, recently quoted a Voice of America press release as saying that it was reorganizing in order to "eliminate duplication and redundancy. " This quote both states a goal of data compression and illustrates its common need: the removal of duplication (or redundancy) can provide a more efficient representation of data and the quoted phrase is itself a candidate for such surgery. Not only can the number of words in the quote be reduced without losing informa tion, but the statement would actually be enhanced by such compression since it will no longer exemplify the wrong that the policy is supposed to correct. Here compression can streamline the phrase and minimize the em barassment while improving the English style. Compression in general is intended to provide efficient representations of data while preserving the essential information contained in the data. This book is devoted to the theory and practice of signal compression, i. e. , data compression applied to signals such as speech, audio, images, and video signals (excluding other data types such as financial data or general purpose computer data). The emphasis is on the conversion of analog waveforms into efficient digital representations and on the compression of digital information into the fewest possible bits. Both operations should yield the highest possible reconstruction fidelity subject to constraints on the bit rate and implementation complexity.
http://www.amazon.com/gp/product/1461366127/?tag=2022091-20
(This book is devoted to the theory of probabilistic infor...)
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
http://www.amazon.com/gp/product/0387973710/?tag=2022091-20
(The Fourier transform is one of the most important mathem...)
The Fourier transform is one of the most important mathematical tools in a wide variety of fields in science and engineering. In the abstract it can be viewed as the transformation of a signal in one domain (typically time or space) into another domain, the frequency domain. Applications of Fourier transforms, often called Fourier analysis or harmonic analysis, provide useful decompositions of signals into fundamental or "primitive" components, provide shortcuts to the computation of complicated sums and integrals, and often reveal hidden structure in data. Fourier analysis lies at the base of many theories of science and plays a fundamental role in practical engineering design. The origins of Fourier analysis in science can be found in Ptolemy's decomposing celestial orbits into cycles and epicycles and Pythagorus' de composing music into consonances. Its modern history began with the eighteenth century work of Bernoulli, Euler, and Gauss on what later came to be known as Fourier series. J. Fourier in his 1822 Theorie analytique de la Chaleur 16 (still available as a Dover reprint) was the first to claim that arbitrary periodic functions could be expanded in a trigonometric (later called a Fourier) series, a claim that was eventually shown to be incorrect, although not too far from the truth. It is an amusing historical sidelight that this work won a prize from the French Academy, in spite of serious concerns expressed by the judges (Laplace, Lagrange, and Legendre) re garding Fourier's lack of rigor.
http://www.amazon.com/gp/product/0792395859/?tag=2022091-20
(Source coding theory has as its goal the characterization...)
Source coding theory has as its goal the characterization of the optimal performance achievable in idealized communication systems which must code an information source for transmission over a digital communication or storage channel for transmission to a user. The user must decode the information into a form that is a good approximation to the original. A code is optimal within some class if it achieves the best possible fidelity given whatever constraints are imposed on the code by the available channel. In theory, the primary constraint imposed on a code by the channel is its rate or resolution, the number of bits per second or per input symbol that it can transmit from sender to receiver. In the real world, complexity may be as important as rate. The origins and the basic form of much of the theory date from Shan non's classical development of noiseless source coding and source coding subject to a fidelity criterion (also called rate-distortion theory) 73 74. Shannon combined a probabilistic notion of information with limit theo rems from ergodic theory and a random coding technique to describe the optimal performance of systems with a constrained rate but with uncon strained complexity and delay. An alternative approach called asymptotic or high rate quantization theory based on different techniques and approx imations was introduced by Bennett at approximately the same time 4. This approach constrained the delay but allowed the rate to grow large.
http://www.amazon.com/gp/product/1461289076/?tag=2022091-20
Electrical engineering educator
Gray, Robert Molten was born on November 1, 1943 in San Diego, California, United States. Son of Augustine Heard and Elizabeth DuBois Gray.
Bachelor of Science in Electrical Engineering, Massachusetts Institute of Technology, 1966. Master of Science in Electrical Engineering, Massachusetts Institute of Technology, 1966. Doctor of Philosophy in Electrical Engineering, University Southern California, Los Angeles, 1969.
He is best known for his contributions to quantization and compression, particularly the development of vector quantization. Born in 1943 in San Diego, Gray grew up in Coronado, California. He was a middle child in a family of five.
Gray followed his two older brothers to the Massachusetts Institute of Technology.
Gray earned the Doctor of Philosophy in Electrical Engineering from University of Southern California in 1969. His Doctor of Philosophy advisor was Robert A. Scholtz.
During his graduate school years, he played guitar in the rock band "MCF". He began his career at the United States Naval Ordnance Laboratory, following a family naval tradition.
Gray is currently Editor-in-Chief of Foundations and Trends in Signal Processing.
He has also been Editor-in-Chief of the Institute of Electrical and Electronics Engineers transactions on Information Theory (1981–1983), and served on the Institute of Electrical and Electronics Engineers Information Theory Society Board of Governors (1974–1980, 1984–1987) and Institute of Electrical and Electronics Engineers Signal Processing Society Board of Governors (1999–2001).
(Source coding theory has as its goal the characterization...)
(Source coding theory has as its goal the characterization...)
(Herb Caen, a popular columnist for the San Francisco Chro...)
(Herb Caen, a popular columnist for the San Francisco Chro...)
( Probability, Random Processes, and Ergodic Properties i...)
(This book is devoted to the theory of probabilistic infor...)
(The Fourier transform is one of the most important mathem...)
( This book is an updated version of the information theo...)
(Random Processes: A Mathematical Approach for Engineers (...)
Fireman La Honda Volunteer Fire Brigade, California, 1970-1980, president, 1971-1972. Coach American Youth Soccer Organization, La Honda, 1971-1978, commissioner, 1976-1978. Fellow Institute of Electrical and Electronics Engineers (Centennial medal 1984, 3rd Millennium medal 2000, Jack S. Kilby Signal Processing medal, 2008), Institute of Mathematics Statistics.
Member Information Theory Society Institute of Electrical and Electronics Engineers (associate editor transactions 1977-1980, editor-in-chief 1980-1983, paper prize 1976, Golden Jubilee award for technological achievement 1998), Signal Processing Society Institute of Electrical and Electronics Engineers (Senior award 1983, Society award 1993, progressive co-chairman 1997 International Conference on Image Processing, Technology Achievement award 1998, Presidential Mentoring award 2002, Distinguished Alumni award University South Carolina 2003, Meritorious Service award 2006), National Academy of Engineering.
Married Arlene Frances Ericson. Children: Timothy M., Lori A.