000 05679nam a2200685 i 4500
001 6813326
003 IEEE
005 20200413152913.0
006 m eo d
007 cr cn |||m|||a
008 140113s2014 caua foab 000 0 eng d
020 _a9781598298086
_qebook
020 _z9781598298079
_qpaperback
024 7 _a10.2200/S00556ED1V01Y201312COM009
_2doi
035 _a(CaBNVSL)swl00403030
035 _a(OCoLC)868156115
040 _aCaBNVSL
_beng
_erda
_cCaBNVSL
_dCaBNVSL
050 4 _aQ385
_b.G535 2014
082 0 4 _a001.539
_223
090 _a
_bMoCl
_e201312COM009
100 1 _aGibson, Jerry D.,
_eauthor.
245 1 0 _aInformation theory and rate distortion theory for communications and compression /
_cJerry Gibson.
264 1 _aSan Rafael, California (1537 Fourth Street, San Rafael, CA 94901 USA) :
_bMorgan & Claypool,
_c2014.
300 _a1 PDF (xii, 115 pages) :
_billustrations.
336 _atext
_2rdacontent
337 _aelectronic
_2isbdmedia
338 _aonline resource
_2rdacarrier
490 1 _aSynthesis lectures on communications,
_x1932-1708 ;
_v# 9
538 _aMode of access: World Wide Web.
538 _aSystem requirements: Adobe Acrobat Reader.
500 _aPart of: Synthesis digital library of engineering and computer science.
500 _aSeries from website.
504 _aIncludes bibliographical references (pages 111-113).
505 0 _a1. Communications, compression and fundamental limits -- 1.1 Shannon's tree theorems -- 1.2 The information transmission theorem or separation theorem -- 1.3 Notes and additional references --
505 8 _a2. Entropy and mutual information -- 2.1 Entropy and mutual information -- 2.2 Chain rules for entropy and mutual information -- 2.3 Differential entropy and mutual information for continuous random variables -- 2.4 Relative entropy and mutual information -- 2.5 Data processing inequality -- 2.6 Notes and additional references --
505 8 _a3. Lossless source coding -- 3.1 The lossless source coding problem -- 3.2 Definitions, properties, and the source coding theorem -- 3.3 Huffman coding and code trees -- 3.4 Elias coding and arithmetic coding -- 3.5 Lempel-Ziv coding -- 3.6 Kraft inequality -- 3.7 The AEP and data compression -- 3.8 Notes and additional references --
505 8 _a4. Channel capacity -- 4.1 The definition of channel capacity -- 4.2 Properties of channel capacity -- 4.3 Calculating capacity for discrete memoryless channels -- 4.4 The channel coding theorem -- 4.5 Decoding and jointly typical sequences -- 4.6 Fano's inequality and the converse to the coding theorem -- 4.7 The additive Gaussian noise channel and capacity -- 4.8 Converse to the coding theorem for Gaussian channels -- 4.9 Expressions for capacity and the Gaussian channel -- 4.9.1 Parallel Gaussian channels [4, 5] -- 4.9.2 Channels with colored Gaussian noise [4, 5] -- 4.10 Band-limited channels -- 4.11 Notes and additional references --
505 8 _a5. Rate distortion theory and lossy source coding -- 5.1 The rate distortion function for discrete memoryless sources -- 5.2 The rate distortion function for continuous amplitude sources -- 5.3 The Shannon lower bound and the optimum backward channel -- 5.3.1 Binary symmetric source -- 5.3.2 Gaussian source -- 5.4 Stationary Gaussian sources with memory -- 5.5 The rate distortion function for a Gaussian autoregressive source -- 5.6 Composite source models and conditional rate distortion functions -- 5.7 The rate distortion theorem for independent Gaussian sources-revisited -- 5.8 Applications of R(D) to scalar quantization -- 5.9 Notes and additional references --
505 8 _aA. Useful inequalities -- B. Laws of large numbers -- B.1 Inequalities and laws of large numbers -- B.1.1 Markov's inequality -- B.1.2 Chebychev's inequality -- B.1.3 Weak law of large numbers -- B.1.4 Strong law of large numbers -- C. Kuhn-Tucker conditions -- Bibliography -- Author's biography.
506 1 _aAbstract freely available; full-text restricted to subscribers or individual document purchasers.
510 0 _aCompendex
510 0 _aINSPEC
510 0 _aGoogle scholar
510 0 _aGoogle book search
520 3 _aThis book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book.
530 _aAlso available in print.
588 _aTitle from PDF title page (viewed on January 14, 2014).
650 0 _aInformation theory
_xMathematics.
650 0 _aRate distortion theory
_xMathematics.
650 0 _aData compression (Telecommunication)
_xMathematics.
653 _ainformation theory
653 _arate distortion theory
653 _afundamental limits on communications
653 _afundamental limits on compression
776 0 8 _iPrint version:
_z9781598298079
830 0 _aSynthesis digital library of engineering and computer science.
830 0 _aSynthesis lectures on communications ;
_v# 9.
_x1932-1708
856 4 2 _3Abstract with links to resource
_uhttp://ieeexplore.ieee.org/servlet/opac?bknumber=6813326
856 4 0 _3Abstract with links to full text
_uhttp://dx.doi.org/10.2200/S00556ED1V01Y201312COM009
999 _c562047
_d562047