Thumbnail
Access Restriction
Open

Author Linder, Tamás ♦ Zamir, Ram
Source CiteSeerX
Content type Text
File Format PDF
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Subject Keyword Asymptotic Tightness ♦ Stationary Source ♦ Source Distribution ♦ Relaxed Condition ♦ Source Vector ♦ Informational Divergence ♦ Fundamental Property ♦ Rate Distortion Theory ♦ Slb Relative ♦ New Result ♦ Difference Distortion Measure ♦ Finite Differential Entropy ♦ General Difference Distortion Measure ♦ Coordinated Science Laboratory ♦ Norm-based Distortion ♦ Rate Distortion Function ♦ Source Density ♦ Single Letter Difference Distortion ♦ Finite Ffth Moment ♦ Weak Assumption ♦ Key Convergence Result
Abstract New results are proved on the convergence of the Shannon lower bound (SLB) to the rate distortion function as the distortion decreases to zero. The key convergence result is proved using a fundamental property of informational divergence. As a corollary, it is shown that the SLB is asymptotically tight for norm-based distortions, when the source vector has a finite differential entropy and a finite ffth moment for some ff ? 0, with respect to the given norm. Moreover, we derive a theorem of Linkov on the asymptotic tightness of the SLB for general difference distortion measures with more relaxed conditions on the source density. We also show that the SLB relative to a stationary source and single letter difference distortion is asymptotically tight under very weak assumptions on the source distribution. Key words: rate distortion theory, Shannon lower bound, difference distortion measures, stationary sources T. Linder is with the Coordinated Science Laboratory, University of Illinoi...
Educational Role Student ♦ Teacher
Age Range above 22 year
Educational Use Research
Education Level UG and PG ♦ Career/Technical Study
Publisher Date 1997-01-01