无失真编码PPT课件_第1页
无失真编码PPT课件_第2页
无失真编码PPT课件_第3页
无失真编码PPT课件_第4页
无失真编码PPT课件_第5页
已阅读5页,还剩29页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

1、 General model of discrete, lossless (无失真), non-memory source:Input symbol setCode symbol setCode word set12(,.,);1,2,.,liiiiiiilsWxxxiq12 ,.,)irXx xx5.1 Lossless encoder第1页/共34页 How to code losslessly? Assuming that the statistical characteristics of the source can be ignored, it must satisfy the f

2、ollowing condition.To get the target of lossless coding, it must conform to the condition , which guarantees that there are plenty of code symbols to be used.NlqrlFrom the condition we can get an inequality,rqNlrqlNloglog第2页/共34页 E.g. 5.1 Suppose we only have the first eight letters of English alpha

3、bet (A to H) in our vocabulary. The Fixed Length Code (FLC) for this set of letters would beLetterCodewordLetterCodewordA000E100B001F101C010G110D011H111Fixed Length Code第3页/共34页A Variable Length Code (VLC) for the same set of letters can be LetterCodewordLetterCodewordA00E101B010F110C011G1110D100H11

4、11Variable Length Code1 (Huffman coding)第4页/共34页 Suppose we have to code the series of letters:” A BAD CAB”. The fixed length code and variable length code representations of the 7 letters areFixed Length Code000 001 000 011 010 000 001Total bits= 21Variable Length Code00 010 00 100 011 00 010Total

5、bits= 18It can be seen that the VLC uses a fewer number of bits than FLC since the letters, for example, A, appearing more frequently in the sentence are represented with a fewer number of bits 00.第5页/共34页 Instantaneous code (prefix code)(1) It is a kind of uniquely decodable code(2) In an unfixed l

6、ength code, there is no one code being prefix to other codes.(3) When decoding, it does not need to refer to its following codes, and can make the judgment immediately, carrying out non-delay decoding.第6页/共34页 Let us have a look at Example 5.1 again. Now we consider another VLC for the first 8 lette

7、rs of English alphabet:LetterCodewordLetterCodewordA0E10B1F11C00G000D01H111Variable Length Code2第7页/共34页 This second variable length code appears to be more efficient than the first one in terms of representation of the letters.Variable Length Code100 010 00 100 011 00 010Total bits= 18Variable Leng

8、th Code20 1 0 01 00 0 1Total bits= 9:Original: 0 1 0 01 00 0 1 - A BAD CABNow: 0 10 0 1 0 0 01 - A EAB AADOr 0 1 0 0 1 0 0 0 1 - A BAAB AAAB第8页/共34页 Definition 5.1 A Prefix Code is one in which no codeword forms the prefix of any other codeword. Such codes are also called Uniquely Decodable or Insta

9、ntaneous Codes.lThe optimal codeIt is a kind of uniquely decodable codeIts average code length is less than that of any other uniquely decodable code.第9页/共34页 5.2.1 Fixed length coding theorem5.2 Lossless source coding: As to the discrete, non-memory, stationary, ergodic source symbol sequence S=(S1

10、,S2 .Sq),we can use r different symbols X =(x1, x2. xr) to perform fixed length code. For any 0 and0,as to the N-expansion source, If is satisfied, and when N is big enough,the decoding error can be less than .The fixed length coding theorem has presented a theoretical limit of the code length used

11、for fixed length coding. log( )(0)lrH SN 第10页/共34页 If the equal length code is demanded to be uniquely decodable, then it must has: If N1,then: Conclusion:for unique decoding, each source symbol needs at least code symbols to be represented. When use dual-code symbol to code, r2,then: Conclusion:whe

12、n use equal length dual-code, the limit code length of each source symbol is log/logqr1loglog ()Nlqlq bitN log ()q bitloglogNllqqrNrloglogqlrExample第11页/共34页5.2.2 Unfixed length (Variable length) source coding Several concepts of code type(Eg.2.4.2) Non-singular code and singular code Uniquely decod

13、able code Prefix code(instantaneous code, non-delay code) Kraft theorem Unfixed length coding theorem(Shannon First Theorem)第12页/共34页Kraft theorem question:find real-time, uniquely decodable code method:research the code partition condition of the real-time uniquely decodable code introduction:conce

14、pt of “code tree” conclusion:present the condition that the real-time uniquely decodable code (prefix code) exists, that is, the Kraft theorem第13页/共34页code treeThe corresponding relationship between VLC and code tree:(1) Tree root Starting point of a code word(2) Number of branches from a tree node

15、Code dimension(3) Node Part of a code word(4) Terminal node End of a code word(5) Number of branches Code length (6) Non-full branch tree Variable length code (7) Full branch tree Fixed length code 第14页/共34页Theorem 5.1 (Kraft Inequality) A necessary and sufficient condition for the existence of a bi

16、nary prefix code set whose code words have lengths is Lnnn.21121LknkCode tree map for proof第15页/共34页 Average length of the prefix code A discrete non-memory stationary source Its limit source entropy is and the number of input symbol set is . The code length of each code word is , then the average c

17、ode length of prefix code is )(SH11,.,.,.,iqiqSsSsSsSpppP,1,2,.,il iqql5.3.3 Lossless unfixed length (variable length) source coding theorem 1qiiill p第16页/共34页 Lossless unfixed length source coding theorem () For discrete non-memory stationary source S, its limit entropy is and its code signals are

18、X=x1,xr. We can always code the source S by using a coding method to construct uniquely decodable code. Thus the average code length of each source signal satisfies:)(SH( )( )1loglogHsHslrr 第17页/共34页Theorem 5.3 () Let X be the set of letters from a DMS with finite entropy H(X) and xk, k= 1, 2, , L t

19、he output symbols, occurring with probabilities . Given these parameters, it is possible to construct a code that satisfies the prefix condition and has an average length R that satisfies the following inequalitykxP( )( )1H xRH x第18页/共34页 : discrete source without memory412431sspSIts entropy is:134(

20、 )log4log0.811/443H SbitsymUse dual signals(0, 1; r=2)construct a prefix code:1,021ssThe average code length of each signal is:1l ( )0.811H SlThe code efficiency is:第19页/共34页Construct a prefix code of expansion source S2Prefix code s1s1 9/160 s1s2 3/1610 s2s1 3/16110 s2s2 1/16111i)(iPaverage code le

21、ngth:293312712331616161616l average code length of each signal in source S2 :2727/ 21632l code efficiency:2( )0.961H SlAlso has:In order to improve the code efficiency, a two-dimensional union code is adopted by considering the 2nd order expansion source of S. 3( )0.985H Sl4( )0.991H Sl第20页/共34页: Us

22、ing unfixed length code, we can achieve quite high coding efficiency even when N is not very high. Moreover we may realize lossless coding. Also with N increasing, the coding efficiency more and more approaches to 1.第21页/共34页5.4 Typical example of lossless source coding5.4.1 Huffman coding Coding me

23、thod Characteristic Application第22页/共34页(Huffman coding)12341/ 21/ 41/81/8SssssP Eg. 5.7第23页/共34页Coding steps: 1) Sort the source message U according to their probabilities in a descending order; 2) Start from the two least probabilities; for example, the lower branch with little probability is assi

24、gned “1”, and the upper branch is assigned “0”. If the two branches probabilities are equal, still assign the lower branch “1”, and the upper “0”. 3) Combine the two coded branches, reorder and recode. 4) Repeat step 3)until the sum of the probabilities is 1. 5) Turn over from the rightmost end to t

25、he left along the tree branch to get the corresponding code word, like U3 “110”. 第24页/共34页(Huffman coding) E.g.1:第25页/共34页第26页/共34页E.g. 2Eight symbols (A-H) with the probabilities of occurrence given in the right table. Please draw the Huffman coding procedure and compute the coding efficiency. Symb

26、olProbabilityA0.10B0.18C0.40D0.05E0.06F0.10G0.07H0.04第27页/共34页第28页/共34页第29页/共34页(1) The entropy of this source is:222222220.10log (0.10)0.18log (0.18)0.40log (0.40)0.05log (0.05)0.06log (0.06)0.10log (0.1000)0.07log (0.07)0.04log (0.04)2.5524bits/symbolH (2) The average length of codeword is:0.10 30.18 30.40 1

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论