版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
Chapter9FundamentalLimitsinInformationTheoryProblems:(pp.618-625)9.39.5
9.109.11
6
9.311Chapter9FundamentalLimitsinInformationTheory9.1Introduction9.2Uncertainty,Information,andEntropy9.3Source-CodingTheorem9.4DataCompaction9.5DiscreteMemorylessChannels9.6MutualInformation9.7ChannelCapacity9.8Channel-CodingTheorem9.9DifferentialEntropyandMutualInformationforContinuousEnsembles2Chapter9FundamentalLimitsinInformationTheory9.10InformationCapacityTheorem9.11ImplicationsoftheInformationCapacityTheorem9.12InformationCapacityofColoredNoiseChannel9.13RateDistortionTheory9.14DataCompression9.15SummaryandDiscussion3第九章信息论基础9.1引言9.2不确定性、信息和熵9.3信源编码定理9.4无失真数据压缩9.5离散无记忆信道9.6互信息9.7信道容量9.8信道编码定理9.9连续信号的相对熵和互信息9.10信息容量定理9.11信息容量定理的含义9.12有色噪声信道的信息容量9.13率失真定理9.14数据压缩9.15总结与讨论4Chapter9FundamentalLimitsinInformationTheoryMainTopics:
Entropy-basicmeasureofinformationSourcecodinganddatacompactionMutualinformation-channelcapacityChannelcoding
InformationcapacitytheoremRate-distortiontheory-sourcecoding59.1Introduction
Purposeofacommunicationsystem
carryinformation-bearingbasebandsignalsfromoneplacetoanotheroveracommunicationchannelRequirementsofacommunicationsystemEfficient:sourcecodingReliable:error-controlcoding69.1Introduction
Questions:1.Whatistheirreduciblecomplexitybelowwhichasignalcannotbecompressed?2.Whatistheultimatetransmissionrateforreliablecommunicationoveranoisychannel?So,invokeinformationtheory(Shannon1948) ↓
mathematicalmodelingandanalysis ofcommunicationsystems79.1Introduction
Answers:1.Entropyofasource2.CapacityofachannelAremarkableresult:If(theentropyofthesource)<(the capacityofthechannel)Thenerror-freecommunicationoverthe channelcanbeachieved.89.2Uncertainty,Information,andEntropyUncertaintyDiscretememorylesssource:->adiscreterandomvariable,S(statisticallyindependent)
(9.1)(9.2)(9.3)99.2Uncertainty,Information,andEntropyeventbeforeoccur,amountofuncertaintyoccur,amountofsurpriseafter,informationgain (resolutionofuncertainty)and:probability↑,surprise↓,information↓e.g.:
nosurprise,noinformation,information()>information()So,theamountofinformationisrelatedtotheinverseoftheprobabilityofoccurrence.109.2Uncertainty,Information,andEntropyAmountofinformationProperties:
Forbase2--unitcalledbit
(9.4)119.2Uncertainty,Information,andEntropyEntropy--meanofI(sk)
Itisameasureoftheaverageinformationcontentpersourcesymbol.Definition:(9.9)129.2Uncertainty,Information,andEntropySomePropertiesofEntropy
BoundaryLowerbound:ifandonlyif forsomek--nouncertaintyUpperbound:ifandonlyif
forallk(可用拉式乘子法证明)(9.10)139.2Uncertainty,Information,andEntropyProve:1.Lowerbound149.2Uncertainty,Information,andEntropy2.upperboundSuppose
(Figure9.1)15Figure9.1
Graphsofthefunctionsx
1andlogxversusx.169.2Uncertainty,Information,andEntropyExample9.1
EntropyofBinaryMemorylessSourceEntropyofthesourceEntropyfunction(Figure9.2)H17Figure9.2
EntropyfunctionH(p0).189.2Uncertainty,Information,andEntropyDistinctionbetweenEqu.(9.15)andEqu.(9.16)
TheofEquation(9.15)givestheentropyofadiscretememorylesssourcewithsourcealphabet. TheentropyfunctionEquation(9.16)isafunctionofthepriorprobabilityp0definedontheinterval[0,1].199.2Uncertainty,Information,andEntropyExtensionofadiscretememorylesssourceExtendedsource:Block--consistingofnsuccessivesourcesymbolssourcealphabetdistinctblocks∵discretememorylesssource→statisticallyindependent∴entropy(9.17)209.2Uncertainty,Information,andEntropyExample9.2Entropyofextendedsource
alphabet probabilitiesentropyofthesource entropyoftheextendedsource219.3Source-CodingTheoremWhy? EfficientNeed: Knowledgeofthestatisticsofthesource3.Example :
Variable-lengthcode
Shortcodewords–frequentsourcesymbols Longcodewords–raresourcesymbols4.Requirementsofanefficientsourceencoder:Thecodewordsareinbinaryform.Thesourcecodeisuniquelydecodable.5.Figure9.3showsasourceencodingscheme.22Figure9.3
Sourceencoding.ablockof0sand1s239.3Source-CodingTheoremAssume:
alphabet--Kdifferentsymbolsprobabilityofkthsymbolsk
--pk,
k=0,1,...,K-1
binarycodewordlengthassignedtosymbolsk
--
lkAveragecode-wordlength--averagenumberofbitspersourcesymbolCodingefficiency(9.18)(9.19)Note:efficientwhen
--Minimumpossiblevalueof
249.3Source-CodingTheoremHowistheminimumvaluedetermined?Answer:Shannon’sfirsttheorem--thesource-codingtheorem
Givenadiscretememorylesssourceofentropy,theaveragecode-wordlengthforanydistor-
tionlesssourceencodingschemeisboundedasBACKBackwhen
(9.21)(9.20)259.4DataCompactionWhydatacompaction?
Signalsgeneratedbyphysicalsourcescontaina significantamountofredundantinformation.→notefficient
Requirementofdatacompaction:
Notonlyefficientintermsoftheaveragenumberofbitspersymbolbutalsoexactinthesensethattheoriginaldatacanbereconstructedwithnolossofinformation.--losslessdatacompressionExamplesPrefixCoding,HuffmanCoding,Lempel-ZivCoding269.4.1PrefixCodingDiscretememorylesssource
alphabetstatisticsrequirement
uniquelydecodable
definition:acodeinwhichnocodewordistheprefixofanyothercodeword.codewordof
--Wheremki∈(0,1);n--code-wordlength
calledprefix279.4.1PrefixCodingTable9.2
CodeIandCodeIIInotaprefixcodeCodeIIaprefixcodedecodingusedecisiontree--Figure9.4
Procedure:
1.Startattheinitialstate.2.Checkthereceivedbit.If=1,decodermovestoaseconddecisionpoint,andrepeatstep2.If=0,movestotheterminalstate,andbacktostep1.28Figure9.4
DecisiontreeforcodeIIofTable9.2.e.g.:1011111000…→s1s3s2s0s0
…299.4.1PrefixCodingProperty:
1.uniquelydecodable2.satisfyKraft-McMillanInequality
wherelk
isthecodewordlength.3.instantaneouscodes
Theendofacodewordisalwaysrecognizable.Note:性质1和2只是前缀码的必要条件.(e.g.CodeII,CodeIII满足性质1和2,但只有CodeII是前缀码.)(9.22)309.4.1PrefixCodingProperty:
4.Givenentropy,aprefixcodecanbeconstructedwithanaveragecodewordlength,whichisboundedas:(9.23)319.4.1PrefixCodingSpecialcase:
Theprefixcodeismatchedtothesourceinthat
,underthecondition.Prove:329.4.1PrefixCodingExtendedprefixcode:
Thecodeismatchedtoanarbitraydiscrete
memorylesssourcebythehighorderoftheextendedprefixcode.(→increaseddecodingcomplexity)Prove:Whereistheaveragecode-wordlengthoftheextendedprefixcode.
339.4.2HuffmanCodingAnimportantclassofprefixcodes
Basicidea
Asequenceofbitsroughlyequalinlengthtotheamountofinformationconveyedbythesymbolisassignedtoeachsymbol.
averagecode-wordlengthapproachesentropyEssenceofthealgorithmReplacetheprescribedsetofsourcestatisticswithasimplerone.349.4.2HuffmanCodingEncodingalgorithm1.Splittingstage:(i)Sourcesymbolsarelistedinorderofdecreasingprobability(P).(ii)The2symbolsoflowestPareassigneda0&1.2.Combinethe2symbolsasanewsymbolwithsumP,andreplacethesourcesymbolsasinstep1.3.Repeat2untiltwosymbolsleft.Thenthecodeforeach(original)sourcesymbolisfoundbyworkingbackwardandtracingthesequenceof0sand1sassignedtothatsymbolaswellasitssuccessors.
359.4.2HuffmanCodingExample9.3HuffmanTreeFigure9.5
(a)ExampleoftheHuffmanencodingalgorithm.(Ashighaspossible)(b)Sourcecode.369.4.2HuffmanCodingExample9.3HuffmanTree(Cont.)
Theaveragecode-wordlengthis=2.2Theentropyis=2.12193bitsTwoobservations:Theaveragecode-wordlengthexceedstheentropybyonly3.67percent.Theaveragecode-wordlengthdoesindeedsatisfytheEquation(9.23).379.4.2HuffmanCodingExample9.3HuffmanTree(Cont.)Notes:1.Encodingprocessisnotunique.(i)Arbitraryassignments
of0&1tothelasttwosourcesymbols.→trivialdifferences(ii)Ambiguousplacementofacombinedsymbolwhenitsprobabilityisequaltoanotherprobability.(ashighorlowaspossible?)→noticeabledifferences
Answer:
2.Requiresprobabilisticmodelofthesource.(Drawback)High,variance↓;Low,variance↑389.4.3Lempel-ZivCodingProblemofHuffmancode1.Itrequiresknowledgeofaprobabilisticmodelofthesource.Inpractice,sourcestatisticsarenotalwaysknownapriori.2.Storagerequirementspreventitfromcapturingthehigher-orderrelationshipsbetweenwordsandphrasesinmodelingtext.→efficiencyofthecode↓AdvantageofLempel-Zivcoding
intrinsicallyadaptiveandsimplertoimplementthanHuffmancoding399.4.3Lempel-ZivCodingBasicideaofLempel-Zivcode
EncodingintheLempel-Zivalgorithmisaccomplishedbyparsingthesourcedatastreamintosegments
thataretheshortestsubsequencesnotencounteredpreviously.
Forexample:(pp.580)
inputsequence
000101110010100101...Assume:Subsequencesstored:0,1Datatobeparsed:000101110010100101...
Result:codebookinFigure9.6
40Figure9.6
IllustratingtheencodingprocessperformedbytheLempel-Zivalgorithmonthebinarysequence000101110010100101....NumericalPositions:123456789Subsequences: 01000101110010100101Numericalrepresentations: 11124221416162Binaryencodedblocks: 0010
001110010100100011001101Binaryencodedrepresentationofthesubsequence=(binarypointertothesubsequence)+(innovationsymbol)419.4.3Lempel-ZivCodingThedecoderisjustassimpleastheencoder.
BasicconceptFixed-lengthcodesareusedtorepresentavariablenumberofsourcesymbols.→Suitableforsynchronoustransmission.Basicconcept1.Inpractice,fixedblocksof12bitslong →acodebookof4096entries2.standardalgorithmforfilecompression.Achievesacompactionofapproximately55%forEnglishtext.429.5DiscreteMemorylessChannels
AdiscretememorylesschannelisastatisticalmodelwithaninputXandanoutputYthatisanoisyversionX;bothXandYarerandomvariables.(seeFigure9.7)
inputalphabetoutputalphabettransitionprobabilitiesDefinition(9.31)(9.32)foralljandk43Figure9.7
Discretememorylesschannel.Discrete---bothofalphabetsXandYhavefinitesizesmemoryless--currentoutputsymboldependsonlyonthecurrent inputsymbolandnotanyofthepreviousones.449.5DiscreteMemorylessChannelsChannelmatrix(ortransitionmatrix)(9.35)Note:row--fixedchannelinputcolumn--fixedchanneloutputforallj459.5DiscreteMemorylessChannelsNOTE:jointprobabilitydistributionmarginalprobabilitydistributioninputprobabilitydistribution469.5DiscreteMemorylessChannelsExample9.4BinarysymmetricchannelFigure9.8Transitionprobabilitydiagramofbinarysymmetricchannel.479.6MutualInformation
HowcanwemeasuretheuncertaintyaboutXafterobservingY?Themean(9.40)(9.41)Answer:conditionalentropy--theamountofuncertaintyremainingaboutthechannelinputafterthechanneloutputhasbeenobserved.489.6MutualInformationMutualinformationH(X)--uncertaintyaboutthechannelinputbeforeobservingtheoutputH(X|Y)--uncertaintyaboutthechannelinputafter
observingtheoutputH(X)-H(X|Y)--uncertaintyaboutthechannelinputthatisresolvedbyobservingthechanneloutput(9.43)(9.44)499.6.1PropertiesofMutualInformationProperty1--symmetric
Property2--nonnegativeProperty3
Relatedtothejointentropyofthechannelinputandchanneloutputby(9.54)(9.50)(9.45)50Figure9.9
Illustratingtherelationsamongvariouschannelentropies.519.7ChannelCapacityDiscretememorylesschannelhere
Themutualinformationofachannelthereforedependsnotonlyonthechannelbutalsoonthewayinwhichthechannelused.(9.49)529.7ChannelCapacityDefinitionWedefinethechannelcapacityofadiscretememoryless
channelasthemaximummutualinformationI(X;Y)inanysingleuseoftheChannel(i.e.,signalinginterval),wherethemaximizationisoverallpossibleinputprobabilitydistributionsonX.(9.59)Subjecttoandforallj539.7ChannelCapacityNote:1.Cismeasuredinbitsperchanneluse,orbitspertransmission.2.Cisafunctiononlyofthetransitionprobabilities,whichdefinethechannel.3.ThevariationalproblemoffindingthechannelcapacityCisachallengingtask.549.7ChannelCapacityExample9.5BinarysymmetricchannelTransitionprobability(seefigure9.8)(SeeFigure9.10)Observations:1.Noisefree,p
=0,C=1(maximumvalue)2.Useless,p=1/2,C=0(minimumvalue)55Figure9.10
Variationofchannelcapacityofabinarysymmetricchannelwithtransitionprobabilityp.569.8Channel-CodingTheoremGoalIncreasetheresistanceofadigitalcommunicationsystemtochannelnoise.Why?noise→error
Figure9.11
Blockdiagramofdigitalcommunicationsystem.579.8Channel-CodingTheoremBlockcodes(n,k);coderate:r=k/nQuestion:Doesthereexistachannelcodingschemesuchthattheprobabilitythatamessagebitwillbeinerrorislessthananypositivenumberε(i.e.,arbitrarilysmallprobabilityoferror),andyetthechannelcodingschemeisefficientinthatthecoderateneednotbetoosmall?Channelcoding--introducecontrolledredundancy
toimprovereliabilitySourcecoding--reduce
redundancytoimprove efficiency589.8Channel-CodingTheoremAnswer:Shannon’ssecondtheorem(Channelcodingtheorem)1. IfExistsacodingscheme.C/Tc--criticalrate2.IfNot.ThetheoremspecifiesthechannelcapacityCasafundamentallimitontherateatwhichthetransmissionofreliableerror-freemessagescantakeplaceoveradiscretememoryless
channel.Back(9.61)(9.62)averageinformationrate≤channelcapacityperunittime599.8Channel-CodingTheoremNOTE:Anexistenceproof.(Donottellushowtoconstructagoodcode?)Nopreciseresultfortheprobabilityofsymbolerror(Pe)afterdecodingthechanneloutput.(lengthofthecode↑,Pe→0)Powerandbandwidthconstraintswerehiddeninthediscussionpresentedhere.(showupinthechannelmatrixPofthediscretememorylesschannel.)609.8Channel-CodingTheoremApplicationofthechannelcodingtheoremtobinarysymmetricchannelsSourceTs0,1sourceentropy1bitpersymbolinformationrate1/TsbpsafterencodingTccoderatertransmissionrate1/Tcsymbols/sThen,ifTheprobabilityoferrorcanbemadearbitrarilylowbytheuseofasuitablechannelencodingscheme.andFor,thereexistsacodecapableofachievinganarbitrarilylowprobabilityoferror.Back619.8Channel-CodingTheoremExample9.6RepetitioncodeBSCC=0.9192channelcodingtheorem→foranyε>0and ,thereexistsacodeoflengthnlargeenough&r&appropriatedecodingalgorithm,suchthatPe<ε.Seefigure9.1262Figure9.12
Illustratingsignificanceofthechannelcodingtheorem.639.8Channel-CodingTheoremExample9.6Repetitioncode(1,n)n=2m+1ifn=3,0->000,1->111decodingmajorityrule
m+1ormorebitsreceivedincorrectly→errorAverageprobabilityoferrorCharacteristic:exchangeofcoderateformessagereliability→Table9.3(r↓,Pe↓)649.9DifferentialEntropyandMutualInformationforContinuousEnsemblesXacontinuousrandomvariablefX(x)theprobabilitydensityfunctionWehave(9.66)h(X),thedifferentialentropyofX.Note:ItisnotameasureoftherandomnessofX.Itisdifferentfromordinaryorabsoluteentropy.659.9DifferentialEntropyandMutualInformationforContinuousEnsemblesAssumeXintheinterval,probabilityOrdinaryentropyofthecontinuousrandomvariableX669.9DifferentialEntropyandMutualInformationforContinuousEnsemblescontinuousrandomvectorconsistingofnrandomvariablesX1,X2,...,Xnthejointprobabilitydensityfunctionof
thedifferentialentropy
(9.68)679.9DifferentialEntropyandMutualInformationforContinuousEnsemblesExample9.7UniformdistributionArandomvariableXuniformlydistributedovertheinterval(0,a).TheprobabilitydensityfunctionThen,weget(9.69)Note:log2a<0fora<1.Unlikeadiscreterandomvariable,thedifferentialentropyofacontinuousrandomvariablecanbenegative.689.9DifferentialEntropyandMutualInformationforContinuousEnsemblesExample9.8GaussiandistributionX,Yrandomvariables,use(9.12)(9.70)(9.71)(9.72)Assume:1.X,Yhavethesamemeanandthesamevariance.2.XisGaussiandistributed,as699.9DifferentialEntropyandMutualInformationforContinuousEnsembles(9.73)then,(9.74)(9.75)(9.76)∵forY∴709.9DifferentialEntropyandMutualInformationforContinuousEnsemblesCombining(9.75)and(9.76),(9.77)whereequalityholds,andonlyif,fY(x)=fX(x)
.Summarize(twoentropicpropertiesofaGaussianrandomvariable)Forafinitevariance,theGaussianrandomvariablehasthelargestdifferentialentropyattainablebyanyrandomvariable.TheentropyofaGaussianrandomvariableXisuniquelydeterminedbythevarianceofX(i.e.,itisindependentofthemeanofX).719.9.1MutualInformationApairofcontinuousrandomvariablesXandYMutualinformation(9.78)Properties(9.79)(9.80)(9.81)729.9.1MutualInformationh(X),h(Y)thedifferentialentropyofX,Y.Where:h(X|Y)istheconditionaldifferentialentropyofX,givenY;h(Y|X)istheconditionaldifferentialentropyofY,givenX;(9.82)Conditionaldifferentialentropy739.10InformationCapacityTheoremInformationcapacitytheoremforband-limited,power-limitedGaussianchannels.signalX(t)azero-meanstationaryprocess,band-limitedtoBhertz.Tseconds,transmittedoveranoisychannelThenumberofsamples(9.83)XkthecontinuousrandomvariablesobtainedbyuniformsamplingoftheprocessX(t)attheNyquist
rateof2Bsamplespersecond.K=1,2,...,K749.10InformationCapacityTheoremNoise
AWGN,zeromean,powerspectraldensity=N0/2,band-limitedtoBhertz.ThenoisesampleNkisGaussianwithzeromeanandvariancegivenbyFigure9.13Modelofdiscrete-time,memorylessGaussianchannel.(9.84)(9.85)Thesamplesofreceivedsignal759.10InformationCapacityTheoremThecosttoeachchannelinput,(9.86)wherePistheaveragetransmittedpower.TheinformationcapacityofthechannelThemaximumofthemutualinformationbetweenthechannelinputXkandthechanneloutputYkoveralldistributionsontheinputXkthatsatisfythepowerconstraintofEquation(9.86).(9.87)769.10InformationCapacityTheorem(9.88)(9.89)(9.90)whereMaximizing,requiresmaximizing.Fortobemaximum,hastobeaGaussianrandomvariable.Thatis,thesamplesofthereceivedsignalrepresentanoiselikeprocess.Next,sinceisGaussianbyassumption,thesampleofthetransmittedsignalmustbeGaussiantoo.Xk
,Nk
areindependent779.10InformationCapacityTheoremso(9.91)ThemaximizationspecifiedinEquation(9.87)isattainedbychoosingthesamplesofthetransmittedsignalfromanoiselikeprocessofaaveragepowerP.ThreestagesfortheevaluationoftheinformationcapacityC1.ThevarianceofYk=so(9.92)789.10InformationCapacityTheorem2.ThevarianceofNk=(9.93)so3.Informationcapacity(9.94)equivalentform(K/TtimesC)(9.95)799.10InformationCapacityTheoremShannon’sthirdtheorem,theinformationcapacitytheorem:TheinformationcapacityofacontinuouschannelofbandwidthBhertz,perturbedbyadditivewhiteGaussiannoiseofpowerspectraldensityN0/2andlimitedinbandwidthtoB,isgivenbywherePistheaveragetransmittedpower.Thechannelcapacitytheoremdefinesthefundamentallimitontherateoferror-freetransmissionforapower-limited,band-limitedGaussianchannel.Toapproachthislimit,thetransmittedsignalmusthavestatisticalpropertiesapproximatingthoseofwhiteGaussiannoise. Back809.10.1SpherePackingPurpose:Forsupportingtheinformationcapacitytheorem.Anencodingscheme,yieldsKcodewords,codewordlength(numberofbits)=nPowerconstraint:nP,Paveragepowerperbit.Thereceivedvectorofnbits,Gaussiandistributed,MeanequaltothetransmittedcodewordVarianceequalto,thenoisevariance.819.10.1SpherePackingWithhighprobability,thereceivedvectorliesinsideasphereofradius,centeredonthetransmittedcodeword.Thissphereisitselfcontainedinalargersphereofradius,whereistheaveragepowerofthereceivedvector.Seefigure9.14Figure9.14
Thesphere-packingproblem.829.10.1SpherePackingQuestion:Howmanydecodingspherescanbepackedinsidethelargesphereofreceivedvectors?Inotherwords,howmanycodewordscanweinfactchoose?Firstrecognizethatthevolumeofann-dimensionalsphereofradiusrmaybewrittenas;isascalingfactor.Statements1.Thevolumeofthesphereofreceivedvectorsis2.Thevolumeofthedecodingsphereis839.10.1SpherePackingThemaximumnumberbenonintersectingdecodingspheresthatcanbepackedinsidethesphereofpossiblereceivedvectorsis(9.96)Example9.9Reconfigurationofconstellationforreducedpower64-QAMFigure9.159.15bhasanadvantageover9.15a:asmallertransmittedaveragesignalenergypersymbolforthesameBERonanAWGNchannel84Figure9.15
(a)Square64-QAMconstellation.(b)Themosttightlycoupledalternativetothatofparta.HighSNRonAWGNchannel,thesameBERSquaredEuclideandistancesfromthemessagepointstotheoriginb<a859.11ImplicationsoftheInformationCapacityTheoremIdealsystemRb=CAveragetransmittedpower(9.97)accordingly,theidealsystemisdefinedby(9.98)(9.99)signalenergy-per-bittonoisepowerspectraldensityratioAnidealsystemisneededtoassesstheperformanceofapracticalsystem.869.11ImplicationsoftheInformationCapacityTheorembandwidth-efficiencydiagramAplotofbandwidthefficiencyRb/BversusEb/N0.(Figure9.16)wherethecurvelabeled”capacityboundary”correspondstotheidealsystemforwhichRb=C.Observations:1.Forinfinitebandwidth,(9.100)ThisvalueiscalledShannonlimitforanAWGNchannel,assumingacoderateofzero.(-1.6dB)879.11ImplicationsoftheInformationCapacityTheoremFigure9.16
Bandwidth-efficiencydiagram.889.11ImplicationsoftheInformationCapacityTheorem(9.101)2.Thecapacityboundary,definedbythecurveforthecriticalbitrateRb=C.Rb<C,error-freetransmissionRb>C,error-freetransmissionisnotpossible3.Thediagramhighlightspotentialtrade-offsamongEb/N0,Rb/B,andprobabilityofsymbolerrorPe.899.11ImplicationsoftheInformationCapacityTheoremExample9.10M-aryPCMAssumption:
Thesystemoperatesabovethethreshold.Theaverageprobabilityoferrorduetochannelnoiseisnegligible.acodeword:ncodeelements,eachhavingoneofMpossiblediscreteamplitudelevels.noisemargin:sufficientlylargetomaintainanegligibleerrorrateduetochannelnoise.
↓TheremustbeacertainseparationbetweentheseMpossiblediscreteamplitudelevels,kconstant,noisevariance,BchannelbandwidthTheaveragetransmittedpowerwillbeleastiftheamplituderangeissymmetricalaboutzero.909.11ImplicationsoftheInformationCapacityTheorem(9.102)Thediscreteamplitudelevels,normalizedwithrespecttotheseparation,willhavethevaluetheaveragetransmittedpower(假设先验等概)Whertz,highestfrequencycomponent2W,sampledrateL,representationlevelsofquantizer(equallylikely)themaximumrateofinformationtransmission(9.103)919.11ImplicationsoftheInformationCapacityTheorem(9.104)Forauniquecodingprocess(9.105)(9.106)(9.107)929.11ImplicationsoftheInformationCapacityTheorem(9.108)Brequiredtotransmitarectangularpulseofduration1/2nWiswhereisaconstantwithavaluelyingbetween1and2.Using=1,(minimumvalue)TheyareidenticaliftheaveragetransmittedpowerinthePCMsystemisincreasedbythefactork2/12,comparedwiththeidealsystem.PowerandbandwidthinaPCMsystemareexchange
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- HY/T 0399-2024渤海和黄海海冰卫星遥感监测规范
- 一年级数学(上)计算题专项练习集锦
- 物流风险管理与应对措施培训
- 培养学生的探索精神与品德计划
- 酒店多语言服务技巧培训
- 关注儿童心理健康的工作策略计划
- 商品寄售合同三篇
- 信阳师范大学《操作系统》2021-2022学年第一学期期末试卷
- 高效会议的时间管理技巧计划
- 酒店保安员培训
- 投诉法官枉法裁判范本
- 食材配送服务方案投标方案(技术方案)
- 西方文明史导论智慧树知到期末考试答案2024年
- 24春国家开放大学《离散数学》大作业参考答案
- 2021年眩晕急诊诊断与治疗指南(全文)
- 单片机课程设计五彩灯控制器
- 沉井工程施工方案(附示意图)
- 专业绿色施工节能减排的管理措施和实施记录
- 钢结构工程报价单
- 铁路专用线名称表
- 国企各职能部门绩效考核评分标准
评论
0/150
提交评论