新祥旭北京大学光华管理学院考研-统计学资料_第1页
新祥旭北京大学光华管理学院考研-统计学资料_第2页
新祥旭北京大学光华管理学院考研-统计学资料_第3页
新祥旭北京大学光华管理学院考研-统计学资料_第4页
新祥旭北京大学光华管理学院考研-统计学资料_第5页
已阅读5页,还剩50页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

新祥旭北京大学光华管理学院考研-统计学资料ProbabilityFunctionsofARandomVariableSupposeXhasadiscretedistributionwithp.f.f,andY=r(X),afunctionofX.Thep.f.ofYis:VariablewithAContinuousDistributionSupposeXhasacontinuousdistributionwithp.d.f.f,andY=r(X),afunctionofX.Thed.f.GofYcanbederivedas: IfYalsohasacontinuousdistribution,itsp.d.f.gcanbeobtainedby atanyywhereGisdifferentiable.Example3.8.1SupposeXhasauniformdistributionon(-1,1), Whatisthep.d.ffor? Solution:For0<y<1,DirectDerivationofp.d.f.SupposeY=r(X)whereriscontinuous,andXliesinacertaininterval(a,b)overwhichthefunctionr(x)isstrictlyincreasing.Thenrisaone-to-onefunctionwhichmaps(a,b)to(a,b).IthasaninversefunctionX=s(Y).Foranyysuchthata<y<b,Supposesisdifferentiableover(a,b),thenSupposeY=r(X)whereriscontinuous,andXliesinacertaininterval(a,b)overwhichthefunctionr(x)isstrictlydecreasing.Thenrisaone-to-onefunctionwhichmaps(a,b)to(a,b).IthasaninversefunctionX=s(Y).Foranyysuchthata<y<b,Supposesisdifferentiableover(a,b),thenTheorem3.8.1LetXbearandomvariableforwhichthep.d.f.isfandPr(a<X<b)=1.LetY=r(X),andsupposethatr(x)iscontinuousandeitherstrictlyincreasingorstrictlydecreasingfora<x<b.Supposealsothatr(x)mapsa<x<btoa<y<b,andletX=s(Y)betheinversefunctionfora<Y<b.Thenthep.d.f.ofYisspecifiedbyExampleSupposeXhasap.d.f. Whatisthep.d.f.of? Solution:Yisacontinuous,strictlydecreasingfunctionfor0<X<1,withrange0<Y<1.Theinversefunctionisfor0<Y<1. TheProbabilityIntegralTransformationSupposeXhasacontinuousd.f.F,letY=F(X).Thistransformationiscalledtheprobabilityintegraltransformation.WhatisthedistributionofY?Since,Pr(Y<0)=Pr(Y>1)=0.Foranygiven0<y<1,letx1bethelargestnumbers.t.F(x1)=y.ThenSoYhasauniformdistributionontheinterval(0,1).FunctionsofTwoorMoreRandomVariablesSupposeX1,...,Xnhaveadiscretejointdistributionwithp.f.f,andmfunctionsY1,...,Ymofthesenrandomvariablesare:Foranygivenvaluesy1,...,ym,letAdenotethesetofallpoints(x1,...,xn)suchthatThenthejointp.f.gofY1,...,Ymis:VariableswithAContinuousJointDistributionSupposethejointp.d.f.ofX1,...,Xnisf(x1,...,xn)andY=r(X1,...,Xn).Foranygivenvaluey,letAybethesubsetofcontainingallpoints(x1,...,xn)suchthat.Then IfthedistributionofYisalsocontinuous,thenthep.d.f.ofYcanbefoundbydifferentiatingthed.f.G(y).TheDistributionofMaximumandMinimumValuesinaRandomSampleSupposeX1,...,Xnformarandomsampleofsizenfromadistributionwithp.d.f.fandd.f.F.ConsiderConsiderSupposewewanttofindthejointdistributionofandTransformationofAMultivariatep.d.fSupposeX1,...,Xnhaveacontinuousjointdistributionwithjointp.d.f.f,andnnewrandomvariablesY1,...,Ynaredefinedby: SupposeisthesupportforX1,...,Xn,theimageunderthetransformationisT.AssumethatthetransformationfromStoTisaone-to-onetransformation.Wecangettheinverseofthetransformation:Supposeeachpartialderivativeexistsateverypoint.TheJacobianofthetransformationcanbeconstructed:Thejointp.d.f.gofY1,...,Yncanbederived:Example3.9.1SupposeX1andX2haveacontinuousjointdistributionwithp.d.f. Whatisthejointp.d.f.ofY1andY2?TheinverseofthetransformationisS={(x1,x2):0<x1<1,0<x2<2} T={(y1,y2):y1>0,y2>0,y1y2<1,y2/y1<1}WehaveTheJacobianisThep.d.f.ofY1andY2is:LinearTransformationLetAbeanxnmatrix,supposeY1,...,Ynaredefinedby:IfAisnonsingular,thenthematrixexists.Theinversetransformationis:TheJacobianJisThep.d.f.ofY1,...,YnisTheSumofTwoRandomVariablesSupposeX1andX2haveagivenjointp.d.f.f,andwewanttofindthep.d.fforY=X1+X2.LetZ=X2,thenthetransformationfromX1andX2toYandZwillbeaone-to-onelineartransformation.Theinversetransformationis X1=Y-Z,X2=ZThematrixofcoefficientsofthistransformationisThejointp.d.f.ofYandZis:Themarginalp.d.f.gofYcanbeobtainedby:Similarly,ifweletZ=X1,wecanhaveIfX1andX2areindependentrandomvariableswithmarginalp.d.f.’sf1andf2,wecanhaveExample3.9.3SupposethatX1andX2arei.i.d.randomvariablesandthep.d.f.foreachis: Whatisthep.d.f.gofY=X1+X2? Solution:Forg(y)=0fory<0.TheRangeSupposeX1,...,Xnformarandomsampleofsizenfromadistributionwithp.d.f.fandd.f.F.and. W=Yn-Y1iscalledtherangeofthesample.Whatisthep.d.f.ofW? Solution:Wealreadyderivedthejointp.d.f.g(y1,yn)ofY1andYn.IfweletZ=Y1,thenthetransformationfromY1andYntoWandZwillbeaone-to-onelineartransformation.Theinversetransformationis Y1=Z Yn=W+Z with|J|=1.Thejointp.d.f.ofWandZ forandis Themarginalp.d.f.ofWisExample3.9.5SupposethatnvariablesX1,…,Xnformarandomsamplefromauniformdistributionontheinterval[0,1].Whatisthep.d.f.oftherangeofthesample?Solution: F(x)=xfor0<x<1. SoMomentsForanypositiveintegerk,theexpectationiscalledthekthmomentofX.ThekthmomentexistsifandonlyifThemeanE(X)isthefirstmomentofX.IfXisbounded,thenallmomentsofXmustexist.Theorem4.4.1.Ifforsomepositiveintegerk,thenforanypositiveintegerjsuchthatj<k. Proof.AssumethatXhasacontinuousdistributionwithp.d.f.f.CentralMomentsSuppose.Theexpectation iscalledthekthcentralmomentofX.Thefirstcentralmomentmustbe0.Thevarianceisthesecondcentralmoment.IfthedistributionofXissymmetricwithrespecttoitsmean,andifthecentralmoment existsforagivenoddnumberk,thenthevalueofmustbe0.Example4.4.1SupposeXhasacontinuousdistributionwithp.d.f.Foreverypositiveintegerk, SoallthemomentsofXexist.Sincef(x)issymmetricwithrespecttox=3,thenE(X)=3.Italsofollowsthatforeveryoddintegerk.MomentGeneratingFunctionsm.g.f.IfXisbounded,thentheexpectationexistsforallvaluesoft.Sothem.g.f.ofXexistsforallvaluesoft.IfXisnotbounded,thenthem.g.f.mightexistforsomevaluesoftandmightnotexistforothers.Them.g.f.mustexistatthepointt=0,anditsvalueatthispointmustbeSupposethem.g.f.existsforallvaluesinsomeintervalaroundthepointt=0.Thederivativeofthem.g.f.existsatt=0.AndIngeneral,thenthderivativeexistsatt=0.AndExample4.4.2SupposeXhasacontinuousdistributionwithp.d.f.asfollows: Whatisthem.g.f.ofX?WhatisVar(X)?Foranyrealnumbert,AllmomentsofXexist. SoPropertiesofMomentGeneratingFunctionsTheorem2.LetXbearandomvariablewithm.g.f.,letY=aX+b,andletdenotethem.g.f.ofY.Thenforanyvalueoftsuchthatexists, Proof.Example4.4.3SupposeXhasacontinuousdistributionwithp.d.f.asfollows: Then,fort<1,them.g.f.ofXis IfY=3-2X,thenthem.g.f.ofYwillexistfort>-1/2,andwillhavethevalueTheorem4.4.3.SupposeX1,...,Xnarenindependentrandomvariables,andfori=1,...,n,letdenotethem.g.f.ofXi.LetY=X1+...+Xnandletthem.g.f.ofYbe.Thenforanyvalueoftsuchthatexistsfori=1,...,n,Proof.m.g.f.fortheBinomialdistributionSupposeXhasabinomialdistributionwithparametersnandp.XcanberepresentedbythesumofnindependentrandomvariablesX1,...,Xn.ThedistributionofeachXiis: Pr(Xi=1)=p Pr(Xi=0)=q=1-p Weshalldeterminethem.g.f.ofX.Uniquenessofm.g.f.Theorem4.4.4Ifthem.g.f.’softworandomvariablesX1andX2areidenticalforallvaluesoftinanintervalaroundthepointt=0,thentheprobabilitydistributionofX1andX2mustbeidentical.TheAdditivePropertyoftheBinomialDistributionSupposeX1andX2areindependentrandomvariables.X1hasabinomialdistributionwithparametersn1andp,andX2hasabinomialdistributionwithparametersn2andp.WeshalldeterminethedistributionofX1+X2.Letdenotethem.g.f.ofXifori=1,2.Letdentoethem.g.f.ofX1+X2,thenThisfunctionisthem.g.f.ofabinomialdistributionwithparametersn1+n2andp.SoX1+X2hasabinomialdistributionwithparametersn1+n2andp.TheMeanandTheMedianThemeanofadistributioncanberegardedasthecenterofthedistribution.Intuitively,themedianisapointwhichdividesthetotalprobabilityintotwoequalparts:Theprobabilitytotheleftofthemedianis½.Theprobabilitytotherightofthemedianis½.(Formal)definition:ForanyrandomvariableX,amedianofthedistributionofXisdefinedtobeapointmsuchthatandExample4.5.1SupposethatXhasthefollowingdiscretedistribution:Pr(X=1)=0.1,Pr(X=2)=0.2,Pr(X=3)=0.3,Pr(X=4)=0.4.Themedianforthisdistributionis3since 3istheuniquemedianofthisdistributionExample4.5.2SupposethatXhasthefollowingdiscretedistribution:Pr(X=1)=0.1,Pr(X=2)=0.4,Pr(X=3)=0.3,Pr(X=4)=0.2.Anyvalueintheinterval[2,3]isamedianforthisdistributionsinceExample4.5.3SupposethatXhasacontinuousdistributionwithp.d.f.Theuniquemedianmisthenumbers.t.Example4.5.4SupposethatXhasacontinuousdistributionwithp.d.f.Foranyvaluemintheinterval[1,2.5], Somisamedianofthisdistribution.ComparingtheMeanandtheMedianBoththemeanandthemediancanrepresentthe“average”valueofavariable.Inmanycases,themedianisamoreusefulmeasureoftheaveragethanisthemean,sinceamedianmaybeunaffectedbyremovingasmallprobabilityfromsomepartofthedistributionandassigningthisamounttoanextremevalue.E.g.“average”incomenimizingMeanSquareErrorSupposeXisarandomvariablewithmeanandvariance.SupposewewanttopredictthevalueofXbeforeobservingit.Wewanttousesomenumberdforwhich(meansquarederror)willbeminimum.Foranyvalueofd, It’sminimizedat.ThecorrespondingM.S.E.isMinimizingMeanAbsoluteErrorSupposeweinsteadwanttousesomenumberdforwhich(MeanAbsoluteError)willbeminimum.Theorem4.5.1.LetmbeamedianofthedistributionofXandletdbeanyothernumber,then TheequalitywillholdifandonlyifdisalsoamedianofthedistributionofX.Proof: SupposeXhasacontinuousdistributionwithp.d.f.f. Furthersupposethatd>m.Example4.5.5Supposethatthereisprobability1/6thatarandomvariableXwilltakeeachofthesixvalues:0,1,2,3,5,7.WhatisthepredictionforwhichtheM.S.E.isminimum?WhatisthepredictionforwhichtheM.A.E.isaminimum?E(X)=1/6(0+1+2+3+5+7)=3.SotheM.S.E.willbeminimizedbytheuniquevalued=3.Anynumbermin[2,3]isamedianofthegivendistribution,sotheM.A.E.willbeminimizedbyanyvalueofdin[2,3].CovarianceandCorrelationCovariance:measuretheassociationbetweentworandomvariables. LetXandYberandomvariableshavingaspecifiedjointdistribution,andletE(X)=,E(Y)=, Var(X)=,Var(Y)=.ThecovarianceofXandY,isdefinedasIf,,thenCov(X,Y)willbefinite.Cov(X,Y)canbepositive,negative,orzero.If,,thecorrelationofXandY,isdefinedas Therangeofpossiblevaluesofthecorrelationis:Theorem4.6.1(Schwarzinequality):ForanyrandomvariablesUandV, Proof.(a)If,thenPr(U=0)=1.ItfollowsthatPr(UV=0)=1.SoE(UV)=0andtherelationissatisfied.If,therelationisalsosatisfied. (b)Ifeitherorisinfinite,apparentlytherelationissatisfied.(c)If,Let,thenXandYarepositivelycorrelated: XandYarenegativelycorrelated: XandYareuncorrelated:PropertiesofCovarianceandCorrelationTheorem4.6.2.ForanyrandomvariablesXandYsuchthatand, Cov(X,Y)=E(XY)-E(X)E(Y) Proof.Theorem4.6.3.IfXandYareindependentrandomvariableswithand ,then Proof.IfXandYareindependent,thenE(XY)=E(X)E(Y).Therefore, Cov(X,Y)=E(XY)-E(X)E(Y)=0. ItfollowsthatRemark:Twouncorrelatedrandomvariablescanbedependent.Example4.6.3.SupposethatXcantakeonlythreevalues–1,0,and1andeachofthesethreevalueshasthesameprobability.LetYbedefinedby.WeshallshowthatXandYaredependentbutuncorrelated. Proof.ApparentlyXandYaredependent. XandYareuncorrelated.Theorem4.6.4.SupposeXisarandomvariablewith,supposethatY=aX+bwhere.Ifa>0,then.Ifa<0,then Proof.IfY=aX+b,thenand.Therefore,Theorem4.6.5.IfXandYarerandomvariablessuchthatand,then Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y). Proof.SinceE(X+Y)=,thenRemark.Foranyconstantsaandb,wecanshowthatCov(aX,bY)=abCov(X,Y).Itfollowsthat

Theorem4.6.6.IfX1,...,Xnarerandomvariablessuchthatfori=1,...,n,then Proof.ForanyrandomvariableY,Cov(Y,Y)=Var(Y).Remark.IfX1,...,Xnareuncorrelatedrandomvariables,thenConditionalExpectationSupposeXandYarerandomvariableswithjointp.f.orp.d.f.f(x,y),letf1(x)denotethemarginalp.f.orp.d.f.orX,foranyvalueofxsuchthatf1(x)>0,letg(y|x)denotetheconditionalp.f.orp.d.f.ofYgivenX=x.TheconditionalexpectationofYgivenX,denotedbyE(Y|X),isspecifiedasafunctionofX:TheConditionalExpectationIsaRandomVariableE(Y|X)isafunctionoftherandomvariableX,soitisitselfarandomvariablewithitsowndistribution.ThedistributionofE(Y|X)canbederivedfromthedistributionofX.WecanfindthemeanandthevarianceofE(Y|X).Theorem4.7.1.ForanyrandomvariablesXandY,E[E(Y|X)]=E(Y). Proof.AssumeforconveniencethatXandYhaveacontinuousjointdistribution,thenExample4.7.3SupposeapointXischosenrandomlyfromtheinterval(0,1).SupposeafterthevalueX=xhasbeenobserved,apointYisthenchosenrandomlyfromtheinterval(x,1).WhatisthevalueofE(Y)? Solution.Remark.Generally,supposeXandYhaveacontinuousjointdistributionandthatr(X,Y)isanyfunctionofXandY.ThenthecontionalexpectationE[r(X,Y)|X)]isdefinedasafunctionofX: Itcanbeshownthat E{E[r(X,Y)|X]}=E[r(X,Y)]Example4.7.4LinearConditionalExpectationSupposethatE(Y|X)=aX+bforsomeconstantsaandb.ThenConditionalVarianceLetVar(Y|x)denotethevarianceoftheconditionaldistributionofYgivenX=x,i.e.,VarianceandConditionalVarianceEve’sLaw Proof.PredictionSupposeXandYhaveaspecifiedjointdistribution.AfterthevalueofXhasbeenobserved,thevalueofYmustbepredicted.Assumethispredictedvalued(X)mustbechosensoastominimizeM.S.E.Theorem4.7.2.Thepredictiond(X)thatminimizesisd(X)=E(Y|X). Proof.d(X)minimizestheM.S.E.ifforanyx,d(x)minimizes IfX=xhasbeenobserved,the(conditional)M.S.E.willbeminimizedusingthe(conditional)expectationofY,i.e.,d(x)=E(Y|x). Itfollowsthatthefunctiond(X)forwhichtheM.S.E.isminimizedisd(X)=E(Y|X).Note:TheconditionalM.S.E.usingd(x)isVar(Y|x),andtheoverallM.S.E.usingd(X)isE[Var(Y|X)].IfthevalueofYmustbepredictedwithoutanyinformationaboutthevalueofX,thenthebestpredictionthatminimizesM.S.E.isE(Y),andtheM.S.E.isVar(Y).SothereductionintheM.S.E.canbeachievedbyusingtheobervationXis Var(Y)-E[Var(Y|X)] =Var[E(Y|X)].ExampleSupposethatanobservationYisequallylikelytobetakenfromeitheroftwopopulationsand.Iftheobservationistakenfrompopulation,thenthep.d.f.ofYis Iftheobservationistakenfrompopulationthenthep.d.f.ofYisIfthepopulationfromwhichYistobetakenisnotknown,thenthepredictedvalueofYwiththesmallestM.S.E.willbeE(Y),andtheM.S.E.ofthispredictedvaluewillbeVar(Y).Themarginalp.d.f.ofYisg(y)=1/2[g1(y)+g2(y)].SupposeitispossibletolearnfromwhichofthetwopopulationYistobetakenbeforepredictingY.DefineX=1ifYistobetakenfrompopulation,andX=2ifYistobetakenfrompopulation.Thus,fori=1,2,ifYistobetakenfrompopulation ,thenthepredictedvalueofYwiththesmallestM.S.E.willbeE(Y|X=i)andtheM.S.E.ofthispredictionisVar(Y|X=i).TheoverallM.S.E.isThereductioninM.S.E.achievedbyknowingXis:TheSampleMeanTheorem4.8.1.MarkovInequality.SupposethatXisarandomvariablesuchthat.Thenforanygivennumbert>0, Proof.AssumeforconveniencethatXhasadiscreatedistributionwithp.f.f.Then,Remark.TheMarkovinequalityisprimarilyofinterestforlargevaluesoft.Forexample,foranynonnegativerandomvariableXwhosemeanis1,themaximumpossiblevalueofis0.01.Theorem4.8.2.ChebyshevInequality.LetXbearandomvariableforwhichVar(X)exisits.Thenforanygivennumbert>0, Proof.Let.ThenandE(Y)=Var(X).ByapplyingtheMarkovinequalitytoY,wehaveRemark.SupposeVar(X)=PropertiesoftheSampleMeanSupposethatX1,...,Xnformarandomsampleofsizenfromsomedistributionforwhichthemeanisandthevarianceis.Let Thisrandomvariableiscalledthesamplemean.Example4.8.1Supposearandomsampleistobetakenfromadistributionforwhichthemeanisunknown,butthestandarddeviationisknowntobe2.Howlargeasamplemustbetakeninordertomaketheprobabilityatleast0.99thatwillbelessthan1units.Example4.8.3Supposeafaircoinistobetossedntimes.Fori=1,...,n,letXi=1ifaheadisobtainedontheithtossandletXi=0ifatailisobtainedontheithtoss.Thenthesamplemeanistheproportionofheadsthatareobtainedonthentosses.Whatisthenumberoftimesthecoinmustbetossedinordertomake Solution:Letdenotethetotalnumberofheadsobtainedonthentosses,then.Thasabinomialdistributionwithparametersnandp=1/2.SoE(T)=n/2,Var(T)=n/4.UseChebyshevinequality:2.Usebinomialdistribution:forn=15, So15tossesissufficient.TheLawofLargeNumbersConvergenceinProbability.SupposeZ1,Z2,...isasequenceofrandomvariables.Itissaidthatthissequenceconvergestoagivennumberbinprobabilityifforanygivennumber, Thisisrepresentedby OrLawoflargenumbers.SupposethatX1,...,Xnformarandomsamplefromadistributionforwhichthemeanis,andletdenotethesamplemean.Then Proof.Assumethedistributionfromwhichtherandomsampleistakenhasafinitevariance.FromtheChebyshevinequality,forany,Remark.Ifalargerandomsampleistakenfromadistributionforwhichthemeanisunknown,thenthearithmeticaverageofthevaluesinthesamplewillusuallybeacloseestimateoftheunknownmean.ContinuousFunctionsofRandomVariablesIf,andifg(z)isafunctionthatiscontinuousatz=b,thenIfand,andifg(z,y)iscontinuousat(z,y)=(b,c),thenWeakLawsandStrongLawsStrongconvergence.AsequenceofrandomvariablesZ1,Z2,...convergestoaconstantbwithprobability1ifWeakconvergence.Convergenceinprobability.Stronglawoflargenumbers.Ifisthesamplemeanofarandomsampleofsizenfromadistributionwithmean,thenTheCentralLimitTheoremTheCentralLimitTheorem(LindebergandLevy)fortheSampleMean.Theorem5.7.1.IftherandomvariablesX1,...,Xnformarandomsampleofsizenfromagivendistributionwithmeanandvariance().Let thenforanyfixednumberx,Ifalargerandomsampleistakenfromanydistributionwithmeanandvariance,regardlessofthedistributionalform,ThedistributionofwillbeapproximatelyanormaldistributionwithmeanandvarianceThedistributionofthesumwillbeapproximatelyanormaldistributionwithmean andvariance.Example5.7.1:TossingaCoinSupposeafaircoinistossed900times.Whatistheprobabilityofobtainingmorethan495heads?Fori=1,...,900,letXi=1ifaheadisobtainedontheithtossandletXi=0otherwise.ThenE(Xi)=1/2andVar(Xi)=1/4.Fromthecentrallimittheorem,thetotalnumberofheads willbeapproximatelyanormaldistributionwithmean(900)(1/2)=450,variance(900)(1/4)=225,andstandarddeviation.ThevariableZ=(H-450)/15willhaveapproximatelyastandardnormaldistribution.Thus,Example5.7.2:SamplingfromaUniformDistributionSupposethatarandomsampleofsizen=12istakenfromtheuniformdistributionontheinterval(0,1).Whatisthevalueof?Themeanoftheuniformdistributionontheinterval(0,1)is1/2,andthevarianceis1/12.Fromthecentrallimittheorem,thedistributionof willbeapproximatelyanormaldistributionwithmean1/2,variance1/144,andstandarddeviation1/12.Thedistributionofwillbeapproximatelyastandardnormaldistribution.TheCentralLimitTheorem(Liapounov)fortheSumofIndependentRandomVariablesTheorem5.7.2.SupposethattherandomvariablesX1,X2,...areindependent., andfori=1,2,....Also,supposethat Let Then,foranyfixednumberx,Iftherelevantconditionsaresatisfied,foranylargevalueofn,thedistributionofwillbeapproximatelyanormaldistributionwithmeanandvariance.TheCentralLimitTheoremforBernoulliRandomVariablesTheorem5.7.3.SupposethattherandomvariablesX1,...,XnareindependentandthatXihasaBernoullidistributionwithparameterpi(i=1,2,...).Supposealsothattheinfiniteseriesisdivergent. Let thenforanyfixednumberx,ThedistributionofthesumofalargenumberofindependentBernoullirandomvariablescanalwaysbeapproximatelyanormaldistributionwithmeanandvariance.Thenormaldistributionprovidesagoodapproximationwhenthevalueofislarge.Theapproximationwillbebestwhennislargeandthevaluesofp1,...,pnarecloseto½.Example5.7.6:ExaminationQuestionsSupposeanexaminationcontains99questionsarrangedinasequencefromtheeasisttothemostdifficult.Supposetheprobabilitythataparticularstudentwillanswertheithquestioncorrectlyis1-i/100fori=1,...,99.Assumethatallquestionswillbeansweredindependentlyandthatthestudentmustansweratleast60questionscorrectlytopasstheexamination.Whatistheprobabilitythatthestudentwillpass?LetXi=1iftheithquestionisansweredcorrectlyandXi=0otherwise.Then,E(Xi)=pi=1-i/100andVar(Xi)=piqi=(i/100)[1-(i/100)].Fromthecentrallimittheorem,thedistributionof ,willbeapproximatelyanormaldistributionwithmean49.5andstandarddeviation.Sothevariable willhaveapproximatelyastandardnormaldistribution.SoConvergenceinDistributionLetX1,X2,...beasequenceofrandomvariables,andforn=1,2,...,letFndenotethed.f.forXn.LetX*denoteanotherrandomvariablewithd.f.F*.AssumethatF*isacontinuousfunctionovertheentirerealline.ThenthesequenceX1,X2,...convergesindistributiontotherandomvariableX*if SometimeswesimplysaythatXnconvergesindistributiontoX*,andthedistributionofX*iscalledtheasymptoticdistributionofXn.Inthecentrallimittheorems,theasymptoticdistributionofYnisastandardnormaldistribution.ConvergenceoftheMomentGeneratingFunctionsTheorem5.7.4.LetX1,X2,...beasequenceofrandomvariables,andforn=1,2,...,letFndenotethed.f.ofXnandletdenotethem.g.f.ofXn.Also,letX*denoteanotherrandomvariablewithd.f.F*andm.g.f.. Supposethatthem.g.f.’sandexist(n=1,2,...). If forallvaluesoftinsomeintervalaroundt=0,then thesequenceX1,X2,...convergesindistributiontoX*.TheGammaDistributionThegammafunction:foranypositivenumbera,letthevalueG(a)bedefinedbytheintegral: ThefunctionGiscalledthegammafunction.PropertiesoftheGammaFunctionTheorem5.9.1.Ifa>1,then Proof.Accordingtothemethodofintegrationbyparts,Theorem5.9.2.Foranypositiveintegern, G(n)=(n-1)! Proof.ItfollowsfromTheorem5.9.1thatforanyinteger ,Furthermore, Therefore,G(n)=(n-1)!forn=2,3,... Moreover,G(1)=1=0!,soG(n)=(n-1)!forn=1,2,...Let,thendx=ydyandForanypositiveintegern,TheGammaDistributionArandomvariableXhasagammadistri-butionwithparametersaandb(a>0andb>0)ifXhasacontinuousdistributionwithp.d.f.Whenb=1,thep.d.f.ofXisLetY=X/b,thep.d.f.ofYcanbederivedas SoYfollowsaGammadistributionwithparametersaandb.MomentsIfXhasagammadistributionwithparameteraandb,thenfork=1,2,...,wehave Inparticular,MomentGeneratingFunctionThem.g.f.yofXcanbeobtainedas:Theorem5.9.3.IftherandomvariablesX1,...,XkareindependentandifXihasagammadistributionwithparametersaiandb(i=1,...,k),thenthesumX1+...+Xkhasagammadistributionwithparametersa1+...+akandb. Proof.Ifyidenotesthem.g.f.ofXi,thenfori=1,...,k, Ifydenotesthem.g.f.ofthesumX1+...+Xk,thenTheExponentialDistributionArandomvariableXhasanexponentialdistributionwithparameterb(b>0)ifXhasacontinuousdistributionwithp.d.f.ExponentialdistributionisaspecialcaseofGammadistributionwitha=1.MemorylessPropertyIfXhasanexponentialdistributionwithparam

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论