




版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
PatternRecognitionNanyangTechnologicalUniversityDr.Shi,DamingHarbinEngineeringUniversity1PatternRecognitionNanyangTec标题添加点击此处输入相关文本内容点击此处输入相关文本内容总体概述点击此处输入相关文本内容标题添加点击此处输入相关文本内容标题添加点击此处输入相点击此处输入总体概述点击此处输入标题添WhatisPatternRecognitionClassifyrawdataintothe‘category’ofthepattern.Abranchofartificialintelligenceconcernedwiththeidentificationofvisualoraudiopatternsbycomputers.Forexamplecharacterrecognition,speechrecognition,facerecognition,etc.
Twocategories:syntactic(orstructural)patternrecognitionandstatisticalpatternrecognitionIntroductionPatternRecognition=PatternClassification3WhatisPatternRecognitionCla44WhatisPatternRecognitionTrainingPhaseTrainingdataUnknowndataFeatureExtractionLearning(Featureselection,clustering,discriminantfunctiongeneration,grammarparsing)
Recognition(statistical,structural)ResultsRecognitionPhaseKnowledge5WhatisPatternRecognitionTraWhatisPatternRecognitionTrainingPhaseTrainingdataUnknowndataFeatureExtractionLearning(Featureselection,clustering,discriminantfunctiongeneration,grammarparsing)
Recognition(statistical,structural)ResultsRecognitionPhaseKnowledge6WhatisPatternRecognitionTraCategorisationBasedonApplicationAreasFaceRecognitionSpeechRecognitionCharacterRecognitionetc,etcBasedonDecisionMakingApproachesSyntacticPatternRecognitionStatisticalPatternRecognitionIntroduction7CategorisationBasedonApplicaSyntacticPatternRecognitionAnyproblemisdescribedwithformallanguage,andthesolutionisobtainedthroughgrammaticalparsingInMemoryofProf.FU,King-SunandProf.ShuWenhaoIntroduction8SyntacticPatternRecognitionAStatisticalPatternRecognitionInthestatisticalapproach,eachpatternisviewedasapointinamulti-dimensionalspace.Thedecisionboundariesaredeterminedbytheprobabilitydistributionofthepatternsbelongingtoeachclass,whichmusteitherbespecifiedorlearned.Introduction9StatisticalPatternRecognitioScopeoftheSeminarModule1Distance-BasedClassificationModule2ProbabilisticClassificationModule3LinearDiscriminantAnalysisModule4NeuralNetworksforP.R.Module5ClusteringModule6FeatureSelectionIntroduction10ScopeoftheSeminarModule1DModule1Distance-BasedClassificationNanyangTechnologicalUniversityDr.Shi,DamingHarbinEngineeringUniversityPatternRecognition11Module1Distance-BasedClassiOverviewDistancebasedclassificationisthemostcommontypeofpatternrecognitiontechniqueConceptsareabasisforotherclassificationtechniquesFirst,aprototypeischosenthroughtrainingtorepresentaclassThen,thedistanceiscalculatedfromanunknowndatatotheclassusingtheprototype
Distance-BasedClassification12OverviewDistancebasedclassifClassificationbydistanceObjectscanberepresentedbyvectorsinaspace.Intraining,wehavethesamples:Inrecognition,anunknowndataisclassifiedbydistance:Howtorepresentclasses?Distance-BasedClassification13ClassificationbydistanceObjePrototypeTofindthepattern-to-classdistance,weneedtouseaclassprototype(pattern):(1)SampleMean.Forclassci,(2)MostTypicalSample.chooseSuchthatisminimized.Distance-BasedClassification14PrototypeTofindthepattern-tPrototype–NearestNeighbour(3)NearestNeighbour.chooseSuchthatisminimized.Nearestneighbourprototypesaresensitivetonoiseandoutliersinthetrainingset.Distance-BasedClassification15Prototype–NearestNeighbour(Prototype–k-NN(4)k-NearestNeighbours.K-NNismorerobustagainstnoise,butismorecomputationallyexpensive.Thepatternyisclassifiedintheclassofitsknearestneighboursfromthetrainingsamples.Thechosendistancedetermineshow‘near’isdefined.Distance-BasedClassification16Prototype–k-NN(4)k-NearestDistanceMeasuresMostfamiliardistancemetricistheEuclideandistanceAnotherexampleistheManhattandistance:Manyotherdistancemeasures…Distance-BasedClassification17DistanceMeasuresMostfamiliarMinimumEuclideanDistance(MED)ClassifierEquivalently,18MinimumEuclideanDistance(MEDecisionBoundaryGivenaprototypeandadistancemetric,itispossibletofindthedecisionboundarybetweenclasses.LinearboundaryNonlinearboundaryDecisionBoundary=DiscriminantFunctionDistance-BasedClassificationlightnesslengthlightnesslength19DecisionBoundaryGivenaprotoExampleDistance-BasedClassification20ExampleDistance-BasedClassifiExampleAnyfishisavectorinthe2-dimensionalspaceofwidthandlightness.fishDistance-BasedClassificationlightnesslength21ExampleAnyfishisavectorinExampleDistance-BasedClassification22ExampleDistance-BasedClassifiSummaryClassificationbythedistancefromanunknowndatatoclassprototypes.Choosingprototype:SampleMeanMostTypicalSampleNearestNeighbourK-NearestNeighbourDecisionBoundary=DiscriminantFunctionDistance-BasedClassification23SummaryClassificationbythedModule2ProbabilisticClassificationNanyangTechnologicalUniversityDr.Shi,DamingHarbinEngineeringUniversityPatternRecognition24Module2ProbabilisticClassifReviewandExtend25ReviewandExtend25MaximumAPosterior(MAP)ClassifierIdeally,wewanttofavourtheclasswiththehighestprobabilityforthegivenpattern:WhereP(Ci|x)istheaposteriorprobabilityofclassCi
givenx26MaximumAPosterior(MAP)ClasBayesianClassificationBayes’Theoreom:WhereP(x|Ci)istheclassconditionalprobabilitydensity(p.d.f),whichneedstobeestimatedfromtheavailablesamplesorotherwiseassumed.WhereP(Ci)isaprioriprobabilityofclassCi.ProbabilisticClassification27BayesianClassificationBayes’MAPClassifierBayesianClassifier,alsoknownasMAPClassifierSo,assignthepatternxtotheclasswithmaximumweightedp.d.f.ProbabilisticClassification28MAPClassifierBayesianClassifAccuracyVS.RiskHowever,intherealworld,lifeisnotjustaboutaccuracy.Insomecases,asmallmisclassificationmayresultinabigdisaster.Forexample,medicaldiagnosis,frauddetection.TheMAPclassifierisbiasedtowardsthemostlikelyclass.–maximumlikelihoodclassification.ProbabilisticClassification29AccuracyVS.RiskHowever,intLossFunctionOntheotherhand,inthecaseofP(C1)>>P(C2),thelowesterrorratecanbeattainedbyalwaysclassifyingasC1Asolutionistoassignalosstomisclassification.whichleadsto…Alsoknownastheproblemofimbalancedtrainingdata.ProbabilisticClassification30LossFunctionOntheotherhandConditionalRiskInsteadofusingthelikelihoodP(Ci|x),weuseconditionalriskcostofactionigivenclassj
Tominimizeoverallrisk,choosetheactionwiththelowestriskforthepattern:ProbabilisticClassification31ConditionalRiskInsteadofusiConditionalRiskProbabilisticClassification32ConditionalRiskProbabilisticExampleAssumingthattheamountoffraudulentactivityisabout1%ofthetotalcreditcardactivity:C1=FraudP(C1)=0.01C2=NofraudP(C2)=0.99Iflossesareequalformisclassification,then:ProbabilisticClassification33ExampleAssumingthattheamounExampleHowever,lossesareprobablynotthesame.Classifyingafraudulenttransactionaslegitimateleadstodirectdollarlossesaswellasintangiblelosses(e.g.reputation,hasslesforconsumers).Classifyingalegitimatetransactionasfraudulentinconveniencesconsumers,astheirpurchasesaredenied.Thiscouldleadtolossoffuturebusiness.Let’sassumethattheratiooflossfornotfraudtofraudis1to50,i.e.,Amissedfraudis50timesmoreexpensivethanaccidentallyfreezingacardduetolegitimateuse.ProbabilisticClassification34ExampleHowever,lossesareproExampleByincludingthelossfunction,thedecisionboundarieschangesignificantly.InsteadofWeuseProbabilisticClassification35ExampleByincludingthelossfProbabilityDensityFunctionRelativelyspeaking,it’smucheasytoestimateaprioriprobability,e.g.simplytakeToestimatep.d.f.,wecan(1)Assumeaknownp.d.f,andestimateitsparameters(2)Estimatethenon-parametricp.d.ffromtrainingsamplesProbabilisticClassification36ProbabilityDensityFunctionReMaximumLikelihoodParameterEstimationWithoutthelossofgenerality,weconsiderGaussiandensity.P(x|Ci)=TrainingexamplesforclassCiParametervaluestobeidentifiedWearelookingforthatmaximizethelikelihood,soThesamplecovariancematrix!37MaximumLikelihoodParameterEDensityEstimationifwedonotknowthespecificformofthep.d.f.,thenweneedadifferentdensityestimationapproachwhichisanon-parametrictechniquethatusesvariationsofhistogramapproximation.(1)Simplestdensityestimationistouse“bins”.e.g.,in1-Dcase,takethex-axisanddivideintobinsoflengthh.Estimatetheprobabilityofasampleineachbin.kNisthenumberofsamplesinthebin(2)Alternatively,wecantakewindowsofunitvolumeandapplythesewindowstoeachsample.Theoverlapofthewindowsdefinestheestimatedp.d.f.ThistechniqueisknownasParzenwindowsorkernels.ProbabilisticClassification38DensityEstimationifwedonotSummaryBayesianTheoreomMaximumAPosteriorClassifier=MaximumLikelihoodclassiferDensityEstimationProbabilisticClassification39SummaryBayesianTheoreomProbaModule3LinearDiscriminantAnalysisNanyangTechnologicalUniversityDr.Shi,DamingHarbinEngineeringUniversityPatternRecognition40Module3LinearDiscriminantALinearClassifier-1Alinearclassifierimplementsdiscriminantfunctionoradecisionboundaryrepresentedbyastraightlineinthemultidimensionalspace.Givenaninput,x=(x1…xm)TthedecisionboundaryofalinearclassifierisgivenbyadiscriminantfunctionWithweightvectorw=(w1…wm)TLDA41LinearClassifier-1AlinearLinearClassifier-2Theoutputofthefunctionf(x)foranyinputwilldependuponthevalueofweightvectorandinputvector.Forexample,thefollowingclassdefinitionmaybeemployed:Iff(x)>0ThenxisBalletdancerIff(x)≤0ThenxisRugbyplayerLDA42LinearClassifier-2TheoutpuLinearClassifier-3x1x2f(x)>0f(x)<0f(x)=0wTheboundaryisalwaysorthogonaltotheweightvectorwTheinnerproductoftheinputvectorandtheweightvector,wTx
wTxisthesameforallpointsontheboundary--(-b).LDA43LinearClassifier-3x1x2f(x)>Perceptronx=(x1
…xm)Tw=(w1
…wm)TInputsOutput
Activation
Function
w2
w1
Linear
Combiner
bx2x1yLDA44Perceptronx=(x1…xm)Tw=(wMulti-classproblemLDA45Multi-classproblemLDA45LimitationofPerceptronAsingle-layerperceptroncanperformpatternclassificationonlyonlinearlyseparablepatterns.(a)LinearlySeparablePatterns(b)Non-linearlySeparablePatternsLDA46LimitationofPerceptronAsingGeneralizedLinearDiscriminantFunctionsDecisionboundarieswhichseparatebetweenclassesmaynotalwaysbelinear
Thecomplexityoftheboundariesmaysometimesrequesttheuseofhighlynon-linearsurfaces
Apopularapproachtogeneralizetheconceptoflineardecisionfunctionsistoconsiderageneralizeddecisionfunctionas:LDAwhereisanonlinearmappingfunction47GeneralizedLinearDiscriminanSummaryLinearclassifierVectoranalysisPerceptronPerceptroncannotclassifylinearlynon-separablepatternsMLP,RBF,SVMLDA48SummaryLinearclassifierLDA48Module4NeuralNetworksforPatternRecognitionNanyangTechnologicalUniversityDr.Shi,DamingHarbinEngineeringUniversityPatternRecognition49Module4NeuralNetworksforPDetailsinanotherseminar:NeuralNetworks50Detailsinanotherseminar:50Module5ClusteringNanyangTechnologicalUniversityDr.Shi,DamingHarbinEngineeringUniversityPatternRecognition51Module5ClusteringNanyangTecSupervisedLearningVS.unsupervisedLearningClusteringSupervisedLearning(Thetargetoutputisknown)Foreachtraininginputpattern,thenetworkispresentedwiththecorrecttargetanswer(thedesiredoutput)byateacher.UnsupervisedLearning(Thetargetoutputisunknown)Foreachtraininginputpattern,thenetworkadjustsweightswithoutknowingthecorrecttarget.Inunsupervisedtraining,thenetworkself-organizestoclassifysimilarinputpatternsintoclusters.52SupervisedLearningVS.unsupeClusteringCluster:asetofpatternsthataremoresimilartoeachotherthantopatternsnotinthecluster.Givenunlabelledsamplesandhavenoinformationabouttheclasses.Wanttodiscoverifthereareanynaturallyoccurringclustersinthedata.Twoapproaches:ClusteringbyDistanceMeasureClusteringbyDensityEstimationClustering53ClusteringCluster:asetofpaClusteringbyDistanceTwoissues:Howtomeasurethesimilaritybetweensamples?Howtoevaluateapartitioningofasetintoclusters?TypicaldistancemetricsincludeEuclideanDistance,HammingDistance,etc.Clustering54ClusteringbyDistanceTwoissuGoodnessofPartitioningWecanuseameasureofthescatterofeachclustertogaugehowgoodtheoverallclusteringis.Ingeneral,wewouldlikecompactclusterswithalotofspacebetweenthem.WecanusethemeasureofgoodnesstoiterativelymovesamplesfromoneclustertoanothertooptimizethegroupingClustering55GoodnessofPartitioningWecanCriterion:sumofsquarederrorThiscriteriondefinesclustersastheirmeanvectorsmi
inthesensethatitminimizesthesumofthesquaredlengthsoftheerrorx-mi.TheoptimalpartitionisdefinedasonethatminimizesJe,alsocalledminimumvariancepartition.Workfinewhenclustersformwellseparatedcompactclouds,lesswhentherearegreatdifferencesinthenumberofsamplesindifferentclusters.Clustering56Criterion:sumofsquarederroCriterion:ScatterScattermatricesusedinmultiplediscriminantanalysis,i.e.,thewithin-scattermatrixSWandthebetween-scattermatrixSB
ST=SB+SW thatdoesdependonlyfromthesetofsamples(notonthepartitioning)Thecriteriacanbetominimizethewithin-clusterormaximizethebetween-clusterscatterThetrace(sumofdiagonalelements)isthesimplestscalarmeasureofthescattermatrix,asitisproportionaltothesumofthevariancesinthecoordinatedirectionsClustering57Criterion:ScatterScattermatrIterativeoptimizationOnceacriterionfunctionhasbeemselected,clusteringbecomesaproblemofdiscreteoptimization.Asthesamplesetisfinitethereisafinitenumberofpossiblepartitions,andtheoptimalonecanbealwaysfoundbyexhaustivesearch.Mostfrequently,itisadoptedaniterativeoptimizationproceduretoselecttheoptimalpartitionsThebasicidealiesinstartingfromareasonableinitialpartitionand“move”samplesfromoneclustertoanothertryingtominimizethecriterionfunction.Ingeneral,thiskindsofapproachesguaranteelocal,notglobal,optimization.Clustering58IterativeoptimizationOnceaK-MeansClustering-1k-meansclusteringalgorithmInitialization.t=0.Chooserandomvaluesfortheinitialcentersck(t),
k=1,…,KSampling.DrawasamplefromthetrainingsamplesetSimilaritymatching.k(x)denoteindexofbestmatchingcenter4)
Updating.Foreveryk=1,…,K5)
Continuation.t=t+1,gobacktostep(2)untilnonoticeablechangesareobservedClustering59K-MeansClustering-1k-meansK-MeansClustering-2c1c2Clustering60K-MeansClustering-2c1c2ClusK-MeansClustering-3c1c3c2Clustering61K-MeansClustering-3c1c3c2ClClusteringbyDensityEstimatione.g.Findingthenucleusandcytoplasmpelsinwhitebloodcells.ImageGrey-levelHistogram:Setß=valley(localminimum)Ifvalue>ßpeliscytoplasmIfvalue<ßpelisnucleusthisisclusteringbasedondensityestimation.peaks=clustercentres.valleys=clusterboundariesClustering62ClusteringbyDensityEstimatiParameterizedDensityEstimationWeshallbeginwithparameterizedp.d.f.,inwhichtheonlythingthatmustbelearnedisthevalueofanunknownparametervector
Wemakethefollowingassumptions:
Thesamplescomefromaknownnumbercofclasses
ThepriorprobabilitiesP(j)foreachclassareknown
P(x|j,j)(j=1,…,c)areknown
Thevaluesofthecparametervectors1,2,…,careunknownClustering63ParameterizedDensityEstimatiMixtureDensityThecategorylabelsareunknown,andthisdensityfunctioniscalledamixturedensity,andOurgoalwillbetousesamplesdrawnfromthismixturedensitytoestimatetheunknownparametervector.Onceisknown,wecandecomposethemixtureintoitscomponentsanduseaMAPclassifieronthederiveddensities.Clustering64MixtureDensityThecategorylaChineseYing-YangPhilosophyEverythingintheuniversecanbeviewedasaproductofaconstantconflictbetweentheopposites–YingandYang.YingnegativefemaleinvisiblepositivemalevisibleYangTheoptimalstatusisreachedifYing-YangachievesharmonyClustering65ChineseYing-YangPhilosophyEvBayesianYing-YangClusteringTofindaclustersytopartitioninputdataxxisvisiblebutyisinvisiblexdecidesyintrainingbutydecidesxinrunningp(x,y)=p(y|x)p(x)p(x,y)=p(x|y)p(y)xyp(,)Clustering66BayesianYing-YangClusteringTBayesianYingYangHarmonyLearning(1)TominimisethedifferencebetweentheYing-Yangpair:Toselecttheoptimalmodel(clusternumber):whereClustering67BayesianYingYangHarmonyLeaBayesianYingYangHarmonyLearning(2)ParameterlearningusingEMalgorithmE-Step:M-Step:Clustering68BayesianYingYangHarmonyLeaSummaryClusteringbyDistanceGoodnessofparetitioningK-meansClusteringbyDensityEstimationBYYClustering69SummaryClusteringbyDistanceCModule6FeatureSelectionNanyangTechnologicalUniversityDr.Shi,DamingHarbinEngineeringUniversityPatternRecognition70Module6FeatureSelectionNanyMotivationFeatureSelectionClassifierperformancedependonacombinationofthenumberofsamples,numberoffeatures,andcomplexityoftheclassifier.Q1:Themoresamples,thebetter?Q2:Themorefeatures,thebetter?Q3:Themorecomplex,thebetter?However,thenumberofsamplesisfixedwhentrainingBothrequirestoreducethenumberoffeatures71Motivatio
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 艺人专属化妆合同范本
- 院标设计合同范本
- 平价转让的合同范本
- 农药认购合同范本
- 通信器材采购合同范本
- 模具加工协议合同范本
- 商场安全施工合同范本
- 非标定制合同范本
- 厂区地面工程合同范本
- 玩具销售协议合同范本
- 月考测试卷(第一、二单元)试题-2023-2024学年六年级下册语文统编版
- 和静县备战矿业有限责任公司备战铁矿采选改扩建工程环评报告
- 超级大富翁活动方案课件
- 儿童乐理课课件
- 大学语文(第三版)教案 第三讲 辩论
- 优质课一等奖小学综合实践《生活中的小窍门》
- 比逊笔试卷附有答案
- 《人体形态与结构》考试复习题库(含答案)
- 农场、畜牧场的消防安全与草堆防火
- 化疗相关性恶心呕吐(CINV)的护理
- 危险化学品安全生产规章制度和岗位操作规程的目录清单
评论
0/150
提交评论