版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
1、第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models Weike Pan, and Congfu Xupanweike, xucongfu Institute of Artificial Intelligence College of Computer Science, Zhejiang UniversityOctober 12, 2006浙江大学计算机学院人工智能引论课件ReferencesAn Introduction to Probabilistic Graphical Models. Michael
2、I. Jordan. OutlinePreparationsProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights of PGMOutlinePreparationsPGM “is” a universal modelDifferent thoughts of machine learningDifferent training approachesDifferent data typesBayesian FrameworkChain rules of probability theoryConditiona
3、l IndependenceProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights of PGMDifferent thoughts of machine learningStatistics (modeling uncertainty, detailed information) vs. Logics (modeling complexity, high level information)Unifying Logical and Statistical AI. Pedro Domingos, Univer
4、sity of Washington. AAAI 2006.Speech: Statistical information (Acoustic model + Language model + Affect model) + High level information (Expert/Logics)Different training approachesMaximum Likelihood Training: MAP (Maximum a Posteriori) vs. Discriminative Training: Maximum Margin (SVM)Speech: classic
5、al combination Maximum Likelihood + Discriminative TrainingDifferent data typesDirected acyclic graph (Bayesian Networks, BN)Modeling asymmetric effects and dependencies: causal/temporal dependence (e.g. speech analysis, DNA sequence analysis)Undirected graph (Markov Random Fields, MRF)Modeling symm
6、etric effects and dependencies: spatial dependence (e.g. image analysis)PGM “is” a universal modelTo model both temporal and spatial data, by unifyingThoughts: Statistics + LogicsApproaches: Maximum Likelihood Training + Discriminative Training Further more, the directed and undirected models togeth
7、er provide modeling power beyond that which could be provided by either alone.Bayesian FrameworkWhat we care is the conditional probability, and its is a ratio of two marginal probabilities.A posteriori probabilityLikelihoodPriori probabilityClass iNormalization factorObservationProblem description
8、Observation Conclusion (classification or prediction)Bayesian ruleChain rules of probability theoryConditional IndependenceOutlinePreparationsProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights of PGMPGMNodes represent random variables/statesThe missing arcs represent conditional
9、independence assumptions The graph structure implies the positionDirected PGM (BN)RepresentationConditional IndependenceProbability DistributionQueriesImplementationInterpretationProbability DistributionDefinition of Joint Probability DistributionCheck:RepresentationGraphical models represent joint
10、probability distributions more economically, using a set of “local” relationships among variables.Conditional Independence (basic)Assert the conditional independence of a node from its ancestors, conditional on its parents.Interpret missing edges in terms of conditional independenceConditional Indep
11、endence (3 canonical graphs) Classical Markov chain“Past”, “present”, “future”Common causeY “explains” all the dependencies between X and ZMarginal Independence Common effect Multiple, competing explanationConditional IndependenceConditional Independence (check)One ing arrow and one outgoing arrowTw
12、o outgoing arrowsTwo ing arrowsCheck through reachabilityBayes ball algorithm (rules)OutlinePreparationsProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights of PGMUndirected PGM (MRF)RepresentationConditional IndependenceProbability DistributionQueriesImplementationInterpretationPr
13、obability Distribution(1)CliqueA clique of a graph is a fully-connected subset of nodes.Local functions should not be defined on domains of nodes that extend beyond the boundaries of cliques.Maximal cliquesThe maximal cliques of a graph are the cliques that cannot be extended to include additional n
14、odes without losing the probability of being fully connected.We restrict ourselves to maximal cliques without loss of generality, as it captures all possible dependencies.Potential function (local parameterization) : potential function on the possible realizations of the maximal clique Probability D
15、istribution(2)Maximal cliquesProbability Distribution(3)Joint probability distribution Normalization factorBoltzman distributionConditional IndependenceIts a “reachability” problem in graph theory.RepresentationOutlinePreparationsProbabilistic Graphical Models (PGM)Directed PGMUndirected PGMInsights
16、 of PGMInsights of PGM (Michael I. Jordan)Probabilistic Graphical Models are a marriage between probability theory and graph theory. A graphical model can be thought of as a probabilistic database, a machine that can answer “queries” regarding the values of sets of random variables. We build up the database in pieces, using probability theory to ensure that the pieces have a consistent overall interpretation. Probability theory also justifies the inferential machinery that allows the pieces to be put together “on the fly” to answer the queries. I
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2020-2021学年湖南省五市十校教研教改共同体高一下学期期末考试地理试题
- 小学五年级数学小数乘除法计算练习题-集
- 《急性咽炎》课件
- 小学数学四年级上册《小数加减混合运算》教学设计
- 《行政法讲义》课件
- 《菱镁矿开采工艺》课件
- 护栏工程劳务作业内容及技术参数
- 《刑法分则的适用》课件
- 高校美术教育实践经验总结计划
- 小学班主任工作经历总结
- 2024旅行社承包经营合同
- 地下车库地面改造施工方案
- 成人有创机械通气气道内吸引技术操作标准解读
- 《护患沟通》课件
- 洗浴用品购销合同模板
- 电能质量-公用电网谐波
- 部编人教版道德与法治八年级上册:(1-4)单元全套练习题4套(含解析)
- 电火灶-编制说明
- 幼儿园幼小衔接方案模板
- 批评与自我批评表
- 2024年商用密码应用安全性评估从业人员考核试题库-中(多选题)
评论
0/150
提交评论