生物信息學(xué)課件英文原版課件 (19)_第1頁(yè)
生物信息學(xué)課件英文原版課件 (19)_第2頁(yè)
生物信息學(xué)課件英文原版課件 (19)_第3頁(yè)
生物信息學(xué)課件英文原版課件 (19)_第4頁(yè)
生物信息學(xué)課件英文原版課件 (19)_第5頁(yè)
已閱讀5頁(yè),還剩42頁(yè)未讀 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說(shuō)明:本文檔由用戶(hù)提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

1、1999 Henrik Bengtsson1Bayesian Networks- an introductionHenrik BengtssonComputer Science and TechnologyLund Institute of TechnologyMasters Thesis Project:1999 Henrik Bengtsson2Outline Introduction A simple Bayesian Network Graph Theory The Junction Tree Algorithm hbBN - a simple Bayesian Network too

2、l The Twin-Model Example Summary1999 Henrik Bengtsson3Introduction“The Year 2000 problem1999 Henrik Bengtsson4History60sThe first expert systems. IF-THEN rules.1968Attempts to use probabilities in expert systems (Gorry & Barnett).1973Gave up - to heavy calculations! (Gorry)1976MYCIN: Medical predica

3、te logic expert system with certainty factors (Shortliffe).1976PROSPECTOR: Predicts the likely location of mineral deposits. Uses Bayes rule. (Duda et al.).Summary of the time up until mid 80s:“Pure logic will solve the AI problems!“Probability theory is intractable to use and too complicated for co

4、mplex models.Example (PROSPECTOR):P(d | a)= P(d | a b) P(b | a) + P(d | a b) P( b | a)Certainty Factor (MYCIN):A real value (-1,+1): -1:expression is known to be false. 0:no belief either way.+1:expression is known to be true.Example:rule 34:a b c q(+0.7)rule 35:d q r s(-0.9)1999 Henrik Bengtsson5Bu

5、t.1986Bayesian networks were revived and reintroduced to expert systems (Pearl).1988Breakthrough for efficient calculation algorithms (Lauritzen & Spiegelhalter) tractable calculations on BNs.1995In Windows95 for printer-trouble shooting and Office assistance (“the paper clip).1999BN is getting more

6、 and more used. Ex. Gene expression analysis, Business strategy etc.2000Widely used - a BN tool will be shipped with every Windows Commercial Server.Bayesian Networks is the future!1999 Henrik Bengtsson6abcdA Bayesian Network has two parts:1)qualitative partthe structuredirected acyclic graph (DAG)v

7、ertices represent variablesedges represent relations between variables2)quantitative partthe strength of relationship between variablesconditional probability functionWhat is a Bayesian Network?P(Xa)P(Xb|Xa)P(Xd|Xa)P(Xc|Xb,Xd)1999 Henrik Bengtsson7Models probabilitiesGives posterior beliefs given so

8、me observationsCan be used as a classifierCan explain whyCan find the variables with the most impactAlgorithms existsCan model expert (subjective) knowledgeCan automatically be learned from raw dataSimple - my grandmother can use it!Why Bayesian Networks?1999 Henrik Bengtsson8A simple Bayesian Netwo

9、rkP(XIcy):yes0.7no0.3P(XHolmes|XIcy):yesnoyes0.80.2no0.10.9P(XWatson|XIcy):yesnoyes0.80.2no0.10.9“Icy Roads 1999 Henrik Bengtsson9 “Watson has had an accident! P(XWatson=yes)=1Bayes Rule P(XIcy | XWatson=yes) = (0.95,0.05)(0.70,0.30)a priori?Joint Probability + Marginalization P(XHolmes | XWatson=ye

10、s) = (0.76,0.24)(0.59,0.41)a priori 1999 Henrik Bengtsson10 “No, the roads are not icy. P(XIcy=no)=1When initiating XIcyXHolmes becomes independent of XWatson ;XHolmes XWatson | XIcy1999 Henrik Bengtsson11a bcd1234conflict5678conflictThis naive approach of updating the network inherits oscillation p

11、roblems!1999 Henrik Bengtsson12Idea behind the Junction Tree Algorithmabcdabcdcleveralgorithm1999 Henrik Bengtsson13abcdefghSecondary Structure/Junction Tree multi-dim. random variables joint probabilities (potentials)Bayesian Networkone-dim. random variablesconditional probabilitiesabdadeacecegeghd

12、efadaecedeeg12341999 Henrik Bengtsson14abcdGraph TheoryA graph G = ( V ,E ) consists of:a set of vertices Va set of edges EEx. V = a,b,c,d, E = ab, ad, bc, dce=uv : u is a parent of v, and v a child of u.pa(u) = the parent set of u, ch(u) = the child set of uEx. pa(b) = pa(d) = a, ch(a) = b,d, ch(d)

13、 = the family of u: fa(u) = u pa(u)Ex. fa(a) = a, fa(b) = a,b, fa(c) = a,c, fa(d) = b,c,dthe neighbors of u: ne(u) = u pa(u) ch(u)Ex. ne(a) = a,b,d, ne(b) = a,b,c, ne(c) = b,c,d, ne(d) = a,c,d1999 Henrik Bengtsson15A complete graph is a graph where all variables are pairwise connected.abcYESabcdNOA

14、clique is maximal complete subgraph.abcdTwo cliques: C1 = a,b,d, C2 = b,c,d1999 Henrik Bengtsson16Junction Tree Algorithm1.Transformation (of structure)a) Moralizeb) Triangulatec) Build junction tree2.Initialization (of values)a) Set up potentialsb) Propagate potentials3.Update beliefsa) Insert evid

15、ence into the junction treeb) Propagate potentialsQualitative Quantitative Step 1 and step 2 is performed only once1999 Henrik Bengtsson17Step 1a: Moralizationabcdefghabcdefghabcdefgh1. For all w V: For all u,vpa(w) add an edge e=u-v.2. Undirect all edges.GMG = ( V , E )1999 Henrik Bengtsson18Step 1

16、b: TriangulationAdd edges to GM such that there is no cyclewith length 4 that does not contain a chord.NOYESabcdefghabcdefghGMGT1999 Henrik Bengtsson19Elimination of a vertex1. Connect all neighbors pairwise.2. Remove the vertex and edges connected to it.1999 Henrik Bengtsson20abcdefghabcdefghabcdef

17、gabcdefabcdeabdeaaeadevertexinduced addedremovedclique edges1hegh-2gceg-3fdef-4cacea-evertexinduced added removed clique edges5babda-d6dade-7eae-8aa-GTGMEliminate the vertex that requires least number of edges to be added.1999 Henrik Bengtsson21abdacedefadeeghcegcliquesabcdefghBayesian NetworkG = (

18、V , E )abcdefghabcdefghMoral graph GMTriangulated graph GT1999 Henrik Bengtsson22abdacedefadeeghcegabdadeacecegeghdefadaecedeegsepsetsStep 1c: Building junction treeIn JT cliquesbecomesverticesGJTEx: ceg egh = eg1999 Henrik Bengtsson23Junction Tree PropertiesAll vertices C and sepsets S along the pa

19、th between any two vertices A and B contain the intersection AB.abdadeacecegeghdefadaecedeegEx: A=a,b,d, B=a,c,e AB=aC=a,d,ea, S1=a,da, S2=a,eaABCS1S21999 Henrik Bengtsson24Junction Tree Algorithm1.Transformation (of structure)a) Moralize b) Triangulate c) Build junction tree 2.Initialization (of va

20、lues)a) Set up potentialsb) Propagate potentials3.Update beliefsa) Insert evidence into the junction treeb) Propagate potentialsQualitative Quantitative Step 1 and step 2 is performed only once1999 Henrik Bengtsson25PotentialsDEFINITION: A potential A over a set of variables XA is a function that ma

21、ps each instantiation of xA into a non-negative real number. We denote the number that A maps xA by A(xA).Ex: A potential abc over the set of vertices a,b,c. Xa has four states, and Xb and Xc has three states.A joint probability is a special case of a potential where A(xA)=1.1999 Henrik Bengtsson26D

22、ecomposable DistributionDEFINITION: A probability distribution is said to be decomposable with respect to graph G = ( V , E ) if G is triangulated and it hold for any clusters A and B with separator C that XA XB | XCStep 1 & 2 of the junction tree algorithm guarantees this property!1999 Henrik Bengt

23、sson27Factorization of potentialsTHEOREM: Given a decomposable probability distribution P(XV) on the graph G = ( V , E ) it can be written as a the product of all potentials of the cliques divided by the product of all potentials of the sepsets: all cliques CP(XV) = all sepsets S1999 Henrik Bengtsso

24、n28Step 2a: Setting up potentials1. For each cluster C and sepset S;C 1 , S 12. For each vertex u in the BN select a parent cluster C s.t. C fa(u). Include the conditional probability P( Xu | Xpa(u) ) into C ;C C P( Xu | Xpa(u) ) all vertices P( Xu | Xpa(u) )= P(XV)1 all cliques C all sepsets S=”P(pán)RO

25、OF:”1999 Henrik Bengtsson29The potentials in the junction tree are not consistent with each other., i.e. if we use marginalization to get the probability distribution for a variable Xu we will get different results depending on which clique we use. abdadeacecegeghdefadaecedeegP(Xa) = ade = (0.02, 0.

26、43, 0.31, 0.12)deP(Xa) = ace = (0.12, 0.33, 0.11, 0.03)ceThe potentials might not even sum to one, i.e. they are not joint probability distributions.1999 Henrik Bengtsson30Message Passing from clique A to clique B 1. Project the potential of A into SAB 2. Absorb the potential of SAB into B Projectio

27、n AbsorptionStep 2b: Propagating potentials1999 Henrik Bengtsson311. COLLECT-EVIDENCEmessages1-52. DISTRIBUTE-EVIDENCEmessages6-10 (both methods are recursive)Global Propagation32514810976Start here!abdadeacecegeghdefadaecedeeg1999 Henrik Bengtsson32A priori distributionglobal propagation potentials

28、 are consistent Marginalizations gives probability distributions for the variables1999 Henrik Bengtsson33Summaryabcdefghabcdefgh32514810976abdadeacecegeghdefadaecedeegabdadeacecegeghdefadaecedeeg1. For each cluster C and sepset S;C 1 , S 12. For each vertex u in the BN select a parent cluster C s.t.

29、 C fa(u). Include the conditional probability P( Xu | Xpa(u) ) into C ;C C P( Xu | Xpa(u) )1999 Henrik Bengtsson34Junction Tree Algorithm1.Transformation (of structure)a) Moralize b) Triangulate c) Build junction tree 2.Initialization (of values)a) Set up potentialsb) Propagate potentials3.Update be

30、liefsa) Insert evidence into the junction treeb) Propagate potentialsQualitative Quantitative Step 1 and step 2 is performed only once1999 Henrik Bengtsson35Evidence is new information about a r.v. that changes our belief about its distribution.Ex. Before receiving evidenceP(Xu) = (0.14, 0.43, 0.31,

31、 0.12)Hard evidence - the r.v. is instantiated (observed)Xu=xu P(Xu) := (0, 0, 1, 0)Soft evidence - everything else.Xu java hb.math.bn.HBBN IcyRoads.xbs Can be called from Matlab or any other environment. Generates figures to be included in LaTeX Only discrete variables1999 Henrik Bengtsson40 * * Th

32、e information that Watson has crashed * updates the probability that the roads are * icy and that Holmes also has crashed. * yes IcyRoadsW 3 * * At last, when Inspector is convinced that the * roads are not icy, then P(H|I=n)=(0.1,0.9). * - page 2 -Example of hbBN script language (.xbs) * Icy Roads

33、This example is taken from An Introduction to Bayesian Networks Finn V. Jensen, 1996 * Loading network. OK IcyRoads 0 yes IcyRoadsJoinTree - page 1 -1999 Henrik Bengtsson41- page 2 - 0.7 0.3 0.8 0.2 0.1 0.9 0.8 0.2 0.1 0.9 - page 2 -Icy Roads model in XML Belief Network File format Watson yes no Holmes yes no Icy yes no - page 1 -1999 Henrik Bengtsson42The Twin-Model Example“the execution of a prisoner or notCourtOrder CaptainCaptain RiflemanACaptain RiflemanB RiflemanA RiflemanB Pris

溫馨提示

  • 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶(hù)所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶(hù)上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶(hù)上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶(hù)因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

評(píng)論

0/150

提交評(píng)論