CSE443 Reasoning Under Uncertainty

Exercise Sheet 3: Inference in Bayesian Networks

www.csse.monash.edu.au/~annn/443/Ex3.html

  1. Message Passing
  2. Consider (again - see Exercise sheet 2) the belief network for another medical diagnosis example, where B=Bronchitis, S=Smoker, C=Cough, X=Positive X-ray and L=Lung cancer. Suppose that the prior for a patient being a smoker is 0.25, and the prior for the patient having bronchitis (during winter in Melbourne!) is 0.05.
     

    Health BN

    Suppose that evidence is obtained that a patient has a positive xray result and a polytree message passing algorithm was to be used to perform belief updating.

    (a) Write down the pi and lambda values for the following nodes: smoker, bronchitis, cough and positiveXray.

    (b) Show the 3 stages of data propagation by the message passing algorithm in terms of the pi and lambda messages.

    (c) Suppose that a doctor considered that smoking was a contributing factor towards getting bronchitis as well as cancer. The new network structure reflecting this model is as follows.
     

    Health BN 2

    Why is the polytree message passing algorithm unsuitable for performing belief updating on this network?

  3. Clustering and Conditioning
  4. Consider the BN for the "Asia" problem described in lectures (Lauritzen, 1988).

    Asia BN

    (a) Show how the Jensen join-tree algorithm would construct a join-tree from this network. Draw the resultant join-tree.

    (b) Can you perform clustering which results in a tree with less clustered nodes? Draw the resultant cluster tree.

    (c) Identify a cutset of nodes for this BN that splits the network into separate polytree. Draw the resultant polytrees.

  5. Approximate Inference
  6. (a) Why is Logic Sampling a very inefficient approximate inference algorithm? How does likelihood weighting overcome this problem?

    (b) The Kullback-Leibler distance (see Lecture 3, slide 28) can be used to measure the error in the beliefs obtained using an approximate inference algorithm.

    Consider the network structure for the Lecturer's Life example.
     

    Lecturer's Life BN

    Suppose that you observe that the Lecturer is logged on and you run both an exact and an approximate inference algorithm to update the networks beliefs.
     
     
    Exact Inference Approximate
    Bel(in-office=T) 92.3    90.8
    Bel(in-office=F) 7.7 9.2
    Bel(lights-on=T)  46.5 47.4
    Bel(lights-on=F)  53.5 52.6

     
    What is the total KL error for the network? What is the average KL error per node?