Exam SS19: Question 1.2/Soundness and Completeness of d-separation/Bonus Challenge

Disclaimer: Dieser Thread wurde aus dem alten Forum importiert. Daher werden eventuell nicht alle Formatierungen richtig angezeigt. Der ursprüngliche Thread beginnt im zweiten Post dieses Threads.

Exam SS19: Question 1.2/Soundness and Completeness of d-separation/Bonus Challenge
The exam from the summer semester 2019 contains the following question:

The correct answer for the first question is A and B. In one of the tutorials some of you asked for proof that A and B are indeed not conditionally independent.

The short answer is:

Strictly speaking, they are not guaranteed to be dependent. Every independence you read off a Bayesian network (BN) is guaranteed to hold but there maybe additional independencies not displayed by the BN.

The long answer is:

Bayesian networks (BNs) represent probability distributions as graphs. As such, they are a formal language to talk about probability distributions, they are not probability distributions themselves. As you know from AI I two important properties of formal languages with respect to their models (i.e. what they talk about) are soundness and completeness.

BNs talk about independencies in joint probability distributions in terms of the graph property of d-separation. We have only implicitly talked about d-separation in the lecture but you have been using it all along - the intuition is easy, the definition is complicated.

First let’s define three helpful notions:

A v-structure in a BN is any combination of a node (call it the “v-node”) with two incoming edges.
A trail is any path in the underlying undirected (no arrow heads) version of the BN.
Let E be the (possibly empty) set of observed nodes in a BN. A trail is active by E if (1) for every v-structure (A=> C<= B) on the trail C or a descendent of C is in E and (2) all other nodes on the trail are not in E.

Then A and B are d-separated by E if there is no active trail by E between them.

We have the following relationship between d-separation and independence:

Soundness: If A and B are d-separated by E then A and B are independent given B
“Almost” Completeness: If A and B are independent given E then they are “almost always” d-separated by E.

“Almost” here means “for all probability distributions that can be represented by the BN in question (note that a BN (without values) corresponds to many probability distributions over its nodes/variables), except for a set of measure zero”.

So BNs are not strictly a complete language to talk about independencies in joint probability distributions. It may be the case that independencies exist that are not represented in the BN. Therefore, in general, you cannot prove that two variables are dependent only based on the BN-representation.

Bonus Challenge

That being said: We are currently not sure whether A,B can be conditionally independent given C in the above network under the condition that A and C and B and C are actually dependent. If you can prove it is impossible or provide an example proving that it is possible, 50 bonus points to you.

You can find some initial considerations here: https://gl.kwarc.info/teaching/AI/-/blob/master/SS20/Bonus_Challenge.pdf

Rules: Submissions to my email address please. You can hand in alone or as a group. In the latter case the points will be split between the group members. Likewise, if you hand in the same or a very similar solution as somebody else, the points will be split too. The first five submissions will be considered. The deadline is July 26th.

Best of luck!