Conditional Independence - Basic Definitions and Properties #
This file defines conditional independence for random variables and establishes basic properties. The definition uses indicator functions on measurable rectangles.
Main definitions #
CondIndep Y Z W μ: Y and Z are conditionally independent given W under measure μ, denoted Y ⊥⊥_W Z, defined via indicator test functions on Borel sets.
Main results #
condIndep_symm: Conditional independence is symmetric (Y ⊥⊥_W Z ↔ Z ⊥⊥_W Y)condExp_const_of_indepFun: Independence implies constant conditional expectation
Implementation notes #
We use an indicator-based characterization rather than σ-algebra formalism to avoid requiring a full conditional distribution API. The definition states that for all Borel sets A, B:
E[1_A(Y) · 1_B(Z) | σ(W)] = E[1_A(Y) | σ(W)] · E[1_B(Z) | σ(W)] a.e.
This is equivalent to the standard σ-algebra definition but more elementary to work with.
References #
- Kallenberg (2005), Probabilistic Symmetries and Invariance Principles, Section 6.1
- Kallenberg (2002), Foundations of Modern Probability, Chapter 6
Definition of conditional independence #
Conditional independence via indicator test functions.
Random variables Y and Z are conditionally independent given W under measure μ, denoted Y ⊥⊥_W Z, if for all Borel sets A and B:
E[1_A(Y) · 1_B(Z) | σ(W)] = E[1_A(Y) | σ(W)] · E[1_B(Z) | σ(W)] a.e.
Mathematical content: This says that knowing W, the events {Y ∈ A} and {Z ∈ B} are independent: P(Y ∈ A, Z ∈ B | W) = P(Y ∈ A | W) · P(Z ∈ B | W).
Why indicators suffice: By linearity and approximation, this extends to all bounded measurable functions. The key is that indicators generate the bounded measurable functions via monotone class arguments.
Relation to σ-algebra definition: This is equivalent to σ(Y) ⊥⊥_σ(W) σ(Z), but stated more elementarily without requiring full conditional probability machinery.
Implementation: We use Set.indicator for the characteristic function 1_A.
Equations
- One or more equations did not get rendered due to their size.
Instances For
Basic properties #
Symmetry of conditional independence.
If Y ⊥⊥_W Z, then Z ⊥⊥_W Y. This follows immediately from commutativity of multiplication.
Helper lemmas for independence and conditional expectation #
Conditional expectation against an independent σ-algebra is constant.
If X is integrable and measurable with respect to a σ-algebra independent of σ(W), then E[X | σ(W)] = E[X] almost everywhere.
This is the key property that makes independence "pass through" conditioning: knowing W provides no information about X when X ⊥ W.
Extract independence of first component from pair independence.
Extract independence of second component from pair independence.