Documentation

Exchangeability.Probability.CondIndep.Basic

Conditional Independence - Basic Definitions and Properties #

This file defines conditional independence for random variables and establishes basic properties. The definition uses indicator functions on measurable rectangles.

Main definitions #

Main results #

Implementation notes #

We use an indicator-based characterization rather than σ-algebra formalism to avoid requiring a full conditional distribution API. The definition states that for all Borel sets A, B:

E[1_A(Y) · 1_B(Z) | σ(W)] = E[1_A(Y) | σ(W)] · E[1_B(Z) | σ(W)] a.e.

This is equivalent to the standard σ-algebra definition but more elementary to work with.

References #

Definition of conditional independence #

def CondIndep {Ω : Type u_5} {α : Type u_6} {β : Type u_7} {γ : Type u_8} [MeasurableSpace Ω] [MeasurableSpace α] [MeasurableSpace β] [MeasurableSpace γ] (μ : MeasureTheory.Measure Ω) (Y : Ωα) (Z : Ωβ) (W : Ωγ) :

Conditional independence via indicator test functions.

Random variables Y and Z are conditionally independent given W under measure μ, denoted Y ⊥⊥_W Z, if for all Borel sets A and B:

E[1_A(Y) · 1_B(Z) | σ(W)] = E[1_A(Y) | σ(W)] · E[1_B(Z) | σ(W)] a.e.

Mathematical content: This says that knowing W, the events {Y ∈ A} and {Z ∈ B} are independent: P(Y ∈ A, Z ∈ B | W) = P(Y ∈ A | W) · P(Z ∈ B | W).

Why indicators suffice: By linearity and approximation, this extends to all bounded measurable functions. The key is that indicators generate the bounded measurable functions via monotone class arguments.

Relation to σ-algebra definition: This is equivalent to σ(Y) ⊥⊥_σ(W) σ(Z), but stated more elementarily without requiring full conditional probability machinery.

Implementation: We use Set.indicator for the characteristic function 1_A.

Equations
  • One or more equations did not get rendered due to their size.
Instances For

    Basic properties #

    theorem condIndep_symm {Ω : Type u_1} {α : Type u_2} {β : Type u_3} {γ : Type u_4} [MeasurableSpace Ω] [MeasurableSpace α] [MeasurableSpace β] [MeasurableSpace γ] (μ : MeasureTheory.Measure Ω) [MeasureTheory.IsProbabilityMeasure μ] (Y : Ωα) (Z : Ωβ) (W : Ωγ) :
    CondIndep μ Y Z W CondIndep μ Z Y W

    Symmetry of conditional independence.

    If Y ⊥⊥_W Z, then Z ⊥⊥_W Y. This follows immediately from commutativity of multiplication.

    Helper lemmas for independence and conditional expectation #

    theorem condExp_const_of_indepFun {Ω : Type u_1} {γ : Type u_4} [MeasurableSpace Ω] [MeasurableSpace γ] (μ : MeasureTheory.Measure Ω) [MeasureTheory.IsProbabilityMeasure μ] {X : Ω} {W : Ωγ} (hX : Measurable X) (hW : Measurable W) (h_indep : ProbabilityTheory.IndepFun X W μ) (_hX_int : MeasureTheory.Integrable X μ) :
    μ[X|MeasurableSpace.comap W inferInstance] =ᵐ[μ] fun (x : Ω) => (x : Ω), X x μ

    Conditional expectation against an independent σ-algebra is constant.

    If X is integrable and measurable with respect to a σ-algebra independent of σ(W), then E[X | σ(W)] = E[X] almost everywhere.

    This is the key property that makes independence "pass through" conditioning: knowing W provides no information about X when X ⊥ W.

    theorem IndepFun.of_comp_left_fst {Ω : Type u_1} {α : Type u_2} {β : Type u_3} {γ : Type u_4} [MeasurableSpace Ω] [MeasurableSpace α] [MeasurableSpace β] [MeasurableSpace γ] {μ : MeasureTheory.Measure Ω} {Y : Ωα} {Z : Ωβ} {W : Ωγ} (h : ProbabilityTheory.IndepFun (fun (ω : Ω) => (Y ω, Z ω)) W μ) :

    Extract independence of first component from pair independence.

    theorem IndepFun.of_comp_left_snd {Ω : Type u_1} {α : Type u_2} {β : Type u_3} {γ : Type u_4} [MeasurableSpace Ω] [MeasurableSpace α] [MeasurableSpace β] [MeasurableSpace γ] {μ : MeasureTheory.Measure Ω} {Y : Ωα} {Z : Ωβ} {W : Ωγ} (h : ProbabilityTheory.IndepFun (fun (ω : Ω) => (Y ω, Z ω)) W μ) :

    Extract independence of second component from pair independence.