In class, we saw that information measures appear in diverse problems in data science, statistical inference or machine learning. In the next problem, I ask you to prove a useful formula, called Golden formula, which connects the mutual information with KL divergences and can be used to upper bound the mutual information. Prove the following statement: For VQy such that D(Py||Qy) < ∞, I(X;Y) = Epx [D(Py|x||Qy)] – D(Py||Qy). (2) This inequality true for any distribution Qy such that D(Py||Qy) < ∞. Thus, by finding a proper Qy, you can always bound the mutual information by I(X;Y) ≤ EPx [D(PY|x||QY)]. Proof. Use I(X;Y) = EPxy [log PrixQY PYQY (3) and group Pyx and Qy.

Algebra & Trigonometry with Analytic Geometry
13th Edition
ISBN:9781133382119
Author:Swokowski
Publisher:Swokowski
Chapter5: Inverse, Exponential, And Logarithmic Functions
Section5.6: Exponential And Logarithmic Equations
Problem 64E
icon
Related questions
Question
In class, we saw that information measures appear in diverse problems in data science,
statistical inference or machine learning. In the next problem, I ask you to prove a
useful formula, called Golden formula, which connects the mutual information with KL
divergences and can be used to upper bound the mutual information. Prove the following
statement:
For VQy such that D(Py||Qy) < ∞,
I(X;Y) = Epx [D(Py|x||Qy)] – D(Py||Qy).
(2)
This inequality
true for any distribution Qy such that D(Py||Qy) < ∞. Thus,
by finding a proper Qy, you can always bound the mutual information by I(X;Y) ≤
EPx [D(PY|x||QY)].
Proof. Use
I(X;Y) = EPxy [log PrixQY
PYQY
(3)
and group Pyx and Qy.
Transcribed Image Text:In class, we saw that information measures appear in diverse problems in data science, statistical inference or machine learning. In the next problem, I ask you to prove a useful formula, called Golden formula, which connects the mutual information with KL divergences and can be used to upper bound the mutual information. Prove the following statement: For VQy such that D(Py||Qy) < ∞, I(X;Y) = Epx [D(Py|x||Qy)] – D(Py||Qy). (2) This inequality true for any distribution Qy such that D(Py||Qy) < ∞. Thus, by finding a proper Qy, you can always bound the mutual information by I(X;Y) ≤ EPx [D(PY|x||QY)]. Proof. Use I(X;Y) = EPxy [log PrixQY PYQY (3) and group Pyx and Qy.
Expert Solution
steps

Step by step

Solved in 2 steps with 2 images

Blurred answer
Recommended textbooks for you
Algebra & Trigonometry with Analytic Geometry
Algebra & Trigonometry with Analytic Geometry
Algebra
ISBN:
9781133382119
Author:
Swokowski
Publisher:
Cengage
Linear Algebra: A Modern Introduction
Linear Algebra: A Modern Introduction
Algebra
ISBN:
9781285463247
Author:
David Poole
Publisher:
Cengage Learning
Calculus For The Life Sciences
Calculus For The Life Sciences
Calculus
ISBN:
9780321964038
Author:
GREENWELL, Raymond N., RITCHEY, Nathan P., Lial, Margaret L.
Publisher:
Pearson Addison Wesley,
Trigonometry (MindTap Course List)
Trigonometry (MindTap Course List)
Trigonometry
ISBN:
9781337278461
Author:
Ron Larson
Publisher:
Cengage Learning