diff --git a/exercises-pdf/ic_information_theory_1.pdf b/exercises-pdf/ic_information_theory_1.pdf index 7cf2af9b..75a5744a 100644 Binary files a/exercises-pdf/ic_information_theory_1.pdf and b/exercises-pdf/ic_information_theory_1.pdf differ diff --git a/exercises/information-theory/ex_rnw/ex_kl_divergence_misspecification.Rnw b/exercises/information-theory/ex_rnw/ex_kl_divergence_misspecification.Rnw index f435af06..97a19c40 100644 --- a/exercises/information-theory/ex_rnw/ex_kl_divergence_misspecification.Rnw +++ b/exercises/information-theory/ex_rnw/ex_kl_divergence_misspecification.Rnw @@ -1,4 +1,4 @@ -Consider a double-exponential distributed random variable $X$ with unknown parameters $\mu_0 \in \R$ and $\sigma_0 > 0$. In other words: $X\sim\text{DE}(\mu_0,\sigma_0)$ with the following density function: +Consider a laplace-distributed random variable $X$ with unknown parameters $\mu_0 \in \R$ and $\sigma_0 > 0$. In other words: $X\sim\text{LP}(\mu_0,\sigma_0)$ with the following density function: $$ g(x) = \frac{1}{2\sigma_0}\,\exp\left(-\frac{|x-\mu_0|}{\sigma_0}\right) $$ @@ -6,12 +6,12 @@ Unfortunately, the model is misspecified and $X$ is assumed to be normally distr $$ f_\theta(x) = \frac{1}{\sigma \sqrt{2\pi}} \exp \left(-\frac{1}{2} \left( \frac{x-\mu}{\sigma} \right)^2 \right) $$ -\begin{enumerate} - \item - Calculate the set of parameters $\theta$ that minimizes the Kullback-Leibler Divergence $D_{KL}(g \| f_\theta)$ +%\begin{enumerate} +% \item + Calculate the set of parameters $\theta$ that minimizes the Kullback-Leibler Divergence $D_{KL}(g \| f_\theta)$. - \emph{Hint}: Use the fact that for $X \sim \text{DE}(\mu_0,\sigma_0)$, the following properties apply: $\E(X )=\mu_0$ and $\var(X)=2 \sigma_0^2$. + \emph{Hint}: Use the fact that for $X \sim \text{LP}(\mu_0,\sigma_0)$, the following properties apply: $\E(X )=\mu_0$ and $\var(X)=2 \sigma_0^2$. -\end{enumerate} +%\end{enumerate}