Skip to content

Commit

Permalink
fix typo
Browse files Browse the repository at this point in the history
  • Loading branch information
ludwigbothmann committed Nov 18, 2024
1 parent 07d9ed9 commit cc7865e
Show file tree
Hide file tree
Showing 3 changed files with 1 addition and 1 deletion.
Binary file modified slides-pdf/lecture_sl.pdf
Binary file not shown.
Binary file modified slides-pdf/slides-info-cross-entropy-kld.pdf
Binary file not shown.
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@
The KL divergence (which is non-negative) between $f(x)$ and $g(x)$ is:
\begin{equation}
\begin{aligned}
0 \leq D_{KL}(f \| g) & = -h(f) + H(p \| q) \\
0 \leq D_{KL}(f \| g) & = -h(f) + H(f \| g) \\
& =-h(f)-\int_{-\infty}^{\infty} f(x) \log (g(x)) dx
\end{aligned}
\end{equation}
Expand Down

0 comments on commit cc7865e

Please sign in to comment.