Skip to content
Snippets Groups Projects
Commit cbc6cef1 authored by Nando Farchmin's avatar Nando Farchmin
Browse files

Fix typo

parent 8a855bd1
No related branches found
No related tags found
1 merge request!1Update math to conform with gitlab markdown
...@@ -119,11 +119,6 @@ Let $`x^{(1)},\dots,x^{(N)}\sim\pi`$ be independent (random) samples and assume ...@@ -119,11 +119,6 @@ Let $`x^{(1)},\dots,x^{(N)}\sim\pi`$ be independent (random) samples and assume
The empirical regression problem then reads The empirical regression problem then reads
```math
\text{Find}\qquad \Psi_\vartheta
= \operatorname*{arg\, min}_{\Psi_\theta\in\mathcal{M}_{d,\varphi}} \frac{1}{N} \sum_{i=1}^N \bigl(f^{(i)} - \Psi_\theta(x^{(i)})\bigr)^2
=: \operatorname*{arg\, min}_{\Psi_\theta\in\mathcal{M}_{d,\varphi}} \mathcal{L}_N(\Psi_\theta)
```
> **Definition** (loss function): > **Definition** (loss function):
> A _loss functions_ is any function, which measures how good a neural network approximates the target values. > A _loss functions_ is any function, which measures how good a neural network approximates the target values.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment