Skip to content
GitLab
Explore
Sign in
Register
Primary navigation
Search or go to…
Project
G
GUM-compliant neural network robustness verification - a Masters thesis
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Model registry
Analyze
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
ludwig10_masters_thesis
GUM-compliant neural network robustness verification - a Masters thesis
Commits
363a513f
Verified
Commit
363a513f
authored
2 years ago
by
Björn Ludwig
Browse files
Options
Downloads
Patches
Plain Diff
wip(thesis): introduce section outline and robustness verification task
parent
1321f0d2
No related branches found
No related tags found
No related merge requests found
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
src/thesis/Thesis_408230.tex
+23
-17
23 additions, 17 deletions
src/thesis/Thesis_408230.tex
with
23 additions
and
17 deletions
src/thesis/Thesis_408230.tex
+
23
−
17
View file @
363a513f
...
...
@@ -192,23 +192,29 @@
\section
{
Objective
}
\label
{
sec:objective
}
We first want to provide a mathematical foundation for propagating the uncertainty
in neural network inputs through the networks in a GUM-compliant way.
The task we eventually aim to solve is:
\begin{task}
Let
\(
X
=
(
X
_
j
)
_{
j
=
1
,
\hdots
, N
}\)
the input quantities of a
measurement function
\(
f
\colon
\mathbb
R
^
N
\to
\mathbb
R
\colon
x
\mapsto
f
(
x
)
=
y
\)
, which takes the form of an already trained, deep neural network.
Let
\(
(
x
_
j
)
_{
j
=
1
,
\hdots
, N
}
\in
\mathbb
R
^
N
\)
the best estimates representing
the input quantities and
\(
u
(
x
_
j
)
, j
=
1
,
\ldots
, N
\)
the associated standard
uncertainties (
\cite
[paragraph 3.18]
{
jcgm
_
evaluation
_
2008
}
).
Propagate the
\(
x
_
j
\)
and
\(
u
(
x
_
j
)
, j
=
1
,
\ldots
, N
\)
through
\(
f
\)
to form an estimate
\(
y
\)
of the output quantity
\(
Y
\)
and the
associated standard uncertainty
\(
u
(
y
)
\)
as suggested in ~
\cite
[paragraph
7.2]
{
jcgm
_
evaluation
_
2009
}
.
\end{task}
Afterwards we will apply an existing robustness verification method to our network
and measure its performance.
We first want to provide a mathematical foundation for propagating the uncertainty
in neural network inputs through the networks in a GUM-compliant way.
The task we eventually aim to solve is:
\begin{task}
[Uncertainty propagation]
Let
\(
X
=
(
X
_
j
)
_{
j
=
1
,
\hdots
, N
}\)
the input quantities of a measurement function
\(
f
\colon
\mathbb
R
^
N
\to
\mathbb
R
\colon
x
\mapsto
f
(
x
)
=
y
\)
, which takes the
form of an already trained, deep neural network.
Let
\(
(
x
_
j
)
_{
j
=
1
,
\hdots
, N
}
\in
\mathbb
R
^
N
\)
the best estimates representing the
input quantities and
\(
u
(
x
_
j
)
, j
=
1
,
\ldots
, N
\)
the associated standard
uncertainties (
\cite
[paragraph 3 .18]
{
jcgm
_
evaluation
_
2008
}
).
Propagate the
\(
x
_
j
\)
and
\(
u
(
x
_
j
)
, j
=
1
,
\ldots
, N
\)
through
\(
f
\)
to form an
estimate
\(
y
\)
of the output quantity
\(
Y
\)
and the associated standard uncertainty
\(
u
(
y
)
\)
as suggested in ~
\cite
[paragraph 7 .2]
{
jcgm
_
evaluation
_
2009
}
.
\end{task}
Afterwards we will apply an existing robustness verification method to our network
and measure its performance.
\begin{task}
[Robustness verification]
Given a classification deep neural network (CDNN)
$
f
$
with an input region
$
\Theta
$
comprised of a set of uncertain inputs, does the robustness property hold?
\end{task}
\section
{
Outline
}
\label
{
sec:outline
}
\include
{
preliminaries
}
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment