Skip to content
GitLab
Explore
Sign in
Register
Primary navigation
Search or go to…
Project
G
GUM-compliant neural network robustness verification - a Masters thesis
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Model registry
Analyze
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
ludwig10_masters_thesis
GUM-compliant neural network robustness verification - a Masters thesis
Graph
36b1405a4a89647ad6b2d47cbea23541b170d067
Select Git revision
Branches
1
main
default
1 result
You can move around the graph by using the arrow keys.
Begin with the selected commit
Created with Raphaël 2.2.0
28
Dec
27
13
12
4
30
Nov
29
27
24
23
22
18
17
16
14
13
12
10
9
8
7
6
5
31
Oct
29
23
19
18
17
14
13
12
11
10
9
7
8
Sep
30
Jul
24
27
Jun
26
18
16
15
24
May
1
12
Dec
6
29
Nov
15
3
Jul
2
25
Jun
refactor(thesis): improve formulation about the composition being applied element-wise
refactor(thesis): introduce two more non-breaking spaces
fix(thesis): improve definition of i^{(i)}
refactor(thesis): update one reference after doing a data_validation with biber and updating the bib
refactor(thesis references): update bib after doing a data_validation with biber
refactor(thesis): remove remarks on GUM section 4.3.7 about upper and lower bounds for uncertainties
refactor(thesis): introduce parametric when talking about quadlu
refactor(thesis): introduce non-breaking spaces all over
refactor(thesis): correct one formulation as Sascha suggested
refactor(thesis): introduce parameter into quadlu operator
refactor(thesis): make softplus operator more robust
refactor(thesis): streamline formulation of linear inclusion first order approximation
refactor(thesis): improve formulation of optimization problems
refactor(thesis): introduce one more proof for proposition
refactor(thesis): add forgotten 0 for theta
refactor(thesis): change definiton of varrho
fix(thesis): correct simulation ideas
refactor(thesis): introduce some equation numbers
refactor(thesis): introduce reasoning about QuadLU proof idea
feat(thesis): include reference about softplus outperforming ReLU in SAT
refactor(thesis): write every theorem, definition, proposition with capital letter
feat(thesis): introduce long headings into toc
style(differentiation): blacken notebook
fix(thesis): repair formula syntax error
fix(thesis): replace a wrong reference
refactor(thesis): correct another typo
refactor(thesis): improve one formulation after discussing it with Barbara
refactor(thesis): correct a definition sign
refactor(thesis): express the first interval extension clearer by reordering terms
refactor(thesis): correct a minor typo
refactor(thesis): introduce some whitespace to silence some IDE warning
refactor(thesis): simplify definition of uncertain input region after discussing it with Barbara
refactor(thesis): rename optimum of non-linear optimization problem to \Delta x_{non}
refactor(thesis): introduce a couple of todos after discussing the draft with Barbara
refactor(thesis): correct another minor typo
fix(thesis): correct proof of QuadLU LPU
wip(thesis): correct a minor typo
fix(thesis): fix alphas domain for QuadLU
wip(thesis): introduce first ideas about what to do in the simulation sections
wip(thesis): improve labels of background chapters
Loading