BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:America/Denver
X-LIC-LOCATION:America/Denver
BEGIN:DAYLIGHT
TZOFFSETFROM:-0700
TZOFFSETTO:-0600
TZNAME:MDT
DTSTART:19700308T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0600
TZOFFSETTO:-0700
TZNAME:MST
DTSTART:19701101T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20200129T163559Z
LOCATION:502-503-504
DTSTART;TZID=America/Denver:20191118T162000
DTEND;TZID=America/Denver:20191118T165000
UID:submissions.supercomputing.org_SC19_sess115_ws_mlhpce111@linklings.com
SUMMARY:GradVis: Visualization and Second Order Analysis of Optimization S
urfaces During the Training of Deep Neural Networks
DESCRIPTION:Workshop\n\nGradVis: Visualization and Second Order Analysis o
f Optimization Surfaces During the Training of Deep Neural Networks\n\nCha
tzimichailidis, Keuper, Pfreundt, Gauger\n\nCurrent training methods for d
eep neural networks boil down to very high dimensional and non-convex opti
mization problems which are usually solved by a wide range of stochastic g
radient descent methods. While these approaches tend to work in practice,
there are still many gaps in the theoretical understanding of key aspects
like convergence and generalization guarantees, which are induced by the p
roperties of the optimization surface (loss landscape). In order to gain d
eeper insights, a number of recent publications proposed methods to visual
ize and analyze the otimization surfaces. However, the computational cost
of these methods are very high, making it hardly possible to use them on l
arger networks.\n\nIn this paper, we present the GradVis Toolbox, an open
source library for efficient and scalable visualization and analysis of de
ep neural network loss landscapes in Tesorflow and PyTorch. Introducing mo
re efficient mathematical formulations and a novel parallelization scheme,
GradVis allows to plot 2d and 3d projections of optimization surfaces and
trajectories, as well as high resolution second order gradient informatio
n for large networks.\n\nTag: Workshop Reg Pass, Machine Learning\n\nRegis
tration Category: Workshop Reg Pass, Machine Learning
URL:https://sc19.supercomputing.org/presentation/?id=ws_mlhpce111&sess=ses
s115
END:VEVENT
END:VCALENDAR