## Course information

The information on this website is preliminary and will be updated continuously. Further information will soon be available publicly in the LSF information system of the university.

## Description

This seminar is addressed to master students of Scientific Computing and Mathematics. We will view Machine Learning from a mathematical point of view and discuss how we can relate it back to well studied mathematical concepts. This semester the focus topic will be „Different Deep Neural Network Architectures“. We explore different Deep Neural Networks, and their relation to Ordinary Differential Equations. The list of topics and their corresponding resources are:

- Introduction Machine Learning I (chap. 5 of GBC2016)
- Introduction Machine Learning II (chap. 5 of GBC2016)
- Feed Forward Network (chap. 6 of GBC2016, chap. 7.2 HF2018)
- ResNet (HZRS2016 - 130k citations)
- Convolutional Network (chap. 9 of GBC2016, chap. 7.2.3 of HF2018)
- Recurrent Neural Network (chap. 10 of GBC2016, chap. 8.3.1 of HF2018
- Long Short Term Memory Network (chap. 10.10 of GBC2016, HS1997 (70 k citations))
- Auto Encoder (chap. 14 of GBC2016)
- Fractional-DNN (AKLV2020, AEOV2021, ADH2022)
- Physics Informed Neural Network (RPK2018 (3.5k citations), RPK2017I, RPK2017II)

## Exam / Presentations

For a successful completion of the seminar the following are mandatory:

- a 45 minute presentation (50%),
- a written documentation of your presentation (30%), and
- participation in class (20%).

Your final grade will be a weighted sum of the mentioned performances.

## Dates and Timeline

- Organizational meeting during the first week of lectures on
**Thursday, 20th October 2022**at 6pm, in https://heiconf.uni-heidelberg.de/wu26-jm4d-h7fk-xeyw - During this meeting the topics will be distributed. Participation in this meeting is essential to attending the seminar.
- Based on the provided material you will research your topic and prepare a 45 minute presentation with a written documentation of maximum 15 pages. The documentation should NOT be an exact replication of the presentation slides. If you have questions, please don’t hesitate to contact us.
- It is mandatory to schedule an appointment, which takes place at least 1 week before your presentation. For this meeting, please prepare your presentation and your written documentation, so we can give you feedback.
- It is also mandatory to submit your written documentation at least 24 hours before your talk. It will be made available to the other participants.
- We will meet 5 times for 2 presentations each, in the order of the topics list.
- First presentation meeting is scheduled for the second week of January (9th - 13th January).
- Weekday and time: to be discussed.

## (Pre-)registration

If you are interested in participating in this seminar, please pre-register via Müsli , so we know how many participants we can expect. You will finally register for the class when you have selected a suitable topic.

## Resources

- Goodfellow, Bengio, Courville: Deep Learning (GBC2016)
- Hoogendoorn, Funk: Machine Learning for the Quantified Self (HF2018)
- He, Zhang, Ren, Sun: Deep Residual Learning for Image Recognition (HZRS2016)
- Hochreiter, Schmidhuber: Long Short-Term Memory
- Antil, Khatri, Löhner, Verma: Fractional Deep Neural Network via Constrained Optimization (AKLV2020)
- Antil, Elman, Onwunta, Verma: Novel Deep neural networks for solving Bayesian statistical inverse (AEOV2021)
- Antil, Díaz, Herberg: An Optimal Time Variable Learning Framework for Deep Neural Networks (ADH2022)
- Raissi, Perdikaris, Karniadakis: Physics-informed neural networks (RPK2018)
- Raissi, Perdikaris, Karniadakis: Physics Informed Deep Learning (Part I) (RPK2017I)
- Raissi, Perdikaris, Karniadakis: Physics Informed Deep Learning (Part II) (RPK2017II)
- LaTeX-Template for the article and corresponding pdf.
- LaTeX-Template for the presentation and corresponding pdf.