MIT Introduction to Deep Learning 6.S191: Lecture 1
Foundations of Deep Learning
Lecturer: Alexander Amini
January 2020
For all lectures, slides, and lab materials: http://introtodeeplearning.com
Lecture Outline
0:00 - Introduction
4:14 - Course information
8:10 - Why deep learning?
11:01 - The perceptron
13:07 - Activation functions
15:32 - Perceptron example
18:54 - From perceptrons to neural networks
25:23 - Applying neural networks
28:16 - Loss functions
31:14 - Training and gradient descent
35:13 - Backpropagation
39:25 - Setting the learning rate
43:43 - Batched gradient descent
46:46 - Regularization: dropout and early stopping
51:58 - Summary
Subscribe to @stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!!