Neural Networks For Your Dog - 3.3 Gradient Descent

Опубликовано: 21 Июль 2021
на канале: GormAnalysis
297
9

In this video, we'll see how neural networks learn optimal weights via gradient descent (commonly called "backpropagation"). Then we'll build a neural network with logistic activation functions and log loss objective function.

0:00 - introduction
0:08 - idea for weight optimization
0:55 - gradient descent
4:35 - current model issues
5:52 - logistic activation function
6:54 - log loss objective function
10:11 - reformulating our goal
11:46 - backpropagation
18:41 - recap
19:53 - helper functions
22:43 - gradient checker
24:19 - challenge
24:30 - solution / backprop implementation
26:05 - code demo
27:12 - next steps / generalizations

-- Code -----------------------
https://github.com/ben519/nnets-for-y...

-- Vids & Playlists ---------------------------------
Google Colab -    • Introduction to Google Colab  
NumPy -    • Python NumPy For Your Grandma  
Pandas -    • Python Pandas For Your Grandpa  
Neural Networks -    • Neural Networks For Your Dog  

-- Subscribe To Mailing List ---------------------------------
https://eepurl.com/hC1Pmj

-- Music ---------------------------------
Elite Estate by The Galaxy News
Amistades by Hola Hola
Groove Protocol by Nu Alkemi$t

-- Support -----------------------
https://merchonate.com/gormanalysis