Skip to content

KumarJonnala/deep_learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

deep learning

Hands-on deep learning practice in PyTorch, working through core concepts from shallow networks to modern architectures. Notebooks follow the progression in Deep Learning Illustrated (Krohn, Beyleveld, Bassens) with extensions into transformers and beyond.

Stack

  • Python, PyTorch, torchvision, matplotlib
  • Dataset: MNIST (digit classification throughout fundamentals)

Notebook Index

Foundations (Deep Learning Illustrated)

1_shallow_net.ipynb: Shallow neural network — forward pass, weights, biases 2_activation_functions.ipynb: Sigmoid, tanh, ReLU — comparison and intuition

In progress: Backpropagation & gradient descent Convolutional Neural Networks (CNNs) Recurrent Neural Networks (RNNs) LSTMs Transformers

parameters: weight w bias b activation a artificial neurons: sigmoid tanh ReLU input layer hidden layer output layer layer types: dense/fully connected softmax forward propagation


About

Hands-on deep learning practice in PyTorch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors