diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..5509140 --- /dev/null +++ b/.gitignore @@ -0,0 +1 @@ +*.DS_Store diff --git a/_config.yml b/_config.yml index 3e1fd7b..e66a707 100644 --- a/_config.yml +++ b/_config.yml @@ -1,5 +1,5 @@ # Site settings -title: Maсhine Learning +title: MIPT: Maсhine Learning, part 2 email: ml.course.mipt@gmail.com description: "Moscow Institute of Physics and Technology" baseurl: "" diff --git a/index.html b/index.html index 6f0afb3..02b3004 100644 --- a/index.html +++ b/index.html @@ -4,87 +4,15 @@
- These notes accompany the Stanford CS class CS231n: Convolutional Neural Networks for Visual Recognition. + These notes accompany for the class MIPT: Maсhine Learning, part 2.
- For questions/concerns/bug reports regarding contact Justin Johnson regarding the assignments, or contact Andrej Karpathy regarding the course notes. You can also submit a pull request directly to our git repo. + rusles
- We encourage the use of the hypothes.is extension to annote comments and discuss these notes inline.
-
Winter 2016 Assignments
- - - - - - - - - -
Module 0: Preparation
- - - - - - - - - -
Module 1: Neural Networks
@@ -96,94 +24,5 @@
-
- - Linear classification: Support Vector Machine, Softmax - -
- parameteric approach, bias trick, hinge loss, cross-entropy loss, L2 regularization, web demo -
-
- -
- - Optimization: Stochastic Gradient Descent - -
- optimization landscapes, local search, learning rate, analytic/numerical gradient -
-
- -
- - Backpropagation, Intuitions - -
- chain rule interpretation, real-valued circuits, patterns in gradient flow -
-
- -
- - Neural Networks Part 1: Setting up the Architecture - -
- model of a biological neuron, activation functions, neural net architecture, representational power -
-
- -
- - Neural Networks Part 2: Setting up the Data and the Loss - -
- preprocessing, weight initialization, batch normalization, regularization (L2/dropout), loss functions -
-
- -
- - Neural Networks Part 3: Learning and Evaluation - -
- gradient checks, sanity checks, babysitting the learning process, momentum (+nesterov), second-order methods, Adagrad/RMSprop, hyperparameter optimization, model ensembles -
-
- -
- - Putting it together: Minimal Neural Network Case Study - -
- minimal 2D toy data example -
-
- -
Module 2: Convolutional Neural Networks
- -
- - Convolutional Neural Networks: Architectures, Convolution / Pooling Layers - -
- layers, spatial arrangement, layer patterns, layer sizing patterns, AlexNet/ZFNet/VGGNet case studies, computational considerations -
-
- -
- - Understanding and Visualizing Convolutional Neural Networks - -
- tSNE embeddings, deconvnets, data gradients, fooling ConvNets, human comparisons -
-
- -
- - Transfer Learning and Fine-tuning Convolutional Neural Networks - -
-