Skip to content

Commit

Permalink
init
Browse files Browse the repository at this point in the history
  • Loading branch information
senya-ashukha committed Aug 31, 2016
1 parent 161d0a7 commit a0a74e4
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 164 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
*.DS_Store
2 changes: 1 addition & 1 deletion _config.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Site settings
title: Maсhine Learning
title: MIPT: Maсhine Learning, part 2
email: [email protected]
description: "Moscow Institute of Physics and Technology"
baseurl: ""
Expand Down
165 changes: 2 additions & 163 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -4,87 +4,15 @@

<div>

These notes accompany the Stanford CS class <a href="http://cs231n.stanford.edu/">CS231n: Convolutional Neural Networks for Visual Recognition</a>.
These notes accompany for the class <a href="https://ml-mipt.github.io/">MIPT: Maсhine Learning, part 2</a>.
<br>
For questions/concerns/bug reports regarding contact <a href="http://cs.stanford.edu/people/jcjohns/">Justin Johnson</a> regarding the assignments, or contact <a href="cs.stanford.edu/people/karpathy/">Andrej Karpathy</a> regarding the course notes. You can also submit a pull request directly to our <a href="https://github.com/cs231n/cs231n.github.io">git repo</a>.
rusles
<br>
We encourage the use of the <a href="https://hypothes.is/">hypothes.is</a> extension to annote comments and discuss these notes inline.
</div>

<div class="home">
<div class="materials-wrap">

<div class="module-header">Winter 2016 Assignments</div>

<div class="materials-item">
<a href="assignments2016/assignment1/">
Assignment #1: Image Classification, kNN, SVM, Softmax, Neural Network
</a>
</div>

<div class="materials-item">
<a href="assignments2016/assignment2/">
Assignment #2: Fully-Connected Nets, Batch Normalization, Dropout,
Convolutional Nets
</a>
</div>

<div class="materials-item">
<a href="assignments2016/assignment3/">
Assignment #3: Recurrent Neural Networks, Image Captioning,
Image Gradients, DeepDream
</a>
</div>

<!--
<div class="module-header">Winter 2015 Assignments</div>
<div class="materials-item">
<a href="assignment1/">
Assignment #1: Image Classification, kNN, SVM, Softmax
</a>
</div>
<div class="materials-item">
<a href="assignment2/">
Assignment #2: Neural Networks, ConvNets I
</a>
</div>
<div class="materials-item">
<a href="assignment3/">
Assignment #3: ConvNets II, Transfer Learning, Visualization
</a>
</div>
-->

<div class="module-header">Module 0: Preparation</div>

<div class="materials-item">
<a href="python-numpy-tutorial/">
Python / Numpy Tutorial
</a>
</div>

<div class="materials-item">
<a href="ipython-tutorial/">
IPython Notebook Tutorial
</a>
</div>

<div class="materials-item">
<a href="terminal-tutorial/">
Terminal.com Tutorial
</a>
</div>

<div class="materials-item">
<a href="aws-tutorial/">
AWS Tutorial
</a>
</div>

<!-- hardcoding items here to force a specific order -->
<div class="module-header">Module 1: Neural Networks</div>

<div class="materials-item">
Expand All @@ -96,94 +24,5 @@
</div>
</div>

<div class="materials-item">
<a href="linear-classify/">
Linear classification: Support Vector Machine, Softmax
</a>
<div class="kw">
parameteric approach, bias trick, hinge loss, cross-entropy loss, L2 regularization, web demo
</div>
</div>

<div class="materials-item">
<a href="optimization-1/">
Optimization: Stochastic Gradient Descent
</a>
<div class="kw">
optimization landscapes, local search, learning rate, analytic/numerical gradient
</div>
</div>

<div class="materials-item">
<a href="optimization-2/">
Backpropagation, Intuitions
</a>
<div class="kw">
chain rule interpretation, real-valued circuits, patterns in gradient flow
</div>
</div>

<div class="materials-item">
<a href="neural-networks-1/">
Neural Networks Part 1: Setting up the Architecture
</a>
<div class="kw">
model of a biological neuron, activation functions, neural net architecture, representational power
</div>
</div>

<div class="materials-item">
<a href="neural-networks-2/">
Neural Networks Part 2: Setting up the Data and the Loss
</a>
<div class="kw">
preprocessing, weight initialization, batch normalization, regularization (L2/dropout), loss functions
</div>
</div>

<div class="materials-item">
<a href="neural-networks-3/">
Neural Networks Part 3: Learning and Evaluation
</a>
<div class="kw">
gradient checks, sanity checks, babysitting the learning process, momentum (+nesterov), second-order methods, Adagrad/RMSprop, hyperparameter optimization, model ensembles
</div>
</div>

<div class="materials-item">
<a href="neural-networks-case-study/">
Putting it together: Minimal Neural Network Case Study
</a>
<div class="kw">
minimal 2D toy data example
</div>
</div>

<div class="module-header">Module 2: Convolutional Neural Networks</div>

<div class="materials-item">
<a href="convolutional-networks/">
Convolutional Neural Networks: Architectures, Convolution / Pooling Layers
</a>
<div class="kw">
layers, spatial arrangement, layer patterns, layer sizing patterns, AlexNet/ZFNet/VGGNet case studies, computational considerations
</div>
</div>

<div class="materials-item">
<a href="understanding-cnn/">
Understanding and Visualizing Convolutional Neural Networks
</a>
<div class="kw">
tSNE embeddings, deconvnets, data gradients, fooling ConvNets, human comparisons
</div>
</div>

<div class="materials-item">
<a href="transfer-learning/">
Transfer Learning and Fine-tuning Convolutional Neural Networks
</a>
</div>

</div>
</div>

0 comments on commit a0a74e4

Please sign in to comment.