diff --git a/week2/week2-NER.ipynb b/week2/week2-NER.ipynb index b73c4101..f6013ec1 100644 --- a/week2/week2-NER.ipynb +++ b/week2/week2-NER.ipynb @@ -411,7 +411,7 @@ "Now, let us specify the layers of the neural network. First, we need to perform some preparatory steps: \n", " \n", "- Create embeddings matrix with [tf.Variable](https://www.tensorflow.org/api_docs/python/tf/Variable). Specify its name (*embeddings_matrix*), type (*tf.float32*), and initialize with random values.\n", - "- Create forward and backward LSTM cells. TensorFlow provides a number of [RNN cells](https://www.tensorflow.org/api_guides/python/contrib.rnn#Core_RNN_Cells_for_use_with_TensorFlow_s_core_RNN_methods) ready for you. We suggest that you use *BasicLSTMCell*, but you can also experiment with other types, e.g. GRU cells. [This](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) blogpost could be interesting if you want to learn more about the differences.\n", + "- Create forward and backward LSTM cells. TensorFlow provides a number of RNN cells ready for you. We suggest that you use *BasicLSTMCell*, but you can also experiment with other types, e.g. GRU cells. [This](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) blogpost could be interesting if you want to learn more about the differences.\n", "- Wrap your cells with [DropoutWrapper](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/DropoutWrapper). Dropout is an important regularization technique for neural networks. Specify all keep probabilities using the dropout placeholder that we created before.\n", " \n", "After that, you can build the computation graph that transforms an input_batch:\n",