From abb6b7e39a938fc8b29defe963f714b9bf19414e Mon Sep 17 00:00:00 2001 From: Roman Derbanosov Date: Fri, 2 Nov 2018 23:31:19 +0300 Subject: [PATCH] delete broken link --- week2/week2-NER.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/week2/week2-NER.ipynb b/week2/week2-NER.ipynb index b73c4101..f6013ec1 100644 --- a/week2/week2-NER.ipynb +++ b/week2/week2-NER.ipynb @@ -411,7 +411,7 @@ "Now, let us specify the layers of the neural network. First, we need to perform some preparatory steps: \n", " \n", "- Create embeddings matrix with [tf.Variable](https://www.tensorflow.org/api_docs/python/tf/Variable). Specify its name (*embeddings_matrix*), type (*tf.float32*), and initialize with random values.\n", - "- Create forward and backward LSTM cells. TensorFlow provides a number of [RNN cells](https://www.tensorflow.org/api_guides/python/contrib.rnn#Core_RNN_Cells_for_use_with_TensorFlow_s_core_RNN_methods) ready for you. We suggest that you use *BasicLSTMCell*, but you can also experiment with other types, e.g. GRU cells. [This](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) blogpost could be interesting if you want to learn more about the differences.\n", + "- Create forward and backward LSTM cells. TensorFlow provides a number of RNN cells ready for you. We suggest that you use *BasicLSTMCell*, but you can also experiment with other types, e.g. GRU cells. [This](http://colah.github.io/posts/2015-08-Understanding-LSTMs/) blogpost could be interesting if you want to learn more about the differences.\n", "- Wrap your cells with [DropoutWrapper](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/DropoutWrapper). Dropout is an important regularization technique for neural networks. Specify all keep probabilities using the dropout placeholder that we created before.\n", " \n", "After that, you can build the computation graph that transforms an input_batch:\n",