Built using fast.ai library.
I used a pretarined on a bigger dataset(a cleaned subset of wikipedia called wikitext-103). That model has been trained to guess what the next word,its input being all the previous words.It has a recurrent structure and a hidden state that is updated each time it sees a new word.This is hidden state thus contains information about the sentence up to that point. This is where unlablled data is going to useful to us, as we use it to fine tune our model. This predicts the next coming words as shown in the screen shot above.