Neural networks are set of algorithms inspired by the functioning of human brain.Recurrent Neural Networks are the first of its kind State of the Art algorithms that can Memorize/remember previous inputs in memory, When a huge set of Sequential data is given to it.
Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used in the field of deep learning. LSTM’s have a Nature of Remembering information for a long periods of time is their Default behaviour.Long Short Term Memory networks – usually just called “LSTMs” – are a special kind of RNN, capable of learning long-term dependencies. They were introduced by Hochreiter & Schmidhuber (1997), and were refined and popularized by many people in following work.They work tremendously well on a large variety of problems, and are now widely used.LSTMs are explicitly designed to avoid the long-term dependency problem. Remembering information for long periods of time is practically their default behavior, not something they struggle to learn!!!
DataSet :: Household Electric Power Consumption
The description of data can be found here: http://archive.ics.uci.edu/ml/datasets/Individual+household+electric+power+consumption
1.This archive contains 1485328 measurements gathered in a house located in Sceaux (7km of Paris, France) between December 2006 and November 2010 (47 months).
2.The dataset contains some missing values in the measurements (nearly 1,25% of the rows).
Data Set Characteristics: Multivariate, Time-Series
Number of Instances:1485328
Attribute Characteristics:Real
Number of Attributes:9
Associated Tasks:Regression, Clustering, Analysis