Skip to content

vivekguthikonda/HAR_DeepLearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Human Activity Recognition

This case study is to build a model that predicts the human activities such as Walking, Walking_Upstairs, Walking_Downstairs, Sitting, Standing or Laying. This dataset is collected from 30 persons(referred as subjects in this dataset), performing different activities with a smartphone to their waists. The data is recorded with the help of sensors (accelerometer and Gyroscope) in that smartphone. This experiment was video recorded to label the data manually.

Dataset

By using the sensors(Gyroscope and accelerometer) in a smartphone, they have captured '3-axial linear acceleration'(tAcc-XYZ) from accelerometer and '3-axial angular velocity' (tGyro-XYZ) from Gyroscope with several variations.

  • prefix 't' in those metrics denotes time.
  • suffix 'XYZ' represents 3-axial signals in X , Y, and Z directions.

Y_Labels(Encoded):
In the dataset, Y_labels are represented as numbers from 1 to 6 as their identifiers.

  • WALKING as 1
  • WALKING_UPSTAIRS as 2
  • WALKING_DOWNSTAIRS as 3
  • SITTING as 4
  • STANDING as 5
  • LAYING as 6

Deep Learning Models :

Three models were trained for:

  • To classify whether it is dynamic or static activity.
  • Dynamic Activities (Walking, upstairs, downstairs)
  • Static Activities (sitting, standing, lying) We have used:
  • 1d CNN and 1-layer LSTM for type of activity classification.
  • 1d CNN and 1-layer LSTM for dynamic activities.
  • 3 (1d CNN + MaxPooling) for static activities.

Results:

We have achieved,
Train accuracy: 0.9880304678998912
Test accuracy: 0.9681031557516118

About

HAR_Using Convs & LSTM

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published