Skip to content

Implementation of Logistic Regression, MLP, CNN, RNN & LSTM from scratch in python. Training of deep learning models for image classification, object detection, and sequence processing (including transformers implementation) in TensorFlow.

Notifications You must be signed in to change notification settings

sickopickle/Coursera_Deep_Learning_Specialization

 
 

Repository files navigation

Deep Learning Specialization

Instructor of the specialization: Andrew Ng

Table of Contents

  1. My Learnings from the Specialization
  2. Instructions to use the repository
  3. Weekly Learning Objective
  4. Results
  5. Disclaimer

My Learnings from the Specialization

In this five course series, I learned about the foundations of Deep Learning by implementing vectorized neural networks (MLP, CNN, RNN, LSTM) and optimization algorithms (SGD, RMSprop, Adam) from scratch in Python, building and training deep neural networks in TensorFlow and Keras and identifying key parameters in network architecture for hyperparameter tuning.

I learned about the best practices to train and develop test sets and analyzed bias/variance for building DL applications, diagnosed and used strategies for reducing errors in ML systems, understand complex ML settings and used transfer learning for image classification tasks.

I learned to build and train CNN models (YOLO for object detection, U-Net for image segmentation, FaceNet for face verification and face recognition) for visual detection and recognition tasks and to generate art work through neural style transfer by using a pre-trained VGG-19 model. I learned about RNNs, GRUs, LSTMs and transformers and applied them to various NLP/sequence tasks. I used RNNs to built a character-level language model to generate dinosaur names, LSTMs to built a Seq2seq model for Neural Machine Translation with attention and trigger word detection model. I used pre-trained transformer models for question-answering and named-entity-recognition tasks.

Instructions to use the repository

Using this repository is straight forward. Clone this repository to use. This repository contains all my work for this specialization. All the code base, quiz questions, screenshot, and images, are taken from, unless specified, Deep Learning Specialization on Coursera.

Weekly Learning Objective

  1. Course 1 - Neural Networks and Deep Learning
  • Course Objective: This course focuses on vectorized implementation of neural networks in Python.

    • Week 1: Introduction to deep learning

      • Be able to explain the major trends driving the rise of deep learning, and understand where and how it is applied today.
    • Week 2: Neural Networks Basics

      • Python Basics with Numpy and Logistic Regression with a Neural Network mindset.
    • Week 3: Shallow neural networks

      • Understand the key parameters in a neural network's architecture. Planar data classification with a hidden layer
    • Week 4: Deep Neural Networks

      • Understand the key computations underlying deep learning, use them to build and train deep neural networks, and apply it to computer vision.
  1. Course 2 - Improving Deep Neural Networks Hyperparameter tuning, Regularization and Optimization
  • Course Objective: This course teaches the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results.

    • Week 1: Practical aspects of Deep Learning

      • Understand industry best-practices for building deep learning applications. Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking along with implementation.
    • Week 2: Optimization algorithms

      • Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence.
    • Week 3: Hyperparameter tuning, Batch Normalization and Programming Frameworks

      • Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance.
      • Implement a neural network in TensorFlow.
  1. Course 3 - Structuring Machine Learning Projects
  • Course Objective: This course focuses on how to diagnose errors in a machine learning system, be able to prioritize the most promising directions for reducing error, understand complex ML settings, such as mismatched training/test sets and comparing to and/or surpassing human-level performance and how to apply end-to-end learning, transfer learning, and multi-task learning.

    • There is no Programming Assignment for this course. But this course comes with very interesting case study quizzes.
  1. Course 4 - Convolutional Neural Networks
  • Course Objective: This course focuses on how to build a convolutional neural network, including recent variations such as residual networks, how to apply convolutional networks to visual detection and recognition tasks and use neural style transfer to generate art.

    • Week 1 - Foundations of Convolutional Neural Networks

      • Build Convolutional Model in python from scratch.
    • Week 2 - Deep convolutional models: case studies

      • Build Residual Network in Keras.
      • Transfer Learning with MobileNet.
    • Week 3 - Object detection

      • Learn how to apply your knowledge of CNNs to one of the toughest but hottest field of computer vision: Object detection. Autonomous driving application - Car detection.
      • Image segmentation with U-Net.
    • Week 4 - Special applications: Face recognition & Neural style transfer

      • Discover how CNNs can be applied to multiple fields, including art generation and face recognition.
      • Build Face Recognition model for the Happy House. Implement Art Generation with Neural Style Transfer.
  1. Course 5 - Sequence Models
  • Course Objective: This course focuses on how to build and train Recurrent Neural Networks (RNNs), and commonly-used variants such as GRUs and LSTMs, able to apply sequence models to natural language problems, including text synthesis and sequence models to audio applications, including speech recognition and music synthesis.

    • Week 1 - Recurrent Neural Networks

      • Build a Recurrent Neural Network in python from scratch. Implement Character-Level Language Modeling to generate Dinosaur names. Generate music to Improvise a Jazz Solo with an LSTM Network.
    • Week 2 - Natural Language Processing & Word Embeddings

      • Using word vector representations and embedding layers you can train recurrent neural networks with outstanding performances in a wide variety of industries. Examples of applications are sentiment analysis, named entity recognition and machine translation.
    • Week 3 - Sequence models & Attention mechanism

      • Sequence models can be augmented using an attention mechanism. This algorithm will help your model understand where it should focus its attention given a sequence of inputs.
      • Implement Neural machine translation with attention and Trigger word detection.
    • Week 4 - Transformer Network

      • Use HuggingFace tokenizers and transformers to perform Named Entity Recognition and Question Answering

Results

Some results from the programming assignments of this specialization

alt text

  • Neural style transfer alt text

  • Image classification using Logistic Regression from scratch in Python alt text

  • Accuracy vs number of hidden layers in MLP for planar data set alt text

alt text

Disclaimer

The solutions uploaded in this repository are only for reference when you got stuck somewhere. Please don't use these solutions to pass programming assignments.

About

Implementation of Logistic Regression, MLP, CNN, RNN & LSTM from scratch in python. Training of deep learning models for image classification, object detection, and sequence processing (including transformers implementation) in TensorFlow.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 98.7%
  • Python 1.3%