The Deep Learning Specialization is a foundational program that will help us understand the capabilities, challenges, and consequences of deep learning and prepare us to participate in the development of leading-edge AI technology.
In this Specialization, we will build and train neural network architectures such as Convolutional Neural Networks, Recurrent Neural Networks, LSTMs, Transformers, and learn how to make them better with strategies such as Dropout, BatchNorm, Xavier/He initialization, and more. We will get ready to master theoretical concepts and their industry applications using Python and TensorFlow and tackle real-world cases such as speech recognition, music synthesis, chatbots, machine translation, natural language processing, and more.
https://www.coursera.org/account/accomplishments/specialization/6A49H5M2LRWW
Course 1: Neural Networks Deep Learning
In the first course of the Deep Learning Specialization, we will study the foundational concept of neural networks and deep learning.
- Week 1 - Introduction to Deep Learning: Understand the significant technological trends driving deep learning development and where and how it’s applied.
- Week 2 - Neural Networks Basics: Set up a machine learning problem with a neural network mindset and use vectorization to speed up your models.
- Week 3 - Shallow Neural Networks: Build a neural network with one hidden layer using forward propagation and backpropagation.
- Week 4 - Deep Neural Networks: Understand the key computations underlying deep learning, use them to build and train deep neural networks, and apply them to computer vision.
In the second course of the Deep Learning Specialization, we will open the deep learning black box to understand the processes that drive performance and generate good results systematically.
- Week 1 - Practical Aspects of Deep Learning: Discover and experiment with various initialization methods, apply L2 regularization and dropout to avoid model overfitting, and use gradient checking to identify errors in a fraud detection model.
- Week 2 - Optimization Algorithms: Develop your deep learning toolbox by adding more advanced optimizations, random mini-batching, and learning rate decay scheduling to speed up your models.
- Week 3 - Hyperparameter Tuning, Batch Normalization and Programming Frameworks: Explore TensorFlow, a deep learning framework that allows you to build neural networks quickly and easily and train a neural network on a TensorFlow dataset.
Course 3: Structuring Machine Learning Projects
In the third course of the Deep Learning Specialization, we will learn how to build a successful machine learning project and get to practice decision-making as a machine learning project leader.
- ML Strategy [1] - Strategic guidelines and Using Human-level Performance: Streamline and optimize your ML production workflow by implementing strategic guidelines for goal-setting and applying human-level performance to help define key priorities.
- ML Strategy [2] - Error Analysis & End-to-end Deep Learning: Develop time-saving error analysis procedures to evaluate the most worthwhile options to pursue and gain intuition for how to split your data and when to use multi-task, transfer, and end-to-end deep learning.
Course 4: Convolutional Neural Networks
In the fourth course of the Deep Learning Specialization, we will understand how computer vision has evolved and become familiar with its exciting applications such as autonomous driving, face recognition, reading radiology images, and more.
- Week 1 - Foundations of Convolutional Neural Networks: Implement the foundational layers of CNNs (pooling, convolutions) and stack them properly in a deep network to solve multi-class image classification problems.
- Week 2 - Deep Convolutional Models - Case Studies: Discover practical techniques and methods used in research papers to apply transfer learning to your own deep CNN.
- Week 3 - Object Detection: Apply your knowledge of CNNs to computer vision: object detection and semantic segmentation using self-driving car datasets.
- Week 4 - Special Applications: Face recognition & Neural Style Transfer: Discover how CNNs can be applied to multiple fields, including art generation and face recognition, and implement your own algorithm to generate art and recognize faces.
Course 5: Sequence Models
In the fifth course of the Deep Learning Specialization, we will become familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more.
- Week 1 - Recurrent Neural Networks: Discover recurrent neural networks (RNNs) and several of their variants, including LSTMs, GRUs and Bidirectional RNNs, all models that perform exceptionally well on temporal data.
- Week 2 - Natural Language Processing and Word Embeddings: Use word vector representations and embedding layers to train recurrent neural networks with an outstanding performance across a wide variety of applications, including sentiment analysis, named entity recognition, and neural machine translation.
- Week 3 - Sequence Models and the Attention Mechanism: Augment your sequence models using an attention mechanism, an algorithm that helps your model decide where to focus its attention given a sequence of inputs, explore speech recognition and how to deal with audio data, and improve your sequence models with the attention mechanism.
- Week 4 - Transformers: Build the transformer architecture and tackle natural language processing (NLP) tasks such as attention models, named entity recognition (NER) and Question Answering (QA).
The solutions to the assignments uploaded here are only for reference.
- In the course of Convolutional Neural Networks (4th course) some datasets and pretrained models of the assignments were removed, so that the files were lighter when downloading them.