The application of Artificial Intelligence is now visible in healthcare systems, delivering personalized education, precision agriculture and even in self-driving cars. This course will help you to learn about these amazing technologies.

This is a highly sought out course in many countries of the world which specializes on neural networks and deep learning, hyperparameter tuning, regularization and optimization, structuring your machines, convolutional neural networks and natural neural processing to build amazing things.

**Assessment**

This course does not involve any written exams. Students need to answer 5 assignment questions to complete the course, the answers will be in the form of written work in pdf or word. Students can write the answers in their own time. Each answer needs to be 200 words (1 Page). Once the answers are submitted, the tutor will check and assess the work.

**Certification**

Edukite courses are free to study. To successfully complete a course you must submit all the assignment of the course as part of the assessment. Upon successful completion of a course, you can choose to make your achievement formal by obtaining your Certificate at a cost of £49.

Having an Official Edukite Certification is a great way to celebrate and share your success. You can:

- Add the certificate to your CV or resume and brighten up your career
- Show it to prove your success

Course Credit: Deep Learning AI

### Course Curriculum

Module 01 | |||

Neural Networks – Welcome | 00:06:00 | ||

Neural Networks – Supervised Learning with Neural Networks | 00:00:00 | ||

Neural Networks – Why is Deep Learning taking off | 00:10:00 | ||

Neural Networks – About this Course | 00:02:00 | ||

Neural Networks – Geoffrey Hinton interview | 00:40:00 | ||

Neural Networks – Binary Classification | 00:08:00 | ||

Neural Networks – Logistic Regression | 00:06:00 | ||

3 Logistic Regression Cost Function | 00:08:00 | ||

Neural Networks – Gradient Descent | 00:11:00 | ||

Neural Networks – Derivatives | 00:07:00 | ||

Neural Networks – More Derivative Examples | 00:10:00 | ||

Neural Networks – Computation graph | 00:00:00 | ||

Neural Networks – Derivatives with a Computation Graph | 00:15:00 | ||

Module 02 | |||

Neural Networks – Logistic Regression Gradient Descent | 00:07:00 | ||

Neural Networks – Gradient Descent on m Examples | 00:08:00 | ||

Neural Networks – Vectorization | 00:08:00 | ||

Neural Networks – More Vectorization Examples | 00:06:00 | ||

Neural Networks – Vectorizing Logistic Regression | 00:08:00 | ||

Neural Networks – Vectorizing Logistic Regression’s Gradient Output | 00:10:00 | ||

Neural Networks – Broadcasting in Python | 00:11:00 | ||

Neural Networks – A note on pythonnumpy vectors | 00:07:00 | ||

Neural Networks – Quick tour of Jupyteri Python Notebooks | 00:04:00 | ||

Neural Networks – Explanation of logistic regression cost function | 00:07:00 | ||

Neural Networks – Pieter Abbeel interview | 00:16:00 | ||

Module 03 | |||

Neural Networks – What is a neural network | 00:07:00 | ||

Neural Networks – Overview | 00:04:00 | ||

Neural Networks – Neural Network Representation | 00:05:00 | ||

Neural Networks – Computing a Neural Network’s Output | 00:10:00 | ||

Neural Networks – Vectorizing across multiple examples | 00:09:00 | ||

Neural Networks – Explanation for Vectorized Implementation | 00:08:00 | ||

Neural Networks – Activation functions | 00:11:00 | ||

Neural Networks – Why do you need non linear activation functions | 00:06:00 | ||

Neural Networks – Derivatives of activation functions | 00:08:00 | ||

Neural Networks – Gradient descent for Neural Networks | 00:10:00 | ||

Neural Networks – Backpropagation intuition | 00:16:00 | ||

Neural Networks – Random Initialization | 00:08:00 | ||

Neural Networks – Ian Goodfellow interview | 00:15:00 | ||

Module 04 | |||

Neural Networks – Deep L layer neural network | 00:06:00 | ||

Neural Networks – Forward Propagation in a Deep Network | 00:07:00 | ||

Neural Networks – Getting your matrix dimensions right | 00:11:00 | ||

Neural Networks – Why deep representations | 00:11:00 | ||

Neural Networks – Building blocks of deep neural networks | 00:09:00 | ||

Neural Networks – Forward and Backward Propagation | 00:11:00 | ||

Neural Networks – Parameters vs Hyperparameters | 00:07:00 | ||

Neural Networks – What does this have to do with the brain | 00:03:00 | ||

Neural Networks – Train Dev Test sets | 00:12:00 | ||

Neural Networks – Bias Variance | 00:09:00 | ||

Neural Networks – Basic Recipe for Machine Learning | 00:06:00 | ||

Neural Networks – Regularization | 00:10:00 | ||

Neural Networks – Why regularization reduces overfitting | 00:07:00 | ||

Neural Networks – Dropout Regularization | 00:09:00 | ||

Module 05 | |||

Neural Networks – Understanding Dropout | 00:07:00 | ||

Neural Networks – Other regularization methods | 00:08:00 | ||

Neural Networks – Normalizing inputs | 00:06:00 | ||

Neural Networks – Vanishing Exploding gradients | 00:06:00 | ||

Neural Networks – Weight Initialization for Deep Networks | 00:06:00 | ||

Neural Networks – Numerical approximation of gradients | 00:07:00 | ||

Neural Networks – Gradient checking | 00:07:00 | ||

Neural Networks – Gradient Checking Implementation Notes | 00:05:00 | ||

Neural Networks – Yoshua Bengio interview | 00:26:00 | ||

Neural Networks – Mini batch gradient descent | 00:11:00 | ||

Neural Networks – Understanding mini batch gradient descent | 00:11:00 | ||

Neural Networks – Exponentially weighted averages | 00:06:00 | ||

Neural Networks – Understanding exponentially weighted averages | 00:10:00 | ||

Neural Networks – Bias correction in exponentially weighted averages | 00:04:00 | ||

Neural Networks – Gradient descent with momentum | 00:09:00 | ||

Module 06 | |||

Neural Networks – RMSprop | 00:08:00 | ||

Neural Networks – Adam optimization algorithm | 00:07:00 | ||

Neural Networks – Learning rate decay | 00:07:00 | ||

Neural Networks – The problem of local optima | 00:05:00 | ||

Neural Networks – Yuanqing Lin interview | 00:14:00 | ||

Neural Networks – Tuning process | 00:07:00 | ||

Neural Networks – Using an appropriate scale to | 00:09:00 | ||

Neural Networks – Hyperparameters tuning in prac | 00:07:00 | ||

Neural Networks – Normalizing activations in a n | 00:09:00 | ||

Neural Networks – Fitting Batch Norm into a neur | 00:13:00 | ||

Neural Networks – Why does Batch Norm work | 00:12:00 | ||

Neural Networks – Batch Norm at test time | 00:06:00 | ||

Neural Networks – Softmax Regression | 00:12:00 | ||

Neural Networks – Training a softmax classifier | 00:10:00 | ||

Module 07 | |||

Neural Networks – Deep learning frameworks | 00:04:00 | ||

Neural Networks – TensorFlow | 00:16:00 | ||

Neural Networks – Why ML Strategy | 00:03:00 | ||

Neural Networks – Orthogonalization | 00:11:00 | ||

Neural Networks – Single number evaluation metric | 00:07:00 | ||

Neural Networks – Satisficing and Optimizing metric | 00:06:00 | ||

Neural Networks – Traindevtest distributions | 00:07:00 | ||

Neural Networks – Size of the dev and test sets | 00:06:00 | ||

Neural Networks – When to change devtest sets and metrics | 00:11:00 | ||

Neural Networks – Why human level performance | 00:06:00 | ||

Neural Networks – Avoidable bias | 00:07:00 | ||

Neural Networks – Understanding human level performance | 00:11:00 | ||

Neural Networks – Surpassing human level performance | 00:06:00 | ||

Neural Networks – Andrej Karpathy interview | 00:15:00 | ||

Neural Networks – Improving your model performance | 00:06:00 | ||

Module 08 | |||

Neural Networks – ML Strategy 2 | 00:11:00 | ||

Neural Networks – Cleaning up incorrectly labeled data | 00:13:00 | ||

Neural Networks – Cleaning up incorrectly labeled data | 00:13:00 | ||

Neural Networks – Training and testing on different distributions | 00:11:00 | ||

Neural Networks – Bias and Variance with mismatched data distributions | 00:18:00 | ||

Neural Networks – Addressing data mismatch | 00:10:00 | ||

Neural Networks – Transfer learning | 00:11:00 | ||

Neural Networks – Multi task learning | 00:13:00 | ||

Neural Networks – Whether to use end to end deep learning | 00:10:00 | ||

Neural Networks – Whether to use end to end deep learning | 00:10:00 | ||

Neural Networks – Ruslan Salakhutdinov interview | 00:17:00 | ||

Neural Networks – Computer Vision | 00:06:00 | ||

Neural Networks – Edge Detection Example | 00:12:00 | ||

Neural Networks – More Edge Detection | 00:08:00 | ||

Neural Networks – Padding | 00:10:00 | ||

Neural Networks – Strided Convolutions | 00:09:00 | ||

Neural Networks – Convolutions Over Volume | 00:11:00 | ||

Module 09 | |||

Neural Networks – One Layer of a Convolutional Network | 00:16:00 | ||

Neural Networks – Simple Convolutional Network Example | 00:09:00 | ||

Neural Networks – Pooling Layers | 00:10:00 | ||

Neural Networks – CNN Example | 00:13:00 | ||

Neural Networks – Why Convolutions | 00:10:00 | ||

Neural Networks – Why look at case studies | 00:03:00 | ||

Neural Networks – Classic Networks | 00:18:00 | ||

Neural Networks – ResNets | 00:07:00 | ||

Neural Networks – Why ResNets Work | 00:09:00 | ||

Neural Networks – Networks in Networks and 1×1 Convolutions | 00:07:00 | ||

Neural Networks – Inception Network Motivation | 00:10:00 | ||

Neural Networks – Inception Network | 00:09:00 | ||

Neural Networks – Using Open Source Implementation | 00:06:00 | ||

Neural Networks – Transfer learning | 00:11:00 | ||

Neural Networks – Data Augmentation | 00:10:00 | ||

Neural Networks – State of Computer Vision | 00:13:00 | ||

Neural Networks – Object Localization | 00:12:00 | ||

Neural Networks – Landmark Detection | 00:06:00 | ||

Neural Networks – Object Detection | 00:06:00 | ||

Module 10 | |||

Neural Networks – Convolutional Implementation of Sliding Windows | 00:11:00 | ||

Neural Networks – Bounding Box Predictions | 00:15:00 | ||

Neural Networks – Intersection Over Union | 00:04:00 | ||

Neural Networks – Non max Suppression | 00:08:00 | ||

Neural Networks – Anchor Boxes | 00:10:00 | ||

Neural Networks – YOLO Algorithm | 00:07:00 | ||

Neural Networks – Region Proposals | 00:06:00 | ||

Neural Networks – What is face recognition | 00:05:00 | ||

Neural Networks – One Shot Learning | 00:05:00 | ||

Neural Networks – Siamese Network | 00:05:00 | ||

Neural Networks – Triplet Loss | 00:16:00 | ||

Neural Networks – Face Verification and Binary Classification | 00:06:00 | ||

Neural Networks – What is neural style transfer | 00:02:00 | ||

Neural Networks – What are deep ConvNets learning | 00:08:00 | ||

Neural Networks – Cost Function | 00:00:00 | ||

Neural Networks – Content Cost Function | 00:04:00 | ||

Neural Networks – Style Cost Function | 00:13:00 | ||

Neural Networks – 1D and 3D Generalizations | 00:09:00 | ||

Assessment | |||

Submit Your Assignment | 00:00:00 | ||

Certification | 00:00:00 |

### Course Reviews

No Reviews found for this course.

**497 STUDENTS ENROLLED**