• LOGIN
  • No products in the cart.

[course_title] are everywhere and will determine the world of tomorrow. Because of that we all should familiarize ourselves with this topic. If we do so, we will create tremendous opportunities and lay the foundation to a bright future. In this course you will learn about artificial neural networks and how they’re being used for machine learning, as applied to speech and object recognition, image segmentation, modeling language and human motion, etc.

Assessment

This course does not involve any written exams. Students need to answer 5 assignment questions to complete the course, the answers will be in the form of written work in pdf or word. Students can write the answers in their own time. Each answer needs to be 200 words (1 Page). Once the answers are submitted, the tutor will check and assess the work.

Certification

Edukite courses are free to study. To successfully complete a course you must submit all the assignment of the course as part of the assessment. Upon successful completion of a course, you can choose to make your achievement formal by obtaining your Certificate at a cost of £49.

Having an Official Edukite Certification is a great way to celebrate and share your success. You can:

  • Add the certificate to your CV or resume and brighten up your career
  • Show it to prove your success

 

Course Credit: Open Culture

Course Curriculum

Module: 01
Bayesian optimization of hyper-parameters [Neural Networks for Machine Learning] 00:13:00
The fog of progress [Neural Networks for Machine Learning] 00:02:00
Why do we need machine learning [Neural Networks for Machine Learning] 00:13:00
What are neural networks [Neural Networks for Machine Learning] 00:08:00
Some simple models of neurons [Neural Networks for Machine Learning] 00:08:00
A simple example of learning [Neural Networks for Machine Learning] 00:06:00
Three types of learning [Neural Networks for Machine Learning] 00:08:00
Types of neural network architectures [Neural Networks for Machine Learning] 00:07:00
Perceptrons: first-generation neural networks [Neural Networks for Machine Learning] 00:08:00
A geometrical view of perceptrons [Neural Networks for Machine Learning] 00:06:00
Why the learning works [Neural Networks for Machine Learning] 00:05:00
Module: 02
What perceptrons can’t do [Neural Networks for Machine Learning] 00:15:00
Learning the weights of a linear neuron [Neural Networks for Machine Learning] 00:12:00
The error surface for a linear neuron [Neural Networks for Machine Learning] 00:05:00
Learning weights of logistic output neuron [Neural Networks for Machine Learning] 00:04:00
The backpropagation algorithm [Neural Networks for Machine Learning] 00:12:00
Using the derivatives from backpropagation [Neural Networks for Machine Learning] 00:10:00
Learning to predict the next word [Neural Networks for Machine Learning] 00:13:00
A brief diversion into cognitive science [Neural Networks for Machine Learning] 00:04:00
The softmax output function [Neural Networks for Machine Learning] 00:07:00
Neuro-probabilistic language models [Neural Networks for Machine Learning] 00:08:00
Dealing with many possible outputs [Neural Networks for Machine Learning] 00:12:00
Module: 03
Why object recognition is difficult [Neural Networks for Machine Learning] 00:05:00
Achieving viewpoint invariance [Neural Networks for Machine Learning] 00:06:00
Convolutional nets for digit recognition [Neural Networks for Machine Learning] 00:16:00
Convolutional nets for object recognition [Neural Networks for Machine Learning] 00:18:00
Overview of mini batch gradient descent [Neural Networks for Machine Learning] 00:08:00
A bag of tricks for mini batch gradient descent [Neural Networks for Machine Learning] 00:13:00
The momentum method [Neural Networks for Machine Learning] 00:09:00
Adaptive learning rates for each connection [Neural Networks for Machine Learning] 00:06:00
Rmsprop: normalize the gradient [Neural Networks for Machine Learning] 00:12:00
Modeling sequences: a brief overview [Neural Networks for Machine Learning] 00:17:00
Training RNNs with back propagation [Neural Networks for Machine Learning] 00:06:00
Module: 04
A toy example of training an RNN [Neural Networks for Machine Learning] 00:06:00
Why it is difficult to train an RNN? [Neural Networks for Machine Learning] 00:08:00
Long term Short term memory [Neural Networks for Machine Learning] 00:09:00
A brief overview of Hessian-free optimization [Neural Networks for Machine Learning] 00:04:00
Modeling character strings [Neural Networks for Machine Learning] 00:15:00
Predicting the next character using HF [Neural Networks for Machine Learning] 00:12:00
Echo State Networks [Neural Networks for Machine Learning] 00:10:00
Overview of ways to improve generalization [Neural Networks for Machine Learning] 00:12:00
Limiting the size of the weights [Neural Networks for Machine Learning] 00:06:00
Using noise as a regularizer [Neural Networks for Machine Learning] 00:07:00
Introduction to the full Bayesian approach [Neural Networks for Machine Learning] 00:11:00
Module: 05
The Bayesian interpretation of weight decay [Neural Networks for Machine Learning] 00:11:00
MacKay ‘s quick and dirty method [Neural Networks for Machine Learning] 00:01:00
Why it helps to combine models [Neural Networks for Machine Learning] 00:13:00
Mixtures of Experts [Neural Networks for Machine Learning] 00:13:00
The idea of full Bayesian learning [Neural Networks for Machine Learning 00:07:00
Making full Bayesian learning practical [Neural Networks for Machine Learning] 00:07:00
Dropout [Neural Networks for Machine Learning] 00:09:00
Hopfield Nets [Neural Networks for Machine Learning] 00:13:00
Dealing with spurious minima [Neural Networks for Machine Learning] 00:11:00
Hopfield nets with hidden units [Neural Networks for Machine Learning] 00:10:00
Using stochastic units to improve search [Neural Networks for Machine Learning] 00:10:00
Module: 06
How a Boltzmann machine models data [Neural Networks for Machine Learning] 00:12:00
Boltzmann machine learning [Neural Networks for Machine Learning] 00:12:00
More efficient ways to get the statistics [Neural Networks for Machine Learning] 00:12:00
Restricted Boltzmann Machines [Neural Networks for Machine Learning] 00:11:00
An example of RBM learning [Neural Networks for Machine Learning] 00:07:00
RBMs for collaborative filtering [Neural Networks for Machine Learning] 00:08:00
The ups and downs of backpropagation [Neural Networks for Machine Learning] 00:10:00
Belief Nets [Neural Networks for Machine Learning] 00:13:00
Learning sigmoid belief nets [Neural Networks for Machine Learning] 00:12:00
The wake sleep algorithm [Neural Networks for Machine Learning] 00:13:00
Learning layers of features by stacking RBMs [Neural Networks for Machine Learning] 00:18:00
Module: 07
Discriminative learning for DBNs [Neural Networks for Machine Learning] 00:10:00
Discriminative fine tuning [Neural Networks for Machine Learning] 00:09:00
Modeling real valued data with an RBM [Neural Networks for Machine Learning] 00:10:00
RBMs are infinite sigmoid belief nets [Neural Networks for Machine Learning] 00:17:00
From PCA to autoencoders [Neural Networks for Machine Learning] 00:08:00
Deep autoencoders [Neural Networks for Machine Learning] 00:04:00
Deep autoencoders for document retrieval [Neural Networks for Machine Learning] 00:08:00
Semantic Hashing [Neural Networks for Machine Learning] 00:09:00
Learning binary codes for image retrieval [Neural Networks for Machine Learning] 00:10:00
Shallow autoencoders for pre-training [Neural Networks for Machine Learning] 00:07:00
Learning a joint model of images and captions [Neural Networks for Machine Learning] 00:09:00
Hierarchical Coordinate Frames [Neural Networks for Machine Learning] 00:10:00
Assessment
Submit Your Assignment 00:00:00
Certification 00:00:00

Course Reviews

4.7

4.7
9 ratings
  • 5 stars0
  • 4 stars0
  • 3 stars0
  • 2 stars0
  • 1 stars0

No Reviews found for this course.

5 STUDENTS ENROLLED
©2021 Edukite. All Rights Resereved
Edukite is A Part Of Ebrahim College, Charity Commission
Reg No 110841