This course is for upper-level graduates who are planning careers in computational neuroscience. It focuses on the problem of supervised learning from the perspective of modern statistical learning theory starting with the theory of multivariate function approximation from sparse data. It develops tools such as Regularization including Support Vector Machines for regression and classification. It derives generalization bounds using both stability and VC theory.
This course does not involve any written exams. Students need to answer 5 assignment questions to complete the course, the answers will be in the form of written work in pdf or word. Students can write the answers in their own time. Each answer needs to be 200 words (1 Page). Once the answers are submitted, the tutor will check and assess the work.
Edukite courses are free to study. To successfully complete a course you must submit all the assignment of the course as part of the assessment. Upon successful completion of a course, you can choose to make your achievement formal by obtaining your Certificate at a cost of £49.
Having an Official Edukite Certification is a great way to celebrate and share your success. You can:
- Add the certificate to your CV or resume and brighten up your career
- Show it to prove your success
Course Credit: MIT
|The Course at a Glance||02:00:00|
|The Learning Problem in Perspective||01:15:00|
|Reproducing Kernel Hilbert Spaces||01:15:00|
|Regression and Least-Squares Classification||00:45:00|
|Support Vector Machines for Classification||01:15:00|
|Unsupervised Learning Techniques||01:00:00|
|Boosting and Bagging||00:30:00|
|Stability of Tikhonov Regularization||00:45:00|
|Uniform Convergence Over Function Classes||01:00:00|
|Uniform Convergence for Classification||00:45:00|
|Math Camp 1: Functional Analysis||01:30:00|
|Math Camp 2: Probability Theory||00:15:00|
|Submit Your Assignment||00:00:00|
No Reviews found for this course.