




This course provides an undergraduatelevel introduction to Statistical Learning. It address problems such as classification and detection, parameter and model estimation, or clustering, which are common in signal processing, communications, image processing, computer vision, artificial intelligence, speech analysis and recognition, datamining, computational biology, bioinformatics, etc. 

Instructor:  Nuno Vasconcelos  
n u n o @ e c e . u c s d . e d u, EBU15602  
office hours: Friday, 9:30a10:30a  
Text:  Introduction to Machine Learning  
Ethem Alpaydin, MIT Press  
Syllabus:  [pdf]  
Homework:  Problem set 1 [pdf] Not due  
Problem set 2 [pdf, data] Due: Lecture 6  
Problem set 3 [pdf] Due: Lecture 8  
Problem set 4 [pdf, data] Due: Lecture 14  
Problem set 5 [pdf] Due: Lecture 16  
Problem set 6 [pdf] Due: Lecture 18  
Problem set 7 [pdf, libsvm, instructions, example] Due: Lecture 20  
Topics:  Lecture 1: introduction [slides]  
Lecture 2: review of linear algebra [slides]  
Lecture 3: review of linear algebra (continued)  
Lecture 4: review of probability [slides]  
Lecture 5: metrics, whitening, nearest neighbors [slides]  
Lecture 6: Bayes decision rule [slides]  
Lecture 7: Bayes decision rule [slides]  
Lecture 8: Bayes decision rule (continued)  
Lecture 9: midterm review [problems,solutions]  
Lecture 10: midterm  
Lecture 11: Maximum Likelihood Estimation [slides]  
Lecture 12: MLE & Regression[slides]  
Lecture 13: MLE & Regression (continued)  
Lecture 14: Least Squares [slides]  
Lecture 15: Clustering, kmeans [slides]  
Lecture 16: Clustering, EM [slides]  
Lecture 17: Principal component analysis [slides]  
Lecture 18: Kernels [slides]  
Lecture 19: Support Vector Machine [slides]  
Lecture 20: Support Vector Machine [slides]  