(?)

CS178: Machine Learning and Data Mining

CLOSED : 2011 OFFERING

Assignments and Exams:

HW1,Data1/12/11 Soln
HW2,Suppl1/20/11 Soln
HW3,Suppl2/02/11 Soln
HW4,Suppl2/24/11 Soln
HW5,Suppl3/11/11 Soln
HW6Not graded Soln
     
Midterm2/03/1111:00-12:30Soln 
Final3/15/1110:30-12:30  
Student Comment Page

Lecture: Parkview Classroom Bldg (PCB) 1300, TR 11am-12:30pm

Discussion: Parkview Classroom Bldg (PCB) 1300, W 4-5pm

Instructor: Prof. Alex Ihler (ihler@ics.uci.edu), Office Bren Hall 4066

  • Office Hours: Monday 2-3pm

TA: Yifei Chen (yifeic@uci.edu), Office Bren Hall 4089

  • Office Hours: Tuesday 2:00pm~3:00pm

Course Notes in development


Introduction to machine learning and data mining

How can a machine learn from experience, to become better at a given task? How can we automatically extract knowledge or make sense of massive quantities of data? These are the fundamental questions of machine learning. Machine learning and data mining algorithms use techniques from statistics, optimization, and computer science to create automated systems which can sift through large volumes of data at high speed to make predictions or decisions without human intervention.

Machine learning as a field is now incredibly pervasive, with applications from the web (search, advertisements, and suggestions) to national security, from analyzing biochemical interactions to traffic and emissions to astrophysics. Perhaps most famously, the $1M Netflix prize stirred up interest in learning algorithms in professionals, students, and hobbyists alike.

This class will familiarize you with a broad cross-section of models and algorithms for machine learning, and prepare you for research or industry application of machine learning techniques.

Background

We will assume basic familiarity with the concepts of probability and linear algebra. Some programming will be required; we will primarily use Matlab, but no prior experience with Matlab will be assumed.

Textbook and Reading

There is no required textbook for the class. However, useful books on the subject for supplementary reading include Bishop's "Pattern Recognition and Machine Learning", Duda, Hart & Stork, "Pattern Classification", and Hastie, Tibshirani, and Friedman, "The Elements of Statistical Learning".

Matlab

Often we will write code for the course using the Matlab environment. Matlab is accessible through NACS computers at several campus locations (e.g., MSTB-A, MSTB-B, and the ICS lab), and if you want a copy for yourself student licenses are fairly inexpensive ($100). Personally, I do not recommend the open-source Octave program as a replacement, as the syntax is not 100% compatible and may cause problems (for me or you).

If you are not familiar with Matlab, there are a number of tutorials on the web:

You may want to start with one of the very short tutorials, then use the longer ones as a reference during the rest of the term.


Interesting stuff for students


Syllabus and Schedule (may be updated)

  • 01 PDF,Lecture : Introduction: what is ML; Problems, data, and tools; Visualization
    • D01 PDF : Introduction to Matlab
  • 02 PDF,Lecture : Linear regression; SSE; gradient descent; closed form; normal equations; features
  • 03 PDF, Lecture : Overfitting and complexity; training, validation, and test data
    • D02 ZIP PDF Introduction to Matlab II
  • 04 PDF,Lecture : Classification problems; decision boundaries; nearest neighbor methods
  • 05 PDF,Lecture : probability and classification, Bayes optimal decisions
    • D03 PDF : Discussion about HW2
  • 06 PDF, Lecture : Naive Bayes and Gaussian class-conditional distributions
  • 07 PDF,Lecture : Linear classifiers
    • D04 PDF : Bayes' Rule and Naive Bayes Model
  • 08 PDF,Lecture : Logistic regression, online gradient descent, Neural Networks
  • 09 : Review; Decision trees
    • D05 PDF : Discussion about HW3 and feedback on HW1, 2
  • 10 : Midterm Exam
  • 11 PDF,Lecture : Ensemble methods: Bagging, random forests, boosting
    • D06 PDF : Details on Decision Tree and Boosting
  • 12 : (no class today)
  • 13 [see next lecture] : Unsupervised learning: clustering, k-means, hierarchical agglomeration
    • D07 PDF : Details on K-means clustering and HW 2, 3 Feedback
  • 14 PDF,Lecture : Clustering continued, EM
  • 15 PDF,Lecture: Latent space methods; PCA; Netflix
    • D08 PDF : Details on EM (for Gaussian mixture model), and PCA
  • 16 PDF,Lecture : Text representations; naive Bayes and multinomial models; clustering and latent space models
  • 17 PDF,Lecture : VC-dimension, structural risk minimization; margin methods and support vector machines
    • D09 PDF, Suppl: Discussion about HW 4, EM clustering Demo, PCA Demo
  • 18 PDF (awm),Lecture : Support vector machines and large-margin classifiers
  • 19 PDF (awm), Lecture : Time series; Markov models; autoregressive models
    • D10 PDF: Discussion about HW5, HW6, including examples of LDA,VC Dimension and SVM
  • 20 : Review
  • --- : Final exam, Tuesday March 15, 10:30am-12:30pm

Last year's lectures are also available.

Last modified February 13, 2017, at 02:21 PM
Bren School of Information and Computer Science
University of California, Irvine