Revised June 2016

CS 480: Introduction to Machine Learning

Watch a video introduction to this course on YouTube.

General description

The course introduces students to the design of algorithms that enable machines to "learn". In contrast to the classic paradigm where machines are programmed by specifying a set of instructions that dictate what exactly a machine should do, a new paradigm is developed whereby machines are presented with examples from which they learn what to do. This is especially useful in complex tasks such as natural language processing, information retrieval, data mining, computer vision and robotics where it is not practical for a programmer to enumerate all possible situations in order to specify suitable instructions for all situations. Instead, a machine is fed with large datasets of examples from which it automatically learns suitable rules to follow. The course will introduce the basics of machine learning and data analysis.



  • CS major students. Usually taken in fourth year. Beneficial for students who are interested in computer applications that solve sophisticated problems.

Normally available

  • Fall, Winter, and Spring

Related courses

  • Pre-requisites: CM 339/CS 341 or SE 240; Computer Science students only
  • Co-requisites: STAT 206 or 231 or 241

For official details, see the UW calendar.

Typical reference(s)

  • Hal Daume III, A course in Machine Learning (under writing)
  • Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep Learning (under writing)
  • Christopher Bishop, Pattern Recognition and Machine Learning (2006)
  • Kevin Murphy, Machine Learning, A Probabilistic Perspective (2012)

Required preparation

At the start of the course, students should be able to

  • Use basic algebra, calculus, and probability
  • Write efficient, readable programs from scratch
  • Read and write technical documents

Learning objectives

At the end of the course, students should be able to

  • Formalize a task as a machine learning problem
  • Identify suitable algorithms to tackle different machine learning problems
  • Apply machine learning algorithms to datasets

Typical syllabus


  • Generalization
  • Underfitting, overfitting
  • Cross-validation

Linear models

  • Linear regression
  • Classification with linear separators (mixtures of Gaussians, logistic regression, perceptron, support vector machines)

Non-linear models

  • Non-linear basis functions
  • Kernel methods
  • Deep neural networks

Unsupervised learning

  • Clustering

Sequence learning

  • Hidden markov models
  • Recurrent and recursive neural networks

Ensemble learning

  • Bagging
  • Boosting

Large scale learning

  • Distributed learning
  • Stream learning

End user issues of Machine Learning

Real-world applications of Machine Learning

Topics in Machine Learning