Washington University in St. Louis
Department of Computer Science and Engineering


CSE 417T: Introduction to Machine Learning

Fall 2017

ANNOUNCEMENTS

OVERVIEW

This course is an introduction to machine learning, focusing on supervised learning. We will cover the mathematical foundations of learning as well as a number of important techniques for classification and regression, including linear and logistic regression, neural networks, nearest neighbor techniques, kernel methods, decision trees, and ensemble methods. Note that the material in this course is a prerequisite for CSE 517A, the graduate level machine learning class. The overlap with CSE 511A (Artificial Intelligence) is minimal.

STAFF

Instructors:
Sanmay Das Jolley 512 sanmay at wustl dot edu
Chien-Ju Ho Jolley 532 chienju.ho at wustl dot edu

Graduate Assistants and TAs:
There are several graduate assistants and undergraduate TAs for the class. All assistants will hold regular office hours, answer questions on Piazza, and grade homeworks. The graduate assistants will also hold occasional recitation or review sessions.
Mingquan Yuan (Graduate Assistant to the Instructors) 7.yuan at wustl dot edu
Liang Zhou (Graduate Assistant to the Instructors) liang.zhou at wustl dot edu
David Flasterstein davidflasterstein at wustl dot edu
Aaron Gordon aarongordon at wustl dot edu
Trevor Larsen trevorlarsen at wustl dot edu
Gwyneth Pearson gpearson at wustl dot edu

Office Hours
Here is a grid with information on office hours organized by day of the week. The instructors will have office hours either in their offices or as otherwise mentioned in class.
Please check the updated locations.
Office Hours Note
Mondays 3-4pm (Sanmay) 5:45-7:45pm (David) Eads 103
Tuesdays 4-6pm (Gwyneth) Duncker 101 6-8pm (Liang) Duncker 101
Wednesdays 2-4pm (Aaron) January Hall 110
Thursdays 12-2pm (Trevor) Rudolph 203 4-5pm (Chien-Ju)
Fridays 9:30-11:30am (Mingquan) Louderman 461

POLICIES

Detailed policies are in the official syllabus. A few points to highlight: please read and understand the collaboration policy and the late day policy. There will be two exams, each covering approximately half the course material, and no separate final exam.

TEXTBOOKS

The course textbooks are

PREREQUISITES

CSE 247, ESE 326 (or Math 320), Math 233, and Math 309 (can be taken concurrently) or equivalents. If you do not have a solid background in calculus, probability, and computer science through a class in data structures and algorithms then you may have a hard time in this class. Matrix algebra will be used and is fundamental to modern machine learning, but it's OK to take that class concurrently.

SCHEDULE, READING, AND ASSIGNMENTS

Date Instructor Topics Readings Assignments
Aug 28/29 Introduction. Course policies. Course overview. Lecture notes (Ho). Lecture notes (Das). AML 1.1, 1.2.
Aug 30/31 The perceptron learning algorithm. Generalizing outside the training set and Hoeffding's inequality. AML 1.1.2, 1.3.1, 1.3.2, and Problem 1.3
Sep 5/6 Matlab session. Accessing Matlab (by Marion Neumann)
Sep 7 Sanmay Generalization, Hoeffding's inequality, multiple hypotheses, different costs, label noise AML: Rest of Chapter 1 HW1
Submission Instructions
Sep 11 Chien-Ju Hoeffding's inequality, multiple hypotheses, different costs, label noise, growth function. AML: Rest of Chapter 1. AML 2.1.1
Sep 12 Sanmay Infinite hypothesis spaces, growth function, and VC-dimension. AML 2.1.1-2.1.3
Sep 13 Chien-Ju Infinite hypothesis spaces, growth function, and VC-dimension. AML 2.1.1-2.1.3
Sep 14 Sanmay VC Generalization bound. Real-valued targets. AML 2.1.4-2.2
Sep 18 Chien-Ju VC Generalization bound. Test set. The bias-variance trade-off. AML 2.1.4-2.3
Sep 19 Sanmay The bias-variance trade-off; the pocket algorithm. AML 2.3-3.1 HW2
Sep 20 Chien-Ju The bias-variance trade-off, the pocket algorithm, and linear regression. AML 2.3-3.2