Course Outline, Grading etc.

[TA for Course]  [Course Outline]  [Grading]  [In Class] [Evaluation]



Course TAs
The TAs are assigned as below. They will be grading your assignments. Pl. contact them for demo schedules. They are also there to help you with difficulties. Of course, you can always contact me as well.

TA NameTA emailRoll numbers
Harsh Bhandariharshbh@Y11 and earlier + till 12063
Jaimita Bansaljmbansal@12066-12379
Venugopal Reddygvgreddy@12383-12694
Zahira Nasrinzahira@12717-12833,13001-13145
Ishan Daroliaidarolia@13157-13369
Milind Solankimilind@13399-13617
Chetan Guptagchetan@13618-13819, 15111001-15111004
Vivek Guptavgupta@15111007-15211263




Course Outline
  1. Course description



Evaluation

ItemWeight
Midsem25%
Endsem30%
Assignments15%
Project30%


Project is to be done in groups of 4. Groups will be created by semi-random methods. The 30% credit for the project is divided as:
10% Project proposal 20th Jan. 2016.
65% demo+presentation (at semester end) - schedule will be put up.
25% project report after the demo+presentation.


Lecture summaries and other material
(Access to the material requires login/passwd for the ftp site.)

  1. Machine learning: introduction
    Source:
    Slides. Notes.
    Other references:
    Look at introductory youtube videos:
    Yaser Abu-Mostafa introductory lecture at Caltech.
    Andrew Ng's first lecture at Stanford
  2. Bayes decision rule
    Source:
    Bayes decision rule (notes)
    Generative-discriminative, discriminant functions (slides)
    Other references:
    Chp 2 of Duda, Hart and Stork.
  3. Naive Bayes algorithm
    Source:
    Naive Bayes
    Other references:
  4. Nearest neighbour algorithms
    Source:
  5. Linear discriminant functions
    Source:
    Linear discriminant functions
    Separation boundary examples (Duda, Hard, Stork)
    Perceptrons
    Other references:
    Look at parts of sec. 5.8 Duda, Hart, Stork
  6. Performance evaluation
    Source:
    Performance evaluation
    Other references:
    See the ICML2012 tutorial slides by Nathalie Japkowicz and Mohak Shah (ftp site). Also available on homepage.
    Chp 5 in Witten, Frank, Hall, Data Mining, 3rd Ed. Morgan Kaufmann, 2011.
  7. SVMs - large margin classifiers
    Source:
    SVMs
    Other references:
  8. Kernels
    Source:
    Kernels
    Other references:
  9. Regression
    Source:
    Regression
    Other references:
    Hypothesis testing tutorial
  10. Dimension reduction
    Source:
    Dimension Reduction
    Other references:
    Dimension Reduction
    Kernel PCA
  11. Decision trees
    Source:
    Decision trees
    Other references:
    Also see sections 8.1 to 8.4 of Duda, Hart, Stork
  12. Ensembles
    Source:
    Ensembles: bagging, boosting
    Other references:
    Section 9.5 of Duda, Hart, Stork.
    Tutorial introduction to Adaboost, Rojas (ftp).
  13. Neural networks
    Source:
    Neural networks
    Neural networks:practical aspects
    Other references:
    Sections 6.1-6.8 Duda, Hart, Stork.
  14. Deep networks
    Source:
    Deep networks
    Other references:
    Bengio et al, Learning representations
    Hinton's 2006 paper
  15. Estimating parameters
    Source:
    Parameter estimation
    Other references:
    Look at sec 3.2, 3.9 of Duda, Hart, Stork.
  16. Learning theory
    Source:
    Learning theory slides
    see chps 3-6 of Shalev-Shwartz, David, Understanding ML chps 3-6 (ftp colt.pdf).
  17. Introduction to Bayesian decision theory
    Source:
    Basic Bayes decision theory
    Other references:
    Sections 2.1-2.3 from Pattern Classification, Duda, Hart, Stork.
    Basic Bayes decision theory
    Other references:
    Sections 2.1-2.3 from Pattern Classification, Duda, Hart, Stork.
  18. Decision trees and random forests
    Source:
    Will be put up soon.
    Other references:
    Sections 8.1-8.3 of Pattern Classification, Duda, Hart, Stork.
    See RS Safarian, D Landgrebe, A survey of decision tree classifer methodology, IEEE-SMC, 21(3), 660-674, 1991 for a comprehensive survey of splitting techniques. (available through ieeexplore)
    See Leo Breiman, Random forests, Machine Learning, 45, 5-32, 2001 for a complete discussion on the random forest classifier. (available online through the journal page)
  19. Linear discriminators
    Source:
    Linear discriminant analysis
    Other references:
    Sections Sections 5.1-5.5, 5.7, 5.8.1, 5.8.4, 5.8.5, 5.12 Pattern Classification, Duda, Hart, Stork
    Bias-variance trade-off Section 9.3 Pattern Classification, Duda, Hart, Stork.
    SVM - see Andrew Ng's Stanford notes (on ftp site).
  20. Kernels
    Source:
    Class notes will be put soon.
    Other references:
    chp 2 from Learning with kernels, Scholkopf and Smola
    chp 3 from Intro to SVMs and other kernel based learning methods, Cristianini, Shawe-Taylor.
    chp 2 from Kernel methods in pattern recognition, Shawe-Taylor, Cristianini
  21. Neural networks
    Source:
    Sections 6.1-6.8 Duda, Hart, Stork.
  22. Parameter estimation - MLE, Bayesian, EM
    Source:
    Sections 3.1 to 3.4, 3.8 Duda, Hart, Stork.
  23. Non-parametric estimation, k-NN
    Source:
    Sections 4.1 to 4.3.4, 4.3.6, 4.4, 4.5 Duda, Hart, Stork.
  24. PCA and SVD
    Source:
    See the book by Jolliffe chps 1-3.
  25. Classifier ensembles
    Source:
    Adaboost tutorial by Freund, Schapire