Introduction to machine learning: What is learning, learning objectives, data needed.
Bayesian inference and learning: Inference, naïve Bayes.
Basic objective of learning: Assumption of nearness and contiguity in input spaces, accuracy, Bayesian risk and casting of learning as Bayesian inference, Risk matrix, Other cost measures
Other issues in learning: Generalization and model complexity, Accuracy, Empirical risk and training, validation, and testing, Model complexity, Structural risk, Number of free parameters vs. VC dimension, Bias-variance tradeoff, Curse of dimensionality, Training sample size requirement, Convergence and training time, Memory requirement, Introduction to online/incremental learning Objective functions for classification, regression, and ranking
Some supervised learning formulations: Linear regression and LMS algorithm, Perceptron and logistic regression, Cybenko’s theorem for nonlinear function estimation, MLP and backpropagation, introduction to momentum and quasi-Newton, L1-norm penalty and sparsity, SVM, support vector regression
Decision trees Kernelization of linear problems: RBF, increase in dimensionality through simple kernels, kernel definition and Mercer’s theorem, Kernelized SVM and SVR, Other applications of kernelization, matching a kernel to a problem
Role of randomization and model combination: Committees and random forests, boosting cascade of classifiers Some unsupervised learning machines: Clustering criteria, K-means, Fuzzy C-means, DB-scan, PDF estimation, Parzen window, EM-algorithm for mixture of Gaussians
Optional topics: Manifold learning, Kernel-PCA, semi-supervised learning, introduction to generative and probabilistic graphical models
References
Textbooks: The lecture will follow standard texts on machine learning technology. For example, the following book provides all the information needed for this course. Whenever we deviate from the main material discussed in these books, soft-copy of lectures notes will be provided.
Pattern Recognition and Machine Learning, by Christopher Bishop, Springer 2011
Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series), 2012 by Kevin P. Murphy
The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics) 2016 by Trevor Hastie and Robert Tibshirani