Pattern Recognition
HT2011, 10 ECTS 
Overview
Recognizing patterns in nature, numbers, geometry, and the world around us forms the basis of discovery in mathematics and the sciences. The human brain from the time of its inception is in ever continual learning, constantly striving to sort out and simplify complex input upon which it builds generalizations and innately attempts to find structure. Understanding the nature of such a process and how it takes place in the mind is in itself a difficult task. Humans are naturally adept at integrating and assimilating information and at recognizing patterns intuitively whether in human behavior, identifying familiar objects, or recognizing other people. However, the process through which we achieve such recognition remains almost entirely hidden from us, and there is only little we know today of how our own brain functions and how it is built for pattern recognition tasks. Despite this reality, great applications still arise from our current knowledge and from extending our ability in pattern recognition to machines enabling them to mimic how humans perform such tasks. On the other hand, though lacking in cognition computers provide us with a greater ability than our own in deriving detailed features from objects that may otherwise not be apparent to a human observer, a fact that may prove vital in achieving correct classification. The understanding of how we may program machines to carry out pattern recognition tasks on their own and assist humans in decision making processes constitutes the basis of this course. 
Course Description The course is intended to provide the student with an extensive and thorough insight into pattern recognition. The main topics covered are supervised and unsupervised classification methods, regression, density estimation, and dimensionality reduction. The course includes a treatment of the following subjects:
The topics will be presented theoretically through lectures and discussions and practically through lab sessions and project work. 
Learning Outcomes In completing the course, the student shall be able to:

Course Literature

Prerequisites/Corequisites

Lab Work The course will include six lab sessions requiring programming in Matlab. Each lab session is generally assigned a two week deadline. Note that the labs are a compulsory part of the course. You are encouraged to work in groups of two during the labs, however you may also choose to work individually. You will be provided assistance throughout the lab periods, and discussions among students and groups are highly recommended. 
Project For the project assignment you are encouraged to work in groups of two or three. You have to choose your own project in either one of the following main topics: unsupervised classification, supervised classification, or regression. You will have to adapt the lab work in order to solve more challenging problems. There will be some project suggestions, however in any case the datasets you choose for the project have to be approved a priori. 
Assessment

Schedule All lectures will be held at the Centre for Image Analysis (CBA), Polacksbacken 2, Uppsala. 
Introduction Lecture 1 : Introduction to Pattern Recognition
Thursday, 27 Oct, 10.1512.00 room: 2115 lecturer: J,Azar Main idea, supervised classification, clustering, and regression. Feature space representation, terminology, decision functions & generalization. Multivariate data visualization. (Appendix D, 1.11.4). Lecture 2 : Mathematical Background
Friday, 28 Oct, 10:1512:00 room: 2115 lecturer: J,Azar Measurements and features in regard to image processing. Revision of basic probability & statistics: joint probabilities, central limit theorem, Bayes’ theorem, covariance, independence and correlation. Lagrange multipliers. (Appendix E) Lecture 3 : Fundamental Concepts
Tuesday, 1 Nov, 10:1512:00 room: 2115 lecturer: J,Azar Bayesian approach to pattern recognition. Density estimation: parametric methods, multivariate Gaussian, data whitening. Nonparametric methods (kNN, Parzen), curse of dimensionality, maximum likelihood, crossvalidation. (2.2, 3.13.3, 3.5) Lab1
Thursday, 3 Nov, 13:1517:00 room: 2315D lecturer: J,Azar Data visualization, central limit theorem, multivariate normal distribution, data whitening, nonparametric density estimation: Parzen, nearest neighbor.
Dimensionality Reduction Lecture 4 : Feature Selection
Monday, 7 Nov, 10:1512:00 room: 2115 lecturer: J,Azar Search algorithms, branch & bound, scatter matrices, criteria functions. Feature selection by global optimization: (meta)heuristic methods: genetic algorithms, simulated annealing. (9.19.2) Lecture 5 : Feature Extraction
Tuesday, 8 Nov, 13:1515:00 room: 2115 lecturer: J,Azar Linear feature extraction: PCA, LDAFisher mapping. Nonlinear feature extraction: overview, multidimensional scaling, dissimilaritybased classifiers & embedding. (9.39.4, Appendix A) Lab2
Thursday, 10 Nov, 13:1517:00 room: 2315D lecturer: J,Azar Forward selection, backward selection, take laddr selection, branch & bound, genetic algorithms. PCA, Fisher mapping, nonlinear feature extraction, multidimensional scaling, dissimilarity representation. Unsupervised Classification Lecture 6 : Clustering
Monday, 14 Nov, 15:1517:00 room: 2115 lecturer: J,Azar Unsupervised learning, hierarchical clustering, kmeans, fuzzy cmeans, mean shift algorithm. Gaussian mixture model, expectationmaximization algorithm, selforganizing maps. (10.110.5, 2.3) Lecture 7 : Cluster Validation
Tuesday, 15 Nov, 13:1515:00 room: 2115 lecturer: J,Azar Cluster validation, number of clusters, distortion measures, DaviesBouldin index, other assessment criteria. Novelty detection, ROC curve. (10.610.10, 8.2.3) Lab3
Thursday, 17 Nov, 13:1517:00 room: 2315D lecturer: J,Azar Hierarchical clustering, kmeans, fuzzy cmeans, Gaussian mixture model, expectationmaximization, DaviesBouldin index, selforganizing maps. Supervised Classification I Lecture 8 : Bayesian Classifiers
Monday, 21 Nov, 10:1512:00 room: 2115 lecturer: J,Azar Bayes decision theory, Bayes classifier, Bayes error & risk, logistic classifier. Parzen classifier, kNN classifier, proportional classifier. (3.3, 3.5, 4.2.1, 4.4) Lecture 9 : Bayesian Normal based Classifiers/Discriminant Analysis
Tuesday, 22 Nov, 10:1512:00 room: 2115 lecturer: J,Azar Quadratic discriminant classifier, linear discriminant classifier, nearest mean classifier. Fisher classifier, classification confidence & rejection. (2.2, 4.2.3, 4.3.9) Lab4
Friday, 25 Nov, 13:1517:00 room: 2315D lecturer: J,Azar Implementation of Bayesian classifier, Parzen classifier, kNN classifier, logistic classifier, quadratic/linear/nearestmean classifiers, and Fisher classifier. Curse of dimensionality. Regression Lecture 10 : Linear Regression
Monday, 28 Nov, 10:1512:00 room: 2115 lecturer: J,Azar Bayesian regression. MMSE estimator, MAP estimator, ML estimator. Model evaluation, quality of regression. Lecture 11 : Nonlinear & Multidimensional Regression
Monday, 28 Nov, 15:1517:00 room: 2115 lecturer: J,Azar Nonlinear regression, kernel smoothing, local regression, backfitting algorithm.
Multidimensional regression: confidence bounds, model regularization: ridge regression, LeastAbsoluteShrinkage&SelectionOperator.Lab5
Wednesday, 30 Nov, 13:1517:00 room: 2315D lecturer: J,Azar Linear regression, MMSE, MAP, MLE, quality measures. Nonlinear regression: kernel smoothing/local weighted regression. Supervised Classification II Lecture 12 : Support Vector Machines
Tuesday, 6 Dec, 13:1515:00 room: 2115 lecturer: J,Azar Support vector classifier: (non)separablity, slack variable, (non)linearity, kernel trick, multiclass problems, control parameters. Classifier complexity, VCdimension. (4.2.5, 5.4, 4.3, 11.6) Lecture 13 : Artificial Neural Networks
Thursday, 8 Dec, 10:1512:00 room: 2115 lecturer: J,Azar Classification: perceptron, multilayer perceptron, backpropagation training, decision functions. Autoregressive ANN, radial basis function ANN. Use in regression & feature extraction. (4.2.2, 6.2) Lecture 14 : Combining Classifiers
Friday, 9 Dec, 10:1512:00 room: 2115 lecturer: J,Azar Ensemble classification: fixed rules, trained combiners. Improving classifier performance: bootstrap aggregating, adaptive resampling and combining, boosting, and cloning approach. (8.4) Lab6
Monday, 12 Dec, 13:1517:00 room: 2315D lecturer: J,Azar SVM, ANN, ensemble classification, complexity: biasvariance tradeoff, improving performance (implement either boosting or cloning). Lecture 15 : Review
Thursday, 15 Dec, 10:1512:00 room: 2115 lecturer: J,Azar Course review/philosophical approach to principles of pattern recognition.
Information concerning the final exam.
Examination
Tuesday, 20 Dec, 13:1517:00 room: 2115 Project Presentation
Thursday, 12 Jan, 13:1517:00 room: 2115
Responsible for the course and web page: Jimmy Azar, Centre for Image Analysis