Retour

Introduction au Machine learning

ECTS : 2

Description du contenu de l'enseignement :

1. Supervised and unsupervised learning
2. Calibration versus prediction: how to avoid over-fitting
3. Measure of the complexity of a model according to Vapnik-Chervonenkis
4. Vapnik-Chervonenkis’s inequality and the control of the prediction error
5. Maximum margin SVMs and Gap tolerant classifiers
6. C-SVMs and duality
7. SVMs with kernels and Mercer’s theorem
8. The simplex case 
9. Mu-SVM, duality and reduced convex envelopes  
10. Single class SVMs, anomaly detections and clustering
11. An introduction to Bootstrap, decision trees and random forests
12. Ridge Regression, penalization, and yield curve smoothing
13. The Representer theorem, Lasso, parsimony and duality.

Compétence à acquérir :

Comprendre comment utiliser les Supports Vectors Machines pour l'apprentissage supervisé et non supervisé. Quelques application des méthodes de regressions pénalisées

Mode de contrôle des connaissances :

Examen

Bibliographie, lectures recommandées :

[1] Pierre Brugiere: https://hal.archives-ouvertes.fr/cel-01390383v2
[2] Wolfgang Karl Härdle, Rouslan Moro, Linda Hoffmann : Learning Machines Supporting Bankruptcy Prediction, SFB 649 Discussion Paper 2010-032
[3] Dave DeBarr and Harry Wechsle: Fraud Detection Using Reputation Features SVMs, and Random Forests 
[4] Trevor Hastie, Robert Tibshirani, Jerome Friedman: The Elements of Statistical Learning
[5] Christopher Bishop: Pattern Recognition and Machine Learning

Université Paris Dauphine - PSL - Place du Maréchal de Lattre de Tassigny - 75775 PARIS Cedex 16 - 06/07/2024