University of Minnesota
Institute of Technology
http://www.it.umn.edu
612-624-2006
myU OneStop



 

Electrical and Computer Engineering

Support Vector Machines and Predictive Data Modeling Methodology

Prof. Vladimir Cherkasskey
Electrical and Computer Engineering

Duration: Half day

Course Description:
There is a growing interest in estimating predictive models from empirical data. The subject of data-driven modeling has been addressed in various disciplines such as statistics, pattern recognition, signal processing, genomics, artificial neural networks, machine learning, and data mining. Since late 1990’s, many researchers and practitioners alike are actively applying the so-called Support Vector Machine (SVM) methods developed under the framework of Vapnik-Chervonenkis (VC) learning theory. This workshop pursues several methodological, practical and research goals, aimed to provide in-depth understanding of SVM methodology and the underlying VC-theoretical concepts. The workshop participants will gain understanding of:
   1. The difference between classical statistical and predictive learning (VC-theoretical) methodologies for estimating models from data
   2. The main ideas, concepts and results of VC learning theory
   3. SVM methodology for different types of learning problems, such as classification, regression and single-class learning
   4. SVM model selection and other practical issues arising in practical applications;
   5. New advanced learning methodologies based on recent SVM extensions and modifications. These new methodologies include Learning in the Universum environment (U-SVM), transduction (transductive SVM) and Learning with Structured Data (aka SVM+)

Course outline:
Philosophical and methodological differences between classical statistical and predictive learning approaches to data modeling

Margin-based Methods and Support Vector Machines
* Motivation for margin-based loss
* SVM Classification
 * SVM Regression
* Issues for SVM model selection

Practical issues and SVM Application Studies
 * Practical issues in using SVM software
 * Interpretation of SVM classifiers
 * Empirical comparison of different learning methods
* Application study 1: fraud detection
 * Application study 2: prediction of transplant-related mortality
 * Application study 3: real-time prediction of epileptic seizures

Advanced SVM-based learning technologies
 * Non-inductive and alternative learning settings
 * Transduction
 * Universum Learning
 * Learning with Structured Data and Multi-Task Learning

Intended Audience:

Instructor Biography:
Vladimir Cherkassky is Professor of Electrical and Computer Engineering at the University of Minnesota, Twin Cities campus. Dr. Cherkassky is co-author of Learning from Data, now in its second edition, and he has also authored the forthcoming Introduction to Predictive Learning.
He has been recently selected as Fellow of IEEE, for ‘contributions and leadership in statistical learning and neural networks’. He has served on editorial boards of IEEE Transactions on Neural Networks (TNN), Neural Networks (the official journal of INNS), Natural Computing: An International Journal and Neural Processing Letters. He was a Guest Editor of the IEEE TNN Special Issue on VC Learning Theory and Its Applications published in September 1999. Dr. Cherkassky was organizer and Director of NATO Advanced Study Institute (ASI) From Statistics to Neural Networks: Theory and Pattern Recognition Applications held in France in 1993.
He received the IBM Faculty Partnership Award in 1996 and 1997 for his work on learning methods for data mining. In 2008, he received the A. Richard Newton Breakthrough Research Award from Microsoft Research for development of new learning methodologies for predictive learning.