All are invited for the same

*Title :* Tutorial on Structural Output Prediction
*Venue :* KD 101
*Date|Day :* 29 Feb 2016 | Monday
*Time : * 8:00 PM (IST)
*Speaker* : Nitish Gupta

*Abstract: *
In binary or multi-class classification problems, the output variables are
independent which allows us to accurately learn dependencies between input
space and class labels with finitely many functions. Learning functional
dependencies between arbitrary input and output spaces especially in
problems involving multiple dependent output variables and structured
output spaces is extremely difficult and cannot be achieved using trivial
supervised learning algorithms for multi-class classification.
In this talk, which will be more of a tutorial, I will start by giving a
brief introduction to supervised methods for binary classification using
linear classifiers and extending this idea to Multi-class classification.
The focus in Multi-class classification will be on One vs. All, All vs.
All, Multi-class SVM and Constraint Classification approaches. I will then
introduce the problem of structured output prediction and present the
various challenges it poses in training and inference. I will conclude the
talk with a brief tutorial on a widely used supervised learning approach
called the Structured SVM.

*Bio: *
Nitish Gupta is a Computer Science PhD Student at the University of
Illinois, Urbana-Champaign and works under Prof. Dan Roth in the Cognitive
Computation Group. Nitish's research interests lie in Natural Language
Processing and Machine Learning, especially in large-scale machine learning
models for Information Extraction. His research is geared towards
extracting structured knowledge out of unstructured text, making textual
world knowledge more accessible and informative. He is currently working on
latent structure models for large-scale Cross Document Co reference
Resolution/Entity Resolution.