General course name: Statistical Relational Learning
Warm up quiz to help you estimate if you want to take this course and help me estimate if you have the necessary background for the course.
In this new offering of the course, I will focus on optimization problems (discrete as well as continuous) and search/optimization algorithms (optimal and approximate/greedy)as they appear in relational learning. Any problem in inference (probabilistic or logical) will be only discussed from the viewpoint of optimization and search. Until last year, there was significant time devoted to first order logic -- we will spend much less time on that (and present the needful more intuitively).
The list of topics will include (but will not be restricted to):
Probabilistic and logical representation languages
Relational Learning, Search Spaces and Search Algorithms
Feature and Structure Induction as (Discrete) Optimization problems
MAP inference as Discrete Optimization problem
Duality Theory: For discrete as well as continuous case
Cutting-plane, branch and bound, branch and cut, active set, A star
Stochastic Hill-climbing, stochastic search, randomized search
Submodularity, Mixed Integer Linear Programs
Montonicity/anti-monotonicity and efficient search algorithms for frequent pattern mining
Parameter Learning as Optimization Problem
Joint Structure and Parameter Learning
Relational Kernels and Implicit Feature Mapping for Parameter Learning
Max-margin learning in Relational Settings
A growing list of papers relevant to the course can be found here. Each student will need to present some paper(s).
List of applets
Prerequisites
One of the departmental machine learning courses: CS725 or CS419. In case you have attended some other machine learning course, you need to talk to me.
Important note: The new offering of this course is going to be intensive in optimization and algorithms. Thus, it will really help if you have atteded an optimization course in the past. Else you will need to do some significant reading up on optimization (see the reference texts below).
Class Timings, Venue and Grading
Introduction and Grading
Time: Tuesdays and Fridays, 3:30-5:00 PM
Venue: SIC 305, KReSIT
Credit/Audit Requirements
Anyone who does an exceptional course project that has the potential to be a publishable paper is eligible for a straight AA grade. Otherwise the grading breakup would be:
15% Mid-semester exam
30% End semester exam
30% Project: A basic project will take any of the papers we study or any related papers, implement the algorithms in the paper, do a basic performance study and diagnose the performance. However, I would expect most projects to improve upon existing techniques. A more advanced project would take a problem specification for which no solution is publicly available, figure out how to solve it, and implement the solution. (courtesy CS632)
15% Seminar+Assignments (Minimum)
10% Quizzes (Maximum)
Audit students have to perform assignments and project.
Reference Books
- Introduction to Statistical Relational Learning,
Edited by Lise Getoor and Ben Taskar, Published by The MIT Press.
- Probabilistic Inductive Logic Programming,
Edited by De Raedt, L.; Frasconi, P.; Kersting, K.; Muggleton, S.H., Published by Springer Verlag, Lecture Notes in Artificial Intelligence.
- Numerical Optimization,
Jorge Nocedal Stephen J. Wright, Springe Verlag.
- A First Course in Combinatorial Optimization,
J. Lee, Cambridge University Press, 2004.
- Relational Data MiningSaso Dzeroski and Nada Lavrac, editors, Springer, Berlin, 2001.
- Pattern Recognition and Machine Learning, Bishop, Christopher M., Springer, Information Science and Statistics Series
Reference Slides
- Introductory Slides to Probabilistic Logical and Relational Learning by James Cussens and Kristian Kersting, 2004.
- Statistical Relational Learning: A Tutorial by Lisa Getoor, 2007.
|
|