Learning to Rank
Unlike the NetRank project, here we are
back to learning to rank feature vectors. We are exploring a number
of directions toward better algorithms for learning to rank.
(KDD 2008 paper)
- We give max-margin learning algorithms that directly optimize for NDCG and MRR,
and an additive mixture of MAP, NDCG and MRR. Our main technique is the
plane method of Tsochantaridis et al., similar to the
application to SVMmap.
- In realistic workloads, queries are diverse, and can benefit from
several local ranking models as compared to a single ranking model
applicable to all queries.
Only a few algorithms like BoltzRank and ListNet seek to model conditional probabilities of permutations. Here is a third attempt:
Conditional Models for Non-smooth Ranking Loss Functions.
Avinava Dubey, Jinesh Machchhar, Chiru Bhattacharyya
and Soumen Chakrabarti. ICDM 2009.
- To get an SVN account, send me email.
Your login ID will be your full email address.
- You will be mailed back a temporary password that you should
- Once you get an account, use
this SVN base
URL to check out and update.
- In case we change our external host name, you will need to svn
- Our code needs additional libraries to build and run.
We used the LETOR
data set published by Microsoft, and the
TREC data sets used in the
A sample path in LETOR is
.../letor/OHSUMED/Data/, with a subdirectory
Foldi for each fold. In the above directory,
place a file called options.properties. A sample file
is shown below.
svmC = 1
clipRank = 10
Note that you have to first invoke the data cleaner class
iitb.StructRank.LETORCleaner after suitable
- Chiru Bhattacharyya, IISc Bangalaore
- Students, past and present
- Rajiv Khanna, Uma Sawant, Avinava Dubey,
Jinesh Machchhar, Somnath Banerjee.
- Partly supported by grants from Microsoft Research.