Learning to Rank

Unlike the NetRank project, here we are back to learning to rank feature vectors. We are exploring a number of directions toward better algorithms for learning to rank.


StructRank (KDD 2008 paper)
We give max-margin learning algorithms that directly optimize for NDCG and MRR, and an additive mixture of MAP, NDCG and MRR. Our main technique is the cutting plane method of Tsochantaridis et al., similar to the application to SVMmap.
LocalRank (L2R4IR 2009 paper)
In realistic workloads, queries are diverse, and can benefit from several local ranking models as compared to a single ranking model applicable to all queries.
Only a few algorithms like BoltzRank and ListNet seek to model conditional probabilities of permutations. Here is a third attempt: Conditional Models for Non-smooth Ranking Loss Functions. Avinava Dubey, Jinesh Machchhar, Chiru Bhattacharyya and Soumen Chakrabarti. ICDM 2009.

Presentation material



We used the LETOR data set published by Microsoft, and the TREC data sets used in the SVMmap paper.


A sample path in LETOR is .../letor/OHSUMED/Data/, with a subdirectory Foldi for each fold. In the above directory, place a file called options.properties. A sample file is shown below.

svmC = 1
clipRank = 10
Note that you have to first invoke the data cleaner class iitb.StructRank.LETORCleaner after suitable customization.


Chiru Bhattacharyya, IISc Bangalaore
Students, past and present
Rajiv Khanna, Uma Sawant, Avinava Dubey, Jinesh Machchhar, Somnath Banerjee.
Partly supported by grants from Microsoft Research.