Learning to Rank

Unlike the NetRank project, here we are back to learning to rank feature vectors. We are exploring a number of directions toward better algorithms for learning to rank.

Papers

StructRank (KDD 2008 paper)
We give max-margin learning algorithms that directly optimize for NDCG and MRR, and an additive mixture of MAP, NDCG and MRR. Our main technique is the cutting plane method of Tsochantaridis et al., similar to the application to SVMmap.
LocalRank (L2R4IR 2009 paper)
In realistic workloads, queries are diverse, and can benefit from several local ranking models as compared to a single ranking model applicable to all queries.
LogRank
Only a few algorithms like BoltzRank and ListNet seek to model conditional probabilities of permutations. Here is a third attempt: Conditional Models for Non-smooth Ranking Loss Functions. Avinava Dubey, Jinesh Machchhar, Chiru Bhattacharyya and Soumen Chakrabarti. ICDM 2009.

Presentation material

Software

Data

We used the LETOR data set published by Microsoft, and the TREC data sets used in the SVMmap paper.

Configuration

A sample path in LETOR is .../letor/OHSUMED/Data/, with a subdirectory Foldi for each fold. In the above directory, place a file called options.properties. A sample file is shown below.

svmC = 1
clipRank = 10
svmEps=1e-7
svmLoss=Margin
QPopt=LARank
#QPopt=SVMLightQP
maxIters=10000
Note that you have to first invoke the data cleaner class iitb.StructRank.LETORCleaner after suitable customization.

Acknowledgment

Collaborators
Chiru Bhattacharyya, IISc Bangalaore
Students, past and present
Rajiv Khanna, Uma Sawant, Avinava Dubey, Jinesh Machchhar, Somnath Banerjee.
Support
Partly supported by grants from Microsoft Research.