Login
Talks & Seminars
Title: Towards Massive Scale Deep Learning
Dr. Sanjiv Kumar, Google Research, New York
Date & Time: August 1, 2018 15:30
Venue: Room no. 109, 01st Floor, Department of Computer Science and Engineering, New CSE/CC Building
Abstract:
Recent successes of deep neural networks in a large number of domains has spurred a renewed interest in both theory and applications of these models. However training and inference in such models at massive scale still remains extremely challenging. In this talk, I will highlight a number of challenges related to both speed and quality for the problems with large output spaces containing billions of outputs that drive real-world relevance search and recommendation systems. I will describe advancements in low-level fast matrix-vector products via structured matrices, stochastic negative mining for fast optimization, and design of appropriate loss functions and regularizers, making robust, massive scale learning feasible.
Speaker Profile:
Sanjiv Kumar is a Principal Scientist (Research Director) at Google Research, NY where he is currently leading research in theory and applications of large scale machine learning. His research interests include massive scale deep learning, fast training and inference with large output spaces, distributed and privacy preserving learning, and learning based hashing. His work on convergence properties of Adam received best paper award in ICLR 2018. He had been an adjunct faculty at Columbia University where he taught a new course on large-scale machine learning. He is currently serving as an Action Editor of JMLR. Sanjiv holds a PhD from School of Computer Science, Carnegie Mellon University.
List of Talks

Webmail

Username:
Password:
Faculty CSE IT
Forgot Password
    [+] Sitemap     Feedback