[BUSINESS LEADER TRACK] PANEL: Pre-Training and Finetuning: how to train LLMs in an efficient, effective, and affordable manner

Saeed Contractor
Saeed Contractor is the Global Head of the Intelligent Automation COE, Tech at Uber. He leads the Intelligent Automation COE at Uber for Technology / Architecture, Strategy and Implementation. Recognized as a hands-on leader of software Architecture and Development, Saeed brings together new Technologies, Engineering, Business Processes and Mathematics to provide innovative and effective solutions to difficult problems. He has a strong customer focus and drives the Agile development of secure, scalable, reliable and highly available products with due consideration of negative paths and incorporating feedback from all stages of the product life cycle. Saeed has a Master of Engineering Degree from Princeton University and an MBA from UCF.

Shreesha Jagadeesh
Shreesha Jagadeesh is an Associate Director of Machine Learning at Best Buy. He leads a multi-national team of ML Scientists and Engineers building models that power the online customer journey through personalized recommendations and ads. He leverages his expertise in Multi-Stage Recommender Systems, LLMs, Embeddings, Multi-Arm Bandits, Offline Policy Evaluation and A/B testing to help digital teams to personalize the experiences deepening customer relationship & driving ecommerce revenue for Best Buy.
Prior to Best Buy, he has worked in a variety of corporate and consulting roles including at Amazon, EY and Cisco building Data Science models in a diverse set of domains spanning HR, Tax, Legal and Supply Chain. Outside of his day job, he advises early-stage startup, reviews pre-publication books/courses and has also published 2 online Data Science courses. He lives in Boston with his wife and enjoys travelling to exotic locations with an Antarctica expedition coming up in December 2024.

Stavros Zervoudakis

Hemant Jain
Hemant works on efficiently fine-tuning and serving LLMs as a part of the Platform team at Cohere. Prior to this, he spent 3 years at NVIDIA developing Triton Inference Server, an open-source solution used to deploy machine learning models into production. He has a Masters in Data Science from the University of Washington.