Welcome To The ICLR Sponsor Expo!

Expo Schedule


Interpretability with skeptical and user-centric mind

Tue 4 May 2 p.m. - 3 p.m. PDT
Expo Talk Panel

Interpretable machine learning has been a popular topic of study in the era of machine learning. But are we making progress? Are we heading in the right direction? In this talk, I start with a skeptically-minded journey of this field on our past-selves, before moving on to recent developments of more user-focused methods. The talk will finish with where we are heading, and a number of open questions that we should think about.

Join Virtual Talk & Panel Visit Google Booth

Live Panel - Academics@ Presents: Representation Learning at Amazon

Wed 5 May 2 p.m. - 3 p.m. PDT
Expo Talk Panel

How is Amazon using Representation Learning to innovate on behalf of customers? This panel will feature Amazon Scholars and Visiting Academics including professors from around the world, sharing how they leverage their RL expertise and apply their research at Amazon in new, inventive ways. Moderated by Academic Program Manager, Lindsey Weil, this panel will provide insight into how Amazon is leveraging RL and ways academics can partner with Amazon in this domain. The ICLR community is invited to hear about Academics’ projects, best practices, and their experience collaborating with Amazon.

Join Virtual Talk & Panel Visit Amazon Booth

AI Model Efficiency Toolkit talk & demo

Thu 6 May 2 p.m. - 3 p.m. PDT
Expo Talk Panel

Neural network models can be very large and compute intensive, which can make them challenging to run on edge devices. Model quantization provides significant benefits in power and memory efficiency, as well as latency. Quantization of a 32-bit floating-point model to an 8-bit or 4-bit integer model often results in accuracy loss. In this talk, we present the AI Model Efficiency Toolkit (AIMET), an open-source library that provides advanced quantization and compression techniques for trained neural network models. We also present the latest addition to it called AIMET Model Zoo. Together with the models, AIMET Model Zoo also provides the recipe for quantizing popular 32-bit floating point (FP32) models to 8-bit integer (INT8) models with little loss in accuracy. The tested and verified recipes include a script that optimizes TensorFlow or PyTorch models across a broad range of categories from image classification, object detection, semantic segmentation, and pose estimation to super resolution, and speech recognition.

Join Virtual Talk & Panel Visit Qualcomm Booth