Machine learning models have grown tremendously in recent years, with some having hundreds of billions of data points, and we wonder how big they can get. How do we deploy even bigger models, whether it’s in the cloud or using captive infrastructure? Models are getting bigger and bigger, then are distilled and annealed, and then grow bigger still. In this episode, Dennis Abts of Groq discusses the scalability of ML models with Stephen Foskett and Chris Grundemann. HPC architecture and concepts are coming to the enterprise, enabling us to work with unthinkable amounts of data. But we are also reducing precision and complexity of models to reduce their size. The result is that businesses will be able to work with ever-larger data sets in the future.
Three Questions
Guests and Hosts
Date: 5/18/2021 Tags: @SFoskett, @ChrisGrundemann, @DennisAbts, @GroqInc