Today’s storage devices (disks and SSDs) have processors and memory already, and this is the concept of computational storage. If drives can process data locally, they can relieve the burden of communication and processing and help reduce the amount of data that gets to the CPU or GPU. In this episode, Vladimir Alves and Scott Shadley join Chris Grundemann and Stephen Foskett to discuss the AI implications of computational storage. Modern SSDs already process data, including encryption and compression, and they are increasingly taking on applications like machine learning. Just as industrial IoT and edge computing is taking on ML processing, so too are storage devices. Current applications for ML on computational storage include local processing of images and video for recognition and language processing, but these devices might even be able to execute ML training locally as in the case of federated learning.
Three Questions
Guests and Hosts
Date: 7/13/2021 Tags: @SFoskett, @ChrisGrundemann, @SMShadley, @NGDSystems