Enabling Efficient and Flexible FPGA Virtualization for Deep Learning in the Cloud
FCCM, pp. 102-110, 2020.
EI
Abstract:
FPGAs have shown great potential in providing low-latency and energy-efficient solutions for deep neural network (DNN) inference applications. Currently, the majority of FPGA-based DNN accelerators in the cloud run in a time-division multiplexing way for multiple users sharing a single FPGA, and require re-compilation with $\\sim$100s ove...More
Code:
Data:
Tags
Comments