On the Burstiness of Distributed Machine Learning Traffic
CoRR(2023)
摘要
Traffic from distributed training of machine learning (ML) models makes up a
large and growing fraction of the traffic mix in enterprise data centers. While
work on distributed ML abounds, the network traffic generated by distributed ML
has received little attention. Using measurements on a testbed network, we
investigate the traffic characteristics generated by the training of the
ResNet-50 neural network with an emphasis on studying its short-term
burstiness. For the latter we propose metrics that quantify traffic burstiness
at different time scales. Our analysis reveals that distributed ML traffic
exhibits a very high degree of burstiness on short time scales, exceeding a
60:1 peak-to-mean ratio on time intervals as long as 5 ms. We observe that
training software orchestrates transmissions in such a way that burst
transmissions from different sources within the same application do not result
in congestion and packet losses. An extrapolation of the measurement data to
multiple applications underscores the challenges of distributed ML traffic for
congestion and flow control algorithms.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要