1. EachPod

Efficient Deep Learning Parallelization using SOAP Search Space and FlexFlow Framework

Author
Arjun Srivastava
Published
Sat 31 Aug 2024
Episode Link
https://arjunsriva.com/podcast/podcasts/1807.05358/

The paper introduces the SOAP search space, encompassing Sample-Operation-Attribute-Parameter dimensions, for optimizing parallelization strategies in deep neural network training. The FlexFlow framework utilizes a guided randomized search algorithm with a novel execution simulator to efficiently explore the vast SOAP space and achieve significant speedups in DNN training.

The SOAP search space allows for flexible parallelization strategies across Sample, Operation, Attribute, and Parameter dimensions, outperforming traditional methods by up to 3.8 times. FlexFlow's simulator predicts performance without real executions, reducing search time and enhancing efficiency.

Read full paper: https://arxiv.org/abs/1807.05358

Tags: Deep Learning, Parallelization, Distributed Computing, Neural Networks, Optimization

Share to: