Workshop: Sample Transfer Optimization with Adaptive Deep Neural Network
Abstract: Transfer configurations play a crucial role in achieving desirable performance in high-speed networks where suboptimal settings could lead to poor transfer throughput. However, discovering the optimal configuration for a given transfer task is a difficult problem as it depends on various factors including dataset characteristics and network settings. The state-of-the-art transfer tuning solutions rely on sample transfers and evaluate different transfer configurations in attempt to discover the optimal one in real-time. Yet, current approaches to run sample transfers incur significant delay and measurement errors, thus limit the gain offered by tuning algorithms. In this paper, we take advantage of feed forward deep neural network (DNN) to minimize execution time of sample transfers without sacrificing measurement accuracy. To achieve this goal, we collected 115K data transfer logs in four networks and trained multiple DNNs that can predict convergence time of transfers by analyzing real-time throughput metrics. The results gathered in various networks with rich set of transfer configurations indicate that DNN can reduce error rate by up to 50% compared to the state-of-the-art solution while achieving similar sample transfer execution time in most cases. Moreover, by tuning its hyperparameters and model settings, one can achieve low execution time and error rate based on the specific needs of the user or application.