Imperial College London

ProfessorPeterPietzuch

Faculty of EngineeringDepartment of Computing

Professor of Distributed Systems
 
 
 
//

Contact

 

+44 (0)20 7594 8314prp Website

 
 
//

Location

 

442Huxley BuildingSouth Kensington Campus

//

Summary

 

Publications

Citation

BibTex format

@inproceedings{Mai:2020,
author = {Mai, L and Li, G and Wagenlander, M and Fertakis, K and Brabete, A-O and Pietzuch, P},
pages = {937--954},
publisher = {Usenix},
title = {KungFu: making training in distributed machine learning adaptive},
url = {https://www.usenix.org/conference/osdi20/presentation/mai},
year = {2020}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - When using distributed machine learning (ML) systems to train models on a cluster of worker machines, users must con-figure a large number of parameters: hyper-parameters (e.g. the batch size and the learning rate) affect model convergence; system parameters (e.g. the number of workers and their communication topology) impact training performance. In current systems, adapting such parameters during training is ill-supported. Users must set system parameters at deployment time, and provide fixed adaptation schedules for hyper-parameters in the training program. We describe Kung Fu, a distributed ML library for Tensor-Flow that is designed to enable adaptive training. Kung Fu allows users to express high-level Adaptation Policies(APs)that describe how to change hyper- and system parameters during training. APs take real-time monitored metrics (e.g. signal-to-noise ratios and noise scale) as input and trigger control actions (e.g. cluster rescaling or synchronisation strategy updates). For execution, APs are translated into monitoring and control operators, which are embedded in the data flowgraph. APs exploit an efficient asynchronous collective communication layer, which ensures concurrency and consistency of monitoring and adaptation operations
AU - Mai,L
AU - Li,G
AU - Wagenlander,M
AU - Fertakis,K
AU - Brabete,A-O
AU - Pietzuch,P
EP - 954
PB - Usenix
PY - 2020///
SP - 937
TI - KungFu: making training in distributed machine learning adaptive
UR - https://www.usenix.org/conference/osdi20/presentation/mai
UR - http://hdl.handle.net/10044/1/85597
ER -