A Generalized Framework for Population Based Training. (arXiv:1902.01894v1 [cs.AI])

Population Based Training (PBT) is a recent approach that jointly optimizes
neural network weights and hyperparameters which periodically copies weights of
the best performers and mutates hyperparameters during training. Previous PBT
implementations have been synchronized glass-box systems. We propose a general,
black-box PBT framework that distributes many asynchronous “trials” (a small
number of training steps with warm-starting) across a cluster, coordinated by
the PBT controller. The black-box design does not make assumptions on model
architectures, loss functions or training procedures. Our system supports
dynamic hyperparameter schedules to optimize both differentiable and
non-differentiable metrics. We apply our system to train a state-of-the-art
WaveNet generative model for human voice synthesis. We show that our PBT system
achieves better accuracy, less sensitivity and faster convergence compared to
existing methods, given the same computational resource.

Source link

Related posts

Convention Q&A: Information Governance: New Perspectives and Global Experiences


Deep learning-based Diabetic Retinopathy assessment on embedded system.


New Precision Medicine Program to Study Role of Genomics in Disease


This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy