papers AI Learner
The Github is limit! Click to go to the new site.

Communication-Efficient and Decentralized Multi-Task Boosting while Learning the Collaboration Graph

2019-01-24
Valentina Zantedeschi, Aurélien Bellet, Marc Tommasi

Abstract

We study the decentralized machine learning scenario where many users collaborate to learn personalized models based on (i) their local datasets and (ii) a similarity graph over the users’ learning tasks. Our approach trains nonlinear classifiers in a multi-task boosting manner without exchanging personal data and with low communication costs. When background knowledge about task similarities is not available, we propose to jointly learn the personalized models and a sparse collaboration graph through an alternating optimization procedure. We analyze the convergence rate, memory consumption and communication complexity of our decentralized algorithms, and demonstrate the benefits of our approach compared to competing techniques on synthetic and real datasets.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1901.08460

PDF

http://arxiv.org/pdf/1901.08460


Similar Posts

Comments