Byzantine-robust distributed support vector machine

Xiaozhou Wang, Weidong Liu, Xiaojun Mao*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The development of information technology brings diversification of data sources and large-scale data sets and calls for the exploration of distributed learning algorithms. In distributed systems, some local machines may behave abnormally and send arbitrary information to the central machine (known as Byzantine failures), which can invalidate the distributed algorithms based on the assumption of faultless systems. This paper studies Byzantine-robust distributed algorithms for support vector machines (SVMs) in the context of binary classification. Despite a vast literature on Byzantine problems, much less is known about the theoretical properties of Byzantine-robust SVMs due to their unique challenges. In this paper, we propose two distributed gradient descent algorithms for SVMs. The median and trimmed mean operations in aggregation can effectively defend against Byzantine failures. Theoretically, we show the convergence of the proposed estimators and provide the statistical error rates. After a certain number of iterations, our estimators achieve near-optimal rates. Simulation studies and real data analysis are conducted to demonstrate the performance of the proposed Byzantine-robust distributed algorithms.

Original languageEnglish
Pages (from-to)707-728
Number of pages22
JournalScience China Mathematics
Volume68
Issue number3
DOIs
StatePublished - Mar 2025

Keywords

  • 62H30
  • 68W15
  • Byzantine robustness
  • convergence
  • distributed learning
  • support vector machine

Fingerprint

Dive into the research topics of 'Byzantine-robust distributed support vector machine'. Together they form a unique fingerprint.

Cite this