On convex stochastic variance reduced gradient for adversarial machine learning

Saikiran Bulusu, Qunwei Li, Pramod K. Varshney

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We study the finite-sum problem in an adversarial setting using stochastic variance reduced gradient (SVRG) optimization in a distributed setting. Here, a fraction of the workers are assumed to be Byzantine that exhibit adversarial behavior by providing arbitrary data. We propose a robust scheme to combat the actions of Byzantine adversaries in this setting, and provide rates of convergence for the convex case. This is the first study of SVRG in an adversarial setting.

Original languageEnglish (US)
Title of host publicationGlobalSIP 2019 - 7th IEEE Global Conference on Signal and Information Processing, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728127231
DOIs
StatePublished - Nov 2019
Event7th IEEE Global Conference on Signal and Information Processing, GlobalSIP 2019 - Ottawa, Canada
Duration: Nov 11 2019Nov 14 2019

Publication series

NameGlobalSIP 2019 - 7th IEEE Global Conference on Signal and Information Processing, Proceedings

Conference

Conference7th IEEE Global Conference on Signal and Information Processing, GlobalSIP 2019
CountryCanada
CityOttawa
Period11/11/1911/14/19

Keywords

  • Adversarial machine learning
  • Byzantines
  • Distributed opti-mization
  • Stochastic Gradient Descent (SGD)
  • Stochastic variance reduced gradient (SVRG)

ASJC Scopus subject areas

  • Information Systems
  • Information Systems and Management
  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint Dive into the research topics of 'On convex stochastic variance reduced gradient for adversarial machine learning'. Together they form a unique fingerprint.

Cite this