Nimbus: Secure and Efficient Two-Party Inference for Transformers

  • Zhengyi Li
  • , Kang Yang*
  • , Jin Tan
  • , Wen Jie Lu
  • , Haoqi Wu
  • , Xiao Wang
  • , Yu Yu
  • , Derun Zhao
  • , Yancheng Zheng
  • , Minyi Guo
  • , Jingwen Leng*
  • *Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

4 Scopus citations

Abstract

Transformer models have gained significant attention due to their power in machine learning tasks. Their extensive deployment has raised concerns about the potential leakage of sensitive information during inference. However, when being applied to Transformers, existing approaches based on secure two-party computation (2PC) bring about efficiency limitations in two folds: (1) resource-intensive matrix multiplications in linear layers, and (2) complex non-linear activation functions like GELU and Softmax. This work presents a new two-party inference framework Nimbus for Transformer models. For the linear layer, we propose a new 2PC paradigm along with an encoding approach to securely compute matrix multiplications based on an outer-product insight, which achieves 2.9× ∼ 12.5× performance improvements compared to the state-of-the-art (SOTA) protocol. For the non-linear layer, through a new observation of utilizing the input distribution, we propose an approach of low-degree polynomial approximation for GELU and Softmax, which improves the performance of the SOTA polynomial approximation by 2.9× ∼ 4.0×, where the average accuracy loss of our approach is 0.08% compared to the non-2PC inference without privacy. Compared with the SOTA two-party inference, Nimbus improves the end-to-end performance of BERTbase inference by 2.7× ∼ 4.7× across different network settings.

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
Volume37
StatePublished - 2024
Externally publishedYes
Event38th Conference on Neural Information Processing Systems, NeurIPS 2024 - Vancouver, Canada
Duration: 9 Dec 202415 Dec 2024

Fingerprint

Dive into the research topics of 'Nimbus: Secure and Efficient Two-Party Inference for Transformers'. Together they form a unique fingerprint.

Cite this