AMPS-Inf: Automatic Model Partitioning for Serverless Inference with Cost Efficiency

Jananie Jarachanthan, Li Chen, Fei Xu, Bo Li

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

32 Scopus citations

Abstract

The salient pay-per-use nature of serverless computing has driven its continuous penetration as an alternative computing paradigm for various workloads. Yet, challenges arise and remain open when shifting machine learning workloads to the serverless environment. Specifically, the restriction on the deployment size over serverless platforms combining with the complexity of neural network models makes it difficult to deploy large models in a single serverless function. In this paper, we aim to fully exploit the advantages of the serverless computing paradigm for machine learning workloads targeting at mitigating management and overall cost while meeting the response-time Service Level Objective (SLO). We design and implement AMPS-Inf, an autonomous framework customized for model inferencing in serverless computing. Driven by the cost-efficiency and timely-response, our proposed AMPS-Inf automatically generates the optimal execution and resource provisioning plans for inference workloads. The core of AMPS-Inf relies on the formulation and solution of a Mixed-Integer Quadratic Programming problem for model partitioning and resource provisioning with the objective of minimizing cost without violating response time SLO. We deploy AMPS-Inf on the AWS Lambda platform, evaluate with the state-of-the-art pre-trained models in Keras including ResNet50, Inception-V3 and Xception, and compare with Amazon SageMaker and three baselines. Experimental results demonstrate that AMPS-Inf achieves up to 98% cost saving without degrading response time performance.

Original languageEnglish
Title of host publication50th International Conference on Parallel Processing, ICPP 2021 - Main Conference Proceedings
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450390682
DOIs
StatePublished - 9 Aug 2021
Event50th International Conference on Parallel Processing, ICPP 2021 - Virtual, Online, United States
Duration: 9 Aug 202112 Aug 2021

Publication series

NameACM International Conference Proceeding Series

Conference

Conference50th International Conference on Parallel Processing, ICPP 2021
Country/TerritoryUnited States
CityVirtual, Online
Period9/08/2112/08/21

Keywords

  • cost efficiency
  • machine learning inference
  • serverless computing

Fingerprint

Dive into the research topics of 'AMPS-Inf: Automatic Model Partitioning for Serverless Inference with Cost Efficiency'. Together they form a unique fingerprint.

Cite this