Scheduling Resources to Multiple Pipelines of One Query in a Main Memory Database Cluster

Zhuhe Fang, Chuliang Weng*, Li Wang, Huiqi Hu, Aoying Zhou

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

To fully utilize the resources of a main memory database cluster, we additionally take the independent parallelism into account to parallelize multiple pipelines of one query. However, scheduling resources to multiple pipelines is an intractable problem. Traditional static approaches to this problem may lead to a serious waste of resources and suboptimal execution order of pipelines, because it is hard to predict the actual data distribution and fluctuating workloads at compile time. In response, we propose a dynamic scheduling algorithm, List with Filling and Preemption (LFPS), based on two novel techniques. (1) Adaptive filling improves resource utilization by issuing more extra pipelines to adaptively fill idle resource 'holes' during execution. (2) Rank-based preemption strictly guarantees scheduling the pipelines on the critical path first at run time. Interestingly, the latter facilitates the former filling idle 'holes' with best efforts to finish multiple pipelines as soon as possible. We implement LFPS in our prototype database system. Under the workloads of TPC-H, experiments show our work improves the finish time of parallelizable pipelines from one query up to 2.5X than a static approach and 2.1X than a serialized execution.

Original languageEnglish
Article number8566007
Pages (from-to)533-546
Number of pages14
JournalIEEE Transactions on Knowledge and Data Engineering
Volume32
Issue number3
DOIs
StatePublished - 1 Mar 2020

Keywords

  • Main memory database
  • filling
  • independent parallelism
  • preemption
  • query processing
  • resource scheduling

Fingerprint

Dive into the research topics of 'Scheduling Resources to Multiple Pipelines of One Query in a Main Memory Database Cluster'. Together they form a unique fingerprint.

Cite this