Understanding the Reproducibility Issues of Monkey for GUI Testing

Huiyu Liu, Qichao Kong, Jue Wang, Ting Su, Haiying Sun*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Automated GUI testing is an essential activity in developing Android apps. Monkey is a widely used representative automated input generation (AIG) tool to efficiently and effectively detect crash bugs in Android apps. However, it faces challenges in reproducing the crash bugs it detects. To deeply understand the symptoms and root causes of these challenges, we conducted a comprehensive study on the reproducibility issues of Monkey with Android apps. We focused on Monkey’s capability to reproduce crash bugs using its built-in replay functionality and explored the root causes of its failures. Specifically, we selected six popular open-source apps and conducted automated instrumentation on them to monitor the invocations of event handlers within the apps. Subsequently, we performed GUI testing with Monkey on these instrumented apps for 6,000 test cases and collected 56 unique crash bugs. For each bug, we replayed it 200 times using Monkey’s replay function and calculated the success rate. Through manual analysis of screen recording files, log files of event handlers, and the source code of the apps, we pinpointed five root causes contributing to Monkey’s reproducibility issues: Injection Failure, Event Ambiguity, Data Loading, Widget Loading, and Dynamic Content. Our research showed that only 36.6% of the replays successfully reproduced the crash bugs, shedding light on Monkey’s limitations in consistently reproducing detected crash bugs. Additionally, we delved deep into the unsuccessfully reproduced replays to discern the root causes behind the reproducibility issues and offered insights for developing future AIG tools.

Original languageEnglish
Title of host publicationDependable Software Engineering. Theories, Tools, and Applications - 9th International Symposium, SETTA 2023, Proceedings
EditorsHolger Hermanns, Jun Sun, Lei Bu
PublisherSpringer Science and Business Media Deutschland GmbH
Pages132-151
Number of pages20
ISBN (Print)9789819986637
DOIs
StatePublished - 2024
Event9th International Symposium on Dependable Software Engineering: Theories, Tools and Applications, SETTA 2023 - Nanjing, China
Duration: 27 Nov 202329 Nov 2023

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume14464 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference9th International Symposium on Dependable Software Engineering: Theories, Tools and Applications, SETTA 2023
Country/TerritoryChina
CityNanjing
Period27/11/2329/11/23

Keywords

  • Android GUI Testing
  • Empirical Study
  • Reproducibility

Fingerprint

Dive into the research topics of 'Understanding the Reproducibility Issues of Monkey for GUI Testing'. Together they form a unique fingerprint.

Cite this