Abstract:
Android apps are commonly used nowadays as smartphones have become irreplaceable parts of modern lives. To ensure that these apps work correctly, developers would need to...Show MoreMetadata
Abstract:
Android apps are commonly used nowadays as smartphones have become irreplaceable parts of modern lives. To ensure that these apps work correctly, developers would need to test them. Testing these apps is laborious, tedious, and often time consuming. Thus, many automated testing tools for Android have been proposed. These tools generate test cases that aim to achieve as much code coverage as possible. A lot of testing methodologies are employed such as model-based testing, search-based testing, random testing, fuzzing, concolic execution, and mutation. Despite much efforts, it is not perfectly clear how far these testing tools can cover user behaviours. To fill this gap, we want to measure the gap between the coverage of automated testing tools and manual testing. In this preliminary work, we selected a set of 11 Android apps and ran state-of-the-art automated testing tools on them. We also manually tested these apps by following a guideline on actions that we need to exhaust when exploring the apps. Our work highlights that automated tools need to close some gaps before they can achieve coverage that is comparable to manual testing. We also present some limitations that future automated tools need to overcome to achieve such coverage. CCS CONCEPTS • Software and its engineering \rightarrow Software verification and validation.
Published in: 2024 IEEE/ACM 11th International Conference on Mobile Software Engineering and Systems (MOBILESoft)
Date of Conference: 14-15 April 2024
Date Added to IEEE Xplore: 17 June 2024
ISBN Information:
Conference Location: Lisbon, Portugal