A Model-Driven Performance Testing Approach for Session-Based Software Systems.

Schulz, Eike (2013) A Model-Driven Performance Testing Approach for Session-Based Software Systems. (Student research project), Kiel University, Kiel, Germany, 56 pp.

[thumbnail of Schulz2013AModelDrivenPerformanceTestingApproachForSessionBasedSoftwareSystems.pdf]
Schulz2013AModelDrivenPerformanceTestingApproachForSessionBasedSoftwareSystems.pdf - Submitted Version

Download (10MB) | Preview


The need for high quality software systems, particularly in terms of e-commerce, is undoubted. Customers generally interpret performance characteristics like responsiveness or data throughput as indicators for system quality. Consequently, the overall performance of a system should be kept high. This requirement implies the need for appropriate and reliable performance testing methods. This paper introduces a model-driven performance testing approach, which consists of six tasks, whose underlying ideas are generally applicable to performance tests targeting session-based systems. It provides a systematic procedure for
setting up a workload generation environment, based on an analytical model. Furthermore, a model-driven implementation idea for the approach will be presented as a proof-of-concept.

The generation of synthetic workload which complies with the workload generated by a population of real users, is one of the main challenges in performance testing, particularly in load testing. This requires a suitable underlying model for producing appropriate requests to be sent to the considered test system. Our approach uses an analytical model for probabilistic workload generation, which is part of the Markov4JMeter add-on for the performance testing tool JMeter. It is assumed that the behavior specification of the test system provides domain-specific use cases as indicators for the input of the workload
generation model, whereas concrete input values are extracted from measurements. The extraction process includes the model-driven part of our approach.

In the evaluation part, a case study system is targeted by the implementation of our approach, demonstrating its practicability. For determining the validity of the approach, several metrics including a methodology for measuring them will be discussed. We will reveal several open issues, aiming to the proof that any results retrieved by using an implementation of our approach are reasonable.

Document Type: Thesis (Student research project)
Keywords: Performance testing, Markov4JMeter, workload generation
Research affiliation: Kiel University > Software Engineering
Projects: Kieker
Date Deposited: 17 Oct 2013 06:53
Last Modified: 24 Nov 2013 10:16
URI: https://oceanrep.geomar.de/id/eprint/22127

Actions (login required)

View Item View Item