Performance Tester Diary - Episode 5
Introduction
This is Chapter 5 of our journey of non-functional testing with Otto Perf. This is the last chapter in Otto’s journey of performance testing a new application for OctoCar.
In Chapter 4 Otto was looking at ways to check the impact of his performance tests on the infrastructure that the application was running on. He found out that the application had been instrumented using Dynatrace and he could use this to analyse all aspects of the application architecture.
He also discovered that with the use of a custom JMeter header in his test scripts he could easily identify the transactions that were generated as part of his performance testing. The ability to create custom metrics based on his tests and to produce dashboards meant that Otto was in a very good place when it came to monitoring.
The introduction of application monitoring and a subsequent increase in server capacity uncovered an issue with Otto’s reporting process. He had missed the fact that the application was regressing over time but because the transactions being measured were still withing their agreed non-functional requirement tolerances he did not notice this.
He did some work to ensure that he was tracking trends across his results to ensure that transaction regression would be picked up in the future. This trend analysis also offered the ability. To run tests at different times of the day and under alternative load profiles and determine the differences. In this Chapter we will see development finish and Otto will discover that the end of a programme is not the end of performance testing and learn about the benefits that the creation of a regression test can bring.
Otto will also find alternative uses for his performance tests outside of their primary purpose of generating load and concurrency and will start to understand that the assets you create when building performance tests can provide benefit to many other IT and Non-IT related activities.