Skip to content
Documentation and Agile Performance Testing

Documentation and Agile Performance Testing

Once upon a time documentation was one of the most important aspects of Quality Assurance and this was not limited to the functional test efforts but the non-functional testing as well.

We spent days, weeks, months even creating Performance Test Strategies, Approaches, Plans, Test Case, Completion Reports etc.

Most of these documents were required before any automation could be written and before a sensible performance testing framework could be considered.

It was expected before performance testing began, during the performance test cycles and after the tests completed it was a constant cycles of documentation creation, review, update, review and sign-off.

With a bit of Performance Testing in the middle.

Surely many of us remember the difficulty in getting some of these significant and lengthy documents approved by many, many stakeholders.

Before I go on to tell you why documentation is overrated I want to caveat it with the fact that for some organisations it is a necessity as they are following the wishes of their customers and clients, and for some organisations they still follow a strict waterfall approach to software development and their way of working delivers for their organisation.

This is all fine, this post is more about Documentation for Performance Testing in Agile Delivery and this is an important distinction.

Do YOU have custom Load Testing needs?
Rely on our Expertise

Is it being reviewed

If you do work for an organisation where documentation must be produced I want to share a story with you.

A long time ago I worked for an organisation that was delivering a huge organisational wide programme of work, the Quality Assurance team were charged with producing document after document outlining the approach that the QA would take and some of these documents were huge with a significant number of senior signatories.

One of the guiding principles of the documentation was the ‘Elements of Testing’ which was a common theme across all the documentation and a colleague of mine thought he would check how thorough the review and sign-off process was by globally changing ‘Elements of Testing’ to ‘Elephants of Testing’.

The document came back approved from all on the approvers list!

The moral of the story is check your efforts are being properly reviewed, if you are putting the effort into the creation, check the effort of the reviewers is the same.

What is the problem in the Agile World

Back to the main point of this post, the problem that I think affects the performance testing effort is that some organisations want the performance testers to ensure the software developed in an agile framework performs against requirements whilst still insisting that huge volumes of documentation accompany this effort.

This is extremely difficult and means that the performance team do not spend time creating, improving and enhancing tests and analysing the execution cycles.

They spend their time writing documentation in the form of strategies and approaches when the project starts meaning they are not able to start writing and executing tests as the first stories are being played.

They are also writing completion reports when they should be analysing the effects of the load generated by the performance tests on the application under test.

What can we do about this

I am not saying we should not produce any documentation when it comes to performance testing. This would be wrong. I am just saying that the amount of documentation needs to be suitable for the way you are delivering software.

I have never been in favour of huge amounts of up-front documentation when it comes to performance testing. You will need to define how you plan on running the performance testing in terms of tools and frameworks and you must create a set of well-formed testable requirements, by well-formed I mean defined in such a way as to make them easy to understand and have a quantifiable measure to determine success or failure.

OctoPerf have a Blog post about requirements and to make them testable which is worth a read, you can find this here.

Once you have your requirements then the measure of success is: do the results meet the requirements? With your daily execution cycles in your agile ways of performance testing a simple pass/fail of each test against your requirements is all that is needed.

This means that you are not spending too long on pouring over results on a daily basis and you can focus your effort elsewhere.

Being experts in automation there is a strong argument for building an automated solution for checking requirements against results and then running this automation at the end of you execution cycles.

Other things to think of are including custom headers in your JMeter tests and if you work in an organisation that uses tools such as Dynatrace or AppDynamics then you JMeter results can be written to these tools where they will be easily available for performance analysis. Alternatively you can also let OctoPerf do it for you.

Change is not easy

I understand that for some organisations, even those that consider themselves agile, documentation overload is still an issue and changing project stakeholders perception of this is not easy.

We understand that for some the leap of faith required to go from huge amounts of documentation to very little and almost the perception that they have to trust the tests is difficult but the important point to get across is that the volume of the documentation is being reduced and not the quality of the documentation and the quality of the tests and professionalism of the performance tester is not changing.

Less documentation does not mean less complete testing it means more complete testing because the performance testers have more time to focus on writing quality tests rather than writing documentation about the tests.

Benefits of making the change

The overriding benefit of reducing the amount of documentation you produce is that you give the performance testers more time to focus on the most important task of building quality, reusable, stable tests without the need for constant document production.

Conclusion

Documentation in performance testing is overrated and gets in the way of producing quality, reusable tests and I think any organisation would prefer better tests to better documentation.

We have said that its not a case of not doing any. But a small amount of quality documentation, especially requirements, is much better than huge amounts of badly constructed and written documentation that is difficult to follow.

Change in some organisations is not easy and it will require effort but the benefits of better performance testing will be worth it.

Want to become a super load tester?
Request a Demo