Skip to content
Performance Testing in a Scrum Framework

Performance Testing in a Scrum Framework

Agile development teams generally follow the principles of Scrum where individual teams work together to manage their workload through a set of values, principles, and practises.

From a development perspective this gives a team which comprises a Product Owner, Scrum Master, and Development team the autonomy to work and deliver in an environment that suits their needs and helps them develop change for the organisation in a way that maximises efficiencies.

This blog post is not an overview of the scrum methodology but will require some understanding of the processes that take place, and these will be discussed throughout his post. What we are trying to do is understand how in an Agile delivery framework we can make sure that performance testing is not lost or overlooked. Scrum teams work in short sprints that means that your performance testing must, like the unit testing built by the development teams, be lightweight and, well... agile, for want of a better word.

OctoPerf Superman
Do YOU have custom Load Testing needs?
Rely on our Expertise

Scrum Framework

scrum-framework

Let’s take a look at the Scrum Framework and familiarise ourselves with the principles. This picture was taken from the Scrum Alliance web site which is a good source of information on Scrum. This information can be found here.

We will now walk through the various stages of the Scrum Framework and consider how performance testing needs to fit into them. Before we do this it is important to understand that unless your development team has either a dedicated performance tester or a performance tester than is used across a few teams then this will not work.

To endure that what you deliver performs acceptably then performance testing needs to be considered at each stage of the process and this requires a dedicated resource.

Scrum Phases

Product Backlog

The definition of this is:

Product Backlog - An emergent, ordered list of what is needed to improve the product and includes the product goal.

You need to ensure that your performance requirements for the product are included in this backlog. There is a blog post on the definition of requirements that can be found here.

Each new service or piece of functionality or updates to existing functionality that is defined in the product backlog needs a complimentary set of performance requirements. When the Sprint Backlog, which we will discuss later, is defined then the performance requirement items can also be included ensuring they form part of the Sprint goal.

Sprint Planning

The definition of this is:

Sprint Planning - The entire scrum team establishes the sprint goal, what can be done, and how the chosen work will be completed. Planning should be timeboxed to a maximum of 8 hours for a month-long sprint, with a shorter timebox for shorter sprints.

You need to ensure that for each new piece of functionality or update to existing piece of functionality that performance testing work is added to the estimate. You can define at this point whether you need to write new tests, or you have an existing test that will performance test the development being completed.

You can even start to leverage the benefits of Test-Driven Development where tests are developed before code is written. There is a blog post here that outlines how this can be accomplished for performance testing.

Sprint Backlog

The definition of this is:

Sprint Backlog - The set of product backlog items selected for the sprint by the developers (team members), plus a plan for delivering the increment and realizing the sprint goal.

You need to ensure that time taken to develop and execute your performance testing is considered. You need to understand where you will need to build new performance tests and where you can leverage the benefits of existing tests.

Part of the plan must include adding any new performance tests developed as part of the upcoming Sprint to your regression capability. Test execution in the upcoming Sprint will be, as already discussed, a combination of new and existing and all new tests need to be added to your existing tests suite for use in future sprints. The time taken to do this must be included in the Sprint Backlog plan.

Sprint

The definition of this is:

The Sprint - The heartbeat of scrum. Each sprint should bring the product closer to the product goal and is a month or less in length.

This part of the Scrum methodology is where new tests are written, and existing ones executed. You need to consider that for each sprint you will need to:

  • Write new tests for new functionality.

  • Run existing tests for functionality being updated as part of this sprint and contrast this with results from previous performance tests.

  • Run existing tests for functionality not directly being updated as part of this sprint, as there may be an impact on common code, and contract this with the results from previous tests.

  • Add new tests once you have verified they perform within your requirements to the existing tests suite for inclusion in all future execution of existing tests.

If new functionality does not perform, then this needs to be addressed in this Sprint. If any of the existing tests have seen regression from a performance perspective then this also needs to be addressed as part of this Sprint.

Daily Scrum

The definition of this is:

Daily Scrum - The developers (team members delivering the work) inspect the progress toward the sprint goal and adapt the sprint backlog as necessary, adjusting the upcoming planned work. A daily scrum should be timeboxed to 15 minutes each day.

This is an important time to raise any performance issues as primarily development time needs to be allocated to resolving the performance issues. This will impact on the spring backlog as it may mean that features become at risk of being included in this Sprint if effort is diverted elsewhere.

This is where the performance testing must be accurate and empirical evidence made available, the Scrum Master will want a good reason to divert resources away from development and you need to provide this. Performance of each feature of each Sprint must meet your non-functional requirements and this must be prioritised over delivery of more features.

Sprint Review

The definition of this is:

Sprint Review - The entire scrum team inspects the sprint's outcome with stakeholders and determines future adaptations. Stakeholders are invited to provide feedback on the increment.

The outcomes of the performance of all the features included in the Spring need to be discussed here. If you are finding common performance issues or common functionality that always presents performance issues, then this is the time to raise them.

In agile teams everyone is responsible for the quality of the product and therefore constructive feedback of performance should be embraced and not something that should be challenged.

Increment

The definition of this is:

Increment - A sum of usable sprint backlog items completed by the developers in the sprint that meets the definition of done, plus the value of all the increments that came before. Each increment is a recognizable, visibly improved, operating version of the product.

Your performance testing will also increment as you add the performance tests created within the last Sprint into the regression suite. As this regression pack increments it starts to become a huge asset to the programme where it can be run between Sprints or even as part of the production pipeline deployment to assess performance before promotion to production.

An example of howe you can use your performance testing in push to production pipelines as well as a definition of what these are can be found here.

Sprint Retrospective

The definition of this is:

Sprint Retrospective - The scrum team inspects how the last sprint went regarding individuals, interactions, processes, tools, and definition of done. The team identifies improvements to make the next sprint more effective and enjoyable. This is the conclusion of the sprint.

You need to consider how performance testing went as part of the Sprint, how efficiencies can be made and how the pairing with developers worked.

Conclusion

This is the definition of how it Scrum works together:

Scrum accountabilities, artifacts, and events work together within a sprint cycle. The product owner defines a vision using information from stakeholders and users. They identify and define pieces of value that can be delivered to move closer towards the product goal. Before the developers can work on any pieces of value, the product owner must order the backlog so that the team knows what is most important. The team can help the product owner further refine what needs to be done, and the product owner may rely on the developers to help them understand requirements and make trade-off decisions. (This is where refinement becomes an important tool for the scrum team.)

During sprint planning, the developers pull a chunk from the top of the product backlog and decide how they will complete it. The team has a set time frame, the sprint, to complete their work. They meet at the daily scrum to inspect progress towards the sprint goal and plan for the upcoming day. Along the way, the scrum master keeps the team focused on the sprint goal and can help the team improve as a whole.

At the end of the sprint, the work should be potentially shippable and ready to be used by a user or shown to a stakeholder. After each sprint, the team conducts a sprint review on the Increment and a retrospective on the process. Then they choose the next chunk of the backlog and the cycle repeats.

Transitioning to an agile framework such as scrum requires a new mindset and overall cultural adjustments. And like all change, it doesn't come easy. But when teams and organizations fully commit to scrum, they'll discover a new sense of flexibility, creativity, and inspiration - all of which will lead to greater results.

From a performance testing perspective you are using the Scum framework to not only tests new features as they are developed but to ensure that performance does not regress against existing features.

Being embedded into the development teams mean that performance issues are resolved as they are found and the time spent to fix them is a lot quicker than investigating after development has completed.

If you are new to this way of working it may seem strange to start with but once the benefits of early detection and resolution of performance issues are seen first-hand you will find it a very impressive way of delivering change.

OctoPerf Superman
Want to become a super load tester?
Request a Demo