Skip to content
Using JMeter Assertions

Using JMeter Assertions

We are going to discuss JMeter Assertions in this post. There is already a fantastic Ultimate Guide post about how to use JMeter Assertions in the JMeter section of the OctoPerf Blog and this can be found here. Before starting on this post, it would be advisable to read the post on JMeter Assertions that already exists and linked above to get a firm understanding of how they work and the various types that exist.

This blog post is going to use the information in this post and consider how assertions can be used to help your performance testing especially when building complex tests and using them to make decisions as your tests run.

The aim of this post is not to reiterate the contents of the Ultimate Guide blog post but to take an example of one or two of the regular expressions and discuss how they can be used in the wider context of performance testing scenarios.

The reality is that any of the assertions can be used to manage your performance testing and it depends on the technology you are testing and therefore the assertions best suited to this.

Elevate your Load Testing!
Request a Demo

Cost of Assertions

This is mentioned in the Ultimate Guide and is important. Using Assertions does have an impact on CPU and Memory, and you need to be mindful of this. If you are using a cloud-based solution for your load injector, then you should be able to offset the overhead by dynamically increasing the amount of CPU and Memory available to your load injector.

If, however, you are using a physical load injector then you will have to use your Assertions sparingly. We will discuss further this during the post. We will step through some of the ways that Assertion use can complement your performance testing. Hopefully our set of examples will encourage you to look at other ways that they can be useful in your performance testing.

There are many combinations of Assertions that can be used to create complex performance test and certainly too many to discuss them all in one post. We will therefore discuss a few examples that we have used in this past which will provide you with a starting point for you using them in your own real-world scenarios.

Is our test working

Before running a full-scale performance test, you might want to check that the application under test is running as expected. Or maybe check whether it is correctly configured after a deployment, for example, before you start to place it under load.

Assertions are ideal for this because for a Sanity test you will be placing a small load on the system and any CPU or Memory constraints will not be an issue. Let’s explore how we can use assertions to check an application is ready to be performance tested.

We will build a dummy test:

jmeter-test-initial

Very simple test using Fragments and a number of Dummy Samplers

The Logon Fragment looks like this:

jmeter-logon-dummy-sampler

The Application Dummy Sampler looks like this:

jmeter-application-dummy-sampler

The Claims Dummy Sampler looks like this:

jmeter-claims-dummy-sampler

The Quote Dummy Sampler looks like this:

jmeter-quote-dummy-sampler

The Logoff Dummy Sampler looks like this:

jmeter-logoff-dummy-sampler

Very straightforward and if we run the Thread Group we have created we get positive dummy responses for all samplers.

jmeter-initial-test-results

If we think about how this would work as a sanity test, we might conclude:

  • If logon fails stop test
  • If application fails stop test because we cannot execute quote or claim without an application
  • If quote fails, continue because we could still execute a claim
  • If claim fails, continue to logoff

This is obviously very simplistic but would be the sort of thing you would want to accomplish in your sanity tests, where you want to continue to check the application under certain conditions but not under others. Let’s consider how we would go about doing this using Assertions.

We start by getting our application sampler to fail, we do this by unchecking the Successful sample box.

set-application-to-fail

If we run our test and look at the results, we see that Application fails but the test continues.

jmeter-application-fail-one

This is not what we want to happen as according to our sanity test conditions, we want the sanity test to stop.

We could set the Thread Group to Stop Thread on Sampler error:

jmeter-thread-group-stop

And this would work for our application sampler failure:

jmeter-application-fail-two

But would not allow us to check the quote or claim failure conditions outlined above as the test would not move on. Let’s reset the Thread Group to Continue on sampler failure. And add some assertions to manage the flow of the sanity test based on our conditions.

For the purposes of this Sanity Test we are going to assume that all responses will return a 200 response if valid, it is possible that your application may return another for a valid response, and you would tailor your assertions accordingly.

There are many valid response codes as outlined in Mozilla response status codes documentation. We have added a JSR223 Assertion to the Logon and Application samplers as we want the test to stop if these return a non-valid response.

We have added a Response Assertion to the Quote, Claim and Logoff samplers as we want to just report on these but continue the test.

Let’s look at both these Assertions, we’ll start with the Response Assertion that simply checks for a 200 response.

response-assertion-example

We will then update the Response Code from the Quote, Claim and Logoff Dummy Samplers to be 401. The example below is Claim but the other two are identical.

dummy-sampler-failure-response-code

If we run our test, we can see that the test responds with a failed assertions but continues anyway.

response-assertion-failure

Let’s now look at the JSR223 Assertion that checks for a response and determines what to do.

jsr223-assertion-example

Let’s look at the code in a bit more detail:

if(SampleResult.getResponseCode() != "200") {
    SampleResult.setSuccessful(false)
    ctx.getEngine().stopTest()
}
else {
    SampleResult.setSuccessful(true)
}

We are using some of the exposed script variables in the form of SampleResult and ctx. For more info, check the Javadoc for the sample result and JMeter Context.

We are checking for a 200 response, if found we mark the sampler as successful and if not as a failure and stop the test.

We have already demonstrated in the earlier execution that the success path works as we saw the test complete, we will now set the Application sampler to return a 401 and check the test stops and reports a sampler failure.

set-application-to-fail-two

And if we execute the test:

jsr223-assertion-fail

Our test is stopped as defined in our criteria above.

It is likely that you do not want to build separate Sanity Tests and you want to use any existing tests that you have for both a sanity test and for a performance test. A very simple way of doing this is to update your JSR223 Assertion to be something like this:

if(props.get("sanity_test") == "true") {
    if(SampleResult.getResponseCode() != "200") {
        SampleResult.setSuccessful(false)
        ctx.getEngine().stopTest()
    }
    else {
        SampleResult.setSuccessful(true)
    }
}

Where we only run the JSR223 Assertion if we have a JMeter Property set.

This will not work for our Response Assertions so you may have to use JSR233 Assertion throughout your tests.

If we were to then run the test by using this command:

./jmeter.sh -t ~/JMeterAssertions.jmx -Jsanity_test=true

We would see that the Assertion is called in our test and the execution is halted because we are still returning a 401 from the Application sampler.

jmeter-command-line-true

And if we set the parameter to false:

./jmeter.sh -t ~/JMeterAssertions.jmx -Jsanity_test=false

Then our assertion is not called and therefore the sampler does not fail.

jmeter-command-line-false

Stopping performance tests

We have already used this technique in the section above. But it is worth briefly discussing that you can use Assertion to stop tests based on any sampler result.

Using the techniques above will allow you to stop tests on response conditions during your performance tests. As long as the JSR223 Assertion is used sparingly, and it’s not overly complicated its overhead on CPU and Memory is negligible.

Stopping performance tests is only really a good idea on wholesale failure as sometimes a performance test that is slow with many failures can tell you information about the performance of your application under test. A performance test full of slow response times can be seen as a good test as it will help you uncover where your performance bottlenecks are. But it’s worth understanding that you can use Assertion logic to stop tests early should you want to.

Making a logical decision

We are going to look at how we can use assertions to introduce some conditional logic into your tests.

There are a number of Logic Controllers in JMeter and these are very powerful and extremely flexible. Using these in conjunction with Assertions to determine your Test Plan logic is very easy and we will look at ow you can accomplish this in this section. For our example we are going to use a While Controller , an If Controller and a Counter.

These coupled with some Assertion logic will help us build a robust performance test.

As the logic is relatively simple in our JSR2233 Assertions the CPU and Memory footprint should not be so great as to impact your performance tests under load. Before we look at the updates that we are going to make to our JMeter Test Plan we will again define some condition to manage, let’s use this.

  • If logon fails stop test
  • If application fails, try again for 5 iterations, if one cannot be created logoff
  • Only run quote if application is successful
  • Only run claim if application is successful
  • If claim fails, continue to logoff

Before we start, we are going to duplicate the fragments we created and make the changes to these as we will make the script available in this blog post and want to provide you with the previous examples.

duplicate-fragments

Our first change is to the Logon Assertion.

jmeter-logon-assertion-one

if(SampleResult.getResponseCode() != "200") {
    SampleResult.setSuccessful(false)
    vars.put("Logon_Check", "false")
}
else {
    SampleResult.setSuccessful(true)
    vars.put("Logon_Check", "true")
}

We are going to set a variable called Logon_Check to be true or false depending on the status of the Assertion.

We now move to the Application fragment where we have changed the assertion to be:

jmeter-application-assertion-one

if(SampleResult.getResponseCode() != "200") {
    SampleResult.setSuccessful(false)
    vars.put("Application_Check", "false")
}
else {
    SampleResult.setSuccessful(true)
    vars.put("Application_Check", "true")
}

We are going to set a variable called Application_Check to be true or false depending on the status of the Assertion.

We also have a Counter we have created:

jmeter-application-counter

This effectively iterates 5 times, and its value is held in a variable called applicationCounter.

And we have a While Controller that manages the iteration of the application sampler.

jmeter-application-while-controller

The logic in the While Controller is:

${__groovy(vars.get('Logon_Check') == 'true' && vars.get('Application_Check') != 'true' && (vars.get('applicationCounter') as int) < 5,)}

Which effectively reads, keep iteration while:

  • Logon is successful
  • Application Check is unsuccessful
  • The iteration count is less than 5

If any of these conditions is not met, then the While Loop will exit.

Effectively the While Loop will not run if logon is not successful, it will exit if an application sampler is successful and will also exit if it has tried unsuccessfully to create an application 5 times.

The Quote and Claim fragment both have the same If Controller.

jmeter-claim-assertion

jmeter-quote-assertion

${__groovy(vars.get('Logon_Check') == 'true' && vars.get('Application_Check') == 'true')}

Which checks that logon is successful and an application has been created.

The Logoff fragment has this If Controller.

jmeter_logoff_assertion

${__groovy(vars.get('Logon_Check') == 'true')}

Which checks for a successful logon before logging off.

Let’s test our logic in our test to check it works, firstly we will return a 401 response to logon.

jmeter-execution-logon-failure

We see that if logon fails nothing else is executed.

We will now set logon to return a 200 but application to return a 401.

jmeter-execution-application-failure

We see that logon is successful, we try 5 times to create an application and then logoff.

We will now set application to return a 200.

jmeter-execution-all-success

And we can see that our test completes successfully.

End of test outputs

Finally, we will briefly touch on the fact that you can use the same assertion logic to determine which tasks to run at the end of your test, maybe as part of the tearDown Thread Group.

Although you will need to use a Property rather than a Variable as you need the scope to be across Thread Groups.

Conclusion

Here are four very simple examples of how Assertions can be used in your performance testing without consuming too much CPU and Memory which is one of the drawbacks of using these extensively in your performance testing.

Hopefully this blog post has given you some ideas on how you can embellish your tests using Assertions outside-of their more obvious use of checking for valid responses during performance testing.

Script is available here.

Want to become a super load tester?
Request a Demo