Skip to content

JMeter

Performance Testing, Artificial Intelligence and Machine Learning

We are going to look at how performance testing can work hand in hand with Artificial Intelligence and Machine Learning:

  • We will look at how we can use data gathered from performance test scenarios during execution to determine what functionality and what concurrent volumes and load profiles we should be testing.
  • We will use very simple mathematical models to do this in conjunction with a very simple database model with several manual steps to simulate the machine learning.

As there are many Artificial Intelligence solutions to choose from and for the purposes of this post this is the easiest way to discuss the principles of Artificial Intelligence working with performance testing rather than discussing a particular framework.

Continuous Delivery Test Reporting

We have spoken in a previous blog post about documentation about how we believe that if you are building performance tests that support either Continuous Integration or Continuous Delivery then having to produce performance testing documentation before and after each test does not fit with this methodology.

This got us to thinking that if we truly want to use an agile framework and continually push code to production then we need to also consider how we monitor and report on our performance test results produced from our testing tools.

If you have an agile testing process, then you will have functional tests built into this deployment process that run in each environment and if you have a mature robust process then this process deploys to production with no manual intervention if all the functional tests have passed along with any deployment checks.

Now in reality we need to have a performance testing stage in this process as well as each release needs to go through a performance test, whether that is a full suite of peak volume, soak test, scalability tests scenarios or whether that is a regression tests of core functionality.

And this is the bit that we are going to discuss in this blog post because if your organisation wants to push code directly to production then you do not have the luxury of running your performance test, analysing the results and writing up a report to be signed off.

Updating JMeter Performance Tests with an XML parser

When building performance tests, we all understand the value of using properties or variables to store static values outside of our tests. This ensures that any changes to these values need only be made in one place rather than having to make these changes in many tests.

Sometime though you may have inherited a suite of JMeter tests, or you were ** under pressure to develop these tests** and in order to do so you hardcoded values in your tests. This means that if anything changes, an endpoint or the server-name or even the payload of a sampler then you need to make changes to these static values in your tests.

How to load test LDAP with JMeter

We are going to discuss how we can test the Lightweight Directory Access Protocol (LDAP) using JMeter, the principles of LDAP can be quite complicated as their origins come from the X500 specification which documents a suite of protocols developed by the International Telecommunication Union in the 1980’s.

It is likely that you have heard of LDAP as the directory protocol used to underpin Active Directory (AD) which is a directory service for Microsoft and normally is used by many organisations to support user authentication and role profiles for company networks.

It is important to understand that LDAP is not exclusive to Microsoft but does allow users to query AD and authenticate access to it.

In order to demonstrate how we can test LDAP using JMeter we are going to use an Online Test Server provided by Forumsystems which means that should you wish to follow this tutorial but don’t have access to a LDAP server you can. If you need to look at the solution, the JMeter project can be found here.

A limitation of using this online LDAP test server is that we only have read-only access meaning we can only test the following functions:

  • Bind
  • Unbind
  • Search
  • Compare

But JMeter also offers the ability to:

  • Rename
  • Add
  • Delete
  • Modify

The LDAP entries which we will be unable to demonstrate using this service but nonetheless we will provide some examples of these test types.

Producing JMeter Tests from OpenAPI

Swagger in conjunction with OpenAPI is a way for REST API’s to be build, documented and consumed. It is defined in either YAML or JSON. OpenAPI and Swagger are both open source, and their use is commonplace amongst development teams and can make the life of a performance tester a lot easier as it allows us to build JMeter tests directly from the definition of the endpoints.

The object of this post is to look at how we can produce JMeter tests directly from a Swagger definition. If you are unfamiliar with Swagger it is worth spending some time reading the online overview as it will provide a good understanding of how the specification works.

Now clearly in order for you to leverage the benefits of building performance tests from Swagger definitions you are going to need to have your development teams use it to describe the API contract for service they are producing, but for the purposes of this post we will use a test definition called Swagger Petstore.