Skip to content

Load Testing Blog

New Features Tour - March 2025

Watch Our Latest Webinar: New Features Tour.

We’re excited to share the recording of our latest webinar, New Features Tour - March 2025!

Over the past 12 months, we improved both our SaaS and On-Premise versions, and this session showcases all the powerful updates we’ve introduced.

What’s New?

  • Trend Report – Track performance evolution over time
  • Insights – Automatically detect potential issues in your test
  • JTL Import – Seamlessly import & analyze your JMeter result libraries
  • Data Faker – Generate realistic test data effortlessly

Watch the full webinar to see these features in action and learn how they can optimize your performance testing!

Streamlining QA with functional and performance testing integration

Join Danielle Forier, Software Quality Assurance Analyst, as she shares the journey of how her QA team transformed their testing strategy by integrating functional and performance testing. Discover how reusable scripts and the right tools helped them achieve seamless workflows, greater efficiency, and the scalability needed to manage a growing and complex product portfolio.

Danielle Forier
Danielle Forier, is an Information Technology leader with over 15 years experience, she has a successful track record in key management, consulting, and individual contributor positions. Hands-on business experience in commodity trading, finance, logistics order-to-cash, and SDLC best practices allow her to be a proven IT change agent.

Cooperative U - Case study

Cooperative U (formerly known as Système U), is a huge French retailers cooperative, made up of hundreds of independent hypermarkets and supermarkets, which makes it the one of the largest retail group in France. It's entire IT development and management is handled by U Tech (ex U IRIS), making performance testing one of its many responsibilities, in collaboration with SIGMA, a digital solutions provider they work closely with.

In 2022, while a Neoload customer for multiple years, U Tech decided to rethink it's performance strategy and looked for ways to improve its efficiency.

That's during this process that discussions with Octoperf started which ultimately lead Cooperative U's decision to make the switch from Neoload to Octoperf.

This article will explain why Cooperative U made this decision, how they implemented that change with SIGMA's help and what they are planning to do next to integrate Octoperf even more in their IT environment.

Generating Quality Data

The problem with test data is that it can become stale very quickly. This is either through its use from testing or from the fact that it is naturally aging in the test environments.

This is not just an issue for performance testing, although the volumes of data sometimes required for performance testing do make it harder. This also affects functional testing as well as batch testing and business acceptance testing amongst others.

Now we have previously written posts on how after completion of performance testing you leave data created by the execution of your tests which may be of use to other members of your test or development community. And we have also discussed how you can use existing data in your performance testing environment in your tests in the most effective manner.

But in both cases, this is during or after your performance testing takes place, for performance test to be executed you need quality data in your test environments. This is also true for functional, batch and user acceptance testing, really its true for any type of activity that wants to use data in the test environments.

Performance Test Results Trend Analysis

In this post we are going to look at how you can spot trends in your performance test results and use this trend analysis to help shape the way you address performance testing. Performance testing can generate a large volume of performance test data and using this data to define your future performance testing coverage, scope and volumes can really add benefit to your non-functional testing process.

To use data to support performance testing you are going to need to store it in a way that makes it accessible and comparable. The ideal solution and one we will discuss in this post is to store this data in a database. This post will look at the positive benefits of storing lots of performance data, especially data that spans a significant period of time, and we will look at ways of using this data to define performance test coverage and identify areas of performance concern.