Skip to content

Load Testing Blog

Welcome to OctoPerf's blog, where we embark on a journey through the dynamic landscape of load testing, a crucial aspect of ensuring your applications thrive in the face of user demands.

Whether you're a seasoned performance engineer or just diving into the world of load testing, our blog is your go-to resource for insights, best practices, and cutting-edge tools.

Categories Highlights

  • JMeter Load-testing


    Load-testing using the n°1 open-source solution: Apache JMeter™.

    Learn More

  • Real-Browser Load-testing


    Load-testing using real-browser solutions like Playwright and Selenimum Web-Driver.

    Learn More

  • Load-testing Methodology


    Load-testing methodology and tools comparison.

    Learn More

  • OctoPerf's releases


    Stay tuned on the JMeter™ Performance Center latest releases: OctoPerf!

    Learn More

Start Your Load Testing Journey Today

Whether you're a developer aiming to optimize your code, a QA engineer ensuring application resilience, or a business owner safeguarding user satisfaction, load testing is your ally. Join us on this expedition into the heart of performance excellence.

Are you ready to transform your understanding of load testing? Let the journey begin!

OctoPerf is JMeter on steroids!
Schedule a Demo

Latest Blog Posts

Feature Highlight - Import JTL results from JMeter

OctoPerf offers a JTL import feature that allows anyone to import JMeter results and manipulate them through our reporting UI for free.

What is JTL

JTL stands for Jmeter Test Logs. It is the recommended output for JMeter test results.

It is basically a CSV file that contains a single line for each individual sample result:

jtl-file

It is important to note that the JTL is not structured, it contains results for a given JMeter instance across all threads running. Because of this it can be difficult to differentiate items that share a similar label.

Cooperative U - Case study

Cooperative U (formerly known as Système U), is a huge French retailers cooperative, made up of hundreds of independent hypermarkets and supermarkets, which makes it the one of the largest retail group in France. It's entire IT development and management is handled by U Tech (ex U IRIS), making performance testing one of its many responsibilities, in collaboration with SIGMA, a digital solutions provider they work closely with.

In 2022, while a Neoload customer for multiple years, U Tech decided to rethink it's performance strategy and looked for ways to improve its efficiency.

That's during this process that discussions with Octoperf started which ultimately lead Cooperative U's decision to make the switch from Neoload to Octoperf.

This article will explain why Cooperative U made this decision, how they implemented that change with SIGMA's help and what they are planning to do next to integrate Octoperf even more in their IT environment.

Generating Quality Data

The problem with test data is that it can become stale very quickly. This is either through its use from testing or from the fact that it is naturally aging in the test environments.

This is not just an issue for performance testing, although the volumes of data sometimes required for performance testing do make it harder. This also affects functional testing as well as batch testing and business acceptance testing amongst others.

Now we have previously written posts on how after completion of performance testing you leave data created by the execution of your tests which may be of use to other members of your test or development community. And we have also discussed how you can use existing data in your performance testing environment in your tests in the most effective manner.

But in both cases, this is during or after your performance testing takes place, for performance test to be executed you need quality data in your test environments. This is also true for functional, batch and user acceptance testing, really its true for any type of activity that wants to use data in the test environments.

Performance Test Results Trend Analysis

In this post we are going to look at how you can spot trends in your performance test results and use this trend analysis to help shape the way you address performance testing. Performance testing can generate a large volume of performance test data and using this data to define your future performance testing coverage, scope and volumes can really add benefit to your non-functional testing process.

To use data to support performance testing you are going to need to store it in a way that makes it accessible and comparable. The ideal solution and one we will discuss in this post is to store this data in a database. This post will look at the positive benefits of storing lots of performance data, especially data that spans a significant period of time, and we will look at ways of using this data to define performance test coverage and identify areas of performance concern.

OctoPerf v15.1 - Continuous integration and new trend report

Version 15 of OctoPerf bring a huge focus on continuous integration and Devops practices. We've taken steps to highlight existing integrations, add many others and we've thrown an entirely new report type on top of it.

Continuous integration improvements

It's always been possible to automate OctoPerf tests using our existing plugins or simply through our API. That said the setup of those integrations comes with a learning curve. In order to make this curve as flat as possible we've added a new wizard and the possibility to get a PDF report automatically at the end of the test.