Skip to content

Load Testing Blog

SNCF - Case study

Most of you have already recognized the name SNCF, it is obviously one that is hard to miss when you live in France. But for everybody else, allow me to do a quick reminder of what SNCF stands for.

The Société Nationale des Chemins de fer Français (Chemin de fer, literally, 'path of iron', means railway) is France's national state-owned railway company. It operates 32,000 km (20,000 mi) of route and in 2017 had €33.5 billion of sales in 120 countries. The SNCF Group employs more than 260,000 people.

source: Wikipedia

Lately, SNCF's IT strategy could be summarized as follows:

  • Have state-of-the-art, multi-cloud, application execution capabilities,
  • Work as a business partner with hand-picked software vendors to help them grow and learn from a real life use case.

This means re-thinking the strategy in many areas, including performance testing.

Julien Leclere
Julien Leclère is the Head of software factory at SNCF, with a range of 1500 applications.
The factory provides assets to manage the entire application lifecycle. Julien was looking for
a solution that could fit in the factory while still answering to a large variety of requirements.

To help with his task, Julien was assisted by Joaquin De Ibar Aguado who took the role of project manager on the migration to OctoPerf. Joaquin would help integrate OctoPerf in the factory as well as migrate a few key projects as a proof of concept.

Asynchronous API Performance Testing with JMeter

The principles behind performance testing API’s does not differ from the principles behind the performance testing of any application.

Many API’s however are Asynchronous and a valid response from the API does not necessarily mean the transaction is complete which can cause a problem when measuring the performance of API’s.

There are however ways around this and we will explore these in this post.

Sample JMX and DB files for this blog post are available for download:

LINKBYNET et OctoPerf s’allient pour tester l’app mobile En’jo de Majikan.

Header

En tant que Services Providers, LINKBYNET a compris les enjeux liés à la performance des apps. C’est dans ce cadre, que la société MAJIKAN a fait appel aux équipes Performance de LINKBYNET. Le besoin principal du client, au-delà des fonctionnalités de son apps, est la séduction du webinaute grâce à une optimisation de la qualité de l’expérience utilisateur pour son application Enjo.

En’jo, est une nouvelle apps qui met en relation des particuliers et des artisans pour répondre à un besoin de dépannage d’urgence. Comme toute apps avant son déploiement, En’jo a dû faire face à de nombreux challenges liés à la performance. En effet, une apps qui connait des lenteurs ou des problématiques de performances dés son lancement aura du mal à convaincre. De plus, le délai de retour sur investissement peut être rallongé considérablement et la communication pointant les défauts de l’apps réalisée par les mobinautes peut entraver l’image de marque du produit et de la société. Les mobinautes attendent des entreprises une démarche proactive, le challenge de MAJIKAN était donc de lancer son apps Enjo et que celle-ci soit opérationnelle immédiatement.

Majikan logo

Documentation and Agile Performance Testing

Once upon a time documentation was one of the most important aspects of Quality Assurance and this was not limited to the functional test efforts but the non-functional testing as well.

We spent days, weeks, months even creating Performance Test Strategies, Approaches, Plans, Test Case, Completion Reports etc.

Most of these documents were required before any automation could be written and before a sensible performance testing framework could be considered.

It was expected before performance testing began, during the performance test cycles and after the tests completed it was a constant cycles of documentation creation, review, update, review and sign-off.

With a bit of Performance Testing in the middle.

Surely many of us remember the difficulty in getting some of these significant and lengthy documents approved by many, many stakeholders.

Before I go on to tell you why documentation is overrated I want to caveat it with the fact that for some organisations it is a necessity as they are following the wishes of their customers and clients, and for some organisations they still follow a strict waterfall approach to software development and their way of working delivers for their organisation.

This is all fine, this post is more about Documentation for Performance Testing in Agile Delivery and this is an important distinction.

Hidden Benefits of Performance Testing

If you are a performance testing specialist or a QA Manager or Programme Manager or anyone involved in the production of quality software then you understand why performance testing is required and its benefits in ensuring your products meet your Quality Criteria for release into production.

The costs of delivering performance testing are easily worth the investment as software that performs not only ensures you and your company have a reputation for delivering well performing software but business users will, in my opinion, overlook and embrace small functional workarounds if the software performs well.

As an investment it is worth the cost of building robust and reusable performance tests for the purposes of performance regression testing and that in itself will justify the cost of keeping them current and executable against the latest versions of code which is a maintenance activity that needs to continue as your product changes and evolves.

Some organisations leave their performance tests purely for the purpose of performance testing but there are other uses for these performance tests and by using them for these additional uses you can save time and money on the building and maintaining alternative tests.

We are going to look at some of the additional uses of performance tests in this post that will further justify the building of a complex performance testing suite.