INDUSTRY

Law

PROJECT

Performance Testing for Law Firm solution yields performance improvements (using LoadRunner and JMeter)

SYNOPSIS

Testing Performance / Fimatix - Performance Testing Case Study

THE BACKGROUND

Fimatix were approached by a law firm client to carry out the performance testing of a large-scale HR & Finance transformation project to support over 7000 users, working with a mixture of on-premise, cloud based and vendor managed applications and application components.

The customer already had a small amount of performance testing capability in-house but had no real assets, infrastructure or processes that could be reused.

THE SCOPE

Performance risks were determined resulting in the build and execution of a performance test against the two most critical applications and multiple middleware components.

THE APPROACH

Fimatix had to define an approach for performance testing, which included defining infrastructure and defining a set of processes and testing tools that could be used for performance testing. The discovery phase looked at volumetrics, performance risks, architecture, and the degree of changes assessing different 12 application areas.

We used a combination of Apache JMeter and OpenText LoadRunner to write the scripts, working with both traditional HTTP(S) based web applications and Citrix-based applications, launched in tandem from an OpenText LoadRunner controller instance. Calls recorded in Apache JMeter needed to be contained within transaction controllers in order for the transactions within the Apache JMeter script to be reported within the OpenText LoadRunner Analysis tool.

Performance testing assets were set-up from scratch including processes, scripts, data and monitoring. Monitoring was set up in Azure and within physical and virtual servers to measure performance metrics such as CPU, Memory & I/O.

Numerous SQL queries were analysed and tuning changes were made during the testing phase alongside application changes to address performance issues

THE CHALLENGES

i). LOCKED DOWN ENVIRONMENT - Secured locked down environment meant set-up for performance testing required strong management.

ii). IMMATURE APPLICATIONS - One of the biggest challenges was having to test relatively immature applications which were changing on a regular basis. Easily maintainable test scripts needed to be created in order to effect quick changes to performance test scripts in line with the development changes coming through and keep the performance testing on-track.

iii). MULTIPLE TOOLS - The best fit for this exercise was using Apache JMeter for web-based traffic, OpenText LoadRunner for Citrix-based traffic using a OpenText LoadRunner Controller to run the test and generate the traffic from the remote machines.

iv). TESTING RESPONSES FROM MULTIPLE GEOGRAPHICAL LOCATIONS - In order to provide analysis of response times from a number of different geographical areas, a number of remotely located workstations for generating the load needed to be set up as well as Azure virtual machines for Citrix-based performance testing.

THE RESULTS

•  Response times reduced.

•  Data integration from one component to another measured and reported.

•  Tuning opportunities identified and implemented.

From the start of testing to the end of testing, application performance was significantly improved in a number of areas, with some calls experiencing a 95% reduction in response times and the number of performance related errors significantly reduced.

The main observation we had from performance testing is that the end user experience when entering client and matter details to the system was very slow under load. This was due to poor performance of background calls which update input forms dynamically dependent on client input (an example would be when a user was to select a main menu item, then a call would be invoked to the server to supply the appropriate sub menu items). The upshot of this was that at load, the user could be populating the input form to enter the record and when the background call finally completed, the screen would be refreshed erasing some of the details only just previously entered by the user.

From the start of testing to the finish, such background calls were fixed to respond significantly quicker, significantly improving the user experience. For example, whilst such a call was previously taking 51.3 seconds for 10 concurrent users, it is now taking 2.8 seconds for 10 concurrent users which is 94.54% reduction in response time.

THE OUTCOME

The customer was able to go live on-schedule with their solutions with the confidence of a significantly improved user experience in terms of a more robust, performant application, along with the full handover of all test assets and instructions and guidance on how to run and how to maintain as part of their own internal testing process.