Performance Problems during Functional Testing

Functional Testing can be delayed because of Performance Problems

Performance problems don’t always occur in production environments.  Increasingly, Testing Performance have observed performance problems occurring in test environments (and not just the performance test environment).

This problem can be hugely frustrating for both program managers and testers alike.  Delays to all forms of testing because of poor response times or because of stability issues can cause big headaches for companies trying to roll out an application on time and within budget.

Often performance or stability problems in test environments are undetermined.  Vague references can be made about the database being slow, or network issues.

Developers are busy resolving functional issues, whilst system administrators are preparing the production environment.  Testers and program managers often find themselves in a quandary.  Diverting resource from bug fixing, or from production platform preparations can initially add to the delays caused by performance problems.

Quick fix solutions like moving a test environment onto a bigger platform seldom resolve issues.

Performance testing takes place alongside other types of testing either on the same platform, not more usually on a separate platform.  Performance testing can be a long winded process.  Preparations of scripts, user accounts and other types of input data can take a substantial amount of time.  The aim of performance testing is firmly targeted at delivering performance improvements in time for rollout to production.

Resolution of these issues cannot wait for the formal performance testing process to complete.  Action needs to be taken immediately to keep the project running to schedule.

Testing Performance recommend the following approach for dealing with performance problems in the test environment.  Of course no two applications are the same.  The approach needs to be flexible to deal with the different circumstances coming into play for each application.

Gather together a small informal group consisting of representatives from each of the main areas involved with the application. This would typically include:

•    DBA
•    Middleware administrator
•    Possibly a Web or Citrix Administrator
•    Performance Tester
•    Application Designer
•    Functional Tester
•    Business User
•    Any other appropriate person

It is not necessary for these people to be fully dedicated to looking at performance issues.  They would however be given responsibility and ownership of performance issues in the test environment for their own area.
Responsibilities would include:
•    Liaising with other members in their team regarding performance issues
•    Investigating and defining performance issues in their area
•    Dispelling myth and rumour, replacing that with fact and substance, clearly defining performance issues
•    Most importantly, talking to one another about the application, its characteristics and behaviour

Performance problems in test environments of applications are seldom limited to one.  There are usually many, with some problems causing knock on affects. An example of this is queuing on a middleware component such as Weblogic or Websphere.  This could be partially caused by slow response times from within the Database rather than a shortage of threads.  Adding more threads to resolve the problem could actually make the performance problems worse, not better.  The middleware administrator and the Database administrator need to talk to each other in order to figure out what is going on.

Some typical performance problems experienced in test environments are as follows:

  • The production configuration for a component is applied to a component in the test environment.  The production platform would normally be significantly larger that the test platform, both in terms of CPU and Memory.  The production configuration may not fit in the memory footprint of the test platform, in effect, over-utilising system resource.
  • The test environment has not been configured at all.  Default values exist for all parameters, under-utilising system resources.
  • Increasingly, functional testing occurs against a database loaded with a large amount of data.  A new or modified application may contain inefficient SQL.  This is to be expected from a new application.  Developers are often unable to completely tune SQL in a development platform.  Traditionally this has not been a problem with application testing as the trend previously had been to use only a small subset of test data.
  • Inefficient configuration of connections between layers of architectural components.
  • Incorrectly applied load balancing against components resulting in both over and under utilised servers.

Change control of performance changes is very important.  Rigorous control and management of functional problems is standard for the testing phase of application development.  The same can't be said of performance changes.  How does one migrate a change to the configuration file of a system component? Good communication and management of performance problems is required to ensure they eventually make their way correctly to the production platform.