Performance Testing: Key Considerations
Performance Testing: Key Considerations
Performance Testing: Key Considerations
The Pega® Platform is different than the standard Java application in several ways, but the
approach to performance testing and tuning Pega is the same. To ensure that each performance
test is successful, please take the following key considerations into account.
Key considerations
• Define Application Usage Model (AUM) and evaluate with business to ensure it’s realistic.
• Create the scenarios from the key functional areas as detailed in the AUM.
• Evaluate and validate test data requirements.
• Create test user IDs with correct roles and privileges.
• Estimate the amount of work that should come in and do not process more than what
was expected. (A common mistake is to process a whole day’s volume in one hour.)
• Tune and optimize the chosen scenarios.
• Use Pega tools PAL / DB Trace to identify issues.
• Fix issues then validate fixes, i.e. new DB index and query tuning.
• Baseline scenarios on a single JVM.
• Validate performance metrics for low user concurrency (10-50).
• Increase concurrent usage to evaluate optimal user concurrency per JVM.
• Diversify test IDs. Using a single test ID to test concurrent scenarios can lead to errors.
Test approach
The diagram below illustrates the approach used for performance testing a Pega application.
Performance testing 1
Performance testing
Test objective
The aim of the test is to measure the ability of the environment to process the selected business
functions within the specified timeframes or rates of concurrency detailed in the service level
agreements (SLA). The tests detailed below should be carried out during this test phase if
applicable. The business or other requirements will usually determine this.
Performance testing 2
Performance testing
Performance testing
Load test
A test type concerned with measuring the behavior of a component or system. With
increasing load, e.g. number of parallel users or transactions, determine what load
can be handled by the component or system. This test can be used to “break” the
component or system.
Stress test
Testing conducted to evaluate a system or component at or beyond the limits of its
specified requirements. This test can also be used to determine how the system or
component handles variations in peak load. Testers may use this test to try to
“break” the component or system.
Soak test
Ensures that performance bottlenecks do not occur when the application is run for
a long period of time with typical (or peak) load (e.g. looks for memory leakage).
This testing will normally mimic a typical production day(s) and must include all
background processing in an application.
Scale test
Testing conducted to evaluate how an application will perform at scale in an
environment. The aim is to prove linear scalability within the confines of the testing
environment. Extrapolation of results may be required if the testing environment is
not identical to production.
An additional aspect of scale testing will be testing of regional usage of the
application to ensure that an acceptable level of performance is achieved for users
in different geographical locations or on terminal emulation software.
Performance testing 3
Performance testing
Performance testing 4
Performance testing
Approach overview
1. Prepare and audit the test environment (system test, clone, pre-production).
2. Complete, review, and agree upon the engagement document with all stakeholders.
3. Build simulators, test harnesses, and any stubs required.
4. Define AUM.
5. Setup monitoring and diagnostic tools.
6. Verify stability of performance environment.
7. Build benchmark scenarios and undertake performance optimization.
8. Execute acceptance testing.
9. Analyze and compile results.
10. Tune application and environment, re-baseline, and retest.
11. Execute assessment testing.
Performance testing 5
Performance testing
Test results
The following artifacts are required to evaluate the outcome of each iteration of the
performance test.
• Pega application and alert log files
• Pega log usage data
• Database statistics data
o Oracle stats pack reports
▪ AWR
▪ ADDM
▪ ASH
• Operating system data
o CPU utilization
Performance testing 6
Performance testing
o Memory utilization
• JVM memory
o Verbose garbage collection data
• Network utilization data
It is a best practice to ensure that each testing iteration starts from the same set of data and that
the log files listed above are deleted before each iteration. When using Oracle, the flashback
functionality should be considered as an option to ensure each test starts from the same point in
time.
Before execution of each testing phase, Pega and WebSphere will need to be restarted to ensure
the log files represent only the elapsed time for each test.
Performance testing 7
Performance testing
JMeter
• Apache JMeter is a 100 percent pure Java desktop application designed to load test
software (such as a web application).
• It may be used to test performance both on static and dynamic resources, such as static
files, Java Servlets, CGI scripts, Java objects, databases, and FTP servers.
• JMeter can be used to simulate a heavy load on a server, network, or object to test its
strength or to analyze overall performance under different load types.
Types of test plans supported with JMeter
• Web test plan/advanced web test plan
• JDBC
• FTP
• JMS Point-to-Point/ JMS Topic
• LDAP/LDAP Extended
• WebServices (SOAP)
LoadRunner
HPE LoadRunner is a software testing tool from Hewlett Packard Enterprise. It is used to test
applications, measuring system behavior and performance under load. HPE LoadRunner can
simulate thousands of users concurrently using application software, recording, and later
analyzing the performance of key components of the application.
LoadRunner simulates user activity by generating messages between application components or
by simulating interactions with the user interface, such as keypresses or mouse movements. The
messages and interactions to be generated are stored in scripts. LoadRunner can generate the
scripts by recording them, such as logging HTTP requests.
Performance testing 8
Performance testing
Our organization has completed the first Robotic Process Automation (RPA)
implementation using Pega. How can we make sure the performance of the system will
not be an issue in production?
An RPA or Robotic Desktop Automation (RDA) application doesn’t need any special performance
testing. Unlike other technologies, robotics simply overlays the applications that are currently in
place. So, in simple terms, the "robots" are pressing the same buttons within any given
application that a user would press when performing the same transaction sequence. Load
testing is dependent upon the speed and performance of the applications that we lay on top.
RPA/RDA applications can move no faster than the speed of the underlying applications and we
can always move faster than those apps. So, load burden does not typically fall to RPA/RDA
implementation.
In the field, when conducting an RPA ("batch") transaction, it's always good practice to test with
higher volumes to make sure nothing crashes. But since the robots keep the load on the
underlying systems, things are often "perceived" as being better. That's mostly because the
processing can happen off hours or more robots can be assigned to do the same work (as long
as the underlying applications can handle more multiple robots).
Our organization has completed the first implementation of outbound marketing. We can
test the campaign using seed and distribution tests, but how can we do the load testing?
In addition to multi-channel Next-Best-Action, Pega Marketing enables the creation of simple
outbound campaigns in which a message can be sent to a target audience over a single
outbound channel. Testing the outbound campaign is tricky, as rules are not tested with the
same kind of data needed by an actual campaign.
There are some open source tools that can be used for performance testing of email outbound
marketing campaigns.
FakeSMTP is a free fake SMTP server written in Java. It has a GUI for easily testing emails in
applications. Configure your application to use "localhost" as your SMTP server, and all emails
will be and displayed in this software. Visit http://nilhcem.com/FakeSMTP/ for more information.
DevNull SMTP is an imitation SMTP server designed for testing purposes. It helps identify
problems with an email server.
Performance testing 9