Performance Paper - OpenText Documentum XCP 20.4
Performance Paper - OpenText Documentum XCP 20.4
Performance Paper - OpenText Documentum XCP 20.4
Version: 20.4
Task/Topic Performance
Audience: Administrators, Decision Makers
Platform: Windows
Document ID: 100031
Updated: October 29, 2020
Performance Paper
OpenText™ Documentum xCP 20.4
OpenText Performance Team
OpenText™ Documentum xCP 20.4 Performance
Contents
Audience ...................................................................................................................... 3
Disclaimer .................................................................................................................... 3
Executive Summary .................................................................................................... 4
Assessment Overview ................................................................................................ 5
Objectives ............................................................................................................... 5
Testing Methodology .................................................................................................. 6
Test Setup ............................................................................................................... 6
Test Strategy........................................................................................................... 6
Test Data Composition ........................................................................................... 7
Test Results ................................................................................................................. 8
Designer Test .......................................................................................................... 8
Designer Operation .......................................................................................... 8
Application Deployment ................................................................................. 10
Runtime Test with CAPINS ................................................................................... 10
CAPINS Low Load Test ................................................................................. 10
CAPINS Gauging Test ................................................................................... 14
CAPINS Standard Load Test ......................................................................... 14
CAPINS Endurance Test ................................................................................ 18
Runtime Test with Clinical..................................................................................... 21
Clinical Low Load Test ................................................................................... 21
Clinical Gauging Test ..................................................................................... 25
Clinical Standard Load Test ........................................................................... 25
Clinical Endurance Test ................................................................................. 28
Conclusion................................................................................................................. 33
Appendices ................................................................................................................ 34
Appendix A - Test Environment ............................................................................ 34
Appendix B - Application and System Tuning Guide ............................................ 35
Appendix C - Detailed Test Scenarios .................................................................. 37
Designer Test Cases ...................................................................................... 37
CAPINS Test Cases ....................................................................................... 38
Clinical Test Cases ......................................................................................... 42
Appendix D - Glossary.......................................................................................... 44
Audience
The document is intended for a technical audience that is planning an implementation
of OpenText™ products. OpenText recommends consulting with OpenText
Professional Services who can assist with the specific details of individual
implementation architectures.
Disclaimer
The tests and results described in this document apply only to the OpenText
configuration described herein. For testing or certification of other configurations,
contact OpenText Corporation for more information.
All tests described in this document were run on equipment located in the OpenText
Performance Laboratory and were performed by the OpenText Performance
Engineering Group. Note that using a configuration similar to that described in this
document, or any other certified configuration, does not guarantee the results
documented herein. There may be parameters or variables that were not contemplated
during these performance tests that could affect results in other test environments.
For any OpenText production deployment, OpenText recommends a rigorous
performance evaluation of the specific environment and applications to ensure that
there are no configuration or custom development bottlenecks present that hinder
overall performance.
All load test results in this paper are based on server–side measurements and do not
capture browser rendering of results. Actual timings including client side (e.g.
browsers) timings may vary significantly depending on the client machine
specifications, the client network, browser variations and other conditions of the user’s
environment.
Executive Summary
OpenText™ Documentum xCP is a key part of the Documentum product and solutions
portfolio. It’s a flexible development platform that brings together document and data
capture, ECM, BPM, Search and Analytics, Collaboration, Customer Communications,
and Records management for automating complex, information-intensive processes to
drive better business decisions.
This report describes the testing efforts undertaken by the OpenText Performance
Engineering Group to analyze the performance of the xCP 20.4 when running on a
given hardware environment.
The performance test focused on xCP designer performance and run-time
performance with reference applications. The tests established a performance
baseline for CAPINS application with common xCP features and the Clinical
Application with new Case Management features introduced in xCP 20.4.
Major findings from the assessment are:
1. The xCP 20.4 Designer operations and application deployment response times
varied depending on application size. With a medium size refapp like CAPINS, the
normal Designer operations response times were all under 3 seconds, and the first
deployment time was under 10 minutes, second time deployment time was around
3 minutes.
2. With the given hardware, the gauging point was 250 concurrent users for CAPINS
refapp running on xCP 20.4 with acceptable SLA (response time < 3 seconds), the
major limiting factor was Documentum Server CPU usage.
3. For the Clinical Application with new Case Management features introduced in xCP
20.4 and Process Management features, the gauging point was determined to be
250 users with the given hardware, and the response times for all the transactions
were less than 2 seconds.
4. No memory/session leaks were observed during the 48-hour endurance testing for
both CAPINS refapp and Clinical (Enhanced Process Management) Application.
Assessment Overview
Objectives
This assessment strives to test the stability, resource consumption and responsiveness
of OpenText Documentum xCP 20.4 on a typical hardware architecture deployment.
Testing Methodology
Test Setup
The performance test was conducted in a distributed system using VMWare virtual
machines. Refer to Appendix A for the details of the configuration. The diagram below
illustrates the infrastructure of the testing environment.
Figure 1
- Test Environment
Architecture
Test Strategy
All the load tests were using LoadRunner to simulate the user transactions and
performance scenarios. And the test strategy for each type of test is described below.
For details of the test cases and scenario settings, refer to Appendix C.
• Gauging Tests
This test was designed to increase the number of virtual users (workload) until
hardware resources such as CPU, RAM, network bandwidth, or I/O were at or near
saturation while the response time for individual transactions was still satisfactory.
The gauging point is the maximum load that the system can support without
compromising the end-user performance experience. Response time and various
server-side metrics were collected during the test.
• Endurance Test
This test was executed to confirm that the system could run a specific user load
over a long period (48 hours) without any performance degradation or resource
leakage.
• Designer Test
This test was executed manually to verify performance of various operations and
application deployment with xCP 20.4 designer using CAPINS refapp.
Test Results
Designer Test
Designer Operation
Response Time
The table below shows the transactions' response times for various of designer
operations. With auto building function turned off, the Designer would not rebuild the
project for each creation/update operation, so that the response times were much
better for operations like “creation” and “update”
Application Deployment
Response Time
For reference application deployment, the time would be depending on the application
size and platform. The table below shows the detailed response times for CAPINS
application deployment in xCP 20.4 on given hardware, the longest part would be the
Documentum repository deployment, which depends on the complexity of the
application.
Table 3 – Application Deployment Response Time
xCP 20.4 CAPINS Deploy Response Time(s) -Windows 2019/Oracle 19/Java 11
Designer validating, generating source, compiling, packaging 53
Invoking xDA 46
Deploy validation 17
Invoking xDA 38
Deploy validation 10
Pre deploy 4
Redeploy Repository deployment 8
xPlore deployment 3
BAM deployment 12
app war deployment 66
TOTAL TIME (seconds) 182
Throughput
Client-Side Metrics
The table below shows the key transactions' response times for CAPINS during the
test, all transactions' average response times were under 3 seconds except the
transaction "Show Cars Without Range Last Page", it was expected to be slower and
shown for comparison with transaction "Show Cars With Range Last Page". In xCP, it
is recommended to always use return range in advanced repository query for better
performance.
Table 5 - CAPINS Low Load Test Client-Side Metrics
Std. 90
Feature to Cover Test Transaction Minimum Average Maximum Count
Dev Percent
Get Signing page 0.02 0.02 0.06 0.00 0.02 895
Login
Login 0.08 0.12 0.30 0.03 0.16 892
Create Customer - Page
0.07 0.15 2.16 0.15 0.19 892
Load
Create Insurance
0.96 1.38 3.23 0.27 1.66 890
Customer
View Customer Details 0.13 0.50 1.40 0.16 0.64 1759
Submit Car Insurance
0.31 0.56 1.22 0.15 0.72 1758
Request
Admin Create City 0.08 0.13 0.25 0.04 0.18 120
Attach Telematic (Type
0.13 0.24 1.61 0.13 0.31 880
Fragment)
Business Object Edit Partner Set Gold
0.08 0.14 0.29 0.07 0.27 50
(Type Fragment)
List Alerts (Alert Query) 0.06 0.10 0.38 0.06 0.13 110
Create a Cabinet 0.09 0.16 0.28 0.06 0.25 118
Click on Manager Cars
0.07 0.11 0.19 0.03 0.16 118
(Editable Grid)
Std. 90
Feature to Cover Test Transaction Minimum Average Maximum Count
Dev Percent
Update Address
Relationship (Context- 0.09 0.16 1.35 0.12 0.25 890
menus)
Add Comment on
0.20 0.38 0.92 0.13 0.56 886
Customer (Comments)
Search for all Customers 0.61 0.75 2.71 0.13 0.83 1755
Search for Particular
0.29 0.40 2.66 0.19 0.47 1756
Customer
Full Text Search Facet filter on Car Brand 0.36 0.47 2.11 0.17 0.56 1760
Reference Document
0.44 0.62 3.06 0.35 0.70 380
Search
Facet filter on Last Month 0.30 0.37 0.87 0.08 0.46 378
Click Agent Inbox 0.18 0.29 1.98 0.15 0.36 1758
Agent Acquire Task 0.12 0.22 0.66 0.08 0.29 1756
Agent View Task 0.53 1.11 4.08 0.28 1.28 1748
Complete Task - Finish
0.17 0.32 0.83 0.11 0.44 1410
Task
BPM
Get Next Task 0.39 0.61 1.14 0.16 0.81 104
Click on Accordion inbox
0.47 0.54 0.65 0.04 0.60 108
(Accordion)
Std. 90
Feature to Cover Test Transaction Minimum Average Maximum Count
Dev Percent
Figure 2
- CAPINS Gauging Test
Result
Throughput
Client-Side Metrics
The table below shows the detailed CAPINS key transactions' response times during
the test.
Table 7 - CAPINS Standard Load Test Client Metrics
Std. 90
Feature to Cover Test Transaction Minimum Average Maximum Count
Dev Percent
Get Signing page 0.02 0.02 0.09 0.01 0.02 1204
Login
Login 0.09 0.21 3.66 0.15 0.28 1204
Create Customer - Page
0.04 0.16 1.15 0.08 0.22 1200
Load
Create Insurance Customer 1.12 1.73 4.74 0.39 2.18 1193
View Customer Details 0.17 0.74 5.00 0.27 0.93 2424
Submit Car Insurance
0.40 0.71 3.00 0.20 0.91 2416
Request
Admin Create City 0.09 0.20 0.75 0.07 0.28 159
Attach Telematic (Type
0.19 0.38 3.82 0.20 0.49 1195
Fragment)
Edit Partner Set Gold (Type
0.10 0.19 0.76 0.10 0.30 72
Fragment)
Business Object List Alerts (Alert Query) 0.07 0.18 3.59 0.30 0.26 147
Create a Cabinet 0.12 0.25 1.87 0.20 0.32 169
Click on Manager Cars
0.09 0.21 1.34 0.16 0.28 168
(Editable Grid)
Click to Update Car
0.19 0.37 2.15 0.18 0.50 169
(Editable Grid)
Add a New Car (Editable
0.17 0.34 1.09 0.16 0.47 169
Grid)
Update Address
Relationship (Context- 0.11 0.34 4.19 0.28 0.50 1205
menus)
Add Comment on
0.14 0.34 3.01 0.18 0.48 1206
Customer (Comments)
Search for all Customers 0.65 0.90 4.06 0.17 1.02 2423
Search for Particular
0.30 0.49 4.47 0.22 0.63 2427
Customer
Full Text Search Facet filter on Car Brand 0.37 0.61 4.56 0.27 0.73 2421
Reference Document
0.51 0.70 2.19 0.16 0.84 506
Search
Facet filter on Last Month 0.32 0.50 2.05 0.18 0.65 501
Click Agent Inbox 0.25 0.46 4.59 0.27 0.57 2404
Agent Acquire Task 0.14 0.30 2.67 0.15 0.38 2406
Agent View Task 0.63 1.47 6.12 0.47 1.80 2412
BPM
Complete Task - Finish
0.21 0.58 5.93 0.45 0.51 1943
Task
Get Next Task 0.52 0.85 2.79 0.24 1.06 139
Click on Accordion inbox
0.66 0.87 2.27 0.18 1.04 147
(Accordion)
Std. 90
Feature to Cover Test Transaction Minimum Average Maximum Count
Dev Percent
BPM Manager View Refdoc Task
0.98 1.48 3.75 0.37 1.92 202
(Lifecycle)
View Reference Folder 0.05 0.12 1.01 0.08 0.18 199
Click on Application Folder
0.12 0.17 0.30 0.04 0.23 200
(Level 2)
Click on Propose New
0.04 0.13 1.01 0.08 0.18 518
(Menu bar)
Click on Propose New
0.09 0.14 0.96 0.07 0.19 515
(Content Tree)
Import Reference
0.60 1.01 5.14 0.40 1.24 504
Navigation document
Click Image View Customer
0.76 1.17 7.62 0.44 1.46 1198
(Image type)
Click Search for Car 0.14 0.26 0.56 0.08 0.35 141
Click on Download
Document List (excel 0.49 0.70 3.58 0.24 0.84 507
export)
Show List of Users (session
0.02 0.07 0.29 0.06 0.15 251
parameter)
Customer Document
0.06 0.13 1.09 0.09 0.19 204
BAM Reports
View Agent Home Page 0.29 0.47 4.39 0.17 0.60 1205
View PDF File
0.43 0.83 5.30 0.39 1.06 495
(ContentViewer)
View PDF File (Advanced
Viewer 0.31 0.77 5.90 0.42 1.04 496
Viewer)
Advanced Viewer Add
0.08 0.21 3.32 0.19 0.29 496
Annotation
Show Cabinet Without
0.08 0.17 0.72 0.08 0.24 242
Permission meta Fetch Perm
data Show Cabinet with Fetch
0.19 0.35 1.35 0.12 0.44 243
Perm
Create Employee by
0.34 0.57 1.39 0.16 0.77 249
Inline BO creation Process
Create Employee InlineBO 0.51 0.76 1.72 0.16 0.95 248
Show Cars with Range 0.14 0.22 0.52 0.06 0.30 244
Show Cars with Range Last
0.15 0.19 0.44 0.04 0.22 247
Advanced Page
Repository query Show Cars Without Range 0.08 0.14 0.39 0.05 0.19 248
Show Cars Without Range
3.14 3.91 5.52 0.45 4.56 249
Last Page
Create Shareable Object 0.16 0.36 1.58 0.13 0.47 203
RTQ for Shareable Objects 0.08 0.18 0.67 0.08 0.26 203
RTQ for Shareable Objects
0.08 0.20 1.43 0.12 0.27 204
LWSO without LWSO
Create LWSO Object 0.16 0.40 4.77 0.34 0.52 204
Create LWSO Objects
0.36 0.71 4.42 0.39 0.94 208
using Batch
View Shareable Object 0.32 0.65 5.07 0.37 0.89 209
Std. 90
Feature to Cover Test Transaction Minimum Average Maximum Count
Dev Percent
RTQ for LWSO Objects 0.13 0.26 1.22 0.11 0.37 210
Materialize LWSO Object 0.09 0.16 0.59 0.06 0.23 210
Dematerialize LWSO
0.06 0.12 0.43 0.05 0.18 211
Object
LWSO RTQ for Shareable Objects
0.07 0.21 3.51 0.25 0.31 211
with LWSO
Reparent LWSO Object 0.08 0.14 0.54 0.06 0.20 206
Delete LWSO Object 0.34 0.60 5.46 0.40 0.77 192
Delete Shareable Object 0.11 0.20 0.56 0.06 0.25 194
Sign Out Sign Out 0.00 0.00 0.05 0.00 0.00 1195
Server-Side Metrics
The table below shows the server metrics during the load test, the Documentum
server's CPU usage was around 87% which was the major limiting factor, no other
resource usage saturated.
Table 8 - CAPINS Standard Load Test Server-Side Metrics
Servers Documentum Oracle App CTS CIS xPlore BAM
CPU
Avg. CPU Usage (%) 86.79 65.43 18.95 18.92 19.52 31.57 8.18
Avg Processer Queue
7.49 1.30 0.05 0.02 0.17 0.25 0.13
Length
Memory Usage
Avg. Available Memory 2.84 3.73 3.78 0.61 of 0.48
0.89 of 8 0.82 of 8
(GB) of 8 of 8 of 4 8 of 4
Avg. Committed
10.32 10.76 6.39 4.22 4.69 8.56 4.11
Memory(GB)
Disk Usage
DATA: 0.74
Avg. Disk Queue Length 0.83 0.02 0.02 0.02 0.35 0.00
LOG: 1.95
DATA: 0.61
Avg. Read Rate (MB/s) 0.32 0.02 0.05 0.02 0.65 0.00
LOG: 0.06
DATA: 2.05
Avg. Write Rate (MB/s) 1.16 0.14 0.19 0.06 1.24 0.00
LOG: 41.75
DATA: 38.85
Avg. Disk Reads/sec 41.57 0.25 0.49 0.26 17.59 0.01
LOG: 0.65
DATA: 98.50
Avg. Disk Writes/sec 144.86 4.18 2.61 4.20 44.01 0.55
LOG: 127.59
Network Usage
Avg. Received Rate
11.62 8.72 1.30 0.47 0.16 0.19 0.44
(MB/s)
Avg. Sent Rate (MB/s) 10.53 10.64 0.59 0.45 0.12 0.17 0.12
Throughput
Client-Side Metrics
The chart below shows all the transactions' average response times were steady
during the endurance test.
Figure 3
- CAPINS Endurance Test
Response Time Trends
The table below shows the detailed transactions response times during the endurance
test, no noticeable performance degradation observed.
Table 10 - CAPINS Endurance Test Client-Side Metrics
Std. 90
Feature to Cover Test Transaction Minimum Average Maximum Count
Dev Percent
Get Signing page 0.01 0.02 0.12 0.00 0.02 46880
Login
Login 0.08 0.18 12.05 0.30 0.23 46945
Business Object Create Customer - Page Load 0.04 0.14 15.63 0.33 0.17 46776
Std. 90
Feature to Cover Test Transaction Minimum Average Maximum Count
Dev Percent
Create Insurance Customer 1.01 1.53 18.46 0.49 1.79 46792
View Customer Details 0.14 0.61 18.29 0.58 0.74 93143
Submit Car Insurance Request 0.33 0.63 2.91 0.15 0.76 93072
Admin Create City 0.08 0.18 2.73 0.13 0.24 6320
Attach Telematic (Type
0.14 0.30 5.47 0.19 0.39 46728
Fragment)
Edit Partner Set Gold (Type
0.09 0.18 1.11 0.09 0.25 3224
Fragment)
List Alerts (Alert Query) 0.06 0.14 1.48 0.12 0.20 5608
Create a Cabinet 0.09 0.19 1.43 0.09 0.26 6344
Click on Manager Cars (Editable
0.08 0.17 1.46 0.10 0.23 6336
Grid)
Click to Update Car (Editable
0.17 0.34 3.93 0.28 0.44 6353
Grid)
Add a New Car (Editable Grid) 0.14 0.29 1.39 0.12 0.39 6368
Update Address Relationship
0.09 0.33 14.08 0.43 0.45 46816
(Context-menus)
Add Comment on Customer
0.12 0.29 13.92 0.28 0.38 46784
(Comments)
Search for all Customers 0.64 0.87 19.42 0.50 0.95 93112
Search for Particular Customer 0.30 0.43 18.94 0.35 0.52 93041
Full Text Search Facet filter on Car Brand 0.35 0.55 17.96 0.51 0.63 93160
Reference Document Search 0.48 0.69 17.32 0.48 0.79 19816
Facet filter on Last Month 0.30 0.44 3.57 0.17 0.56 19712
Click Agent Inbox 0.23 0.43 15.33 0.46 0.50 93024
Agent Acquire Task 0.12 0.27 13.67 0.32 0.33 92880
Agent View Task 0.51 1.27 17.27 0.51 1.51 92984
Complete Task - Finish Task 0.17 0.37 16.75 0.46 0.45 74512
BPM
Get Next Task 0.44 0.78 2.81 0.22 1.02 5568
Click on Accordion inbox
0.62 0.85 2.86 0.19 1.00 5568
(Accordion)
Manager View Refdoc Task
0.81 1.35 6.29 0.42 1.68 6080
(Lifecycle)
View Reference Folder 0.05 0.11 1.79 0.08 0.15 7824
Click on Application Folder
0.11 0.16 1.00 0.06 0.21 7784
(Level 2)
Click on Propose New (Menu
0.03 0.10 1.70 0.05 0.14 19977
bar)
Navigation Click on Propose New (Content
0.09 0.16 13.21 0.52 0.17 19968
Tree)
Import Reference document 0.52 0.91 17.65 0.72 1.09 19880
Click Image View Customer
0.58 1.12 18.54 0.62 1.34 46881
(Image type)
Click Search for Car 0.13 0.24 1.02 0.08 0.31 5568
Std. 90
Feature to Cover Test Transaction Minimum Average Maximum Count
Dev Percent
Click on Download Document
0.48 0.67 15.16 0.51 0.78 19760
List (excel export)
Show List of Users (session
0.01 0.05 0.44 0.05 0.10 10224
parameter)
Customer Document Reports 0.05 0.10 1.41 0.06 0.14 7856
BAM
View Agent Home Page 0.26 0.42 3.24 0.14 0.51 46936
View PDF File (ContentViewer) 0.34 0.74 5.74 0.31 0.98 19568
View PDF File (Advanced
Viewer 0.26 0.67 5.34 0.31 0.94 19393
Viewer)
Advanced Viewer Add
0.07 0.19 4.45 0.17 0.28 19328
Annotation
Show Cabinet Without Fetch
Permission meta 0.07 0.14 2.14 0.13 0.19 10184
Perm
data
Show Cabinet with Fetch Perm 0.16 0.34 14.88 0.60 0.40 10184
Inline BO Create Employee by Process 0.30 0.55 17.19 0.69 0.70 10048
creation Create Employee InlineBO 0.41 0.70 2.43 0.19 0.88 10056
Show Cars with Range 0.14 0.21 0.96 0.07 0.28 10185
Show Cars with Range Last
0.15 0.19 0.67 0.04 0.23 10192
Advanced Page
Repository query Show Cars Without Range 0.08 0.12 1.77 0.08 0.17 10224
Show Cars Without Range Last
2.97 3.93 7.65 0.55 4.58 10152
Page
Create Shareable Object 0.14 0.30 3.47 0.18 0.40 7768
RTQ for Shareable Objects 0.06 0.14 0.76 0.07 0.21 7745
RTQ for Shareable Objects
0.07 0.18 2.23 0.15 0.25 7752
without LWSO
Create LWSO Object 0.16 0.36 4.60 0.39 0.43 7824
Create LWSO Objects using
0.34 0.62 14.86 0.70 0.74 7840
Batch
View Shareable Object 0.25 0.56 6.15 0.33 0.71 7801
LWSO RTQ for LWSO Objects 0.11 0.23 6.35 0.31 0.29 7808
Materialize LWSO Object 0.07 0.15 4.33 0.20 0.20 7801
Dematerialize LWSO Object 0.06 0.13 13.73 0.62 0.14 7816
RTQ for Shareable Objects with
0.06 0.19 16.33 0.74 0.24 7848
LWSO
Reparent LWSO Object 0.06 0.13 4.93 0.22 0.17 7864
Delete LWSO Object 0.28 0.50 3.70 0.20 0.63 7649
Delete Shareable Object 0.09 0.17 1.53 0.07 0.23 7577
Sign Out Sign Out 0.00 0.00 0.07 0.00 0.00 46488
Server-Side Metrics
The charts below show the servers' CPU usage and Memory usage was stable
during the 48 hours endurance test, no resource leakage observed.
Figure 4
- CAPINS Endurance Test
Server CPU Usage Trends
Figure 5
- CAPINS Endurance Test
Server Memory Usage
Trends
Throughput
Client-Side Metrics
The table below shows the transactions' response times during the test, all the
transactions' average response times were under 2 seconds.
Table 12 - Clinical Low Load Test Client-Side Metrics
Minimum Average Maximum 90
Test Transaction Std. Dev Count
Percent
Get_Signin_Page 0.01 0.02 0.05 0.00 0.02 3852
Figure 6
- Clinical Gauging Test
Result
Throughput
Client-Side Metrics
The table below shows the detailed transactions' response times during the test, the
response times were consistent.
Table 14 - Clinical Standard Load Test Client Metrics
Minimum Average Maximum 90
Test Transaction Std. Dev Count
Percent
Get_Signin_Page 0.01 0.02 0.16 0.01 0.02 9279
Server-Side Metrics
The table below shows the major server metrics during the load test, the Documentum
server's CPU usage was around 79% which was the major limiting factor, no other
resource usage saturated.
Table 15 - Clinical Standard Load Test Server-Side Metrics
Servers Documentum Oracle Apphost CTS xPlore
CPU
Avg. CPU Usage (%) 78.77 51.37 38.60 8.93 12.81
Avg Processer Queue Length 4.43 0.25 0.35 0.01 0.04
Memory Usage
3.68 0.79 of
Avg. Available Memory (GB) 0.56 of 8 0.36 of 8 2.47 of 8
of 8 8
Avg. Committed Memory (GB) 10.45 9.84 6.28 4.19 8.29
Disk Usage
DATA: 0.18
Avg. Disk Queue Length 1.38 0.13 0.01 0.02
LOG: 0.96
DATA: 0.16
Avg. Read Rate (MB/s) 0.19 0.01 0.02 0.03
LOG: 0.36
DATA: 1.84
Avg. Write Rate (MB/s) 1.47 0.10 0.09 0.15
LOG: 9.36
DATA: 15.87
Avg. Disk Reads/sec 51.07 0.09 0.33 0.27
LOG: 9.70
DATA: 30.52
Avg. Disk Writes/sec 272.20 1.94 2.32 7.91
LOG: 84.35
Network Usage
Avg. Received Rate (MB/s) 12.50 9.30 1.82 0.21 0.04
Avg. Sent Rate (MB/s) 11.01 11.11 1.26 0.15 0.02
Throughput
Client-Side Metrics
The chart below shows all the Clinical transactions' average response times were
steady during the endurance test.
Figure 7
- Clinical Endurance Test
Response Time Trends
The table below shows the detailed transactions response times during the endurance
test, no noticeable performance degradation observed.
Table 17 – Clinical Endurance Test Client-Side Metrics
90
Std.
Test Transaction Minimum Average Maximum Percent Count
Dev
90
Std.
Test Transaction Minimum Average Maximum Percent Count
Dev
90
Std.
Test Transaction Minimum Average Maximum Percent Count
Dev
Server-Side Metrics
The charts below show the servers' CPU usage and Memory usage was stable during
the endurance test, no resource leakage observed.
Figure 8
- Clinical Endurance Test
Server CPU Usage Trends
Figure 9
- Clinical Endurance Test
Server Memory Usage
Trends
Conclusion
• xCP 20.4 provided consistent response times for both CAPINS application with
most normal xCP transactions and Clinical application with case
management/process management transactions at the supported load level.
(Detailed test scenario functions are provided in the Appendix C)
• The resource usage growth was linear with the user load increase in xCP 20.4,
and the major performance limiting factor was the Documentum server CPU
usage.
• At Designer side, the operation time would be depending on application size, with
a medium size application like CAPINS, all the operations’ response times were
acceptable, using Auto Build OFF would get better user experience.
• No performance degradation or resource leakages was observed during the
endurance test for 48 hours in xCP 20.4.
• All the test results and analysis are intended to offer high-level guidance of how
xCP 20.4 performs on a given environment that is representative of a typical user
profile. It reflects lab-based test results.
Appendices
Intel Xeon
Windows E5-2620 Open JDK 11
CTS Server 8 GB 50 GB VM Disk 10 Gbps
2019 2.0GHz * CTS 20.4
8 cores
Intel Xeon
Windows E5-2699 100 GB SAN
Database 8 GB 10 Gbps Oracle 19.3
2019 2.2GHz * 50 GB VM Disk
8 cores
Intel Xeon
Windows E5-2620 Open JDK 11
CIS Server 4 GB 50 GB VM Disk 10 Gbps
2019 2.0GHz * CIS 20.2
4 cores
Intel Xeon
xPlore Windows E5-2695 100 GB SAN Open JDK 11
8 GB 10 Gbps
Server 2019 2.4GHz * 50 GB VM Disk xPlore 20.2
4 cores
Intel Xeon
Open JDK 11
Windows E5-2620
BAM Server 4 GB 50 GB VM Disk 10 Gbps Tomcat 9.0.26
2019 2.0GHz *
BAM 20.4
4 cores
Intel Xeon
Windows E5-2670
Test Client 8 GB 50 GB VM Disk 10 Gbps HPC 12.55
2012 2.6GHz *
4 cores
xPlore Index Server CPS-requests-max-size Max request size for CPS 1000 5000
Config index-requests-batch- Index batch size 500 1000
size
index-thread-wait-time Index threads wait time 5000 10000
Some indexes were added in the database to achieve better performance as below, in
xCP, it’s recommended to add indexes for the attributes which used in RTQ query for
better performance.
add an Accordion
delete an Accordion
add a Border layout
Page Modification
delete a Border layout
add a Google map
delete a Google map
Pasted simple widget
Pasted complex widget (With data service)
Open preview RTQ
Execute RTQ
Query Preview
Open preview FTQ
Execute FTQ
Open an Application
Refresh Workspace
Show Data Service Instances
Delete duplicates of artifact
Saving BO when 20 pages opened
Bind data service for 5 existed pages
Bind data service for 10 existed pages
Others
Bind data service for 25 existed pages
Bind data service for 5 existed fragments
Bind data service for 10 existed fragments
Bind data service for 25 existed fragments
Open Online Help
Switch Artifacts in Dynamic Help
Import an existing Application
The table below describes the performance scenarios and user distribution used in
xCP 20.4 runtime test for CAPINS refapp, a pace time (3-5 minutes) is added in the
end of each scenario.
Table 22 - CAPINS Test Scenario
Scenario Detailed Steps User
Distribution
Scenario 1: Request Car Insurance
1. Agent user login to the system.
2. View its home page.
3. View customer details from home page and
edit customer (address and comment)
4. Click "Create Customer", input necessary
info and click "Save"
Customer service agent creates new 5. Search customer and create Insurance
customer and car insurance request, Request for the customer
reviews the new task and sends it to 6. Goto "Inbox" and search agent review task,
his/her manager for review, if needed, then acquire the task and view task details. 54%
search tasks with a signed contract and 7. Complete the task with (20%) or without
complete it, search insurance request and (80%) expert review.
attach telematic.
8. Search signed contract task, acquire the
task and view task details, complete the task.
9. Search insurance request and attach
telematic.
10. The user then logs out
Login
Go to Case Cabinet
Create a new Clinical Case
Navigate Case
Acquire State Plan
Create a new Case Node
Complete State Plan
40% Users running Case Back to Case View
Node Scenarios
Acquire State Develop
Modify the Case Node
Complete State Develop
Back to Case View
Acquire State Analysis
Goto Case Node View
Create a New Object
Complete State Analysis
Resume Instance
Click Change Supervisor
Filter Username
Finish Change Supervisor
Logout
Login
Go to Repo Browser
Expand Cabinet
Brower Cabinet
Brower Folder
30% Users running Navigate Cabinet
Navigation and Search
Create Folder
Scenarios
Import Document
View Document
Delete Document
Remove Folder
Go to Search
Appendix D - Glossary
Table 24 - Glossary
Term Description
CAPINS Casual and Property Insurance application, a
simple application deployed in xCP to verify major
runtime features.
Clinical An application which include the Case
Management and Process Management features.
Think time A delay between transactions to simulate end-user
actions.
Pace time A delay at the end of the test script to simulate
end-user actions (take a break after a round of
operation).
HPS HTTP requests per second
TPS Transactions per second
ACS The protocol between client/browser to
Documentum to retrieve document content.
CTS Content Transform Server for Documentum,
provide content transformation capabilities for
various of content formats.
CIS Content Intelligence Server for Documentum,
provide content classification and category
functions for metadata.
DA Documentum Administrator, the server for
administration operations in Documentum.
BAM Business Activity Monitor, the server to monitor
Workflow/Task/Activity operations and provide
reports for Documentum xCP.
BPS Business Process Integrator, the server to provide
inbound messaging capabilities to Documentum
applications.
About OpenText
OpenText enables the digital world, creating a better way for organizations to work with information, on premises or in the
cloud. For more information about OpenText (NASDAQ: OTEX, TSX: OTC) visit opentext.com.
Connect with us:
45
www.opentext.com/contact
Copyright © 2020 Open Text SA or Open Text ULC (in Canada).
All rights reserved. Trademarks owned by Open Text SA or Open Text ULC (in Canada).