Oracle Database 11g
Real Application Testing
What is Real Application Testing?
• New database option available with EE only
• Includes two new features
• Database Replay (DB Replay)
• SQL Performance Analyzer (SPA)
• SQL Tuning Set (STS) functionality is shared between
Real Application Testing option and EM Tuning Pack
• Advance functionality requires additional packs
• Diagnostic Pack for DB Replay
• Tuning Pack for SPA
• Price (?)
2
Real Application Testing
• Value
• Reduces testing cost Deploy
• Improves testing quality
• Business Benefit Test
Change
• Faster technology adoption
• Lower risk
Remediate
Solution for the Agile Business
3
Database Replay
4
The Need for Database Replay
• System changes like hardware/software upgrades are fact of life
• Customers want to identify full impact of change before going live
• Extensive testing and validation can be expensive in time and $$$
• Despite expensive testing success rate low
• Many issues go undetected
• Changes can impact system availability and performance
negatively
• Cause of low success rate
• Inability to properly test with real world production workloads
results in many issues to go undetected
• Database Replay makes it possible to do real world testing
5
Database Replay
• Recreate actual production database workload in test
environment
• Identify, analyze and fix potential instabilities before making
changes to production
• Capture Workload in Production
• Capture full production workload with real load &
concurrency info
• Move the captured workload to test system
• Replay Workload in Test
• Make the desired changes in test system
• Replay workload with production load & concurrency
• Honor commit ordering
• Analyze & Report
• Errors
• Data divergence Analysis & Reporting
• Performance divergence
6
Database Replay: Supported Changes
Client Client Client
…
Changes
Unsupported
Middle Tier
Changes Supported
•Database Upgrades, Patches
•Schema, Parameters Recording of
External Client
•RAC nodes, Interconnect Requests
•OS Platforms, OS Upgrades Storage
•CPU, Memory
•Storage
•Etc.
7
Comparison of LoadRunner & DB Replay
Testing e-Business Suite
80
80
Time Taken (Days)
60
40
24
20 20
20 5 LoadRunner
4 0 0 2 5 DB Replay
0
Install & Setup Understand Identify Key Generate Run Test
Application Transactions Workload
Usage
Total Testing Time
DB Replay: 2 weeks
LoadRunner: 30 weeks
8
Why DB Replay?
From: To:
Artificial workloads Production workloads
Partial workflows Complete workflows
Months of development Days of development
Manual intensive Automated
150 High risk Low risk 10
Days Days
9
Database Replay Overview
10
Step 1: Workload Capture
• All external client requests Production System
captured in binary files
Client Client Client
• System background, internal …
activity excluded
File System
• Minimal performance
overhead for capture Middle Tier
• For RAC, shared and local File 1
file system supported
File 2
• Specify interesting time
period for capture, e.g., peak …
workload, month-end
processing, etc. File n
Storage
11
Capture Options
• Workload can be filtered to customize what is captured
• Filter Types
• Inclusion Filters: Specifies which sessions should be captured
• Exclusion Filters: Specifies which sessions should NOT be
captured
• Filter Attributes: Workload capture can be filtered using any of the
following session attributes
• User
• Program
• Module
• Action
• Service
• Session ID
• Workload capture can be run on-demand or scheduled to run at
later time
12
Step 2: Process Workload Files
• Setup test system
• Logically similar data as production Test System
• Use RMAN to physically restore
production db from backup
• Use Snapshot standby
• Use imp/exp, Data Pump, etc. File 1
File 1
• Processing transforms captured
File 2
data into replay files and
generates necessary meta-data File 2
…
• Must be done on same version of
…
database as replay system File n
• Recommended to process on test File n
system
• Once processed, workload can be Metadata
replayed many times
Capture Files Replay Files
• For RAC, if using local file system,
copy all capture files to single
location for replay
13
Step 3: Replay Workload
• Replay Driver is a special client
program that consumes
processed workload and sends Replay Driver
requests to the replay system
Replay Replay
Client Client
• Replay requests preserve
timing, concurrency and
dependencies seen on the
capture system File 1
• Replay Driver consists of one or File 2
more clients. For workloads with
high concurrency, it may be …
necessary to start multiple File n
clients to drive workload
Metadata
• Once all replay clients are
started, workload replay is then Replay Files
initiated by user
14
Replay Options
• Synchronized Replay
• Workload is replayed in full synchronized mode
• Exact same concurrency and timing as production workload
• Transaction commit order is honored
• Ensures minimal data divergence
• Unsynchronized Replay
• Workload can be replayed in unsynchronized mode
• Useful for load/stress testing
• High Data Divergence
• Three (3) parameters provided to control degree of synchronization
• Think time synchronization
• Commit order synchronization
• Connect (logon) time synchronization
15
Replay Options
• Unsynchronized Replay Parameters
• Think time synchronization
• Controls think time between database calls
• Auto (Default): Adjusts think time so as to maintain captured request rate
• Percentage
• 0% No think time, highest possible request rate
• <100% Higher request rate
• 100% Exact think time
• >100% Lower request rate
• Commit order synchronization
• Controls commit order between transactions
• In asynchronous mode, commit order not honored – transactions are
committed as soon as commit call is issued
• Connect (logon) time synchronization
• Controls when sessions are created
• 0%: All session are connected immediately
• 100% (Default): Sessions connect at same time as in captured system
16
Replay Options
• Number of Replay Clients
• Configurable by user
• Client Calibration Advisor recommends number of replay
clients needed for specific workload
• Replay clients are multithreaded clients that can drive
multiple workload sessions each
17
Analysis & Reporting
• Comprehensive reports are provided for analysis purposes
• There (3) types of divergences are reported
• Data Divergence: Number of rows returned by each call are
compared and divergences reported
• Error Divergence: For each call error divergence is reported
• New: Error encountered during replay not seen during capture
• Not Found: Error encountered during capture not seen during
replay
• Mutated: Different error produced in replay than during capture
• Performance Divergence
• Capture and Replay Report: Provides high-level performance
information
• ADDM Report: Provides in-depth performance analysis
• AWR, ASH Report: Facilitates comparative or skew analysis
18
Workload Types Supported
Supported
• All SQL (DML, DDL, PLSQL) with practically all types of binds
• Full LOB functionality (Cursor based and direct OCI)
• Local transactions
• Login/Logoffs
• Session switching
• Limited PL/SQL RPCs
Limitations
• Direct path load, import/export
• OCI based object navigation (ADTs) and REF binds
• Streams, non-PL/SQL based AQ
• Distributed txns, remote describe/commit operations
• Flashback
• Shared Server
19
EM Interface: DB Replay Summary
20
Best Practices
• Capture
• Provide adequate disk space for captured workload (binary files)
• Database restart (Optional): Recommended to minimize divergence during
replay
• For RAC, use shared file system
• Test System Setup
• Ensure data in test is identical to production as of capture start time to
minimize data divergence during replay
• Use RMAN backup/restore or Snapshot Standby feature to setup test system
• Reset system clock to same time as production if application logic involves
SYSDATE usage
• Process Workload
• Processing workload has performance overhead and can possibly take a
long time
• Process workload on test system after rather than production system
• Replay
• Use Client Calibration Advisor to identify number of replay clients needed to
replay workload properly
21
DB Replay Security Model
• Any “Non-SYS” user with DBA Role
• SYSDBA role is not mandatory
• Does not have to be the user whose workload is captured
• “Execute” privileges on DBMS_WORLOAD_CAPTURE/REPLAY
procedures
• DBMS_WORKLOAD_CAPTURE
• START_CAPTURE, FINISH_CAPTURE, REPORT(), ADD_FILTER,
DELETE_FILTER
• DBMS_WORKLOAD_REPLAY
• PROCESS_CAPTURE,INITIALIZE_REPLAY, PREPARE_REPLAY(),
START_REPLAY(), CANCEL(), REPORT, ADD_FILTER,
REMAP_CONNECTION
• Capture and Replay user can be different if they have
appropriate privileges
22
SQL Performance
Analyzer (SPA)
23
SQL Performance Analyzer (SPA)
• Test impact of change on SQL query performance
• Capture SQL workload in production including statistics & bind
variables
• Re-execute SQL queries in test environment
Client Client Client
… Production Test
Middle Tier Re-execute SQL Queries
Capture SQL
… … Use SQL Tuning
Oracle DB Advisor to tune
regression
Storage
24
SPA Benefits
• Enables identification of SQL performance regressions before
end-users can be impacted
• SPA can help with any change that impacts SQL execution plan
• DB upgrades
• Optimizer statistics refresh
• New indexes, Materialized Views, Partitions, etc.
• Fix regressed SQL with SQL Tuning Advisor and SQL Plan
Baselines
• Integrated with query optimizer
• Captures SQL workload with low overhead
25
SQL Performance Analyzer Workflow
Make Change Tune SQL
Initial Post-Change Tuned
Environment Environment Environment
Execute Re-execute SQL Tuning
Production SQL SQL Advisor
and Gather Stats
Execute SQL
Compare Performance
26
Step 1: Capture SQL Workload
• SQL Tuning Set (STS) used to store
SQL workload
• STS includes:
• SQL Text
• Bind variables
Cursor Cache • Execution plans
• Execution statistics
Incremental Capture
• Incremental capture used to populate
STS from cursor cache over a time
SQL Tuning Set
period
Production Database
• SQL tuning set’s filtering and ranking
capabilities filters out undesirable SQL
27
Step 2: Move SQL Workload to Test System
Cursor Cache
SQL Tuning Set
Export/Import SQL Tuning Set
Production Database Test Database
• Copy SQL tuning set to staging table (“pack”)
• Transport staging table to test system (datapump, db link, etc.)
• Copy SQL tuning set from staging table (“unpack”)
28
Step 3: Execute SQL Before Making Change
SQL Tuning Set • “Before change” SQL performance
version is the SQL workload
Fetch Next SQL performance baseline
• SQL Performance = execution plans +
execution statistics
Test Execute
• Test-Execute SQL in SQL tuning set:
Execution Plan & • produce execution plans and statistics
Statistics • execute SQL serially (no concurrency)
Save Result • every SQL is executed only once
• skip DDL/DML effects
eC hange • Explain plan SQL in SQL tuning set to
Befor
generate SQL plans only
SQL Performance Analyzer
29
Step 4: Execute SQL After Making Change
• Manually implement the planned
SQL Tuning Set
change:
Fetch Next SQL • Database upgrade
• Implementation of tuning recommendations
• Schema changes
• Statistics gathering
• Database parameter changes,
Test Execute
• OS/hardware changes, etc.
Execution Plan &
Statistics • Re-execute SQL after change:
• Test-Execute SQL in SQL tuning set
Save Result to generate SQL execution plans and
statistics
Completed
• Explain plan SQL in SQL tuning set
ge e to generate SQL plans
Chan Chang
Bef ore After
SQL Performance Analyzer
30
Step 5: Compare & Analyze Performance
• Compare performance using different
Completed Completed
metrics
• Elapsed Time
nge Chang
e
ef or e Cha After • Parse Time
B
• Execute Elapsed Time
• Execute CPU Time
• Buffer Gets
• Disk Reads
Compare • Disk Writes
SQL Performance • Optimizer Cost
• SPA Report shows impact of change
for each SQL
• Improved SQL
Analysis Report
• Regressed SQL
• Unchanged SQL
• SQL with Errors
• Tune regressed SQL using SQL
Tuning Advisor
• Analysis results can be used to seed
SQL Performance Analyzer SQL Plan Management repository
31
EM Interface: SPA Report
32
SPA Security Model
• Based on DBMS_SQLTUNE/SQL Tuning Advisor
• Requires “Advisor”, “Administer [Any] SQL Tuning”
privileges
• Create/Drop/Alter SQL Profile privileges
33
Q&
A