Friday, March 27, 2009

Performance Test Report of Oracle 11i ERP application Using Load Runner 9.0

Executive Summary:-
This document describes the performance testing of Oracle 11i financial application
using at a government finance department. The production applications were hosted on
a HP-UX Server. HP Load Runner was used to load, volume and stress testing of the
Oracle Application and was found to perform well and have stability at high loads. This
document presents the system configuration, workload, testing methodology, test
results, error analysis and bottleneck analysis.
1. Introduction
Government finance department uses Oracle 11i application for IT processing of its financial
applications. The Oracle 11i application is hosted in production on a HP-UX server with oracle 10g
database having 14 CPUs and 32GB RAM, and 300GB hard disk.
Performance of the system is critical in terms of user response times, overall transaction
throughput, capacity of concurrent users, and stability under high load.
The objectives of Load/Volume/stress testing are to:
 Determine stability of application under sustained durations of 90%+ CPU usage
 Evaluate response times of most frequently used transactions under high load
 Measure throughput , CPU utilization
2. Objective of Performance testing
The main objectives for client to conduct the load testing exercise are:
 To ensure that the system delivers good performance.
 To ensure that the system is capable of delivering the required level of Business activity
with the peak number of concurrent users and the activity mix.
 To confirm the sizing and the scalability of the machine for the peak load currently.
 To determine if the system is reliable under peak load.
3. Scope of Work
3.1 In Scope
Following are in scope of this assignment:
 Requirements and workload Analysis
 Test Plan
 Creating the test suite
 Setup of test environment at the benchmark center
 Monitoring the system resource (CPU) utilization during the performance tests
 Analysis of the test results
 identifying the bottlenecks in the application
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 5 of 17
TCS Confidential
3.2 Out of Scope
Following are out of scope for the assignment:
 Analysis of business processes and business functionality
 The performance tuning of Oracle ERP application, Oracle 10g database and
HP- UX servers.
 The HR(position form) module and payment module were not able to test
because of Arabic letters in the form.
4. System Configuration
Following is the production application configuration
And the testing bed configuration is
From the above two tables, we can see clearly that the hardware configuration in production
environment is much more than the testing environment. The hardware configuration is a key
element in the performance of a system.
5. Workload
5.1 Transactions and Concurrent Programs
The financial application in Oracle 11i were load, volume and stress tested. In the
financial applications, the following transactions were relevant to the benchmark:
No of
Machines
CPU RAM Hard disk Operating System/
Database
One Apps
Application
6 CPU,64 Bit @ 1000MHz
or 999 MHz
12 GB 300GB HP-UX 11.23,64 Bit
DB (2 nos) 8 CPU,64 Bit @ 1000MHz
or 999 MHz
20GB 300GB Oracle DB 10g
No of Machines CPU RAM Hard disk Operating System/
Database
One 8 CPU @ 1000 MHz 20 GB 300GB HP-UX 11.23,64 Bit
and Oracle 10g
database
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 6 of 17
TCS Confidential
Financial Transactions
 Create Account Payable (AP) Invoice
 Create Account Receivable (AR) Invoice
 Create Account Receivable (AR) Receipt
 Budget (Work sheet creation)
 Budget Revisions
 Group scenarios combining AR-Invoice, AR-Receipt, Budget Worksheet and Budget
Revision.
In addition, the following concurrent batch programs were constantly running in the
background
Concurrent Batch Programs
 Invoice Register
 Payables Approval
 Payables Accounting Process
 Payables Transfer to General Ledger
 Invoice History Report
5.2 Transaction mix
Following four types of scenarios were identified, each having different mix of
transactions.
 Worksheet form(Budget Module)
 Budget Revisions(Budget Module)
 Invoice form (AR Module)
 Receipt form (AR Module)
6. Load Testing Methodology
This section details the methodology used for load testing
6.1 Testing Strategy
Following Strategy has been used for performance testing:
 Test from current to expected business volumes incrementally
 Component based testing to identify individual application bottlenecks
 End – to – End tests to ensure that each application meet their individual expected
volumes
 Monitor for business transaction throughput and resource utilization (CPU)
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 7 of 17
TCS Confidential
6.2 Test Methodology:-
The following flow chart depicts the basic testing methodology
Following methodologies were used for the performance tests.
1. Analyze Requirement:-
First we have gathered the requirements and the workflow of Oracle ERP
applications with the help of functional and technical consultants.
2. Setup
The setup steps involved:
 Installation and configuration of Oracle Financial
 Loading baseline data required for the tests from production environment
 Configuring test scenarios for testing
 Configuring the load injection machines
 Executing pilot tests for verifying correctness of test bed
 Setting the system to a consistent state before each run
 Configuring Oracle concurrent batch jobs to run in the background during each
test
Analyze Requirement System Configuration and
Workload Analysis
Benchmark
Model
Setup Environment
(H/W and Application Setup)
Develop scripts for testing
And monitoring
Execute Tests
Monitor Resources
Analyze Performance
Validate
Change Parameters Repeat Test
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 8 of 17
TCS Confidential
3. Creation of Test Scripts
Load Runner 9.0 was used to record transactions for all types of users.
This generated scripts, some of which were modified for correctly simulating
various user processing scenarios.
4. Execution
Tests were executed using HP Load Runner 9.0. The tool was configured to
generate the transactions as per the chosen mix of scripts.
The counts of Vusers were ramped up starting from 30vusers to 200vusers or till
to know the bottleneck of the application. During the testing the LR tool included
the think time while recording plus 50% to 150% variance of recorded think time.
We have done this run time setting because of the smooth running of script and
tried to give as practical scenario as possible.
5. Validation
Each test was validated for correctness by verifying the success logs of Load
Runner report tool. It was also verified that the data injected appeared in the
back-end after the test. Results of each run were compared against previous runs
for consistency. Repeatability of test results was also verified on a regular basis.
6. Test Analysis
At the end of each performance test, the results were analyzed to determine
response times, CPU utilization and throughputs. This was done using the analysis
and reporting tools built in Load Runner9.0.
7. Tools Usage:-
To detect and monitor the application server performance quickly and
systematically, the following tools were used:
 HP Load Runner agents for Oracle performance monitoring
 Sar and top utilities in HP-UX to analyze the CPU utilization
 Custom SQL scripts to monitor the performance of the Oracle database
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 9 of 17
TCS Confidential
7. Test Results:
7.1 Response time of individual Scenarios
Scenario Number
of
Vusers
Average
scenario
time in sec
(With think
time)
90% / 95% of
Transaction
response time in
sec(With think
time)
Average
scenario
time in sec
(With out
think time)
90% / 95% of
Transaction
response time in
sec(With out
think time)
50 87.078 97.556 9.185 11.52
100 154.961 Budget 178.344 7.853 10.167
125 162.996 220.823 52.189 111.625
50 49.36 59.093 9.364 11.373
Account 100 63.935 83.18 23.912 40.757
Receivable-
Invoice
Generation
150 82.616 126.439 42.612 85.532
30 19.599 26.109
50 34.218 66.261
100 122.687 139.903 10.361 12.595
150 137.729 158.627 22.353 34.345
Account
Receivable-
-Receipt
200 169.285 232.252 53.829 115.051
30 273.045 295.817 (95%) 19.9 25.093(95%)
50 272.62 293.054(95%) 20.181 24.86(95%)
75 274.864 295.836(95%) 21.476 27.42(95%)
100 288.937 337.743(95%) 35.456 81.847(95%)
Account
Payment--
Invoice
150 310.429 367.463(95%) 37.061 92.713(95%)
25 88.594 97.504 13.471 14.03
50 88.668 98.423 13.897 14.897
100 89.678 99.271 14.616 15.642
Budget
Revision
125 90.695 100.409 15.608 17.933
Above table indicates the average response time taken by each scenario. The table above
shows the average response time for forms for a given number of concurrent users,
assuming recorded think time+ 50% to 150% variance of the recorded think time and think
time of 0 sec.
When the number of concurrent users is ramped up to 125, 150 and 200, there is sharp
degradation in performance in terms of response time in different scenarios.
Think time: - The time the user stops to think between steps. Here the think time
including the think time while doing recording plus the 50% to 150% variance of that.
Think time is a mechanism whereby we can make load test more accurately reflect a real
user’s behavior.
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 10 of 17
TCS Confidential
7.2 Graphs generated during testing
Following are the graph generated by the Load runner9.0 tool. I have included
Only for Worksheet form of budget module
7.2.1 Worksheet form of Budget module
a. Running Vusers –Average Transaction Response time
Color Graph
Sc
al
e
Measurement Graph's
Min. Graph's Ave. Graph's Max. Graph's
Median Graph's SD
Running Vusers 1 Run 0.0 79.676 118 94 36.006
Average Transaction
Response Time 1 Action_Transaction 122.763 159.158 180.061 166.796 17.958
Average Transaction
Response Time 1 Login_Tr_1 20.94 42.098 58.591 45.102 9.851
Average Transaction
Response Time 1 LogOff_Tr_3 11.969 23.137 30.972 24.547 4.984
Average Transaction
Response Time 1 vuser_end_Transacti
on 0.0 0.0 0.001 0.0 0.0
Average Transaction
Response Time 1 vuser_init_Transacti
on 0.0 0.001 0.004 0.0 0.001
Average Transaction
Response Time 1 Worksheet_Tr_2 38.706 42.235 43.416 42.262 0.769
Comment:-
From the Average Transaction Response time Vs Running Vusers graph, we can see that
the elapsed time scenario of 10:40min and number of Vusers 98, suddenly the average
transaction response time graph has increased. Here we can conclude that the
performance is degrading from this point.
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 11 of 17
TCS Confidential
b. Server Performance:-
Color Scale Measurement Min. Ave. Max. SD
100 Average load (Unix Kernel Statistics):192.168.1.23 0.035 0.408 0.535 0.112
1 CPU Utilization (Unix Kernel Statistics):192.168.1.23 0.664 38.122 80.737 11.426
100 Paging rate (Unix Kernel Statistics):192.168.1.23 0.0 0.041 1.669 0.134
10 System mode CPU Utilization (Unix Kernel Statistics):192.168.1.23 0.333 5.12 15.913 1.808
1 User mode CPU Utilization (Unix Kernel Statistics):192.168.1.23 0.166 33.003 64.824 9.93
Comment: - The CPU usage is in sync with load i.e. increasing with load.
c. Throughput Graph:-
Color Scale Measurement Graph Min. Ave. Graph Max. Graph Median Graph SD
1 Throughput 236.684 99414.293 148347.031 101159.305 31225.136
Comment: - In throughput graph, the bytes/sec is increasing with the load (No of
Vusers). So we can say that the bandwidth performance is good.
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 12 of 17
TCS Confidential
7.2.2 Group Scenarios
a. Average Transaction Response time graph
Color Scale Measurement Graph's
Min. Graph's Ave. Graph's Max. Graph's Median Graph's SD
1 Action_Transaction 11.78 15.29 25.084 14.024 3.061
1 AR_INV_Invoice_Gen_Tr_2 2.526 3.338 5.435 3.082 0.682
1 AR_INV_LogIn_TR_1 0.406 0.555 0.823 0.521 0.101
1 AR_INV_LogOff_TR_3 0.423 0.593 1.058 0.558 0.133
1 AR_Receipt_LoggOff_TR_3 0.466 0.6 0.856 0.585 0.099
1 AR_Receipt_Logon_TR_1 0.657 0.869 1.272 0.819 0.141
1 AR_REceipt_Tr_2 6.163 7.874 12.028 7.492 1.237
1 B_Revision_LogOff_Tr_3 0.448 0.614 0.873 0.58 0.106
1 B_Revisions_Logon_Tr_1 0.594 0.893 1.439 0.847 0.2
1 Budget_Revision_Tr_2 15.619 22.436 50.04 21.169 6.587
1 BW_Login_Tr_1 0.677 0.941 1.557 0.87 0.205
1 BW_LogOff_Tr_3 0.437 0.576 1.13 0.517 0.143
1 BW_Worksheet_Tr_2 3.348 16.623 72.411 7.296 20.555
1 vuser_end_Transaction 0.0 25.324 303.404 0.0 83.8
1 vuser_init_Transaction 0.0 0.0 0.0 0.0 0.0
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 13 of 17
TCS Confidential
b. Server Performance:-
Color Scale Measurement Min. Ave. Max. SD
10 Average load (Unix Kernel Statistics):192.168.1.23 0.012 1.315 3.922 1.033
1 CPU Utilization (Unix Kernel Statistics):192.168.1.23 0.832 69.977 100 31.346
1 Paging rate (Unix Kernel Statistics):192.168.1.23 0.0 1.072 283.321 15.458
c. Throughput:-
Color Scale Measurement Graph Min. Ave. Graph Max. Graph Median Graph SD
1 Throughput 12357.583 152859.67 245890.555 161081.969 47846.328
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 14 of 17
TCS Confidential
Following are some points to note
 The standard deviation of Budget_Revision_Tr_2 and Bw_Worksheet_tr_2 is very high.
From this we can say that the performance is degrading for these forms.
 In throughput graph, the bytes/sec is increasing with the load (No of Vusers). So we
can conclude that the bandwidth performance is good.
 The maximum CPU utilization had touched 100% at elapsed time of 14min. 10 sec.
And from the server performance graph we can see that for 5 mins the CPU
utilization was 100%.
 Due to this we have got ‘Internal server Error (500)’. And from the individual average
response time graph we can see that at 14min 10 sec, the response time has touched
its peak for most of the transactions. Here we can conclude that for most of the
transactions the average response time and CPU performance has degraded at 14min
10sec.
8. Error analysis
Following are some of the common errors that we have faced during the testing
Error description Reason
The last popup/alert message received: "stop - APP-FND-01516: Invalid
application username, password, or database.
Username: APPLSYSPUB
Database: PDEVCLON"
Application
problem
Action.c(207): Error -26377: No match found for the requested
parameter "OATrxnID19".
Check whether the requested boundaries exist in the response data.
Also, if the data you want to save exceeds
1024 bytes, use web_set_max_html_param_len to increase the
parameter size
Data problem
Action.c(207): Error -26612: HTTP Status-Code=500 (Internal Server
Error) for "http://ntistoad.mof.gov.sa:8061/OA_HTML/fndvald.jsp"
Application and
server
performance
Action.c(216): Error -27791: Server "ntistoad.mof.gov.sa" has shut down
the connection prematurely
Oracle
application
problem
Tata Consultancy Services
Performance Test Report
Document Name : Performance test report_Knowmax Page 15 of 17
TCS Confidential
ERR:- 47197: Monitor name :UNIX Resources. Internal rpc error (error
code:2). Machine: 192.168.1.23.
Hint: Check that RPC on this machine is up and running. Check that
rstat daemon on this machine is up
and running (use rpcinfo utility for this verification). Details: RPC: RPC
call failed.
RPC-TCP: recv()/recvfrom() failed.
RPC-TCP: Timeout reached. (entry point: Factory::CollectData).
[MsgId: MMSG-47197]
Network
bandwidth
problem
9. Bottleneck Analysis
Following table indicate the bottleneck of individual scenarios
Scenario Number of Vusers
Worksheet generation in Budget module 98
Budget revisions 80 to 90
Account Receivable----Invoice Generation 118
Account Receivable----Receipt 176
Account Payable----Invoice Generation 109
And following table indicate the bottleneck of the group scenarios running concurrently
10. Conclusion
This document has described the performance testing of an implementation of Oracle
Financial application at a government financial department. The performance test was
conducted on a HP-UX server running Oracle 11i, using HP Load Runner 9.0 as the load
injection tool. The load/stress/volume tests were conducted using proven tools and
methodology. Under the test conditions the application was demonstrated to have stability
and performance, even under CPU utilizations exceeding 90%. The performance tests
were executed in a test environment. The tests have helped in identifying the bottlenecks
through the analysis of the test results.