首页 MKV Performance Testing Process Guideline_new

MKV Performance Testing Process Guideline_new

举报
开通vip

MKV Performance Testing Process Guideline_new MKV Performance Testing Process Guideline File Name MKV Performance Testing Process Guideline Introduction Company StarCite Author Sincky Date 3/17/2009 Status Draft Date Modifier Reason Chapter Description 3/17/2009 Sincky ...

MKV Performance Testing Process Guideline_new
MKV Performance Testing Process Guideline File Name MKV Performance Testing Process Guideline Introduction Company StarCite Author Sincky Date 3/17/2009 Status Draft Date Modifier Reason Chapter Description 3/17/2009 Sincky First Version 3/27/2009 Sincky Team reviewed all Second Version 1MKV Performance Testing Process Guideline 3Section I PERFORMANCE TESTING 31. Performance Testing Planning 31.1 Performance Requirement Analysis 31.2 Performance Testing Strategy 32. Performance Scenario Definition 32.1 Performance Test Case 32.1.1 Performance test case summary 32.1.2 Performance test case definition 32.1.3 Performance test case review 42.2 Scenario Design 42.2.1 VU Scripts 42.2.2 Scenario Design 42.2.3 Scenario Review 52.3 Scenario Execution 52.3.1 Performance Testing Schedule 52.3.2 LR Machine Summary 52.3.3 Performance Testing Preparation 62.3.4 Performance Testing Execution 73. VU Scripts Development 73.1 Preparation 73.2 Develop HTTP VU Script 73.2.1 Record HTTP VU Script 73.2.2 Edit HTTP VU Script 103.2.3 Debug/Replay HTTP VU Script 113.3 Develop Web Service VU Script 113.3.1 Record Web Service VU Script 113.3.2 Edit Web Service VU Script 113.3.3 Debug/Replay Web Service VU Script 113.4 Common Actions usage 123.5 Review/Save VU Scripts 14Section II PERFORMANCE ANALYSIS 144. LoadRunner Analysis 145. Database Analysis 145.1 SQL Profiler Analysis 155.2 Business Data Analysis 156. Java Profiler Analysis 157. Server Log Analysis 168. LR Customized Log Analysis 169. Performance Testing Report 169.1 Performance Testing Report Introduction 169.2 Performance Testing Report Writing 179.3 Performance Testing Report Saving 18Section III PERFORMANCE TUNING Section I PERFORMANCE TESTING 1. Performance Testing Planning Performance testing is a subset of software system testing; it is to test the application which is performed from one perspective to determine how fast some aspect of the AUT (Application under Testing) performs under a particular workload. With it, QA can serve to validate and verify the quality attributes of the AUT, such as scalability, reliability or stability. Performance testing can serve the different purposes. Just because this, it is necessary to define the specific performance goal for each AUT. Generally for a classic Web/HTTP Application, the performance goal includes the following metrics: · Response Time: the amount of time that it takes for a server to respond to a request · Throughput: the number of requests that can be served by our application per unit time. it is frequently measured as requests or logical transactions per second · Resource Utilization: the measure of how much server and network resources are consumed by our Application. It includes CPU, Memory, Disk I/O, and Network I/O etc. · Load (Workload): it includes the total number of users and concurrent active users, data volumes, and transaction volumes etc. 1.1 Performance Requirement Analysis Owner: Product Manager/ Technical Architect It is critical to detail performance specifications (requirements) in any performance test plan. Product Manager or Technical Architect will define the performance requirements. We will follow this: 1. Analyze Production of AUT, to ask the following questions: · In detail, what is the performance test scope? What subsystems, interfaces, components, etc are in and out of scope for this testing? · For the user interfaces (UI's) involved, how many concurrent users are expected for each (specify peak vs. normal)? · What does the target system (hardware) look like (specify all server and network appliance configurations)? · What is the Application Workload Mix of each application component? (for example: 20% login, 40% search, 30% item select, 10% checkout) · What is the System Workload Mix? [Multiple workloads may be simulated in a single performance test] (for example: 30% Workload A, 20% Workload B, 50% Workload C) · What are the time requirements for any/all backend batch processes (specify peak vs. nominal)? · Where are the synchronous and asynchronous calls? With these questions, we need to · Analyze the user behaviors and action distributions, to get what is the key workflow for real user, identify those workflows is important and the ones that pose the most risk to our performance objectives · Investigate the Hardware structure and Application topology · Research the development technology of Application 2. Refer to Industry Standards · According to the production behaviors, figure out the gaps and distinguishment with the industrial standard · By the current hardware and topology, try to construct a testing environment to simulate the production structure · Refer to the current development technology, research and summarize a checklist that signs the common weakness and bottleneck of the application developed by this technology 3. Define the measurable goals for performance testing of AUT; to the different Application and different work flow, we will define the different goal. for example: -- Technical Goal · Response Time: 3 seconds for each web page displays, excluded Report feature · Throughput: averagely 500K/Second for each Application Server · Resource Utilization: CPU utilization is lower than 90% -- Business Goal · Workload: 100 Users send 10000 RFP in CDI site during 1 hour 1.2 Performance Testing Strategy Owner: Technical Architect/ QA Leader/Dev For the different performance goal, we need to define the different testing strategy. Generally we have the following performance testing type: · Load Testing: it is to verify application behavior under normal and peak load conditions. This load can be the expected concurrent number of users on the application performing a specific number of transactions within the set duration. This testing will give out the response times of all the important business and critical transactions. Then it can point towards the bottleneck in AUT · Stress Testing: it is to evaluate AUT’s behavior when it is pushed beyond its normal load. Then to determine the application's robustness in times of extreme load and helps us to determine if the application will perform sufficiently if the current load goes well above the expected load. · Endurance Testing: This testing is usually done to determine if AUT can sustain the continuous expected load when it is running for a long time, and used to identify the stability of AUT · Capacity Testing: it is also limit testing; its purpose is to find out the threshold value under keeping the stable resource utilizations, if continue to add load, AUT will be broken. Such as find out the max count of the concurrent users or find out the max throughput value In teams, we define performance testing into 3 categories: · User Story Testing: according to the specific definition of User Story, we need to test the special performance goal; such as after a special program is refactored for a feature just like Response RFP, we create a performance testing User Story to test if the response time about Response RFP is improved than before under the same load. In this category, we can use Load Testing/Stress Testing · Production Release Testing: its purpose is to get a benchmark for the current release to compare with the previous or the next version; with this, we can track and measure the performance testing target and trend of AUT. In this category, we can use Load Testing/Endurance Testing · Weekly Tuning Testing: in the daily performance work, we need to test the threshold value of every key workflow, such as the max successful count of Response Action in a clean house without any other business operated simultaneously in the current release; we also need to test the mix workflows for a long time. Their purposes are to find out more bottleneck and weakness of AUT, and make sure the whole system is better day after day after each tuning. In this category, we can use Load Testing/Stress Testing/Endurance Testing/Capacity Testing 2. Performance Scenario Definition Summary: Scenario is the entry of performance testing; it is the design of user behaviors’ simulation by Loadrunner. 2.1 Performance Test Case Performance test case is the specific design for some of a performance requirement. According to the different requirement, select the different strategy to satisfy the required performance goals. 2.1.1 Performance test case definition Owner: Performance QA/QA Leader No matter the category of performance testing, it is necessary to create an Excel document to describe this scenario. In this document, the key is: 1. Explain the purpose for this scenario 2. Define this performance goal 3. Define this performance testing strategy 4. Describe the user behavior category, just as Send RFP/ Do Response etc; besides the restrictions of RFP, such as the RFP count /source site /destination site or clean house distribution rate or failure critical etc. 5. Define the SLA detail so as to call back the risk 6. Define the specific key metrics to be monitored or collected 7. Define the status of the analysis tools, such as if open JConsole/GC View/SQL Profiler 2.1.2 Performance test case review Owner: Performance QA 1. After the test case is defined, it should be reviewed. The reviewer includes: Technical Leader/QA leader, Performance QA, Performance Dev. Through this review, everyone make sure the purpose of this case is clear, and the performance target is measurable 2. After the review, the test case file is saved in the folder of US SVN with a separated file: “https://scm.onvantage.com/svn/OnVantage/qatest/scripts/performance/loadrunner/MKV/testcase “; this is an example about it “ https://scm.onvantage.com/svn/OnVantage/qatest/ scripts/performance/loadrunner/MKV/testcase/MKVProductionScenario_M14.xls “ 2.2 Scenario Design 2.2.1 VU Scripts Owner: Performance QA According to the performance test case, LR scenario can be designed. LR VU scripts of scenario come from: 1. The existing VU scripts 2. New VU scripts – refer to 3. VU Scripts Development 2.2.2 Scenario Design Owner: Performance QA When design the performance scenario, in Loadrunner we should design Scenario contents: · Group name · VU script path – to use relevant path of VU script is must · VU load count and load schedule · Load Generator for each group Notes: · For each VU scripts, we need to set Run-Time setting, especially Run Logic/Log/Think Time/Network Speed etc. another side, make sure the parameter list in this script is sufficient and correct to this group · For Load Generator, we need to set a remote machine as the load generator prior to design this scenario, and configure a temporary folder to save the VU scripts, and then connect it. Another side, it is mandatory to add these common actions into the scenario with 1 VU separated if using them in this scenario · It is not necessary to define Monitors in the Scenario when designs it, we suggest to define them before executes this Scenario 2.2.3 Scenario Review 1. After design Scenario, it should be documented. We have an Excel file to save all of the LR Scenarios in US SVN; we should create a new sheet in it to describe all of information as soon as design a new scenario every time, this file is “https://scm.onvantage.com/svn/OnVantage/Qatest/scripts/performance/loadrunner/MKV/scenarios/MKV LR scenarios 2009/MKV LR scenarios introductions.xls” 2. After documents Scenario, we should review it inside the team, the reviewers include: Technical Leader/QA leader, Performance QA, Performance Dev Reviewed contents: · LR Scenario File · Introduction File about this Scenario Check List: · Make sure the design of this scenario accords with the test case exactly · Make sure the logic requirements such as RFP Distribution can be implemented 3. After reviewed Scenario, it should be run at least once until it is without error due to the scripts 4. Finally compress a zip file for this scenario including the entire folders and files. Generally a Scenario file includes 2 folders: · MKV WorkFlow – save all of the VU scripts using this Scenario · MKV Scenarios – a LRS file saved this Scenario 5. Upload the zip file to the folder “https://scm.onvantage.com/svn/OnVantage/qatest /scripts/performance/loadrunner/MKV/scenarios/MKV LR scenarios 2009” in US SVN 6. Update the file “https://scm.onvantage.com/svn/OnVantage/qatest/scripts/performance/ loadrunner/MKV/scenarios/MKV LR scenarios 2009/MKV LR scenarios introductions.xls” in SVN and commit it 2.3 Scenario Execution 2.3.1 Performance Testing Schedule Owner: Performance QA/QA Leader According to the category of performance testing, we can define their execution schedule: · User Story Scenario: By the schedule of this User Story in Rally, performance QA can download the zip file of the relevant LR scenario from US SVN to LR controller machine, review the LR configuration and testing environment settings, then run it · Production Release Scenario: By the schedule of this version release, performance QA can download the zip file of this LR scenario from US SVN to LR controller machine, review the LR configuration and testing environment settings, then run it once before each release due date · Weekly Testing/Tuning Scenario: During each release milestone, performance QA will design some key business scenarios by the production release test case, then download them from US SVN to LR controller machine, review the LR configuration and testing environment settings, then run it in each week 2.3.2 LR Machine Summary · SH Controller Machine: 10.201.10.216 · US Controller Machine: 10.201.30.57 You can get detail from SH Wiki about them including of Load Generators, it is: http://10.201.10.43/JSPWiki/attach/Loadrunner/Loadrunner+Controller+and+load+generators+Information.xls?version=5 2.3.3 Performance Testing Preparation Owner: Performance QA After download LR scenario to LR Controller machine, unzip it to a special folder. Before run it, we should double check the following: · In Controller machine, we should check · Run-time settings for every VU group · Parameter Lists for every VU group · Load Generator for every VU group (try to use the local machines according to Controller machine, and make sure their displayed statuses are ready) · Schedule settings for every VU group · Result folder for this scenario · Host file in Controller machine · Cleanup Customized Log folder “C:\lr_summary_log” in Load Generator Machines · the Session timeout in Controller/Generator machine · Added Common Actions of VU group to Generator machine if needs (using 1 VU to simulate it for each VU script to run it 1 iteration) · Make sure all of the measurement matrixes are configured correctly Notes, if need to update LR scenario file, make sure update it to this file in US SVN · In Controller machine, we should define Monitors so as to satisfy the different performance testing goal before runs this Scenario. The following of Build-in graphs are mandatory: · Running VUsers · Trans Response Time · Hits per Second · Throughputs · HTTP Responses per second · Pages Downloaded per second – this is optional The following of monitors should be added compulsorily: · SQL Server (for all of the DB servers) · Apache Web Server · Windows Resources (for all of the Web/APP/JMS servers) – this is optional · In Server Machines, we should check · Has backup the previous Sever Log files in all of Jboss machine · The current build version of performance environment · Has cleaned JMSQ and DBCC for all of MessageQ · Has Restarted all of Jboss services in APP servers · Jboss server is running with the Normal/Debug mode console/service · The link of MV response to MOCK server with SCM team to make sure to has configured the onvantage files to there · If want to execute Weekly or User Story scenario, it is necessary to configure · JConsole tools for Jboss servers are running to analyze Java Memory utilization or leak later · JProfiler tool for Jboss servers are running to analyze execution measurement for the whole Java application · SQL Profiler tool for DB server is running with the exact template to analyze DB level performance bottleneck later, please refer to 4. Database Analysis 2.3.4 Performance Testing Execution Owner: Performance QA As soon as all of the preparations are done, QA can run this LR scenario in the Controller machine. If need, you can either run it for a while or run Smoke testing by QTP to test the current performance environment to guarantee its usability and stability. 3. VU Scripts Development Summary: The purpose of VU script demonstrates the behavior of single user to AUT (Application under testing); By Loadrunner VUGen, you can create any VU script to simulate the behaviors about real user. Based on the different protocols of AUT used between server and clients to communicate each other, you have to select single or multi protocols by VUGen to record or develop the scripts. 3.1 Preparation Owner: Performance QA Before develop a VU script, according to the test cases it needs to: · Know Record protocol by VUGen (such as HTTP/Web or Web Service etc.) · Guarantee the functional testing is passed for AUT · Be familiar with the business logic for the VU script · Identify the simulated execution time for the business logic · Prepare all of the testing data using to parameterize the VU script 3.2 Develop HTTP VU Script Owner: Performance QA 3.2.1 Record HTTP VU Script · To Web application, we might select Ajax (Click & Scripts) protocol to simplify recording work with VUGen, it is more specific and clear but it does not support JS & ActiveX running. Notes: we can add Web(HTTP/HTML) code into Ajax (Click & Scripts) code, however it is unavailable to add Ajax into Web code · Sometimes use these 2 protocols to record VU scripts · Once use multi protocols in one script, you might encounter the context switch issues, it can be avoided to use some Function to make Web page refresh, just like web_browser (… Navigate= …) · Try to split every request of Web page to one Action in VUGen, and use multi Actions to represent the whole business behavior for AUT · Keep every Action’s name is meaningful, for example: click_meeting_name_link_on_XXpage · If record VU scripts in the local machine, make sure the ”locale setting” of your machine is same as the controller (US locale is preferred) · To use local network instead of long latency or wireless network to record · To record twice or more times for the scripts by different accounts with the same business behavior, and compare the source code each other using “compare with scripts” option and find out the “real” needed correlation parameters 3.2.2 Edit HTTP VU Script After recorded VU scripts, QA should edit it manually according to the requirements defined in test case. 3.2.2.1 Add Transaction · Any one Action must be added at least one Transaction in VU script to measure the response time of one pair of Request – Response for every page refresh · Transaction name must be the struts method name of this action to trigger the page refresh, such as “cdisales_rfp_EditRfpSave_do” · Any of one End Condition must be asserted by Content Check function instead of using “lr_end_transaction(cdisales_rfp_EditRfpSave_do, LR_AUTO);” E.G. web_reg_find ("Text=AD_HOC_Report_7", "SaveCount=HTML_view_Count7", LAST); lr_begin_transaction (cdisales_rfp_EditRfpSave_do); web_button(…,LAST); if (atoi(lr_eval_string("{HTML_view_Count7}")) > 0){ lr_end_transaction(cdisales_rfp_EditRfpSave_do, LR_PASS);} else { lr_error_message("SendRFP button does not work by:%s", lr_eval_string("{loginUserName}--{UserID}")); lr_exit(LR_EXIT_ITERATION_AND_CON
本文档为【MKV Performance Testing Process Guideline_new】,请使用软件OFFICE或WPS软件打开。作品中的文字与图均可以修改和编辑, 图片更改请在作品中右键图片并更换,文字修改请直接点击文字进行修改,也可以新增和删除文档中的内容。
该文档来自用户分享,如有侵权行为请发邮件ishare@vip.sina.com联系网站客服,我们会及时删除。
[版权声明] 本站所有资料为用户分享产生,若发现您的权利被侵害,请联系客服邮件isharekefu@iask.cn,我们尽快处理。
本作品所展示的图片、画像、字体、音乐的版权可能需版权方额外授权,请谨慎使用。
网站提供的党政主题相关内容(国旗、国徽、党徽..)目的在于配合国家政策宣传,仅限个人学习分享使用,禁止用于任何广告和商用目的。
下载需要: 免费 已有0 人下载
最新资料
资料动态
专题动态
is_085627
暂无简介~
格式:doc
大小:294KB
软件:Word
页数:20
分类:互联网
上传时间:2010-03-21
浏览量:14