User Tag List

Results 1 to 2 of 2
  1. #1
    Join Date
    Aug 2000
    Vancouver, BC, Canada
    Post Thanks / Like
    0 Post(s)
    0 Thread(s)

    Re: Identifying Test Cases and Planning Scenarios for Performance Testing

    Here is a portion of our internal manual that I wrote on workload characterization, it might give you some additional ideas:

    1 Select Workload

    Workload characterization involves studying the user and machine environment, observing key characteristics, and developing a workload model that can be used repeatedly.

    Once a workload model is available, the effect of changes in the workload and system can be easily evaluated by changing the parameters of the model. In addition, workload characterization can help you to determine what's normal, prepare a baseline for historical comparison, comply with management reporting, and identify candidates for optimization.

    Typically, you'll find that the alternatives you're asked to evaluate are stated in business- or user-oriented units instead of machine-oriented units. For example, you'll be asked to model the performance impact of adding 150 new users in the East Coast office compared to 150 new users in the main office.

    The advantage of user-oriented units is that the client can easily relate to them and can frequently give a good forecast of future workload. The disadvantage is that they do not automatically relate to system resources used by a workload. As the performance tester/analyst, you must quantify the relationships, turn an increase in the number of users into an increase in total resources used by each workload class. It is not guaranteed that a good relationship exists.

    1.1.1 Input
    Using the user-oriented information from your study of the environment and the system measures gathered in the Define Services/Components activity, you need to divide the total system workload into workload classes.

    1.1.2 Process
    A workload class is a combination of transactions that have enough in common to be considered one homogeneous workload. The goal is to have complete coverage of system workload with no overlap of classes.

    Depending upon the evaluation technique being used, workloads may be characterized in different forms.
    Some attributes used for classifying a workload include:
    * Resource usage
    * Database activity
    * File server activity
    * Server processes

    * User Scenarios
    * Business Functions
    * Application type
    * Data entry
    * Reporting
    * Search function
    * Data retrieval
    * Batch program

    * Geographical orientation
    * In-house
    * Remote location through DSL/ISDN
    * Remote location through T1-link
    * Internet
    * Dialup

    * Visibility
    * Front-Office activities (web pages, functionality used while talking on the phone to a client)
    * Back-Office activities (research, printing etc.)
    * Organizational unit
    * Workload divided up per group, department, division
    * Workload divided up per role (Executive, Director, Manager, Group lead, worker)
    * Etc.

    At times, it may be appropriate to use multiple criteria to define a single class. Regardless of how you choose to characterize the workload classes, the resulting workload model should capture quantitative information about the real system workload.

    Workload parameters are characteristics of users' requests such as traffic intensity, system load, and resource consumption. Pay special attention to those workload parameters that will be affected by your alternative performance objectives as well as those that have a significant impact on performance - CPU utilization, total I/O operations, memory usage, page faults, and elapsed time.

    1.1.3 Output
    The output of the workload definition process is a workload model that, once implemented in the performance test tool, will realistically simulate production activity on your system under test.
    The creation of this model is the most important step in running an effective and realistic simulation.

    Hope this helps,

    Roland Stens

  2. #2
    Junior Member
    Join Date
    Dec 2002
    Chicago, IL USA
    Post Thanks / Like
    0 Post(s)
    0 Thread(s)

    Identifying Test Cases and Planning Scenarios for Performance Testing

    Hello friends...
    I have few questions regarding the peformance testing and the tool we are using is Mercury Interactive's Load Runner.

    Our goal is to measure how the applications perform over our network infrastructure. We dont have to test the applicaton scalability with the load we just have to measure how much time it takes for various transactions with different network settings (bandwidth, latency etc.)

    Now I want to record the test scripts using Load Runner. And I have to identify the transactions for the measurements.

    I have identified some transactions and some of them are a)Login b)Logout c) registration d) form submssion e) download the webpage (say, home page) f) File download (Say, downloading a PDF file by clicking a hyperlink.

    My questions here are,
    1)is my transaction identification correct or not?
    2) can we consider a page download or a file download as a transaction for recording it as a transaction in Load Runner script?
    3) Did I miss any potential category of transactions, which are generally considered while testing web applications?

    can some one also help me by advicing how can I plan scenarios, first I have to document the scenarios and them Create those scenarios in Load Runner Controller.

    If any one wants to evaluate the test cases I have created, I can e-mail them if, you give me ur e-mail id. Though they are pretty simple test cases, I just wanted to know if I am going in the right path or not.

    by the way, this is my first testing assignement and I am actually from application development world.

    Thanks In Advance




Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

vBulletin Optimisation provided by vB Optimise v2.6.0 Beta 4 (Pro) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging v3.0.9 (Pro) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
Questions / Answers Form provided by vBAnswers (Pro) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
vBNominatevBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
Feedback Buttons provided by Advanced Post Thanks / Like (Pro) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
Username Changing provided by Username Change (Free) - vBulletin Mods & Addons Copyright © 2016 DragonByte Technologies Ltd.
BetaSoft Inc.
Digital Point modules: Sphinx-based search
All times are GMT -8. The time now is 04:14 PM.

Copyright BetaSoft Inc.