I am involved with a group called WOPR, the Workshop on Performance and Reliability. We put on two invitation-only peer workshops a year, and are always looking for new testers to come share fresh ideas.

We are about to close applications for WOPR15, hosted this fall in San Jose by eBay. The announcement is below. Please consider applying in the next couple of days, particularly if you have not attended before.

Thanks for your time, and I hope we see you at this or some future WOPR.

Eric Proegler

WOPR15 Call for Proposals (CFP)
The content owner, Mike Bonar, along with the WOPR organizers, invites you to submit your proposal(s) for the next WOPR.

Theme: Building Performance Test Scenarios with a Context-Driven Approach Conference Location and Key Dates
eBay, San Jose, CA, USA
WOPR15: Thursday-Saturday, October 28 - 30, 2010
Pre-WOPR Dinner: Wednesday, October 27, 2010
Deadline for Proposals: August 16, 2010
Selections will be Completed By: August 30, 2010

Applying for WOPR
Apply for WOPR here.

Presentations will be selected by the WOPR organizers and invitees notified by email according to the above dates.

WOPR15 Theme Description
Every testing project has its own success criteria, and factors that affect how it will unfold over time, such as people, requirements, deadlines, budget, existing test plans and artifacts, available tools, and other external factors. In every project, these factors change; in other words, every project has its own context. To conduct the most effective performance tests, we need to be aware of the context in which we find ourselves, how it influences our actions and decisions, and how we can adapt our efforts to make our testing project more successful.

At WOPR15, we would like to focus on how the context of your testing project influenced your design and use of performance test scenarios. Performance test scenarios live throughout the entire lifecycle of a testing project, including planning, execution, and reporting, so your experience report could describe any part of a project, or all of it. Don't be too concerned if you feel you don't have a lot to say about performance test scenarios in particular, but you still have an interesting experience you wish to share. The theme is designed to help WOPR participants focus on areas where we can learn and explore how we react to and customize testing in response to context.

Some points to help you think about performance test scenarios:
A performance test scenario is a description of an event or series of actions and events.
Performance test scenarios may describe performance testing workloads, but they can also describe what activities we simulate, how many threads we run in the simulation, the environment we use to test, how we structure our test schedule, and who performs the tests.
Performance test scenarios may describe what resources we monitor, which errors we record, which results we capture, how many tests we run, and how we measure success.
A performance test scenario is also a "projection" of actions and events. In other words, a performance test scenario is "forward looking." The tests we perform may or may not give us the results we want, and we are often surprised by the results we get.
Your thinking and choices, as captured in performance test scenarios, may contain assumptions and value judgments you may not be aware of; or they may be very deliberate.

Some questions to help you focus your experience report:
How did you choose the performance test scenarios for your project?
Did the structure of your team influence the performance test scenario selection process?
Did you document your performance test scenarios, and if so in what format?
If you did not document performance test scenarios, how did you proceed, and how did that work for you?
What kinds of things influenced how you executed your performance test scenarios?
Did you drop some performance test scenarios and pick up others as the test progressed?
Did your approach to performance test scenarios change as the project progressed?
Where there any particular challenges associated with performance test scenarios?
Did you present the results for your performance test scenarios individually or all together at the end of testing?
How do we know we've chosen "good" performance test scenarios?
How do we know we've chosen "enough" performance test scenarios?

Proposal Objectives
WOPR15 is seeking experience reports (ERs) of your relevant experiences and innovations from past projects and from your current initiatives (as-yet unfinished and unproven projects). For a description and samples of ERs, see the Paper Guidance and Papers on the WOPR web site.

While these pages describe other types of presentations, we are primarily interested in hearing about an experience. We are more interested in effective presentations and enlightening exchanges than in formal papers. A detailed paper is welcome though not required. For your presentation, an organized outline is enough.

We are looking for informative, in-depth storytelling by experienced practitioners. Your proposal to present should contain enough substance for us to understand and evaluate it. Content is more important than format. Your presentation should omit any confidential data (anything that requires an NDA).

Reports and presentations are welcome over a broad range of topics related to performance testing. The test domain is broad and may include real-time embedded devices, web sites, and international telecom networks.

About WOPR
In the view of knowledgeable observers, WOPR attracts the best and the brightest performance testers and managers as participants. In fact, many participants have world-class reputations.

WOPR conferences are invitation-only. We strive to make every conference an exquisite opportunity for learning and professional growth. They are intimate; we restrict attendance to less than 25 people.

WOPR is generally over-subscribed; we have sometimes had to turn away two or more applicants for every one we invite. We usually have more presentations than can fit into the workshop; not everyone who submits a presentation will be asked to speak.

One of the important goals of WOPR is community building among performance and reliability test professionals. We encourage insightful, talented people of varied experience levels and backgrounds to apply. Even if you do not believe you have a relevant experience, we welcome people who work in performance and reliability testing disciplines to contribute to the discussion. If you have an interest in attending WOPR, even as a non-presenter, please apply for consideration.

WOPR conferences and tutorials are priced as close to free as we can make them, as we are a self-funded not-for-profit organization.

Read more. If you have questions, please contact the organizers.

WOPR is a not-for-profit, low cost workshop, however we do have expenses and we ask the WOPR participants to help us offset these expenses. Thanks to the generosity of eBay, our host, the expense-sharing amount for WOPR15 has set at $300. If you are invited to the workshop, you will be asked to pay the expense-sharing fee to accept your invitation.