Jmeter interview questions
Can any one let me know where can i find JMeter Interview Question.?
Appreciate any help..!!
I have a lot of experience with LoadRunner but very minimum experience with JMeter. I am a QA Manager. I need to hire a good performance test engineer and we use JMeter in our company. None of the other team members have any experience with performance testing.
I am looking for some good advanced JMeter interview questions (and answers). I could not find anything beyond basic questions online. Any help will be appreciated.
Most of my experience has been with Grinder and SOASTA
But I can tell you some things you'll want to ask any Load Test Engineer.
1) Load testing basics.
- recording of traffic.
- setting up a recording proxy
- parameterization of request.
- scaling up virtual users
- setting up a distributed load test rig
- Mocking out 3rd party services
- ramping up / ramping down.
- setting up account and cleaning up after load tests.
- precautions to take during load testing
3) Best practices
- When is it ok to shortcut something and when is it not?
- What should be the ideal test composition?
- What are the best practices in test flow composition?
- What parts be stressed more than others?
- Strategies to provide realistic feedback and not feel good results.
Thanks David. If anyone with JMeter experience has more insights, I'd love to hear from them.
No matter if you're looking for the questions or answers I believe The Ultimate JMeter Resource List will be helpful.
In any case please clarify what exactly you're looking for.
Here are some questions that I use when interviewing potential performance engineers:
1. What tools have you used for load testing ? Which do you prefer and why ?
2. [Jmeter only] --- What is a test fragment and how is it used ?
3. [Jmeter only] --- How can you reduce overhead when running a test ?
4. What are the most important system metrics to monitor when running load tests ?
5. What tools/plugins do you use for monitoring system health, CPU, memory, etc ?
6. How do you capture and store session data in your test scripts ?
Thanks email@example.com and Pastor of Muppets.
There is a real problem with your approach, independent of tool: You don't know the profession/field that you are interviewing for.
Originally Posted by it_phani
This happens quite often in the field of performance testing. Unfortunately it is also often paired with both candidate and interviewer who have studied the same question set downloaded from the Internet. A hire is made and then there is no success. This is usually followed by a chorus of "blame the tool...." You should also be aware that that most public repositories of questions contain deliberately incorrect control questions that have been seeded by professionals over the years to identify both the source of the questions being studied by candidates as well as whether the candidate actually understands the field rather than parroting a response. If neither you nor the candidate have the expertise to determine the control questions then your conversation will be just that, a parroted response.
The core problem is that the performance testing profession is 85-95% foundation skills and 5-15% tool/mechanical skills, depending upon the tool and the project at hand. Yet, people seem to be obsessed with interviewing for the tool and the tool alone. I would submit to you that you should be interviewing for foundation skills first, for someone weak in these areas will produce little to no value as a performance tester independent of whether they are able to answer the questions downloaded accurately.
Let's peel back the onion to areas where you are able to interview that do have a great impact on the output of your performance tester.
1. They need to be a great tester. They need to understand the entire scientific method, steps, expected results, control elements, data and map all of these items to manual tests, automated manual tests and performance testing
2. Requirements. What makes for a solid, testable performance requirement. How are this requirements similar and no similar to functional requirements.
3. Architecture. This speaks to two of the most common questions across all tools, "What protocol?" and "What/How do I monitor?" A failure here means low to no value
4. Analysis. The value of the performance test is not the test, it is not the test reports, it is the analysis of why a requirement was not met and where the issue is likely located. A performance tester of any measure will have quite a few stories of odd/weird/unusual/incredibly valuable defects that were found. They should be able to walk you through the entire process of discovery on the client side through how the source was tracked down. Do not be surprised if these stories involve lots of late nights, pizza and sketches on napkins after adult beverages.
5. Project management. The building of a performance test is a development exercise. It comes with everything else that a performance exercise comes with: How to project labor based upon the requirements. That is labor for building the test, managing/building the test data, executing the test, analyzing the test, etc... Your members should be conversant with project management terms related to budgeting and variances. They should be able to communicate to the master project manager in their terms for tasks and deliverables and why a two week black box is not appropriate.
This is before you add architecture specific skills in Operating Systems, Networking, Database systems, general systems analysis skills related to troubleshooting in complex environments, etc...