Created attachment 35911 [details] Results Our load tests recently began running at half the normal rate. Response times were slightly longer, but the requests per minute had dropped to half the expected rate. We're running Java 8. Our load tests were written in Jmeter 3.3. After some investigation we found that opening a Jmeter 3.3 file with Jmeter 4.0 and saving it (converting it to 4.0) was the culprit. The test runs at twice the rpm in Jmeter 3.3 vs. the SAME jmx file saved in 4.0 I ran 4 tests and the results are described below and attached. Test 1 : ran the test with Jmeter3.3 Test 2 : ran the test with Jmeter4.0 These two test yield the same results in terms of throughput, response time and error rate Test 3 : I then opened the .jmx in Jmeter4.0 and saved it again . Not making any changes to it, but now it is a "Jmeter4.0" test. I then ran the test in jmeter 3.3 This is where it gets interesting; the throughput was now half of the throughput achieved in the first two tests. Test 4 : I ran the same test (as in Test3) again in Jmeter4.0, again the test only produced half the original throughput
Thanks for report. But we need more information : - jmeter process parameters (ps -eaf| grep jmeter) - jmeter.log - user.properties/jmeter.properties - test configuration : distributed or not - any third party plugin used - jvm parameters - test plan Thanks
Interesting are the changed max response times. It might be helpful to take a few thread dumps while the tests are running (before saving and after reloading them) If you are uncomfortable to share your test plan, you could try to diff the plans (original VS saved) and look at the changes.
(In reply to Jehan Coetzee from comment #0) > Created attachment 35911 [details] > Results > > Our load tests recently began running at half the normal rate. Response > times were slightly longer, but the requests per minute had dropped to half > the expected rate. We're running Java 8. > > Our load tests were written in Jmeter 3.3. After some investigation we > found that opening a Jmeter 3.3 file with Jmeter 4.0 and saving it > (converting it to 4.0) was the culprit. The test runs at twice the rpm in > Jmeter 3.3 vs. the SAME jmx file saved in 4.0 > > I ran 4 tests and the results are described below and attached. > > Test 1 : ran the test with Jmeter3.3 > Test 2 : ran the test with Jmeter4.0 > > These two test yield the same results in terms of throughput, response time > and error rate > > Test 3 : I then opened the .jmx in Jmeter4.0 and saved it again . Not making > any changes to it, but now it is a "Jmeter4.0" test. I then ran the test in > jmeter 3.3 Doing this is wrong, JMeter never guarantees upward compatibility. As an example, if your test uses JSR223 Test Element with "Cache compiled" checked, by doing what you describe you would end up in 3.3 with this unchecked because XML saving format has changed. > > This is where it gets interesting; the throughput was now half of the > throughput achieved in the first two tests. > > Test 4 : I ran the same test (as in Test3) again in Jmeter4.0, again the > test only produced half the original throughput If you have JSR223 Test Element, then it would also be explainable. Please note that if we don't get any feedback within 7 days, we'll be closing this ticket. Regards
Good day Yes, the test is using a JSR223 element and doing a diff on the 2 test shows : <stringProp name="cacheKey"></stringProp> vs <stringProp name="cacheKey">true</stringProp> I will disable the element and retest.
Disabling the JSR223 element made a difference and the throughput returned to "normal". I will retest with a script that does not contain the element and feedback.
I wonder if this could be solved automatically in case Apache JMeter 4.0 opens the a file produced by 3.x Am I right 3.x stores value "true"? Could 4.0 recognize it and act accordingly?
4.0 only dtores true/false while 3.3 stored a uuid. There is no issue here as them mistake is to use a 4.0 file in a 3.3
> use a 4.0 file in a 3.3 Thanks, now I see
(In reply to Jehan Coetzee from comment #5) > Disabling the JSR223 element made a difference and the throughput returned > to "normal". > > I will retest with a script that does not contain the element and feedback. Any feedback ? Thanks
Closing as no feedback from user but last tests tend to confirm issue . If reported can confirm or not through the last tests he mentions, it would be nice.
This issue has been migrated to GitHub: https://github.com/apache/jmeter/issues/4777