Bug 64457 (Acceptance, Criteria, Results) - Generation of automatic result (pass/warning/fail) based on defined performance acceptance criteria
Summary: Generation of automatic result (pass/warning/fail) based on defined performan...
Status: NEW
Alias: Acceptance, Criteria, Results
Product: JMeter - Now in Github
Classification: Unclassified
Component: Main (show other bugs)
Version: 5.3
Hardware: All All
: P1 enhancement (vote)
Target Milestone: JMETER_5.4
Assignee: JMeter issues mailing list
URL: https://jmeter.apache.org/usermanual/...
Keywords:
: 62267 (view as bug list)
Depends on:
Blocks:
 
Reported: 2020-05-20 12:02 UTC by piotr.mirek
Modified: 2020-07-13 09:59 UTC (History)
4 users (show)



Attachments
prototypes of requested features (810.33 KB, image/png)
2020-05-20 12:03 UTC, piotr.mirek
Details
dashboard with acceptance criteria (58.17 KB, image/png)
2020-05-20 12:05 UTC, piotr.mirek
Details
dashboard element prototype (57.11 KB, image/png)
2020-05-20 12:06 UTC, piotr.mirek
Details
PAC config element prototype (49.19 KB, image/png)
2020-05-20 12:07 UTC, piotr.mirek
Details
test plan example (79.32 KB, image/png)
2020-05-20 12:07 UTC, piotr.mirek
Details
dashboard element prototype (57.11 KB, image/png)
2020-05-20 12:37 UTC, piotr.mirek
Details

Note You need to log in before you can comment on or make changes to this bug.
Description piotr.mirek 2020-05-20 12:02:13 UTC
##### TEST RESULT (PASS/WARNING/FAIL) EVALUATION

Currently Dashboard provides very nice report, though I find it lacks important features to get clear result if test is passed or not.

There should be possibility to set this based on performance SLAs/requirements by user to evaluate this easily. Some examples:

 * test is considered failed if 'http_sampler_login' 95th percentile > 300ms
 * test is considered failed if 'oracle_sampler_commit_new_user' avg > 500ms
 * test is considered failed if 'soap_set_options' throughput TPS < 20
 * test raise warning if 'rest_get_owner' error %  >1 & <5
 * test is considered failed if 'rest_get_owner' error % > 5

Definition of failed condition in this case is a must, and warning is beneficial to rise attention that problem may be close.

#### TEST RESULT DEFINITION

* no errors & no warnings => test result is PASSED
* 1 or more warnings => test result is WARN
* 1 or more errors => test result is FAILED

This should reflect Dashboard statistics with relevant colors (please see attachment) via using CSS classes for errors/warnings/failures

##### PAC CONFIG ELEMENT

To get this, there should be possibility to have config element - let's call it for a now PAC - performance acceptance criteria, that can be added to each sampler.

PAC should have possibility to optional set ranges for pass/warning/fail criteria for each value that can be seed in the dashboard main statistics, e.g.

Executions
 
 * #Samples 
 * Error % 
 * KO 

Response Times (ms) 

 * Average 
 * Min 
 * Max 
 * 90th pct 
 * 95th pct 
 * 99th pct 

Throughput 

 * Transactions/s 

Network (KS/sec) 

 * Received 
 * Sent 

Please see PAC element prototype for better overview in attachments

##### TEST DASHBOARD/RESULTS ELEMENT

It should be possible to add test dashboard as element in test plan, for having it visualized together with PAC config elements in same plan (attachments).

If possible, dashboard element should update new PAC config elements data on-the-fly. (please see the attachment)

Evaluation result should be also provided in JSON format, as a file that can be easily integrated in CI/CD pipelines (parsing json is "cheaper" than XML and .json files can be easily send via curl to many endpoints)

Please see test plan example in the attachements
Comment 1 piotr.mirek 2020-05-20 12:03:23 UTC
Created attachment 37253 [details]
prototypes of requested features
Comment 2 Philippe Mouawad 2020-05-20 12:04:28 UTC
*** Bug 62267 has been marked as a duplicate of this bug. ***
Comment 3 piotr.mirek 2020-05-20 12:05:20 UTC
Created attachment 37254 [details]
dashboard with acceptance criteria
Comment 4 piotr.mirek 2020-05-20 12:06:15 UTC
Created attachment 37255 [details]
dashboard element prototype
Comment 5 piotr.mirek 2020-05-20 12:07:14 UTC
Created attachment 37256 [details]
PAC config element prototype
Comment 6 piotr.mirek 2020-05-20 12:07:53 UTC
Created attachment 37257 [details]
test plan example
Comment 7 piotr.mirek 2020-05-20 12:10:48 UTC
In the attachments there are graphical samples or prototypes how the requested features could look like.

Current JMeter dashboard report is informative only, requested changes are to make it more like decision maker and use it more easily within CI/CD pipelines and cloud integrations.
Comment 8 piotr.mirek 2020-05-20 12:37:27 UTC
Created attachment 37258 [details]
dashboard element prototype
Comment 9 piotr.mirek 2020-07-13 09:59:33 UTC
If possible, please try to implement same policy as for other JMeter components, that enables scope under specific hash tree.

E.g. PAC config element assigned for thread group (or simple controller, etc):
 - applies for all samplers under the thread group
 - can be overwritten by assigning PAC config element for specific sampler

In this way you will reduce assigning multiple PACs with same requirements for whole group.
Comment 10 The ASF infrastructure team 2022-09-24 20:38:19 UTC
This issue has been migrated to GitHub: https://github.com/apache/jmeter/issues/5321