##### TEST RESULT (PASS/WARNING/FAIL) EVALUATION Currently Dashboard provides very nice report, though I find it lacks important features to get clear result if test is passed or not. There should be possibility to set this based on performance SLAs/requirements by user to evaluate this easily. Some examples: * test is considered failed if 'http_sampler_login' 95th percentile > 300ms * test is considered failed if 'oracle_sampler_commit_new_user' avg > 500ms * test is considered failed if 'soap_set_options' throughput TPS < 20 * test raise warning if 'rest_get_owner' error % >1 & <5 * test is considered failed if 'rest_get_owner' error % > 5 Definition of failed condition in this case is a must, and warning is beneficial to rise attention that problem may be close. #### TEST RESULT DEFINITION * no errors & no warnings => test result is PASSED * 1 or more warnings => test result is WARN * 1 or more errors => test result is FAILED This should reflect Dashboard statistics with relevant colors (please see attachment) via using CSS classes for errors/warnings/failures ##### PAC CONFIG ELEMENT To get this, there should be possibility to have config element - let's call it for a now PAC - performance acceptance criteria, that can be added to each sampler. PAC should have possibility to optional set ranges for pass/warning/fail criteria for each value that can be seed in the dashboard main statistics, e.g. Executions * #Samples * Error % * KO Response Times (ms) * Average * Min * Max * 90th pct * 95th pct * 99th pct Throughput * Transactions/s Network (KS/sec) * Received * Sent Please see PAC element prototype for better overview in attachments ##### TEST DASHBOARD/RESULTS ELEMENT It should be possible to add test dashboard as element in test plan, for having it visualized together with PAC config elements in same plan (attachments). If possible, dashboard element should update new PAC config elements data on-the-fly. (please see the attachment) Evaluation result should be also provided in JSON format, as a file that can be easily integrated in CI/CD pipelines (parsing json is "cheaper" than XML and .json files can be easily send via curl to many endpoints) Please see test plan example in the attachements
Created attachment 37253 [details] prototypes of requested features
*** Bug 62267 has been marked as a duplicate of this bug. ***
Created attachment 37254 [details] dashboard with acceptance criteria
Created attachment 37255 [details] dashboard element prototype
Created attachment 37256 [details] PAC config element prototype
Created attachment 37257 [details] test plan example
In the attachments there are graphical samples or prototypes how the requested features could look like. Current JMeter dashboard report is informative only, requested changes are to make it more like decision maker and use it more easily within CI/CD pipelines and cloud integrations.
Created attachment 37258 [details] dashboard element prototype
If possible, please try to implement same policy as for other JMeter components, that enables scope under specific hash tree. E.g. PAC config element assigned for thread group (or simple controller, etc): - applies for all samplers under the thread group - can be overwritten by assigning PAC config element for specific sampler In this way you will reduce assigning multiple PACs with same requirements for whole group.
This issue has been migrated to GitHub: https://github.com/apache/jmeter/issues/5321