All Classes Namespaces Files Functions Variables Enumerator Groups Pages
Custom metrics

We consider you are already familiar with the general approach for metrics aggregation and storage

Jagger test framework allows to work with custom performance metrics. That means, you can create, collect, aggregate, store and make pass/fail decision based on you own metrics.

Custom metrics collection
Framework is exposing metric service to work with custom metrics
MetricService allows to:
  • create metric
    • set unique id of the metric
    • set display name - text, displayed in the reports
    • set flags what values will be aggregated (summary, detailed results)
    • set what aggregator(s) will be used for this metric
  • save metric values

you can find more details about Jagger services in the chapter User actions during the load test

Custom metrics collection. Case 1
In the example we are creating metric before running performance test and saving values on every successful request to the SUT. After test is over, results for this metric will be aggregated by multiple aggregators. More details about metrics collection you can find in the chapter Collecting metrics
package com.griddynamics.jagger.engine.e1.collector.invocation;
import com.griddynamics.jagger.engine.e1.Provider;
import com.griddynamics.jagger.engine.e1.collector.AvgMetricAggregatorProvider;
import com.griddynamics.jagger.engine.e1.collector.MaxMetricAggregatorProvider;
import com.griddynamics.jagger.engine.e1.collector.MetricDescription;
import com.griddynamics.jagger.engine.e1.collector.MinMetricAggregatorProvider;
import com.griddynamics.jagger.engine.e1.collector.PercentileAggregatorProvider;
import com.griddynamics.jagger.engine.e1.collector.invocation.InvocationInfo;
import com.griddynamics.jagger.engine.e1.collector.invocation.InvocationListener;
import com.griddynamics.jagger.engine.e1.services.ServicesAware;
import com.griddynamics.jagger.invoker.InvocationException;
public class ExampleInvocationListener extends ServicesAware implements Provider<InvocationListener> {
private final String metricName = "example-duration-metric";
protected void init() {
//begin: following section is used for docu generation - example of the metric with multiple aggregators
getMetricService().createMetric(new MetricDescription(metricName)
.displayName("Example duration metric, ms")
.addAggregator(new MinMetricAggregatorProvider())
.addAggregator(new MaxMetricAggregatorProvider())
.addAggregator(new AvgMetricAggregatorProvider())
.addAggregator(new PercentileAggregatorProvider(40D))
.addAggregator(new PercentileAggregatorProvider(50D))
.addAggregator(new PercentileAggregatorProvider(60D))
.addAggregator(new PercentileAggregatorProvider(70D))
.addAggregator(new PercentileAggregatorProvider(80D))
.addAggregator(new PercentileAggregatorProvider(90D))
.addAggregator(new PercentileAggregatorProvider(95D))
.addAggregator(new PercentileAggregatorProvider(99D))
//end: following section is used for docu generation - example of the metric with multiple aggregators
public InvocationListener provide() {
return new InvocationListener() {
public void onStart(InvocationInfo invocationInfo) { }
public void onSuccess(InvocationInfo invocationInfo) {
if (invocationInfo.getResult() != null) {
getMetricService().saveValue(metricName, invocationInfo.getDuration());
public void onFail(InvocationInfo invocationInfo, InvocationException e) { }
public void onError(InvocationInfo invocationInfo, Throwable error) { }
Custom metrics collection. Case 2
Another option: you can collect metrics with some external tool. After test is over you can read time series values from this tool as store them in the framework DB. In this case you can execute your code in the test or test group listener in the onStop method