Performance Plugin

Skip to end of metadata
Go to start of metadata

Plugin Information

Plugin ID performance Changes In Latest Release
Since Latest Release
Latest Release
Latest Release Date
Required Core
Dependencies
1.10
Jan 27, 2014
1.480
Source Code
Issue Tracking
Maintainer(s)
GitHub
Open Issues
Manuel Carrasco Monino (id: manolo)
Vergnes (id: vergnes)
Arnaud Espy (id: aespy)
Usage Installations 2013-Apr 3629
2013-May 3720
2013-Jun 3745
2013-Jul 3900
2013-Aug 3859
2013-Sep 4048
2013-Oct 4246
2013-Nov 4283
2013-Dec 4200
2014-Jan 4452
2014-Feb 4616
2014-Mar 4845

This plugin allows you to capture reports from JMeter and JUnit . Jenkins will generate graphic charts with the trend report of performance and robustness.
It includes the feature of setting the final build status as good, unstable or failed, based on the reported error percentage.
Report formats supported:

  • JMeter XML format (CSV is currently not supported, see JENKINS-16627)
  • JUnit format (used by Soapui for example)

Changelog:

v1.10

  • FIX: Cache preprocessed JMeter Reports to avoid performance issues.
  • FEATURE: Added comparison between builds
  • FIX: UI bug always showing same values regardless of what was saved.
  • FEATURE: Average response time thresholds per jtl file.
  • FIX: Corrected a bug where the 'All URIs' was just displaying the last entry again.
  • FEATURE: Added some useful metrics to the summary details table. JMeter sends size of the response which can be useful to calculate bandwidth usage for perf tests.

v1.9

  • FIX: don't use ; as separator in cookie value
  • FEATURE: added csv parser
  • FEATURE: added response time trend graph for selected build
  • FEATURE: builds trends for responce time
  • FEATURE: consider the time for each test case in a test suite
  • FEATURE: simple cache added
  • FEATURE: new response time graphs selected build and uri report
  • FEATURE: new graphs for response time trends
  • FEATURE: parse JMeter summarizer files

v1.8

  • FIX: parsing results of long running tests
  • FIX: differences not shown for old builds
  • FEATURE: more information columns in the report map
  • CHANGES: use negative values to indicate no threshold (this allows to use 0% thresholds)
  • FEATURE: graphs available on the reports
  • FEATURE: url parameter (buildCount) to control the number of builds to display
  • FEATURE: get a larger image when clicking on a graph

v1.7

  • FIX: Unstable test set final build incorrectly when a previous test failed.
  • FIX: JENKINS-9655, didn't parse JUnit reports correctly (patch: Attila-Mihaly)

v1.6

  • Fix JMeter parser when nested xml tags are in the report.

v1.5

  • Now computes median and 90% Line in jmeter test results.

v1.4

  • Just a control version published after migrating the plugin to gitHub infrastructure.

v1.3

  • Formalized an extension point to define custom parsers so as it should be easier add new parsers.
  • JMeter and JUnit parsers have been split in different classes.
  • Added a new Trend report.
  • Fixed an NPE when a build was failed (JENKINS-5224, JENKINS-6908)

v1.2

  • Support for Ant-FileSet pattern to search  report files.
  • Improved css.
  • Localized UI elements
  • Added Spanish translation

v1.0

  • First release, moved code from JMeter plugin  v.0.3.0 to Performance v.1.0.
  • Added ability to parse junit xml report files.

Jenkins Configuration

  • Add a new report selecting the appropriate parser for your reports (JMeter, JUnit)
  • Configure the search pattern to select the files to be parsed by the Performance plugin. Depending on the parser there is a default pattern to be used if you leave the input box blank.
  • You can configure the error percentage thresholds which would make the project unstable or failed or leave them blank to disable the feature.

Performance Plugin usage

  • As soon as you have configured Jenkins and launched a first build, you'll notice that a new entry is appearing in the left summary : Performance trend.
  • If you just have one report file, the graph of this reports will appear on the main page.
  • If you have more of one report file, you must click on "Performance Trend" and the graphs will appear.

  • The link: Filter trend data in the Performance Trend Page, will permit you to configure the graphs. When you click on it, you obtain the graph configuration menu.

         This configuration will be save in a cookie named performance. The default configuration is: "Show all the builds".

       

  • The link Last report in the Performance Trend Page, give us the detailed information of each report for the last build.
  • You can access to the data of old builds pushing on the new menu entry of each build named Performance report.

  • In the performance trend page, the links to Trend report shows a report with the history data of each build.

Configuring a project to run jmeter performance tests

Although there are different ways to run jmeter tests, here is explained a method to run them using ant.
Once you have your build.xml ready to run jmeter, you can add your project to Jenkins as a Freestyle-project which uses ant, and configure it following the instructions above.

Finally run your project setting the property jmeter-home to the appropriate folder in your computer:

  ant "-Djmeter-home=C:\jmeter" -f build.xml
<project default="all">
  <!-- ant-jmeter.jar comes with jmeter, be sure this is the release you have -->
  <path id="ant.jmeter.classpath">
    <pathelement
       location="${jmeter-home}/extras/ant-jmeter-1.1.1.jar" />
  </path>
  <taskdef
    name="jmeter"
    classname="org.programmerplanet.ant.taskdefs.jmeter.JMeterTask"
    classpathref="ant.jmeter.classpath" />
  <target name="clean">
    <delete dir="results"/>
    <delete file="jmeter.log"/>
    <mkdir dir="results/jtl"/>
    <mkdir dir="results/html"/>
  </target>
  <target name="test" depends="clean">
    <jmeter
       jmeterhome="${jmeter-home}"
       resultlogdir="results/jtl">
      <testplans dir="test/jmeter" includes="*.jmx"/>
      <property name="jmeter.save.saveservice.output_format" value="xml"/>
    </jmeter>
  </target>
  <!-- This is not needed for the plugin, but it produces a nice html report
       which can be saved usin jenkins's archive artifact feature -->
  <target name="report" depends="test">
    <xslt
       basedir="results/jtl"
       destdir="results/html"
       includes="*.jtl"
       style="${jmeter-home}/extras/jmeter-results-detail-report_21.xsl"/>
  </target>
  <target name="all" depends="test, report"/>
</project>

Errors

If you get the error `java.lang.NoClassDefFoundError: Could not initialize class org.jfree.chart.JFreeChart` when the plugin generates the charts, is because you have running an XServer in the jenkins machine. Set the property `-Djava.awt.headless=true` when starting your servlet container. Note that this normally does not happen when running the embedded servlet container Jenkins is packaged with (Jetty).

https://groups.google.com/forum/#!topic/jenkinsci-users/o_Dr7Tn0i3U

Compiling

To use the latest plugin release, you need to download, compile and install by hand. To do it, you need subversion, maven and java installed in your computer.

$ git clone https://github.com/jenkinsci/performance-plugin.git performance
$ cd performance
$ mvn package
$ cp target/performance.hpi <path_to_jenkins>/data/plugins

Remember to restart jenkins in order to use reload the plugin.
You could read more about plugins readingh these pages :

TODO:

Adding other report formats.

Labels

plugin-report plugin-report Delete
Enter labels to add to this page:
Please wait 
Looking for a label? Just start typing.
  1. Aug 17, 2010

    C M says:

    JUnit reports does not work :( I've set JUnit properly in hudson, but it produc...

    JUnit reports does not work :(

    I've set JUnit properly in hudson, but it produces wrong results. All tests have min set to 9223372036854775807 and max to -9223372036854775808.

    I've noticed this logs in hudson console:

    Performance: Parsing JMeter report file TEST-hudson.test.ATest.xml
    Performance: Parsing JMeter report file TESTS-TestSuites.xml

    It tries to parse JUnit test output as JMeter.

    1. Aug 24, 2010

      manuel_carrasco - says:

      - Do you have selected correctly the parser? - Could you send me these ...

      - Do you have selected correctly the parser?

      - Could you send me these two files?

  2. Aug 23, 2010

    build bot says:

    Has anyone else had experience using the performance plugin with JUnit XML repor...

    Has anyone else had experience using the performance plugin with JUnit XML reporting as it claims to work with?

    I read that the output report has to be from SOAPU...

    Can someone please post a working example file please!

    Thanks

    1. Aug 24, 2010

      manuel_carrasco - says:

      What do you want? a soapui example or just a junit xml file? I have not used so...

      What do you want? a soapui example or just a junit xml file?

      I have not used soapui, I just have tested the feature parsing the output files generated by maven (surefire) when running the tests.

  3. Aug 30, 2010

    Samarjeet Mohanty says:

    I installed this plugin, configured on Hudson and am able to get the Trend and P...

    I installed this plugin, configured on Hudson and am able to get the Trend and Performance Reports successfully.  However, there seems to be a bug with the reporting structure.  In my JMeter test plan, I've a CSV Data Set Config (external CSV or TXT file) from where I read the values into my test plan.   I see that on doing an execution, the plugin reports in Hudson show the timings for CSV config element too.  This screws up the entire calculation of  Max , Min and Avg for all transactions put together.  Ideally,  only the response times for transaction controller's should be shown and NOT for Config element's. (Attached a sample view for reference)

    If any body has faced this issue OR  has any suggestions/comment on this, would appreciate If you throw some light on it.

    Thanks.

  4. Aug 31, 2010

    Peter Koch says:

    We also have problems creating Reports with JUnit...min is always 92233720368547...

    We also have problems creating Reports with JUnit...min is always 9223372036854775807, max -9223372036854775808.
    Does not seem to work :-(

    We use Hudson 1.336 and Performance Plugin 1.3, Maven and JUnit 3.8

    The xml File looks like this:

    <?xml version="1.0" encoding="UTF-8" ?>
    <testsuite failures="0" time="1.108" errors="0" skipped="0" tests="2" name="com.jaxlion.base.LogPathInfoTest">
    <properties>
    <property name="java.runtime.name" value="Java(TM) SE Runtime Environment"/>
    <property name="sun.boot.library.path" value="/usr/java/jdk1.6.0_16/jre/lib/amd64"/>
    <property name="java.vm.version" value="14.2-b01"/>
    <property name="java.vm.vendor" value="Sun Microsystems Inc."/>
    <property name="java.vendor.url" value="http://java.sun.com/"/>
    <property name="path.separator" value=":"/>
    <property name="java.vm.name" value="Java HotSpot(TM) 64-Bit Server VM"/>
    <property name="file.encoding.pkg" value="sun.io"/>
    <property name="user.country" value="US"/>
    <property name="sun.java.launcher" value="SUN_STANDARD"/>
    <property name="sun.os.patch.level" value="unknown"/>
    <property name="java.vm.specification.name" value="Java Virtual Machine Specification"/>
    <property name="user.dir" value="/opt/build/hudson/jobs/jaxlion-core/workspace/trunk"/>
    <property name="jaxlion.started" value="Wed Aug 25 07:35:00 CEST 2010"/>
    <property name="java.runtime.version" value="1.6.0_16-b01"/>
    <property name="java.awt.graphicsenv" value="sun.awt.X11GraphicsEnvironment"/>
    <property name="basedir" value="/opt/build/hudson/jobs/jaxlion-core/workspace/trunk"/>
    <property name="java.endorsed.dirs" value="/usr/java/jdk1.6.0_16/jre/lib/endorsed"/>
    <property name="os.arch" value="amd64"/>
    <property name="surefire.real.class.path" value="/tmp/surefirebooter5448448631377469438.jar"/>
    <property name="java.io.tmpdir" value="/tmp"/>
    <property name="line.separator" value="
    "/>
    <property name="java.vm.specification.vendor" value="Sun Microsystems Inc."/>
    <property name="os.name" value="Linux"/>
    <property name="sun.jnu.encoding" value="UTF-8"/>
    <property name="java.library.path" value="/usr/java/jdk1.6.0_16/jre/lib/amd64/server:/usr/java/jdk1.6.0_16/jre/lib/amd64:/usr/java/jdk1.6.0_16/jre/../lib/amd64:/usr/java/packages/lib/amd64:/lib:/usr/lib"/>
    <property name="javax.xml.parsers.SAXParserFactory" value="org.apache.xerces.jaxp.SAXParserFactoryImpl"/>
    <property name="surefire.test.class.path" value="/opt/build/hudson/jobs/jaxlion-core/workspace/trunk/target/test-classes:/opt/build/hudson/jobs/jaxlion-core/workspace/trunk/target/generated-classes/cobertura:/opt/tomcat/temp/.m2/repository/ch/loewenfels/loepa-commons/1.0.15/loepa-commons-1.0.15.jar:/opt/tomcat/temp/.m2/repository/commons-lang/commons-lang/2.4/commons-lang-2.4.jar:/opt/tomcat/temp/.m2/repository/junit/junit/3.8.2/junit-3.8.2.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/slf4j-api/1.5.11/slf4j-api-1.5.11.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/jul-to-slf4j/1.5.11/jul-to-slf4j-1.5.11.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/jcl-over-slf4j/1.5.11/jcl-over-slf4j-1.5.11.jar:/opt/tomcat/temp/.m2/repository/ch/qos/logback/logback-classic/0.9.20/logback-classic-0.9.20.jar:/opt/tomcat/temp/.m2/repository/ch/qos/logback/logback-core/0.9.20/logback-core-0.9.20.jar:/opt/tomcat/temp/.m2/repository/javax/mail/mail/1.4/mail-1.4.jar:/opt/tomcat/temp/.m2/repository/commons-io/commons-io/1.4/commons-io-1.4.jar:/opt/tomcat/temp/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar:/opt/tomcat/temp/.m2/repository/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar:/opt/tomcat/temp/.m2/repository/xalan/xalan/2.7.1/xalan-2.7.1.jar:/opt/tomcat/temp/.m2/repository/xalan/serializer/2.7.1/serializer-2.7.1.jar:/opt/tomcat/temp/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/opt/tomcat/temp/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/opt/tomcat/temp/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/opt/tomcat/temp/.m2/repository/tar/tar/2.3/tar-2.3.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-context/2.5.6/spring-context-2.5.6.jar:/opt/tomcat/temp/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-beans/2.5.6/spring-beans-2.5.6.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-core/2.5.6/spring-core-2.5.6.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-aop/2.5.6/spring-aop-2.5.6.jar:/opt/tomcat/temp/.m2/repository/cglib/cglib-nodep/2.2/cglib-nodep-2.2.jar:/opt/tomcat/temp/.m2/repository/org/mockito/mockito-all/1.8.4/mockito-all-1.8.4.jar:/opt/tomcat/temp/.m2/repository/net/sourceforge/cobertura/cobertura/1.9.2/cobertura-1.9.2.jar:/opt/tomcat/temp/.m2/repository/org/apache/ant/ant/1.7.0/ant-1.7.0.jar:/opt/tomcat/temp/.m2/repository/org/apache/ant/ant-launcher/1.7.0/ant-launcher-1.7.0.jar:"/>
    <property name="java.specification.name" value="Java Platform API Specification"/>
    <property name="java.class.version" value="50.0"/>
    <property name="sun.management.compiler" value="HotSpot 64-Bit Server Compiler"/>
    <property name="os.version" value="2.6.27.42-0.1-default"/>
    <property name="user.home" value="/opt/tomcat/temp"/>
    <property name="user.timezone" value="Europe/Zurich"/>
    <property name="java.awt.printerjob" value="sun.print.PSPrinterJob"/>
    <property name="java.specification.version" value="1.6"/>
    <property name="file.encoding" value="UTF-8"/>
    <property name="javax.xml.transform.TransformerFactory" value="org.apache.xalan.processor.TransformerFactoryImpl"/>
    <property name="user.name" value="tomcat"/>
    <property name="java.class.path" value="/opt/build/hudson/jobs/jaxlion-core/workspace/trunk/target/test-classes:/opt/build/hudson/jobs/jaxlion-core/workspace/trunk/target/generated-classes/cobertura:/opt/tomcat/temp/.m2/repository/ch/loewenfels/loepa-commons/1.0.15/loepa-commons-1.0.15.jar:/opt/tomcat/temp/.m2/repository/commons-lang/commons-lang/2.4/commons-lang-2.4.jar:/opt/tomcat/temp/.m2/repository/junit/junit/3.8.2/junit-3.8.2.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/slf4j-api/1.5.11/slf4j-api-1.5.11.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/jul-to-slf4j/1.5.11/jul-to-slf4j-1.5.11.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/jcl-over-slf4j/1.5.11/jcl-over-slf4j-1.5.11.jar:/opt/tomcat/temp/.m2/repository/ch/qos/logback/logback-classic/0.9.20/logback-classic-0.9.20.jar:/opt/tomcat/temp/.m2/repository/ch/qos/logback/logback-core/0.9.20/logback-core-0.9.20.jar:/opt/tomcat/temp/.m2/repository/javax/mail/mail/1.4/mail-1.4.jar:/opt/tomcat/temp/.m2/repository/commons-io/commons-io/1.4/commons-io-1.4.jar:/opt/tomcat/temp/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar:/opt/tomcat/temp/.m2/repository/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar:/opt/tomcat/temp/.m2/repository/xalan/xalan/2.7.1/xalan-2.7.1.jar:/opt/tomcat/temp/.m2/repository/xalan/serializer/2.7.1/serializer-2.7.1.jar:/opt/tomcat/temp/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/opt/tomcat/temp/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/opt/tomcat/temp/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/opt/tomcat/temp/.m2/repository/tar/tar/2.3/tar-2.3.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-context/2.5.6/spring-context-2.5.6.jar:/opt/tomcat/temp/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-beans/2.5.6/spring-beans-2.5.6.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-core/2.5.6/spring-core-2.5.6.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-aop/2.5.6/spring-aop-2.5.6.jar:/opt/tomcat/temp/.m2/repository/cglib/cglib-nodep/2.2/cglib-nodep-2.2.jar:/opt/tomcat/temp/.m2/repository/org/mockito/mockito-all/1.8.4/mockito-all-1.8.4.jar:/opt/tomcat/temp/.m2/repository/net/sourceforge/cobertura/cobertura/1.9.2/cobertura-1.9.2.jar:/opt/tomcat/temp/.m2/repository/org/apache/ant/ant/1.7.0/ant-1.7.0.jar:/opt/tomcat/temp/.m2/repository/org/apache/ant/ant-launcher/1.7.0/ant-launcher-1.7.0.jar:"/>
    <property name="java.vm.specification.version" value="1.0"/>
    <property name="sun.arch.data.model" value="64"/>
    <property name="java.home" value="/usr/java/jdk1.6.0_16/jre"/>
    <property name="java.specification.vendor" value="Sun Microsystems Inc."/>
    <property name="user.language" value="en"/>
    <property name="java.vm.info" value="mixed mode"/>
    <property name="java.version" value="1.6.0_16"/>
    <property name="java.ext.dirs" value="/usr/java/jdk1.6.0_16/jre/lib/ext:/usr/java/packages/lib/ext"/>
    <property name="sun.boot.class.path" value="/usr/java/jdk1.6.0_16/jre/lib/resources.jar:/usr/java/jdk1.6.0_16/jre/lib/rt.jar:/usr/java/jdk1.6.0_16/jre/lib/sunrsasign.jar:/usr/java/jdk1.6.0_16/jre/lib/jsse.jar:/usr/java/jdk1.6.0_16/jre/lib/jce.jar:/usr/java/jdk1.6.0_16/jre/lib/charsets.jar:/usr/java/jdk1.6.0_16/jre/classes"/>
    <property name="java.vendor" value="Sun Microsystems Inc."/>
    <property name="javax.xml.parsers.DocumentBuilderFactory" value="org.apache.xerces.jaxp.DocumentBuilderFactoryImpl"/>
    <property name="localRepository" value="/opt/tomcat/temp/.m2/repository"/>
    <property name="file.separator" value="/"/>
    <property name="java.vendor.url.bug" value="http://java.sun.com/cgi-bin/bugreport.cgi"/>
    <property name="sun.cpu.endian" value="little"/>
    <property name="sun.io.unicode.encoding" value="UnicodeLittle"/>
    <property name="sun.cpu.isalist" value=""/>
    </properties>
    <testcase time="1.097" classname="com.jaxlion.base.LogPathInfoTest" name="testGetServiceLogPath"/>
    <testcase time="0" classname="com.jaxlion.base.LogPathInfoTest" name="testGetSystemLogPath"/>
    </testsuite>

    1. Sep 08, 2010

      Brian Roe says:

      I also get the same results creating Reports with JUnit:  min is always 922...

      I also get the same results creating Reports with JUnit:  min is always 9223372036854775807, max -9223372036854775808.

      I thought it might be a JUnit version issue, but we are using Junit 4.5, and Hudson 1.375 and Performance Plugin 1.3.

      Doesn't anyone know how to get it working right?  Bueller?  Bueller?

      1. Sep 08, 2010

        Brian Roe says:

        I dug a little more, and found this (by clicking on the Help icon in the Configu...

        I dug a little more, and found this (by clicking on the Help icon in the Configure page for a job for which I installed the Performance plugin): 

        This plugin understands the JMeter analysis report XML format and the SOAPUI report in JUnit format.  (links to http://www.soapui.org )
        This plug-in does not perform the actual analysis; it only displays useful information about analysis results, such as average responding time, historical result trend, web UI for viewing analysis reports, and so on.

    2. Sep 30, 2010

      Peter Koch says:

      I dug a little more using Performance Plugin with JUnit Reports... I downloaded ...

      I dug a little more using Performance Plugin with JUnit Reports... I downloaded the source and enhanced the PerformanceReportTest class with an additional test using an own JUnit-reportfile. The JUnitParser workes fine in this unit-test.

      But on hudson, it doesn't work :-(. The plug-in finds the test-files, but the report and trend are wrong (min is always 9223372036854775807, max -922337203685477580).

      Any idea?

  5. Sep 05, 2010

    Tarun Kumar Bhadauria says:

    Well I am having tough time having to read JMeter report for hudson. If I speci...

    Well I am having tough time having to read JMeter report for hudson.

    If I specify the absolute path to my JMeter report file - "C:/SelNG/jmeter2/target/jmeter-reports/GoogleAdvanceSearch-100906.xml" I get to see performance report. But I cannot use absolute path as the report "GoogleAdvanceSearch-100906.xml" contains time stamp with it. So I tried to use regular expression as illustrated above "**/*.xml" Now when I build, I encounter following exception -

    *****************************Performance: Recording JMeter reports '*/.xml'
    Performance: no JMeter files matching '*/.xml' have been found. Has the report generated?. Setting Build to FAILURE
    Finished: FAILURE
    ********************

    Is there any thing I am missing here?

    Thanks in advance
    Tarun K

    1. Sep 08, 2010

      Samarjeet Mohanty says:

      Hi Tarun, Is it the "JMeter" report or "JUnit" report you'r trying to read. If i...

      Hi Tarun,
      Is it the "JMeter" report or "JUnit" report you'r trying to read.
      If it's JMeter report, you've to specify - "**/*.jtl". This works perfectly fine.

      Cheers..

      1. Sep 08, 2010

        Tarun Kumar Bhadauria says:

        Actually it is xml which is generated out of jmeter maven plugin. Hence I specif...

        Actually it is xml which is generated out of jmeter maven plugin. Hence I specified path as - "*/**.xml"

        but I always encounter exception -

        ****************************Performance: Recording JMeter reports '*/*.xml'
        Performance: no JMeter files matching '*/.xml' have been found. Has the report generated?. Setting Build to FAILURE
        Finished: FAILURE
        ********************

        1. Oct 11, 2010

          manuel_carrasco - says:

          replace '*/.xml' by *'*/.xml'

          replace '*/.xml' by *'*/.xml'

  6. Sep 16, 2010

    John Wood says:

    Great plugin - thanks! Just one question: it seems like the trend graphs do...

    Great plugin - thanks!

    Just one question: it seems like the trend graphs don't appear while a build is progress. Can anyone else confirm this?

    Our test suites tend to take a long time to run and it's kind of a pain to have them not visible during those times.

    Is there anything I can do?

    Regards,

    John Wood

    1. Nov 11, 2011

      Kresten Vester says:

      I saw the same problem. So i added the graphs to the Performance Reports of each...

      I saw the same problem. So i added the graphs to the Performance Reports of each build.

      I changed the PerformanceReportMap.java file to this:

      PerformanceReport.java
      package hudson.plugins.performance;
      
      import hudson.model.AbstractBuild;
      import hudson.model.ModelObject;
      
      import java.io.File;
      import java.io.FileFilter;
      import java.io.IOException;
      import java.io.UnsupportedEncodingException;
      import java.net.URLDecoder;
      import java.util.ArrayList;
      import java.util.Arrays;
      import java.util.Collection;
      import java.util.Collections;
      import java.util.LinkedHashMap;
      import java.util.List;
      import java.util.Map;
      import java.util.StringTokenizer;
      
      import hudson.model.TaskListener;
      import hudson.util.ChartUtil;
      import hudson.util.ChartUtil.NumberOnlyBuildLabel;
      import hudson.util.DataSetBuilder;
      import java.io.FilenameFilter;
      import org.kohsuke.stapler.StaplerRequest;
      import org.kohsuke.stapler.StaplerResponse;
      
      /**
       * Root object of a performance report.
       */
      public class PerformanceReportMap implements ModelObject {
      
          /**
           * The {@link PerformanceBuildAction} that this report belongs to.
           */
          private transient PerformanceBuildAction buildAction;
          /**
           * {@link PerformanceReport}s are keyed by {@link PerformanceReport#reportFileName}
           *
           * Test names are arbitrary human-readable and URL-safe string that identifies an individual report.
           */
          private Map<String, PerformanceReport> performanceReportMap = new LinkedHashMap<String, PerformanceReport>();
          private static final String PERFORMANCE_REPORTS_DIRECTORY = "performance-reports";
      
          /**
           * Parses the reports and build a {@link PerformanceReportMap}.
           *
           * @throws IOException
           *      If a report fails to parse.
           */
          PerformanceReportMap(final PerformanceBuildAction buildAction, TaskListener listener)
                  throws IOException {
              this.buildAction = buildAction;
              parseReports(getBuild(), listener, new PerformanceReportCollector() {
      
                  public void addAll(Collection<PerformanceReport> reports) {
                      for (PerformanceReport r : reports) {
                          r.setBuildAction(buildAction);
                          performanceReportMap.put(r.getReportFileName(), r);
                      }
                  }
              }, null);
          }
      
          private void addAll(Collection<PerformanceReport> reports) {
              for (PerformanceReport r : reports) {
                  r.setBuildAction(buildAction);
                  performanceReportMap.put(r.getReportFileName(), r);
              }
          }
      
          public AbstractBuild<?, ?> getBuild() {
              return buildAction.getBuild();
          }
      
          PerformanceBuildAction getBuildAction() {
              return buildAction;
          }
      
          public String getDisplayName() {
              return Messages.Report_DisplayName();
          }
      
          public List<PerformanceReport> getPerformanceListOrdered() {
              List<PerformanceReport> listPerformance = new ArrayList<PerformanceReport>(
                      getPerformanceReportMap().values());
              Collections.sort(listPerformance);
              return listPerformance;
          }
      
          public Map<String, PerformanceReport> getPerformanceReportMap() {
              return performanceReportMap;
          }
      
          /**
           * <p>
           * Give the Performance report with the parameter for name in Bean
           * </p>
           *
           * @param performanceReportName
           * @return
           */
          public PerformanceReport getPerformanceReport(String performanceReportName) {
              return performanceReportMap.get(performanceReportName);
          }
      
          /**
           * Get a URI report within a Performance report file
           *
           * @param uriReport
           *            "Performance report file name";"URI name"
           * @return
           */
          public UriReport getUriReport(String uriReport) {
              if (uriReport != null) {
                  String uriReportDecoded;
                  try {
                      uriReportDecoded = URLDecoder.decode(uriReport.replace(
                              UriReport.END_PERFORMANCE_PARAMETER, ""), "UTF-8");
                  } catch (UnsupportedEncodingException e) {
                      e.printStackTrace();
                      return null;
                  }
                  StringTokenizer st = new StringTokenizer(uriReportDecoded,
                          GraphConfigurationDetail.SEPARATOR);
                  return getPerformanceReportMap().get(st.nextToken()).getUriReportMap().get(
                          st.nextToken());
              } else {
                  return null;
              }
          }
      
          public String getUrlName() {
              return "performanceReportList";
          }
      
          void setBuildAction(PerformanceBuildAction buildAction) {
              this.buildAction = buildAction;
          }
      
          public void setPerformanceReportMap(
                  Map<String, PerformanceReport> performanceReportMap) {
              this.performanceReportMap = performanceReportMap;
          }
      
          public static String getPerformanceReportFileRelativePath(
                  String parserDisplayName, String reportFileName) {
              return getRelativePath(parserDisplayName, reportFileName);
          }
      
          public static String getPerformanceReportDirRelativePath() {
              return getRelativePath();
          }
      
          private static String getRelativePath(String... suffixes) {
              StringBuilder sb = new StringBuilder(100);
              sb.append(PERFORMANCE_REPORTS_DIRECTORY);
              for (String suffix : suffixes) {
                  sb.append(File.separator).append(suffix);
              }
              return sb.toString();
          }
      
          /**
           * <p>
           * Verify if the PerformanceReport exist the performanceReportName must to be like it
           * is in the build
           * </p>
           *
           * @param performanceReportName
           * @return boolean
           */
          public boolean isFailed(String performanceReportName) {
              return getPerformanceReport(performanceReportName) == null;
          }
      
          public void doRespondingTimeGraph(StaplerRequest request,
                  StaplerResponse response) throws IOException {
              String parameter = request.getParameter("performanceReportPosition");
              AbstractBuild<?, ?> previousBuild = getBuild();
              final Map<AbstractBuild<?, ?>, Map<String, PerformanceReport>> buildReports = new LinkedHashMap<AbstractBuild<?, ?>, Map<String, PerformanceReport>>();
              while (previousBuild != null) {
                  final AbstractBuild<?, ?> currentBuild = previousBuild;
                  parseReports(currentBuild, TaskListener.NULL, new PerformanceReportCollector() {
      
                      public void addAll(Collection<PerformanceReport> parse) {
                          for (PerformanceReport performanceReport : parse) {
                              if (buildReports.get(currentBuild) == null) {
                                  Map<String, PerformanceReport> map = new LinkedHashMap<String, PerformanceReport>();
                                  buildReports.put(currentBuild, map);
                              }
                              buildReports.get(currentBuild).put(performanceReport.getReportFileName(), performanceReport);
                          }
                      }
                  }, parameter);
                  previousBuild = previousBuild.getPreviousBuild();
              }
              //Now we should have the data necessary to generate the graphs!
              DataSetBuilder<String, NumberOnlyBuildLabel> dataSetBuilderAverage = new DataSetBuilder<String, NumberOnlyBuildLabel>();
              for (AbstractBuild<?, ?> currentBuild : buildReports.keySet()) {
                  NumberOnlyBuildLabel label = new NumberOnlyBuildLabel(currentBuild);
                  PerformanceReport report = buildReports.get(currentBuild).get(parameter);
                  dataSetBuilderAverage.add(report.getAverage(), Messages.ProjectAction_Average(), label);
              }
              ChartUtil.generateGraph(request, response,
                      PerformanceProjectAction.createRespondingTimeChart(dataSetBuilderAverage.build()), 400, 200);
          }
      
          private void parseReports(AbstractBuild<?, ?> build, TaskListener listener, PerformanceReportCollector collector, final String filename) throws IOException {
              File repo = new File(build.getRootDir(),
                      PerformanceReportMap.getPerformanceReportDirRelativePath());
      
              // files directly under the directory are for JMeter, for compatibility reasons.
              File[] files = repo.listFiles(new FileFilter() {
      
                  public boolean accept(File f) {
                      return !f.isDirectory();
                  }
              });
              // this may fail, if the build itself failed, we need to recover gracefully
              if (files != null) {
                  addAll(new JMeterParser("").parse(build,
                          Arrays.asList(files), listener));
              }
      
              // otherwise subdirectory name designates the parser ID.
              File[] dirs = repo.listFiles(new FileFilter() {
      
                  public boolean accept(File f) {
                      return f.isDirectory();
                  }
              });
              // this may fail, if the build itself failed, we need to recover gracefully
              if (dirs != null) {
                  for (File dir : dirs) {
                      PerformanceReportParser p = buildAction.getParserByDisplayName(dir.getName());
                      if (p != null) {
                          File[] listFiles = dir.listFiles(new FilenameFilter() {
      
                              public boolean accept(File dir, String name) {
                                  if(filename == null){
                                      return true;
                                  }
                                  if (name.equals(filename)) {
                                      return true;
                                  }
                                  return false;
                              }
                          });
                          collector.addAll(p.parse(build, Arrays.asList(listFiles), listener));
                      }
                  }
              }
          }
      
          private interface PerformanceReportCollector {
      
              public void addAll(Collection<PerformanceReport> parse);
          }
      }
      

      And then i changed the matching index.jelly to:

      hudson.plugins.performance.PerformanceReportMap/index.jelly
      <j:jelly xmlns:j="jelly:core" xmlns:st="jelly:stapler" xmlns:d="jelly:define"
      	xmlns:l="/lib/layout" xmlns:t="/lib/hudson" xmlns:f="/lib/form">
        <l:layout xmlns:jm="/hudson/plugins/performance/tags" css="/plugin/performance/css/style.css">
        <st:include it="${it.build}" page="sidepanel.jelly" />
          <l:main-panel>
            <j:forEach var="performanceReport" items="${it.getPerformanceListOrdered()}">
              <h2>${%Performance Breakdown by URI}: ${performanceReport.getReportFileName()}</h2>
              <img class="trend" src="./respondingTimeGraph?width=600&amp;height=225&amp;performanceReportPosition=${performanceReport.getReportFileName()}" width="600" height="225" />
              <table class="sortable source" border="1">
                <jm:captionLine />
                <j:forEach var="uriReport" items="${performanceReport.getUriListOrdered()}">
                  <tr class="${h.ifThenElse(uriReport.failed,'red','')}">
                    <td class="left">
                      <a href="./uriReport/${uriReport.encodeUriReport()}">
                        <st:out value="${uriReport.getUri()}" />
                      </a>
                    </td>
                    <jm:summaryTable it="${uriReport}" />
                  </tr>
                </j:forEach>
                <tr class="bold">
                  <td class="left bold">${%All URIs}</td>
                  <jm:summaryTable it="${performanceReport}" />
                </tr>
              </table>
            </j:forEach>
          </l:main-panel>
        </l:layout>
      </j:jelly>
      

      The code could be a bit cleaner and better shared with the PerformanceProjectAction that also generate graphs but the output is fine I think.

  7. Oct 07, 2010

    Thomas Grönwall says:

    I generate the jtl-files in my junit tests, to measure some response time in acc...

    I generate the jtl-files in my junit tests, to measure some response time in acceptance tests (using selenium webdriver).
    I get the Performance Report tables all right for each build in Hudson, but I don't get the trend graphs for the job. I see the names of the files, but the frames for the graphs on the performance trend page are empty. If I click the link "Trend report" I get a stack trace. The error says that there is a parse error (java.lang.NumberFormatException: null) in my generated files, but not what is wrong. So my guess is some attribute is perhaps missing, that is needed to produce the trend that is not needed in the tables.
    Here is an example of a file I have generated:

    <?xml version="1.0" encoding="UTF-8"?>
    <testResults version="1.2">
    <sample
     rm="SUCCESS"
     s="true"
     t="154"
     it="0"
     lt="0"
     dt="text"
     de="UTF-8"
     lb="{By.xpath: /*}"
     ts="1286452090735"
    >
    <samplerData class="java.lang.String">se.dreampark.test.BasicSmokeTest</samplerData>
    </sample>
    <sample
     rm="SUCCESS"
     s="true"
     t="5267"
     it="0"
     lt="0"
     dt="text"
     de="UTF-8"
     lb="{WebElement {By.xpath: //*[@id='loadingIcon']} is not visible}"
     ts="1286452090735"
    >
    <samplerData class="java.lang.String">se.dreampark.test.BasicSmokeTest</samplerData>
    </sample>
    <sample
     rm="SUCCESS"
     s="true"
     t="15"
     it="0"
     lt="0"
     dt="text"
     de="UTF-8"
     lb="{By.xpath: //*[@id='loadingIcon']}"
     ts="1286452090735"
    >
    <samplerData class="java.lang.String">se.dreampark.test.BasicSmokeTest</samplerData>
    </sample>
    </testResults>
    
    
    The stack trace:
    
    java.lang.NumberFormatException: null
    	at java.lang.Long.parseLong(Long.java:375)
    	at java.lang.Long.valueOf(Long.java:525)
    	at hudson.plugins.performance.JMeterParser$1.startElement(JMeterParser.java:84)
    	at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:501)
    	at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanStartElement(XMLDocumentFragmentScannerImpl.java:1359)
    	at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:2747)
    	at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:648)
    	at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:510)
    	at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:807)
    	at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
    	at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:107)
    	at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1205)
    	at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:522)
    	at javax.xml.parsers.SAXParser.parse(SAXParser.java:395)
    	at javax.xml.parsers.SAXParser.parse(SAXParser.java:331)
    	at hudson.plugins.performance.JMeterParser.parse(JMeterParser.java:65)
    	at hudson.plugins.performance.PerformanceReportMap.(PerformanceReportMap.java:60)
    	at hudson.plugins.performance.PerformanceBuildAction.getPerformanceReportMap(PerformanceBuildAction.java:78)
    	at hudson.plugins.performance.PerformanceProjectAction.getTrendReportData(PerformanceProjectAction.java:422)
    	at hudson.plugins.performance.PerformanceProjectAction.createTrendReport(PerformanceProjectAction.java:395)
    	at hudson.plugins.performance.PerformanceProjectAction.getDynamic(PerformanceProjectAction.java:368)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    	at java.lang.reflect.Method.invoke(Method.java:597)
    	at org.kohsuke.stapler.Function$InstanceFunction.invoke(Function.java:259)
    	at org.kohsuke.stapler.Function.bindAndInvoke(Function.java:126)
    	at org.kohsuke.stapler.MetaClass$13.dispatch(MetaClass.java:357)
    	at org.kohsuke.stapler.Stapler.invoke(Stapler.java:534)
    	at org.kohsuke.stapler.MetaClass$13.dispatch(MetaClass.java:359)
    	at org.kohsuke.stapler.Stapler.invoke(Stapler.java:534)
    	at org.kohsuke.stapler.MetaClass$7.doDispatch(MetaClass.java:219)
    	at org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:30)
    	at org.kohsuke.stapler.Stapler.invoke(Stapler.java:534)
    	at org.kohsuke.stapler.Stapler.invoke(Stapler.java:450)
    	at org.kohsuke.stapler.Stapler.service(Stapler.java:132)
    	at javax.servlet.http.HttpServlet.service(HttpServlet.java:45)
    	at winstone.ServletConfiguration.execute(ServletConfiguration.java:249)
    	at winstone.RequestDispatcher.forward(RequestDispatcher.java:335)
    	at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:378)
    	at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:94)
    	at org.jvnet.hudson.plugins.greenballs.GreenBallFilter.doFilter(GreenBallFilter.java:47)
    	at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:97)
    	at hudson.util.PluginServletFilter.doFilter(PluginServletFilter.java:86)
    	at winstone.FilterConfiguration.execute(FilterConfiguration.java:195)
    	at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:368)
    	at hudson.security.csrf.CrumbFilter.doFilter(CrumbFilter.java:47)
    	at winstone.FilterConfiguration.execute(FilterConfiguration.java:195)
    	at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:368)
    	at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:84)
    	at hudson.security.UnwrapSecurityExceptionFilter.doFilter(UnwrapSecurityExceptionFilter.java:51)
    	at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
    	at org.acegisecurity.ui.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:166)
    	at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
    	at org.acegisecurity.providers.anonymous.AnonymousProcessingFilter.doFilter(AnonymousProcessingFilter.java:125)
    	at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
    	at org.acegisecurity.ui.rememberme.RememberMeProcessingFilter.doFilter(RememberMeProcessingFilter.java:142)
    	at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
    	at org.acegisecurity.ui.AbstractProcessingFilter.doFilter(AbstractProcessingFilter.java:271)
    	at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
    	at org.acegisecurity.ui.basicauth.BasicProcessingFilter.doFilter(BasicProcessingFilter.java:173)
    	at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
    	at org.acegisecurity.context.HttpSessionContextIntegrationFilter.doFilter(HttpSessionContextIntegrationFilter.java:249)
    	at hudson.security.HttpSessionContextIntegrationFilter2.doFilter(HttpSessionContextIntegrationFilter2.java:66)
    	at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
    	at hudson.security.ChainedServletFilter.doFilter(ChainedServletFilter.java:76)
    	at hudson.security.HudsonFilter.doFilter(HudsonFilter.java:164)
    	at winstone.FilterConfiguration.execute(FilterConfiguration.java:195)
    	at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:368)
    	at winstone.RequestDispatcher.forward(RequestDispatcher.java:333)
    	at winstone.RequestHandlerThread.processRequest(RequestHandlerThread.java:244)
    	at winstone.RequestHandlerThread.run(RequestHandlerThread.java:150)
    	at java.lang.Thread.run(Thread.java:619)
    
    1. Oct 14, 2010

      Thomas Grönwall says:

      Today I found the problem. Some of the builds had errors in the generated jtl-fi...

      Today I found the problem. Some of the builds had errors in the generated jtl-files. When I removed those builds, it worked. I am still a bit puzzled, since the error above came even if I applied a filter, to show the trend for only the latest 2 builds (i.e. not included the builds with corrupt jtl-files). That indicates that the plugin read jtl-files from all builds, even if I apply a filter.

  8. Oct 13, 2010

    Link says:

    Is there a way to include Throughput or any other variables into the performance...

    Is there a way to include Throughput or any other variables into the performance graphs? I understand that information is included in the jtl files, but I'm unsure of a setting to enable Throughput and other metrics and incorporate them in the Hudson performance charts. I'm also very new to the xslt format, so any advice would help immensely.

    Thanks

  9. Nov 30, 2010

    Henri Gomez says:

    What about adding a column with requests/seconds for JMeter scripts ?

    What about adding a column with requests/seconds for JMeter scripts ?

  10. Jan 09, 2011

    Senthil M says:

    Is there a way to add/edit graph metrics? I would like to see throughput & 9...

    Is there a way to add/edit graph metrics? I would like to see throughput & 90%ile data and remove min & max information.

  11. Feb 11, 2011

    Christian says:

    Hi everybody, I also couldn't get JUnit reports to work, I always got the same ...

    Hi everybody,

    I also couldn't get JUnit reports to work, I always got the same results as you described above.

    I had a look into the source code and found the JUnit support seems not to be finished yet. The way it is implemented right now, JUnit test reports must reside under <build-dir>\performance-reports\hudson.plugins.performance.JUnitParser$DescriptorImpl but the files are not copied there automatically. I tried correcting the source code, but I still could not get it working completely.

    The easiest way to get this plugin running with JUnit reports is the following: In PerformanceReportMap.java (line 62) replace JMeterParser with JUnitParser. (Note that then the plugin doesn't work with JMeter anymore.) Additionally, you have to disable one unit test, which would otherwise prevent you from building the plugin: on the bottom of PerformancePublisherTest.java comment-out the wc.getPage() calls.

    This is just a workaround, combined support for JMeter and JUnit requires a bit more work. I'd be happy to help if I can!

  12. May 14, 2011

    Henri Gomez says:

    We've got many huge jmeter reports and Jenkins could be locked when performance ...

    We've got many huge jmeter reports and Jenkins could be locked when performance plugin is reparsing them.

    Why not store a copy of parsing under performance-reports folder to save this time consuming task ?

  13. Aug 03, 2011

    yannack bbbb says:

    Hi everyone, I have discovered this plugin and it serves me well. It's a great ...

    Hi everyone,

    I have discovered this plugin and it serves me well. It's a great tool!

    I don't use it in a "standard" way though... I hacked up a script which measures the number of SQL queries my testsuites make and exports that to a .jtl file. I can then visualize the performance of my code optimization in terms of number of requests made to the DB. I basically use the delay field of jtl files to insert my request number. So each test produces an entry in a jtl file, I have 4 jtl files (one for INSERT, DELETE,UPDATE and SELECT).If anyone is interested, I can share the tools I hacked for this. I use them under linux with a postgresql database, but I am sure they could be adapted to other databases/OSs as they aren't very complex. These are command line tools, so no need to integrate anything in your code except a system call every now and then to mark the logs. I'll give more details by mail if you are interested.

    Here are a few suggestions for this plugin which would make the overall experience better:

    - could I point to a master JTL file which I would like to put up on the front page (when there is only one JTL file being parsed by the performance plugin it shows up directly there, i would like to be able to pick one of my JTL files as a "master" to put up on the front page)

    - it would also be nice to be able to choose which lines are drawn on the graphs. The 90% makes less sense for me than the max for instance. Also I have no use for the "error" graph, because of the way I hacked the format, I never get any fails

    - it would be great to be able to choose the labels to apply to the graphs, and the units, or at least override the defaults (for me especially, seeing how I mentally translate milliseconds into "number of requests", but for i18n purposes or other stuff, I am sure this would be useful)

    - during a rebuild, the performance trend cannot be seen. This would be nice though.

    - finally, for deployment purposes, it would be great if a performance.hpi could be provided directly instead of having to compile this by hand, this way automatic upgrades could be done from within jenkins, etc.

    I hope I posted these suggestions in the right place,

    Thanks again for a great plugin, really useful!

    Yannack

  14. May 15, 2012

    Bernard Sarter says:

    Hi, First, thanks for this very useful plugin ! In fact, I'm facing an is...

    Hi,

    First, thanks for this very useful plugin !

    In fact, I'm facing an issue in case I have unexpected exceptions raised during JUnit tests; The JUnit report looks then like following example:

    <?xml version="1.0" encoding="UTF-8"?>
    <testsuites name="my-tests" tests="9" errors="2" failures="0" ignored="0">
    <testsuite name="foo" time="0.56">
    <testcase name="myTtest" classname="bar" time="0.266">
    java.lang.NullPointerException
    ...
    

    In such a case, I have "errors=2", so I would expect to be able to use the "performance plugin" to force the build to be marked as "failed", but I can't.
    Since the header contains "failures=0", the build is always considered as Ok ...

    Would it be possible to add an option to check the field "errors" as well, and to mark the build as failed if percentage of errors is higher as a given threshold ?

    Many thanks,
    Bernard.

    1. Jul 18, 2013

      hbjastad - says:

      Hi Bernard, I completely agree - so I created an issue for it: https://issu...

      Hi Bernard, I completely agree - so I created an issue for it: https://issues.jenkins-ci.org/browse/JENKINS-18811

      Hopefully, this can be prioritized in the near future...though I see that the last release date was April 2012...

      1. Jan 17

        Kirill Loginov says:

        It will be very helpful to mark build as failed if previous "average "Responding...

        It will be very helpful to mark build as failed if previous "average "Responding Time" (which shown on "Performance Trend" graphs) too different from current.

        (sorry for my English)

  15. Jan 29

    Philippe Koutsoulis says:

    Hi, Thanks for that great plugin :) I am using Performance plugin to display J...

    Hi,

    Thanks for that great plugin :)

    I am using Performance plugin to display JMeter results.
    Is it possible to modify it to use Median values instead of Average values when "Performance Per Test Case Mode" is checked ?
    For example, by changing public void doRespondingTimeGraphPerTestCaseMode (?)

  16. Mar 26

    Dmitriy Korobskiy says:

    This is fantastic plug-in and I'm rolling it out for my team to capture performa...

    This is fantastic plug-in and I'm rolling it out for my team to capture performance baseline of our functional tests. I only wish that there was any help for relative threshold configuration. Neither context help nor this page provides any help for the latest version.

    What do "Unstable % Range" and "Failed % Range" exactly mean? Do "(-)" values actually mean negative percentages (i.e. improvement of performance) and why would I ever want to fail build based on negative values? Does "-" mean minimum and "+" - maximum? Can build be failed based on test outliers (e.g. a single non-performing test) or performance of the entire test suite?

    Figuring these things out is quite hard for me and it's basically trial and error on a 15 minute job.

  17. Mar 26

    Dmitriy Korobskiy says:

    It took me 11 builds. A bunch of those failed on performance improvements, which...

    It took me 11 builds. A bunch of those failed on performance improvements, which does not make sense to me. In the end, I figured out how to configure thresholds to mark builds unstable if performance degrades more than 25%:

    • Mode = Relative Threshold
    • Unstable % Range: -999 to +25
    • Failed % Range: -999 to +999
    • Compare with Build number = a baseline build number, e.g. #18 here
    • Compare based on = Median response time

    I still have a couple of things to clarify. Could you explain:

    1. What is the median response time in this case, exactly? Median between what? There is only one test time for a particular test and a particular build (#18 here), so what does median mean here?
    2. If there are 2 tests with degradation above threshold, it is reported that the second one marked the build unstable. Why not the first one or ideally all tests above the threshold? See sample report from my build below. The first test exceeding threshold of 9% is "Multiple Search Criteria", but only the second one ("Network Analysis Choices") is reported.
    Performance: Percentage of relative difference outside -99.0 to +99.0 % sets the build as failure
    Performance: Percentage of relative difference outside -99.0 to +9.0 % sets the build as unstable
    
    
    
    Performance: Recording JUnit reports 'build/cucumber/*.xml'
    
    Comparison build no. - 18 and 26 using Median response time
    
    
    ====================================================================================================================================
    PrevBuildURI	CurrentBuildURI		PrevBuildURIMed		CurrentBuildURIMed	RelativeDiff	RelativeDiffPercentage
    ====================================================================================================================================
    Search by PI, case-insensitive	Search by PI, case-insensitive		44315			10324			-33991.0		-76.7
    Search by Organization, case-insensitive	Search by Organization, case-insensitive		92196			68074			-24122.0		-26.16
    Warning on all blanks in search	Warning on all blanks in search		1538			1509			-29.0		-1.89
    Search displays a message if nothing found and preserves user input	Search displays a message if nothing found and preserves user input		5097			4444			-653.0		-12.81
    Pagination of PI search results	Pagination of PI search results		18558			18705			147.0		0.79
    Select all	Select all		123646			128703			5057.0		4.09
    Multiple Search Criteria	Multiple Search Criteria		70983			83962			12979.0		18.28
    Topical Analysis Search with Co-PIs	Topical Analysis Search with Co-PIs		39866			42491			2625.0		6.58
    Sorting PI hitlist	Sorting PI hitlist		80044			79765			-279.0		-0.35
    Go to the geospatial analysis	Go to the geospatial analysis		607			608			1.0		0.16
    Geospatial Analysis Choices	Geospatial Analysis Choices		13620			13510			-110.0		-0.81
    Retain last analysis result	Retain last analysis result		3989			4060			71.0		1.78
    Clear last analysis result	Clear last analysis result		4850			4749			-101.0		-2.08
    Go to the network analysis	Go to the network analysis		600			632			32.0		5.33
    Network Analysis Choices	Network Analysis Choices		16895			18872			1977.0		11.7
    Go to the temporal analysis	Go to the temporal analysis		730			650			-80.0		-10.96
    Temporal Analysis Choices	Temporal Analysis Choices		11763			11781			18.0		0.15
    Go to the topical analysis	Go to the topical analysis		630			667			37.0		5.87
    Topical Analysis Choices	Topical Analysis Choices		37696			37631			-65.0		-0.17
    Generate visualizations and see them in history	Generate visualizations and see them in history		18414			18038			-376.0		-2.04
    Warnings for fiscal year selections	Warnings for fiscal year selections		12912			12920			8.0		0.06
    Check geocode for Geospatial Analysis	Check geocode for Geospatial Analysis		14249			14296			47.0		0.33
    "Next" button should be enabled when one or more PIs are selected	"Next" button should be enabled when one or more PIs are selected		10423			10449			26.0		0.25
    ------------------------------------------------------------------------------------------------------------------------------------
    
    The label "Network Analysis Choices" made the build unstable
    Build step 'Publish Performance test result report' changed build result to UNSTABLE
    

    3. Is there ever going to be a difference between PrevBuildURI and CurrentBuildURI?

  18. Apr 08

    Dmitriy Korobskiy says:

    I have a couple of issues in another job. 1. Threshold analysis reports only th...

    I have a couple of issues in another job.

    1. Threshold analysis reports only the last unit test even though 22 tests are run and JUnit reports results just fine. Both JUnit publisher and Performance are configured to use build/test-results/*.xml.
    Also, Performance Trend reports on all 22 tests fine.

    Recording test results
    Performance: No threshold configured for making the test failure
    Performance: No threshold configured for making the test unstable
    
    
    
    Performance: Recording JUnit reports 'build/test-results/*.xml'
    
    Comparison build no. - 854 and 859 using Average response time
    
    
    ====================================================================================================================================
    PrevBuildURI	CurrentBuildURI		PrevBuildURIAvg		CurrentBuildURIAvg	RelativeDiff	RelativeDiffPercentage 
    ====================================================================================================================================
    testTopicalAnalysis	testTopicalAnalysis		2621			4197			1576.0		60.13
    ------------------------------------------------------------------------------------------------------------------------------------
    
    The label "testTopicalAnalysis" made the build unstable
    Build step 'Publish Performance test result report' changed build result to UNSTABLE
    

    2. When threshold levels exceed certain value (99%, I think) "No threshold configured" is reported in console while thresholds are actually in effect. In this example, unstable thresholds were set up as: -999% to 25% and the build was failed to unstable.