Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [wtp-releng] Performance Tests


>  - 2.0.2 I-builds - Do I refer the results of the 2.0.2 I-build to the 2.0.1 build?
>   Or, do I refer the results of the 2.0.2 I-build to the results of the previous 2.0.2 I-build?
>  - 3.0 M-builds - I will refer 3.0 M1 to 2.0.1. But how about 3.0 M2? Referred to 3.0 M1, or again to 2.0.1?

2.0.1.
We want a common, steady base measurement to compare against. It's no big deal if there's a little
week to week variation, but the goal is to improve (or, be no worse than) the previous release).
But ... I also think the base measurements should be re-ran each time the regular measurements are taken,
just to make sure the machine is in a steady state.

We could fix that archive issue if anyone thought worth it to use 2.0 as the base. That'd be my preference, but no big deal if too much work.
If it's just a matter of FTP'ing the zip files,
you can use the "mirror form" of the URL to get the file, and if not found on 'downloads' will automatically look
in archives. But, not sure if that's the issue you are having or not.

I also think it'll be easy to fix the 12 hour run time ... just take out any suites that are too long, open a bug on the subproject that owns it,
and let them create a more reasonable test. The total time should be no more than 4 hours, IMHO, and ideally just 1 or 2.  The regular
JUnit tests are about 30 minutes.

Thanks!





From: "Raev, Kaloyan" <kaloyan.raev@xxxxxxx>
To: <wtp-releng@xxxxxxxxxxx>
Date: 12/03/2007 10:02 AM
Subject: [wtp-releng] Performance Tests





Hello,

I am happy to announce that I have the performance tests setup and run successfully.
I have executed the perf test on two builds already:

http://download.eclipse.org/webtools/downloads/drops/R2.0/R-2.0.1-20070926042742/
http://download.eclipse.org/webtools/downloads/drops/R2.0/M-2.0.2-20071004053715/

Since all builds older than 2.0.1 are already moved to the archive site, I cannot use the PerfBuilder tool to execute the perf tests for them. This is why I use the official 2.0.1 release for a "reference base". The first 2.0.2 I-build (M-2.0.2-20071004053715) perf test results are referred to the results of that 2.0.1 release.

I am going to build perf test results for all declared builds on the WTP download site in the next days. One execution last for around 12 hours. But, before this I want to ask what is the for referring perf results.

        - 2.0.2 I-builds - Do I refer the results of the 2.0.2 I-build to the 2.0.1 build? Or, do I refer the results of the 2.0.2 I-build to the results of the previous 2.0.2 I-build?

        - 3.0 M-builds - I will refer 3.0 M1 to 2.0.1. But how about 3.0 M2? Referred to 3.0 M1, or again to 2.0.1?

And one more thing. There are still 5 failures in the j2ee core perf test:
http://download.eclipse.org/webtools/downloads/drops/R2.0/M-2.0.2-20071004053715/perfresults/html/org.eclipse.jst.j2ee.core.tests.performance_.html

The reason is one and the same: illegal data set: contains neither AVERAGE nor AFTER values.
Is this a bug? Or there is still something wrong in my setup?

Greetings,
Kaloyan Raev

Eclipse WTP Committer
Senior Developer

NW C JS TOOLS JEE (BG)

SAP Labs Bulgaria

T +359/2/9157-416

mailto:kaloyan.raev@xxxxxxx
www.sap.com _______________________________________________
wtp-releng mailing list
wtp-releng@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/wtp-releng

Attachment: smime.p7s
Description: Binary data


Back to the top