Copyright © 2005 Markus Barchfeld
 Eclipse Corner Article

 

Build and Test Automation for plug-ins and features

Summary
Eclipse offers the possibility to build plug-ins automatically outside the Eclipse IDE, which is called "headless build". Eclipse itself is built headless and since Eclipse is an assembly of plug-ins, this feature is also available for any other plug-in. Although the set up of automatic building and testing requires only a couple of files, it can be tedious work to do nonetheless. This article shares the experiences and lessons learned while setting up automatic building and testing for an Open-Source Eclipse plug-in called RDT, Ruby Development Tools.

By Markus Barchfeld, Zuehlke Engineering
May 29, 2005


Environment

All the techniques, examples and screen shots covered in this article refer to Eclipse 3.0. If you want to follow the examples, you will need an Eclipse 3.0.x installation. They might work with Eclipse 3.1 as well, but it is untested.

The first part of this article gives an overview of the building steps and artifacts of the headless build and introduces the required files to build RDT as example. The second part shows how the Eclipse Test Framework can be leveraged to extend the automatic build by running a suite of test cases on the built plug-ins.

Part 1 - Automated Build

Build a feature using PDE GUI Tools

Starting the development of a plug-in is quite easy with the PDE environment. There are tutorials and examples available which create sample plug-in projects in the workspace. Compilation is done automatically from the IDE and running the project can be easily achieved with starting a runtime workbench, thanks to the self hosting feature of Eclipse.

For deployment you need to create a zip file which contains at least the plugin.xml and a jar file with the classes of the plug-in. The inclusion of further files (e.g. html files, images) can be controlled with the build.properties file, the PDE (Plug-in Development Environment) provides the "Build Properties Editor" for convenient editing of the file. [1] explains the content of build.properties. Fig. 1 shows the menu entry for the creation of an Ant build file. The build file will be called build.xml and be located in the same directory as the plugin.xml. The build file creation can be controlled with build.properties. Some of the properties are described in [2], see also Fig. 5. If you have the need to customize the build process beyond the possibilities of build.properties, a custom build file can be provided. Of course, the custom build file must provide the same "interface" as the generated build file has, i.e. there are some mandatory targets, which are explained in [12].

Create Ant Build File
Fig 1. Create Ant Build file


The simplest method of deployment is to choose a zip file as the deliverable and then to unzip the build output into the plugins directory of your target Eclipse installation. More sophisticated deployment options are update sites and product extensions. If the project grows then you may to split it up into several plug-in projects, at least plug-in projects which contain GUI components and Core plug-in projects which do not contain GUI components. They can still be built and deployed separately, but having an Eclipse feature provides additional advantages and so you probably want to wrap up the plug-ins with a feature project.

Fig. 2 shows the Feature Manifest Editor. The Export... button opens the Feature Export Dialog (Fig. 3) with which you can create the deliverable: either a single zip file or the file structure for an update site. Behind the scenes the feature build process just builds every contained plug-in as described above and then collects the results. All the generated build files are temporary and will be deleted after the build has finished. The build file generation for every contained plug-in works in exactly the same way as if called from the "Create Ant Build File" menu entry, which also means that the build.properties file of every plug-in is considered in the same way as if the plug-in would be built standalone.

Additionally there is a build.xml file created for the feature project. Like the plug-ins build file, the build.xml for the feature can be created with the PDE Tools->Create Ant Build File context menu on the feature.xml file. Similar to the plug-ins, there is a build.properties file for the feature which can be used to customize the build process. E.g: the files you choose to include into a binary build are defined in the bin.includes property in build.properties. Some of the properties of build.properties can be conveniently set with the Feature Manifest Editor. E.g the bin.includes property can be assembled on the "Build" tab with a tree control and check boxes.

Although the Feature Export Dialog allows storing the build-action in a single build script, it is not yet the solution for a fully automated build process, because this script can not be run outside of an Eclipse workbench. Its purpose is to save the settings chosen in the Feature Export Dialog and conveniently rerun the same export function again by choosing Run->Ant Build ([3]). For building outside of Eclipse, PDE provides some Ant tasks which are explained in [4]. But In order to use them you don't have to create Ant scripts from scratch. There is already an infrastructure for headless builds, which can be leveraged.

Fig 2. Feature Manifest Editor

 

Fig 3. Export Features Dialog

Build headless

There are many reasons for creating a batch build process such as providing a nightly or continuous build. This requires a fully automatic build which can even start with the retrieval of the sources from CVS. Eclipse itself is built in that way and because "everything is a plug-in" the mechanisms can also be applied for the build of arbitrary plug-ins. However it is necessary to bundle your plug-ins into a feature project if you want to use headless build. If there is no real need to deploy your plug-in(s) using a feature, a feature project can also be used for the build process only. In this case there will be no feature manifest added to the deliverable.

The infrastructure is provided by the PDE and the RelEng (Release Engineering) plug-ins. The org.eclipse.releng.basebuilder plug-in contains a complete Eclipse installation for the build, but any Eclipse-SDK is sufficient. The org.eclipse.releng.eclipsebuilder plug-in contains control files for the build of parts like JDT, PDE, and all the other plug-ins which are part of the Eclipse distribution.

Although the build.xml provided in org.eclipse.releng.eclipsebuilder looks like a plain Ant build file, there are Ant-Tasks required which are provided from the PDE. Therefore the script must be executed in an Eclipse environment. Because it would not be suitable for batch processing to start up the Eclipse workbench, there is a "headless" startup mode provided, which means that Eclipse is started without loading GUI plug-ins. To start up a headless Eclipse instance which then executes an Ant build file, Eclipse must be started as AntRunner application. An example is given in Fig. 10.

Building sdk.examples headless

As an example, let's build one of the Eclipse deliverables headless. The feature sdk.examples is chosen, because it is quite small, the deliverable is about 1.8 Mb ([5]). The headless build takes about 10 minutes on my 1.6Ghz Win XP laptop, most of the time used for fetching the sources via a 64k internet connection. In addition to an Eclipse installation there are two prerequisites for the headless build:

The first step is to get org.eclipse.releng.eclipsebuilder:

D:\build>cvs -d :pserver:anonymous@dev.eclipse.org:/cvsroot/eclipse export -r R3_0_2 org.eclipse.releng.eclipsebuilder

The readme.html in org.eclipse.releng.eclipsebuilder ([6]) describes how to build a component. By default, components are built with an Eclipse instance (build host) defined in org.eclipse.releng.basebuilder. But because the basebuilder is rather large, we can also use an existing Eclipse-3.0.2 installation as build host. Because it is not assured that the Eclipse-3.0.2 host can compile the current Eclipse components, we use the 3.0.2 version of eclipsebuilder in this example. In order to fetch the 3.0.2 version of sdk.examples instead of the HEAD version, we have to modify org.eclipse.releng.eclipsebuilder/sdk.examples/build.properties and set the property mapVersionTag appropriately:

mapVersionTag=R3_0_2

Now the build can be started from the command line:

D:\build\org.eclipse.releng.eclipsebuilder>set ECLIPSE_HOME=D:\eclipse\eclipse-3.0.2
D:\build\org.eclipse.releng.eclipsebuilder>java -cp %ECLIPSE_HOME%\startup.jar org.eclipse.core.launcher.Main
 -application org.eclipse.ant.core.antRunner -buildfile build.xml 
 -Dcomponent=sdk.examples -Dconfigs="*,*,*" -Dbaseos=win32 -Dbasews=win32 -Dbasearch=x86 -Djavacfailonerror=true
 -Dpde.build.scripts=%ECLIPSE_HOME%/plugins/org.eclipse.pde.build_3.0.1/scripts -DbaseLocation=%ECLIPSE_HOME%

The property "component" is used to define that the files in the sdk.examples subdirectory are used in the headless build, which are build.properties and customTargets.xml. The build takes place in D:\build\org.eclipse.releng.eclipsebuilder\src by default. This can be changed with the property buildDirectory. Either by adding -DbuildDirectory=${basedir}/newDirectory to the command above or by changing buildDirectory in org.eclipse.releng.eclipsebuilder/sdk.examples/build.properties.

After the build has finished, the deliverable and compile logs can be found in the build output directory, a subdirectory of the build directory. The name of the output directory is defined by the buildLabel property. By default it starts with "I-" and includes the time stamp of the build time.

The baseLocation is used during the build to provide missing plug-ins, i.e., plug-ins which are not part of the deliverable to build and have therefore not been fetched from CVS. After the plug-ins have been fetched, PDE reads in the plug-ins and creates a build time Eclipse host. If baseLocation is given, the Eclipse installation at baseLocation is added to the build time Eclipse host. In our case this is necessary, because sdk.examples depends on the plug-ins in the Eclipse SDK and can therefore not be built standalone. Note that the baseLocation is independent of the build host and therefore the build host can be another version of Eclipse. In the example we could use Eclipse 3.0.2 as build host and Eclipse 3.0 as baseLocation. The baseLocation may not contain any of the plug-ins to be built. If the baseLocation contained the sdk.examples, an error would occur.

The next section looks behind the scenes of the headless build.

Build Phases

Fig. 4 shows the files which drive the build process. While build.xml and genericTargets.xml are provided from the PDE feature, there are two files to be provided by the plugin to be built: the customTargets.xml and build.properties. Examples for these two files can be found in org.eclipse.releng.eclipsebuilder\sdk.examples, if you did not download org.eclipse.releng.eclipsebuilder in the last section, see [7]. The build.properties file allows customizing various build parameters, for a reference see Fig. 5. Fig. 6 shows the interaction of three build files, build.xml, genericTargets.xml and customTargets.xml, which accomplishes the build. The main phases of the build process are declared in build.xml: PreBuild, Fetch, Generate, Process, Assemble and PostBuild.

File name Location Description
build.xml org.eclipse.pde.build_<version>/scripts This is the main build script and provides a skeleton for the build process: from prebuild to postbuild targets
genericTargets.xml org.eclipse.pde.build_<version>/scripts Contains targets like fetchElement, generateScript, processElement, assembleElement
customTargets.xml value of property builder This build script resides in the location specified by the property builder. It is included from the main build.xml file and delegates to targets in genericTargets. The main responsibilities are in defining the features to build and to fetch the map files.
build.properties value of property builder This build script resides in the location specified by the property builder. It contains properties for fetching the sources, building and compiling.
Fig 4. Files for controlling the build process


Category Name Description
Build folder buildDirectory The relative path to a directory where the source for the build will be exported, where scripts will be generated and where the end products of the build will be located. On Windows systems, this directory should be close to the drive root to avoid path length limitations particularly at compile time.
baseLocation A directory separate from ${buildDirectory} which contains pre-built plug-ins against which to compile. ${baseLocation} must not contain any features, plug-ins or fragments which are already or will be located in ${buildDirectory}.
baseos,basews,basearch,basenl The os, ws, arch and nl values of the pre-built Eclipse found in ${baseLocation}.
Build target (what is being built) configs An ampersand separated list of configurations to build for an element where a configuration is specified as <os>,<ws>,<arch>.
ie.configs="win32,win32,x86 & linux, motif, x86 & linux, gtk, x86". Typically used to build a feature that is os, ws, arch specific. A non-platform specific configuration is specified with "*,*,*".
collectingFolder The directory in which built features and plug-ins are gathered. This is typically set to "eclipse".
archivePrefix The top level directory in assembled distribution. This is typically set to "eclipse".
buildType A letter "I, N, S, R or M" used to identify builds as being one of the following:

I - Integration
N - Nightly
S - Stable
R - Release
M - Maintenance
buildId The build name. Default set to "build" in template build.properties.
buildLabel Refers to the name of the directory which will contain the end result of the build. Set to ${buildType}.${buildId} in template build.properties. This directory will be created in the ${buildDirectory}.
timestamp A timestamp used to fill in value for buildid in about.mappings files. Also used to name build output directory, i.e. I-build-<timestamp>.
Repository mapVersionTag Sets the tag attribute in a call to the Ant <cvs> task to check out the map file project.
fetchTag Sets the tag or branch when exporting modules used in the build. For example, setting fetchTag=HEAD will fetch the HEAD stream of the source for all features, plug-ins and fragments listed in the map files instead of fetching the tag specified in the map entry for that element. For example this is being used in the eclipse build process to produce the nightly build.
Java Compiler bootclasspath Sets the value for the attribute "bootclasspath" in calls to the Ant <javac> task in a plug-in's build.xml.
javacDebugInfo Sets the value for the attribute "debug" in calls to the Ant <javac> task in a plug-in's build.xml. Determines if debug info is included in the output jars. Set to on in template build.properties.
javacFailOnError Sets the value for the attribute "failonerror" in calls to the Ant <javac> task in a plug-in's build.xml. Build will continue even if there are compilation errors when this is set to false.
javacSource Sets the value for the attribute "source" in calls to the Ant <javac> task in a plug-in's build.xml. Sets the value of the -source command line switch for javac version 1.4. Used when compiling the jars for the plug-ins. Default set to 1.3 in generated build.xml for plug-ins and fragments.
javacTarget Sets the value for the attribute "target" in calls to the Ant <javac> task in a plug-in's build.xml. Sets the value of the -target command line switch for javac. Used when compiling the jars for the plug-ins. Default set to 1.1 in generated build.xml for plug-ins and fragments.
javacVerbose Sets the value for the attribute "verbose" in calls to the Ant <javac> task in a plug-in's build.xml. Asks the compiler for verbose output. Default set to true.
misc zipargs Arguments to send to the zip executable. Setting it to -y on Linux preserves symbolic links.
Fig 5. build.properties


Fig 6. Script interaction for the build process

PreBuild and Fetch phase

Fig. 7 gives an overview of the main activities and the files created from these activities for the complete build process. The getMapFiles and concatenate activities are part of the PreBuild phase; fetch and getFromCVS are part of the Fetch phase.

The most important task in the PreBuild phase is to retrieve map files, which is implemented in the getMapFiles target in customTargets.xml. Usually the getMapFiles target consists of a CVS command to fetch the map files. A map file is a java property file which contains mappings of elements to their CVS locations and access methods. It consists of one map file entry for each feature being built, its <plugin> elements, and its <includes> elements (i.e. nested features and their plug-ins). Adding a plug-in or fragment to a feature therefore requires updating the map files with the new element.

Map file entries use the following format:

feature|fragment|plugin@elementId=<cvs tag>,<access method>:<cvsuser>@<cvs repository>,<cvs password>[,<repository path> (no starting slash) ]

The <repository path> is only required when the module (or directory) containing the source for the element does not match the elementId or if the directory is not at the root of the repository. Because all the map files are concatenated into a single file called Directory.txt after getMapFiles has finished, the format of directory.txt described in [4] applies also to the format of a map file.

The fetch phase starts with the creation of an ant script called retrieve.xml, which contains a build target for fetching the feature.xml. Then this target will be executed and the information about the provided plug-ins of the feature is read in. For every provided plug-in a corresponding build target will be added to the fetch build script. Both files (retrieve.xml and feature.xml) are only temporary. They will be deleted after the fetch build script has been generated.

The fetch build script is named fetch_<element_id>.xml, where <element_id> is substituted with the feature id read from the feature.xml file (not the property "id" given in customTargets.xml). Running the default target "fetch" of this build script calls the target getFromCVS for the feature itself and all included plug-ins. The sources of the plugins are not fetched again in the case of an existing plugin.xml. So, if you run the full build twice in the same directory, the sources for the plug-ins are not fetched twice. But you need to have a working CVS connection nonetheless, because the retrieve.xml file is created and executed in any case. Therefore you would end up with an incomplete fetch_<element_id>.xml file if you run the fetch phase and did not have CVS connection.



Fig 7. File flow of the build process

Generate Phase

After the feature and plug-ins have been copied to the build target directory, the build scripts are generated in the same way as they would be with the PDE GUI (see above). The created files are:

Process and Assemble Phase

At this point of the build process the build files are in place and the compilation is started from a call to processElement in genericTargets.xml. This in turn calls the target build.jars in the build.xml file in the feature directory. There the compilation starts and the warning and error messages of the java compiler are collected separately for each jar file. The files are named <jarFileName>.bin.log and reside in a subdirectory "temp.folder". Later these files are gathered and copied to the build output directory.

The assembly is started from the assembleElement target in genericTargets.xml which then calls a target named assemble.${id} in customTargets.xml. This gives the chance to do some work before or after the assembly, but the main part of the work can just be delegated to the assemble.${id}.all.xml in the build target directory.

The assemble script copies all the build results to a temporary subdirectory in the build target directory and finally zips them up. The files being collected are feature manifest files (feature.xml), plug-in manifest files (plugin.xml) and the built jar files. The inclusion of arbitrary files can be specified with the bin.includes property of the plug-in's build.properties file ([1]).

PostBuild Phase

This is a build hook which is empty per default and allows performing actions with the build output. We will use this later for test automation.

After modifications to the build files you can save time if you run only a particular phase of the build process. E.g. if you changed a property which only affects the process phase, you can run this phase alone by appending process to the -buildfile argument in the command line:

java org.eclipse.core.launcher.Main -application org.eclipse.ant.core.antRunner -buildfile %buildfile% process

How to build your Plug-In

Now we have built sdk.examples headless and have an overview of the build process. Build automation for your plug-in is not far away anymore with one constraint: you need to provide a feature project if you want to fully leverage the headless build as you have seen it for sdk.examples. So, if you don't have a feature project created yet, now is the time. With the IDE you can create a feature project with File->New->Project...->Plug-in Development->Feature Project. In the wizard which opens enter a feature name and then select all the plug-ins you want to include in your build. If you do not want to deploy your plug-in(s) as a feature, you can configure the feature project so that it is only be used to drive the build process but will not be included in the deliverable. Therefore just remove the bin.includes property from the build.properties file of the feature project.

Using the feature there need to be 4 files defined in order to provide a one step build:

Fig. 8 shows a part of the map file for RDT. The CVS method used is ext, which allows to configure access parameters like user name, password or SSH proxy locally. In the case that pserver should be used, the getMapFiles target of customTargets.xml replaces ext with pserver.

 feature@org.rubypeople.rdt=HEAD,:ext:cvs.sf.net:/cvsroot/rubyeclipse,,org.rubypeople.rdt-feature
 plugin@org.kxml2=HEAD,:ext:cvs.sf.net:/cvsroot/rubyeclipse,
 plugin@org.rubypeople.rdt=HEAD,:ext:cvs.sf.net:/cvsroot/rubyeclipse,
 plugin@org.rubypeople.rdt.core=HEAD,:ext:cvs.sf.net:/cvsroot/rubyeclipse,
Fig 8. A section from rdt.map

As seen above customTargets.xml is the build file which provides hooks to customize the build process. A template for customTargets.xml can be found in [9]. Fig. 9 shows the parts of the customTargets.xml file which have to be modified as a minimum for the build process:

  1. Define the Feature to be built in the allElements target. The given id is not the Feature id as defined in the feature manifest but only an identifier for the lookup in the map file.
  2. Retrieve map files in the getMapFiles target. Usually the map files are fetched from CVS and copied to the maps subdirectory of ${buildDirectory}.
  3. Provide a target called assemble.<id>. This can just delegate to the assemble script which has been created in the generate phase of the build process.
<target name="allElements">
  <ant antfile="${genericTargets}" target="${target}" >
 <property name="type" value="feature" />
  <property name="id" value="org.rubypeople.rdt" />
 </ant>
</target>
<target name="getMapFiles">
    <property name="cvsRoot" value=":ext:cvs.sf.net:/cvsroot/rubyeclipse" />
  <cvs cvsroot="${cvsRoot}"
    dest="${buildDirectory}/maps"
   command="export -r ${mapVersionTag} org.rubypeople.rdt.build/map"/>
</target>
<target name="assemble.org.rubypeople.rdt">
  <ant antfile="${assembleScriptName}" dir="${buildDirectory}">
 <property name="zipargs" value="" />
  </ant>
</target>
Fig 9. Mandatory changes in customTargets.xml

Build RDT as example

All the files which configure the headless build for RDT can be found in [10]. In order to build RDT, you must have an Eclipse 3.0.x installation, CVS installed and CVS access to the internet (firewall!). The sources will be fetched from cvs.sf.net, which provides anonymous access.

First we need to get the bootstrap for the headless build:

cvs -d :pserver:anonymous@cvs.sf.net:/cvsroot/rubyeclipse export -r R0_5_0 org.rubypeople.rdt.build/bootstrap

After a successful retrieval you will find the following files: README, run.bat, run.sh, customTargets.xml and build.properties. The map file defines to fetch the HEAD version of all plug-ins. In order to get stable sources, the property fetchTag is set in build.properties and overwrites the settings from the map file. See [9] for a description of the build file options. Now change the variables in the run script (run.bat on windows or run.sh on UNIX) so that they reflect your PCs environment.

Now the execution of the run script should build RDT. If it does not, test your skills in handling the lengthy Ant stack traces.

The build.properties file which is provided as template in [9] builds a result zip file which contains paths with "eclipse" prefix, e.g. "eclipse/feature/...". This is inconvenient if the zip file should be extracted into a renamed Eclipse installation directory, e.g. eclipse-3.0.2. Therefore the archive path names should be relative to the eclipse installation directory, which can be achieved by setting the properties collectingFolder and archivePrefix to ".".

Debug the Build Process

If you reach a point where the error messages during the build are more confusing than helpful, you can debug the build run. A first measure to take is to set the AntRunner to verbose mode: add a "-verbose" behind the AntRunner argument in the java command line. This gives detailed information about which Ant targets are executed. It is particularly useful for debugging problems with accessing CVS. If that does not help, the build run itself can be debugged. Instead of starting the build host from a shell script (.sh or .bat) you can start the build host from the Eclipse IDE with a Run Configuration. The custom Ant Tasks which PDE provides reside in the org.eclipse.pde.build plug-in project and set breakpoints or catch exceptions. Here is a step-by-step guide on how to set up an AntRunner Run-time Workbench:

  1. Start Eclipse and switch to an empty workspace
  2. File>Import, External plug-ins and Fragments
  3. Select "Projects with source folders"
  4. Add all plug-ins with org.eclipse.pde.*
  5. Create a Run-time Workbench run configuration (Run->Run..., Run-time Workbench)
    1. Choose "org.eclipse.ant.core.antRunner" as "Run an application"
    2. Add Program Arguments:
      1. -Dbuilder=$builder, where $builder is the directory where your customTargets.xml is located
      2. -DbaseLocation=$baseLocation, where $baseLocation is the directory of your Eclipse installation (see above)
  6. Add a breakpoint, e.g. in the method "generate" in org.eclipse.pde.build/src-pdebuild/org.eclipse.pde.internal.build.FetchScriptGenerator
  7. Start the debug session
Run Configuration for building sdk.examples
Fig 10. Run configuration for building sdk.examples

Fig. 10 shows the Run configuration dialog for building sdk.examples, which we have built from command line above. The full Program Arguments are:

-buildfile "D:\build\org.eclipse.releng.eclipsebuilder\build.xml" -Dcomponent=sdk.examples -Dconfigs="*,*,*"
-Dbaseos=win32 -Dbasews=win32 -Dbasearch=x86 -Djavacfailonerror=true
-Dpde.build.scripts=D:/eclipse/eclipse-3.0.2/plugins/org.eclipse.pde.build_3.0.1/scripts -DbaseLocation=D:\eclipse\eclipse-3.0.2

Note the location of the build workspace. The log file in D:\build\build-workspace\.metadata\.log often provides useful information if a build fails.

Include sources

Unfortunately the headless build does not support all of the settings as they are provided by the Feature Export Dialog (Fig. 3). For example, the inclusion of source jar files is not possible. This lack can be circumvented by using a source feature or plug-in. Building a source feature is quite convenient and needs only two entries to be added:

  1. Add a Line to build.properties of your feature project:
    generate.feature@org.rubypeople.rdt.source=org.rubypeople.rdt
    
  2. Add an xml element to feature.xml:
    <includes id="org.rubypeople.rdt.source" version="0.0.0"/>
    
    If the version is set to 0.0.0, the version of the containing plug-in will be used.

Further customization of the source feature generation can be achieved by creating the subdirectories "sourceTemplatePlugin" and "sourceTemplateFeature" in your feature project. These directories should contain the files that are included in the root of the generated source feature and plug-in. The feature.xml and plugin.xml files are not required since these are generated. A build.properties is required in the sourceTemplatePlugin directory. This should contain a "bin.includes" setting as well as the entry "sourcePlugin = true". The plugin.xml file and src/ directory should be listed in bin.includes.

If there is an includes tag in feature.xml, building with the Export... button from the Feature Manifest Editor fails ([16]). As a workaround the feature.xml of the RDT project contains only an xml comment. The comment will be replaced with the includes tag in the postFetch target in customTargets.xml. The generate.feature property must be treated in a similar way.

Build for Update Site

If you want to build files for an update site customTargets.xml can be modified to call the build.update.jar target of the feature build file. The results can then be copied to the update site location. Fig. 11 shows the changes in customTargets.xml, following the idea in [11].

<target name="postBuild">
  <property name="UpdateSiteStagingLocation" value="${buildDirectory}/updateSite"/>
  <property name="sitePackagePrefix" value="org.rubypeople.updatesite"/>
  <antcall target="generateUpdateSite"/>
</target>
<target name="generateUpdateSite">
  <!-- Create the directory structure -->
  <mkdir dir="${UpdateSiteStagingLocation}"/>
  <mkdir dir="${UpdateSiteStagingLocation}/features"/>
  <mkdir dir="${UpdateSiteStagingLocation}/plugins"/>
  <!-- Build the jar files -->
  <antcall target="allElements">
    <param name="genericTargets" value="${builder}/customTargets.xml"/>
    <param name="target" value="updateSiteExport"/>
  </antcall>
  <antcall target="copySiteXmlFromCvs"/>
  <antcall target="createNightlyBuildSiteXml"/>
</target>
<target name="updateSiteExport">
  <ant antfile="build.xml" dir="${buildDirectory}/features/${id}/" target="build.update.jar">
    <property name="feature.destination" value="${UpdateSiteStagingLocation}/features"/>
    <property name="plugin.destination" value="${UpdateSiteStagingLocation}/plugins"/>
  </ant>
</target>
<target name="copySiteXmlFromCvs" unless="isNightlyBuild">
  <!-- connect to CVS and fetch site.xml, copy to ${UpdateSiteStagingLocation}/site.xml afterwards -->
</target>
<target name="createNightlyBuildSiteXml" if="isNightlyBuild">
  <!-- create ${UpdateSiteStagingLocation}/site.xml which contains only the nighlty build version -->
</target>
Fig 11. Building for update site

Part 2 - Automated Tests

After the deliverable has been created the wish may arise to deploy it and to run a test suite with unit and integration tests. The reasons to do this automatically as well and present the results along with the deliverable are manifold:

Fortunately, Eclipse provides a framework for test automation, which can also be leveraged for test automation of arbitrary Eclipse features. The test results of every Eclipse build are publicly available in the Eclipse download directories; see 13 for the test results of the 3.0.2 build.

Eclipse Test Framework

For running the tests on your PC, get the eclipse-Automated-Tests-<version>.zip, which contains a zip file with the unit tests (eclipse-junit-tests-<version>.zip), shell scripts, build files and documentation. The steps for running the tests are described in the file testFramework.html (14). Basically the shell script runtests starts up a new AntRunner Eclipse instance which executes one of the targets of the test.xml build file. Therefore an eclipse installation must exist in the eclipse subdirectory, which is called the build host in the following. If there is no build host installation, the script tries to unzip an SDK zip which it assumes to be the same directory. In addition to that the script extracts the org.eclipse.test plug-in from the eclipse-junit-tests-<version>.zip into the build host installation.

In order to run the some of the tests of the JDT, the following steps are necessary:

  1. Download eclipse-Automated-Tests-3.0.2.zip (15) and unzip to $ECLIPSE_AUTOMATED_TESTS
  2. Copy eclipse-SDK-3.0.2 to $ECLIPSE_AUTOMATED_TESTS
  3. cd to $ECLIPSE_AUTOMATED_TESTS and type "runtests.bat jdtui"

The last step executes the non-interactive UI tests for the JDT (there are also interactive UI Tests which require the interaction of a test person to assure the layout and interactive behaviour of dialogs for example). Although the tests are non-interactive you will watch a workbench come up and during the execution of the test suite you can witness projects being created and closed, editors being opened and so on. After the test run has finished, the test results will be placed in $ECLIPSE_AUTOMATED_TESTS/results in html and xml format.

While the tests run, we can have look behind the scenes. Fig. 12 lists the build files for test automation; Fig. 13 shows the interaction between them. At first the runtests script creates the build host and launches an Eclipse instance as AntRunner. The AntRunner executes the jdt target of test.xml. During the process another eclipse-installation will take place in $ECLIPSE_AUTOMATED_TESTS/test-eclipse, called the test host in the following. The test host also consists of the provided SDK and the full content of eclipse-junit-tests-3.0.2.zip. A test host instance is then launched, with the application type org.eclipse.test.uitestapplication (target uitest in library.xml, see below). The class name of the test suite is passed to the uitestapplication. After the test suite has completed, the test results can be found in $ECLIPSE_AUTOMATED_TESTS/results.

file name location description
test.xml unzip location of eclipse-Automated-Tests.zip This script provides the entry point for testing, the runtests target
test.xml plug-in directory of the test suite to be run Must provide a run target. Main purpose is to set properties and delegate to the ui-test target in library.xml. After the tests have run the collect target in library is called to create the result files. A skeleton can be found in 14.
library.xml plugins/org.eclipse.test, delivered in eclipse-junit-tests-<version>.zip, which itself is part of eclipse-Automated-Tests-<version>.zip This build script contains the eclipse-test target which launches a new eclipse instance within which the tests will be executed. Tests can be run with or without UI.
Fig 12. Files for controlling the test process



Interaction Diagram for the test process
Fig 13. Script interaction for the test process

Automate Tests for your Plug-In

Now it is time to think about the effort to automate the tests for your plug-in. At the time I automated the tests for RDT, the tests resided in extra plug-ins and there was an AllTests suite, which gathered all the tests. With these preconditions, there were 3 steps on the road to test automation:

Build a deployable which includes the tests

Because the deliverable for your plug-in (the zip file or update site files) should not contain the tests, another zip file for the tests must be created, called test deliverable in the following. If you followed the pattern of the Eclipse plug-ins and created a tests plug-in project for each of your plug-ins, this can easily be achieved. If there is more than one test plug-in project, a test feature can be created to bundle them. Therefore you can add a feature project. Alternatively, if you want to keep the number of projects small and do not have the need to create the feature interactively from the feature manifest file, you can add a subdirectory to the existing feature project. The map file can then be used to customize the fetch process to create a feature directory with the content of the subdirectory. An example is given in the first line of Fig. 14.

feature@org.rubypeople.rdt-tests=HEAD,:ext:cvs.sf.net:/cvsroot/rubyeclipse,,org.rubypeople.rdt-feature/unit-tests-feature
plugin@org.rubypeople.rdt.ui.tests=HEAD,:ext:cvs.sf.net:/cvsroot/rubyeclipse,
plugin@org.rubypeople.rdt.debug.core.tests=HEAD,:ext:cvs.sf.net:/cvsroot/rubyeclipse,
...
Fig 14. Section from rdt.map for test feature

Building the test deliverable is identical to building the deliverable itself. If you can use the same build properties as for the deliverable, just add an additional entry to the allElements target in customTargets.xml. Fig. 15 shows how the allElements target is adapted for RDT in order to create the deliverable and test deliverable in one step.

<target name="allElements"><target name="allElements">

  ...
 
  <ant antfile="${genericTargets}" target="${target}" >
    <property name="type" value="feature" />
    <property name="id" value="org.rubypeople.rdt-tests" />
  </ant> 
</target>
<target name="postBuild"> 
  <antcall target="test"></antcall>
</target>

<target name="test">
  <echo message="Setting up tests in ${eclipseAutomatedTestHome}"/>
    
  <copy file="${buildDirectory}/${buildLabel}/org.rubypeople.rdt-${buildId}.zip" tofile="${eclipseAutomatedTestHome}/eclipse-SDK-RDT-${buildId}.zip" />
  <copy file="${buildDirectory}/${buildLabel}/org.rubypeople.rdt-tests-${buildId}.zip" tofile="${eclipseAutomatedTestHome}/eclipse-junit-tests-rdt-${buildId}.zip" />

  <ant antfile="${eclipseAutomatedTestHome}/test.xml" target="runtests" dir="${eclipseAutomatedTestHome}"> 
    <property name="os" value="${baseos}" />
    <property name="ws" value="${basews}" />
    <property name="arch" value="${basearch}" />
    <property name="testPlugin" value="org.rubypeople.rdt.tests.all_0.5.0" /> 
    <property name="report" value="org.rubypeople.rdt.tests.all" /> 
  </ant> 
</target>
Fig 15. Targets from customTargets.xml for testing

Add a test.xml

A test.xml file must exist for every test plug-in you want to run. Because every test run starts up a new eclipse instance, it is convenient to have a plug-in which bundles all the tests into a single AllTests suite. Then there is only one test run and there must only be one test.xml file maintained. A template for such a test.xml file can be found in 14. Fig. 16 shows important targets of the test.xml for the RDT plug-in containing the AllTests suite. The entry point is the run target, which is called from ${eclipseAutomatedTestHome}/test.xml. Its prerequisites are init, suite and cleanup. The suite target calls ui-test in library.xml with the necessary properties to start up the test host and run TS_RdtAllTests.

 <target name="init">
    <tstamp/>
    <delete>
      <fileset dir="${eclipse-home}" includes="org*.xml"/>
    </delete>
  </target>
      
  <!-- This target defines the tests that need to be run. -->
  <target name="suite">
    <property name="rdt-folder" value="${eclipse-home}/rdt_folder"/>
    <delete dir="${rdt-tests-workspace}" quiet="true"/>
    <ant target="ui-test" antfile="${library-file}" dir="${eclipse-home}">
      <property name="data-dir" value="${rdt-tests-workspace}"/>
      <property name="plugin-name" value="${plugin-name}"/>
      <property name="classname" value="org.rubypeople.rdt.tests.all.TS_RdtAllTests"/>   

      <property name="vmargs" value="-Drdt.rubyInterpreter=&quot;${rdt.rubyInterpreter}&quot;"/>

    </ant>
  </target>
      
 
  <!-- This target holds code to cleanup the testing environment after -->
  <!-- after all of the tests have been run. You can use this target to -->
  <!-- delete temporary files that have been created. -->
  <target name="cleanup">
  </target>
      
  <!-- This target runs the test suite. Any actions that need to happen -->
  <!-- after all the tests have been run should go here. -->
  <target name="run" depends="init,suite,cleanup">
    <ant target="collect" antfile="${library-file}" dir="${eclipse-home}">
      <property name="includes" value="org*.xml"/>
      <property name="output-file" value="${plugin-name}.xml"/>
    </ant>
  </target>
Fig 16. Important targets of org.rubypeople.rdt.tests.all/test.xml

Add post-build behavior in customTargets.xml

Fig. 15 shows the test target of customTargets.xml, which is executed in the PostBuild phase of the build process. The property eclipseAutomatedTestHome is set to the directory into which eclipse-Automated-Tests-<version>.zip has been extracted. The first step in the test target is to copy the two deliverables into this directory, following a naming schema which is given in ${eclipseAutomatedTestHome}/test.xml. By using these names, the runtests target of test.xml extracts the deliverables into the test host installation (in ${eclipseAutomatedTestHome}/test-eclipse/eclipse). The second step is to call the runtests target of ${eclipseAutomatedTestHome}/test.xml. The property testPlugin defines which tests to run and the report property defines the name of the generated report.

Compared to the example above, where we ran the tests jdt ui, there is no need to create the build host in $ECLIPSE_AUTOMATED_TESTS/eclipse, because the target runtests in test.xml is directly called from the postbuild target in customTargets.xml. So the build host for the automatic build is reused as build host for the automatic tests. Fig. 17 shows the difference compared to Fig. 13.

Interaction Diagram for the test process
Fig 17. Script interaction for the test process

PDE JUnit tests open up a workbench window. If you run your tests on a remote *nix box, you can use Xvfb (X virtual frame buffer) to provide a dummy environment for displaying windows.

Run the RDT tests

If you want to run the RDT tests, follow the instructions above for building RDT and set the following additional properties in run.bat or run.sh:

After you call run.bat or run.sh the tests will be executed in the postBuild phase. If you want to run the tests separately, you can insert the build target after the -build property in the antRunner command line:

%vm% -cp %eclipseDir%\startup.jar ... org.eclipse.core.launcher.Main -application org.eclipse.ant.core.antRunner -buildfile %buildfile% postBuild -data ...

Conclusion

Hopefully this article could help you in setting up project automation for your Eclipse plug-in project. It has shown the automation mechanisms used in building Eclipse itself and how they are leveraged to build and test arbitrary plug-ins automatically with a single command. Further project automation can be achieved from the inside or from the outside: From the inside by using the hooks provided in the build files for activities like enhanced reporting. From the outside by calling the build command as part of a broader build process from tools like CruiseControl or Maven.
For the discussion of questions about this article and experiences with Eclipse build and test automation in general, I'd like to invite you to the RDT developer wiki [17] or to e-mail me [18].

Acknowledgments

I would like to thank Sonia Dimitrov and Pascal Rapicault for their article draft, which I have weaved into this article. Many thanks also to Pat McCarthy for his valuable comments.

Reference

[1] PDE Help on build.properties, Eclipse 3.0 Help
[2] PDE Help on Generating Build Files, Eclipse 3.0 Help
[3] Generated feature build scripts and headless invokation , Eclipse Bugs
[4] PDE Help on headless build , Eclipse 3.0 Help
[5] eclipse-examples-3.0-win32.zip , Eclipse Download Area
[6] org.eclipse.releng.eclipsebuilder/readme.html , Eclipse CVS
[7] org.eclipse.releng.eclipsebuilder/sdk.examples, Eclipse CVS
[8] JDT feature , Eclipse CVS
[9] org.eclipse.pde.build/scripts, Eclipse CVS
[10] RDT bootstrap files for headless build, RDT CVS
[11] Way to automatically build for update site? , Eclipse News
[12] Using PDE Build, Eclipse CVS
[13] Eclipse Test Results for 3.0.2, Eclipse Download Area
[14] testframework.html, Eclipse Download Area
[15] eclipse-Automated-Tests-3.0.2.zip, Eclipse Download Area
[16] generate.feature and building using export button in feature.xml , Eclipse Bugs
[17] feedback page, RDT developer wiki
[18] contact, user page at sf.net

ref: bug 87151

Java and all Java-based trademarks and logos are trademarks or registered trademarks of Sun Microsystems, Inc. in the United States, other countries, or both.