Building Eclipse Plugins with Maven 2

Summary

In a mature and agile development environment, it is vital that the developers are kept productive and that builds are done continuously and dependably. Eclipse is a great environment for developers and Maven 2 (in conjunction with Continuum or Cruise Control) is a great environment for continuous integration. As with most great software, both Eclipse and Maven 2 tend to be somewhat opinionated and the two don't always see eye to eye on how things should be done. This article describes how to use Maven 2 with Eclipse in general. In particular we will focus on how to develop, package and test Eclipse plugins using Eclipse (Callisto) and Maven 2 (2.0.4) efficiently.

By Peter H. Petersen, Princeton Softech, Inc.
Sumit Gupta, Princeton Softech, Inc.
February 19, 2006

Environment

[Note]

The information provided in this article pertains to the following:

[Note]

Since we began writing this article, we have actually successfully upgraded Callisto from SDK 3.2.0 to SDK 3.2.1 along with most of the rest of the Callisto projects.

Introduction

Before delving into the nitty-gritty details, let's start with a little bit of background on Eclipse and Maven 2 and also cover the goals and requirements for the build system we put together.

Eclipse

If you've already worked with Eclipse on a daily basis, you can probably skip this section. If not, a little background about how Eclipse likes (or wants) to do things might be helpful.

Like most other IDEs, Eclipse organizes your work into projects. A project is essentially a directory, in which your packages, source files and other resources (like XML files, icons, images and resource bundles) are created and modified. What sets Eclipse apart from most other IDEs is its notion of a workspace. Eclipse cannot function without one and historically it's been the de facto location for your projects to live. The workspace also contains Eclipse's metadata directory (appropriately called .metadata), which contains the settings, preferences and other information that needs to persist between invocations.

In earlier versions of Eclipse, it was not possible to have projects in other locations, but this constraint has mostly been removed in the later versions.

The more serious constraint, with respect to the Eclipse workspace, is that Eclipse wants it to be flat. As a rule of thumb, an Eclipse workspace cannot have Java projects that exist in a hierarchy, so people have gotten used to simply organizing their projects as a flat list of top-level directories and then manage their inter-dependencies through Eclipse's built-in facility for project dependencies and build path management. For most tools and add-ons, this constraint is not necessarily a problem, and e.g. Ant is often used in conjunction with Eclipse to do automated builds and testing.

Maven 2

Maven 2 is an elegant project build/management framework that follows the mantra of "Convention over Configuration" made popular by the "Ruby on Rails" community. The premise is that by following the conventions laid out by Maven 2, a developer can very quickly put together a project that can be compiled, tested and packaged in a meaningful way and even get code coverage to boot!

Maven 2 can be used to build pretty much anything. It does this by stepping through a series of phases in an attempt to reach the goal of the build. All these phases put together are called a lifecycle. The phases that come into play for a given lifecycle depend on the kind of library or executable you are building. The default lifecycle for building Java "jar" libraries is as follows:

Figure 1. Maven 2 default lifecycle

Maven 2 default lifecycle

You can find a reference to all the build phases here Introduction to the Lifecycle. As you will see later on we tie into some of these build phases to facilitate the building and testing of Eclipse plugins.

Following of the conventions in Maven 2 mostly boils down to abiding by the directory structure that Maven 2 predefines. In other words, any code in the predefined source directory will be compiled and packaged, and any test code in the predefined test source directory will be compiled and run against the code in source, and so on.

Maven 2 draws all truth from the Project Object Model (POM) which is defined in a file, named pom.xml, at the root of your project . We often say in-house, "if it ain't in the POM, it doesn't exist". POMs, and therefore projects, can be arranged in hierarchies. A POM will inherit properties that it does not define itself from its parent. This inheritance feature can obviously be powerful when attempting to manage projects and can reduce duplicated effort, but may irk some of our well loved IDEs.

Goals and Requirements

When choosing the tools to use in a development and build environment, it's important to understand what the goals are. For example, in some development organizations the primary concern is developer productivity while "ancillary" disciplines like change and configuration management; build; integration and testing take a back seat to that. In other organizations, it's the other way around and developers may well suffer from having to work with a difficult or arcane build environment.

Today, many development organizations have managed to come up with a reasonable "middle of the road" tooling and methodology story, that keeps the developers productive (and mostly happy) but also ensures that builds can be done repeatedly and with a large degree of confidence and automation.

When we set out to design and implement our build environment, we already knew that we'd want to use Eclipse as the Java IDE, as we're implementing our product's design-time environment as a Rich Client Platform (RCP) application, that will also leverage 3rd party plugins. On the other hand, we also had hard requirements on the overall build system to support an agile development methodology with continuous integration and a very high degree of transparency in terms of the overall state of affairs with the source code and releasable bits. So it was clear that we wanted to completely automate the following:

  • Compilation
  • Unit testing
  • Javadoc generation
  • Code coverage
  • Code style checking
  • Release
Of course, all of this has to happen on several different platforms, each with their own quirks, special requirements and caveats.

Our first conclusion was that relying on Ant to get all this done would be a huge effort - especially in the area of dependency management and transparency.

Our second conclusion was that it would be imperative to not cripple or impede the tools we picked. For example, we wanted to leverage all of Eclipse's Plugin Development Environment (PDE) to ensure developer productivity, but we also wanted to use Maven 2 the way it was meant to be used.

The result is an environment that allowed for very little compromise on neither Eclipse's nor Maven 2's part.

Using Eclipse and Maven 2 with Standard Packaging

Before getting knee-deep in the intricacies of building Eclipse plugins with Maven 2, we want to cover the details of how to use Eclipse and Maven 2 together in general. So, if you build plain old jars and/or J2EE deployable artifacts, like ears and wars, this section covers most of what you need.

The primary thing that kept us sane (or at least on track), while getting the build environment together, was keeping in mind that from the point of view of Maven 2, there is no Eclipse. Like the kid in The Matrix, telling Neo that "there is no spoon", we repeatedly had to remind each other that "there is no Eclipse."

For most things, this is actually not that big of an issue, but as you'll see below, it's not quite that straightforward for Eclipse plugins.

Location of Source Code

The first, and perhaps most crucial, thing to get in place (and get used to) is the location of the source code. As mentioned above, most Eclipse developers are probably used to simply stashing away all their source in a flat workspace, but Maven 2 ultimately doesn't like that. More accurately: Maven 2 can handle a flat layout, but a lot of Maven 2 plugins and ancillary tools assume that the source is hierarchical (and since Maven 2 inherently understands that, it can give you more help), so after spending quite a bit of time trying to get all the pieces to play nice in a flat workspace, we grabbed the bull by the horns and went with Maven 2's convention. Since Eclipse ultimately doesn't like to have nested Java projects in the workspace, we decided to keep all of our projects in a location other than Eclipse's workspace.

The net result of this is that we, by convention, keep all our source code in a directory called projects and only use the Eclipse workspace for the metadata. On Windows we keep the projects directory in the root of either the C:\ or D:\ drive (for command-line convenience) and on other OS's we keep it in the user's home directory. We keep the Eclipse workspace in the default location (usually the user's home directory).

Source Code Layout

The second thing to get used to is having your projects organized in a hierarchy. As mentioned above, a hierarchical project structure is the Maven 2 convention and when adhering to that, Maven 2 can provide you with more help.

By convention, a project created underneath another project becomes a child of that. In Maven 2 terminology the child project becomes a module in the parent project. This is crucial for cutting down on configuration (remember: Maven 2 is all about "Convention over Configuration") and we leverage this capability to both organize and configure our projects. Specifically, we have a root project that holds the "super POM". Underneath that, we have a module for each type of project we need, like J2EE and Eclipse plugins. The actual projects of that type in turn live underneath their respective parent. Artifacts that are platform neutral and whose packaging is jar simply live underneath the root project, as they don't require any further configuration.

Just like in an OO class hierarchy, the common, general information lives at the top and the specialized information lives further down the hierarchy. For example, all the repository and plugin configuration that is common to all the projects sits in the "super POM" and configuration specific to all the Eclipse plugins sits in the Eclipse plugin parent POM. This way, comparatively little configuration is needed in each of the projects that most developers are working on. Specifically, our plugin project POMs only contain the group ID, artifact ID and version of the parent and the plugin itself.

Source Code Management (SCM)

Source code management is largely "business as usual", but a few things must be kept in mind. We happen to use CVS, but other supported SCM systems, like e.g. Subversion, should be usable with the same results and caveats described below.

As mentioned above, we decided to go the hierarchical route. One major contributor to that decision happens to be Continuum, which has some hard and fast assumptions about how things are organized. While trying to use Maven 2 with a flat layout, we found out the hard way that Continuum (at the time of writing, we're using version 1.0.3) does not grok the flat layout. Continuum uses Maven 2 to do the actual build, so it is required to go and get the latest sources from the source code repository before invoking Maven 2. Each POM may have SCM information in it, but in order to build a top-level project along with all of its modules, Continuum is forced to "calculate" where each of the modules lives, since it can't very well get the project's SCM information out of the POM, until it has checked it out of SCM.

The net result of this is that only top-level projects are actual CVS modules; all child projects are simply sub-directories underneath the module. If you, like us, were used to the flat layout with every project being a module, this may take a little getting used to.

One of a few (in our opinion minor) issues we've encountered with this, is that sometimes Eclipse gets a little confused about the state of affairs with respect to SCM. We tend to keep the relevant (grand)parent project(s) open in Eclipse as well as the actual projects we write code in, since a majority of the configuration information we keep is stored in the parent POM(s). Since the child projects are sub-directories under their parent, Eclipse essentially ends up with two (or more) views of each project's directory structure: one for the project itself and one underneath the project's directory under its parent. This leads to mixed messages about whether or not a project's code and resources are up to date or not; and when adding a new child project to CVS, we often end up seeing the newly created CVS directories as new Java packages. More often than not, a quick refresh in the right spot will fix everything, but sometimes we need to close and reopen a project to get rid of the normally hidden CVS directories.

[Tip]

Getting used to hitting F5 in the topmost parent project, helps keep things synchronized.

The Maven 2 Plugin for Eclipse

As we point out throughout this document, Maven 2, like Eclipse, is a framework that relies on plugins to do the actual work. In the case of Maven 2, plugins are responsible for things like compilation, testing, code instrumentation (e.g. for code coverage), reporting and packaging. Not surprisingly, Maven 2 plugins can be built in a number of different ways (even using different languages) and also not surprisingly, you can use Maven 2 to build and package Maven 2 plugins. Maven 2 plugins that are written in Java, are made up of one or more so called Mojos. A Mojo is a Java class that contains some special Javadoc tags (and other goodies) that allow Maven 2 to figure out what parameters are needed and required and what phase (or phases) of the lifecycle the class should partake in.

The nice folks at codehaus have made a bunch of Mojos available for public consumption amongst which is an Eclipse plugin for Maven 2.

This plugin is one of the primary bridges we use between Maven 2 and Eclipse. After reminding ourselves once more that "there is no spoon" and thus no Eclipse, we never check into SCM any Eclipse project artifacts i.e. .project, .classpath, .wtpmodules, .components, .settings etc. So once a project is checked out of SCM you can duly go to the root of project you are interested in working on and type mvn eclipse:eclipse which goes and looks at the project's POM file and conveniently generates appropriate .project and .classpath files as well as the rest of the necessary Eclipse artifacts. As mentioned above we keep all our code in a projects directory, so at this point a developer would use Eclipse's Import -> Existing projects into Workspace option to get the project into Eclipse.

This whole process allows us to maintain a single source of truth for all project information, the POM.

What Makes Eclipse Plugins a Special Case?

Now that we've covered Eclipse and Maven 2 in general, we can move on to dealing with the Eclipse plugins. What makes Eclipse plugins so different that they need special attention in the first place? Well, it depends on what you're trying to accomplish. If your only goal is to build plugins using Maven 2, you could almost use Maven 2 out-of-the-box, but if you also want to be able to use PDE for launch and debug and have the ability to do automated tests, code coverage, etc., then you need to handle a few things in a way that Maven 2 doesn't support without some tweaking.

[Note]

When we talk about Maven 2, we're actually talking about the available Maven 2 plugins rather than Maven 2 proper. Like Eclipse/RCP, Maven 2 is a framework in which numerous capabilities have been implemented. As you'll see in the following sections, we've solved the plugin special case handling by creating a Maven 2 plugin that does what we need.

Packaging

At face value, an Eclipse plugin is nothing special: it's a jar or directory containing the usual suspects:

  • Compiled classes
  • The manifest
  • I18N/L10N resources
  • Images and icons
and of course the plugin specifics:
  • plugin.xml
  • plugin.properties
possibly along with:
  • Other jars

[Note]

With Eclipse now being OSGi based, the manifest technically is also a plugin-specific resource. The manifest now declares things like name, ID and version of the plugin (or bundle in OSGi speak) as well as the dependencies; all these details used to be expressed in the plugin.xml file.

In the case where a plugin contains other jars it is deployed as a directory-based plugin, whereas a plugin that does not have jars and also doesn't need a directory-based home, can (and should) be packaged as a single jar.

In the latter case, this maps well to the default Maven 2 packaging type of jar, but Maven 2 really doesn't have the notion of a directory packaging, so it is necessary to use the pom type. The reason for this is that when a plugin exposes packages that live in one or more jars, Maven 2 needs to know about those jars, so that it can satisfy dependencies. Without this, other plugins that depend upon classes in such a jar could never be compiled.

Layout

One of the really nice features of PDE is its ability to do self-hosting of plugins under development. Essentially, PDE can launch a second instance of your development environment that also contains the plugin(s) under development. This behavior makes debug and testing very easy, but requires that the plugin projects are laid out exactly the way the plugin will be packaged when released.

[Tip]

The default behavior of PDE is to launch the second instance of Eclipse with all the plugins of the development environment along with all plugins under development in that environment. This, of course, can be configured through the launch configuration editor's "Plug-ins" tab.

The challenge here is that Maven 2, by default, expects everything to be neatly tucked away under the src/main/resources directory. If we stuck all the plugin's icons and images along with the plugin.xml and plugin.properties files in that directory, Maven 2 would correctly package the plugin for release but PDE wouldn't be able to launch it correctly, making debugging the plugin virtually impossible.

Dependency Management

Maven 2 must have all the dependencies spelled out in the POM for it to resolve and manage them. In contrast, Eclipse plugins must have dependencies clearly defined in their manifest file. This is a major point of conflict, but while Maven 2 can be taught otherwise, Eclipse will not budge on this issue. All dependencies must exist in the manifest and they are resolved from there during development as well as in the runtime.

Testing

Perhaps the biggest difference between plain old jar projects and an Eclipse plugin project is that of unit testing.

The PDE folks have, of course, done an excellent job of making running, debugging and testing plugins easy. For the most part, there is really no difference between unit testing a set of "normal" classes and a set of plugin classes when working in Eclipse.

The challenge, when using Maven 2 (or Ant or another "external to Eclipse" build infrastructure), is that unit testing has to take place in the context of an Eclipse instance. All but the simplest plugin classes cannot execute correctly without all of the core Eclipse infrastructure around it, along with whatever other Eclipse plugins the plugin in question depends upon.

[Note]

We have decided to not attempt to unit test SWT UI pieces - a decision, by the way, that is orthogonal to our choice of using Maven 2. Regardless of your build environment, Eclipse plugins are usually tested using JUnit and trying to automate SWT events etc. in that environment does not seem productive. By convention, we try to keep the UI layer as thin as humanly possible and keep all the business logic squirreled away in classes that don't interact directly with SWT and only sparsely so with JFace - making it much more natural to test with JUnit. Also, we tend to not directly unit test code that is generated (like EMF Ecore models or Hibernate POJOs) and since we use the very excellent Jigloo GUI builder from Cloudgarden to create all the Composites for our views, wizard pages etc. the amount of direct SWT code we write ourselves is fairly limited. Our company uses dedicated UI testing tools to do automated functional testing of the GUI.

The solution to this, as provided by the Eclipse folks, is the Eclipse test framework. This framework is essentially an Eclipse plugin that provides the ability to run an Eclipse workbench, and get it to execute a suite of JUnit tests, using Ant. You can download the test framework from the same location as where you get the Eclipse SDK proper.

A final, minor thing needs to be dealt with: Eclipse, of course, uses JNI to provide the underlying implementation for SWT. When working in Eclipse, this is effectively not an issue because the PDE self hosting mechanism makes both the cross-platform SWT API as well as the platform specific implementation available. In a multi-platform environment, the correct SWT plugin needs to be made available for the compile phase, before any testing can take place.

The Solution

Now that you understand the various problems we were facing you can hopefully appreciate that we didn't just run straight to the keyboard to write some Mojo's just because we could (cough, cough).

Just to re-iterate, the solution keeps in mind two key requirements of our development process. The first being that when the developer is writing, compiling and testing the code in Eclipse, he/she should not really have to care about Maven 2. The other requirement being that the developer or machine building and testing the Eclipse plugins using Maven 2, should/will not have any notion about anything Eclipse.

[Note]

The build system and infrastructure solution presented here does not yet handle any aspects of Eclipse features. At the time of writing, we believe that features need to be dealt with only during the Maven 2 release phase, which we would like to cover in part 2 of this article.

We realized that the Eclipse plugins we write can be fundamentally put into two categories. We had Eclipse plugins that contained code in them, which we called Source Plugins and we had other Eclipse plugins whose sole purpose was to expose jars that other plugins depended on. We called these Binary Plugins. Just to be completely honest, we did have plugins that contained jars as well as code, but to keep things simple (as we like to do) we decided that it was easy enough to keep our Eclipse plugins purely source or binary.

After some trials with implementing this concept using the default build lifecycle and the predefined packaging types pom and jar, we reverted to using packaging types that we created called source-plugin (for all Eclipse plugin projects that contained source code) and binary-plugin (for all Eclipse projects that simply exposed jars). This allowed us to control two of the concerning issues, packaging and testing. In order to create our own packaging types, we had to define our own Maven 2 lifecycles. The configuration and some explanation for this can be found in the section called “Lifecycle Definitions”.

We then proceeded to solve all the issues described in the above sections by writing Mojos or by providing configuration to existing Mojos. These are all described in some detail in the following sub-sections. Most of the Mojo's we wrote help existing Maven 2 plugins do their work in spite of putting information that they need in places that they do not inherently understand (but Eclipse does). Some Mojos we wrote go the other way and help take information from a Maven 2 POM and generate appropriate files that allow Eclipse to obtain the same information.

Maven 2 Plugin (Mojos)

  • Deploy Mojo. (Issue: Maven 2 has to find Eclipse dependencies)

    We set out to use Maven 2 the way it was meant to. Maven 2 does an excellent job of dependency management. All it needs is a groupId, artifactId and a version to locate any artifact in a repository. We were at first diligent about trying to find all the Eclipse plugins that our plugins would depend upon in well known Maven 2 repositories. When that failed, we came up with a Mojo that would scrape all the plugins in an Eclipse installation and deploy them in our internal Maven 2 repository.

    The Deploy Mojo does exactly this. It knows how to scrape an existing Eclipse installation and create a script file that contains all the mvn deploy commands needed to deploy all the jars.

    For a jar based plugin, we simply deploy the jar, using the plugin's id and version as the Maven 2 artifact id and version.

    For a directory based plugin, we deploy a pom for the entire plugin that contains the dependencies on the jars it needs and/or provides. This way, when another plugin depends on it, the POM provides all the jars so that compilation will work.

    Example 1. Eclipse Scrape and Deploy

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        if (!eclipseHome.isDirectory()) {
            throw new MojoExecutionException(
                "The 'eclipseHome' location '" +
                 eclipseHome +
                  "' is not a valid directory");
        }
        if (repositoryHome == null) {
            repositoryHome = new File(
                System.getProperty(KEY_USER_DIR));
            getLog().info("Defaulting 'repositoryHome' to '" +
                repositoryHome + "'");
        }
        if (!repositoryHome.isDirectory()) {
            throw new MojoExecutionException(
                "The 'repositoryHome' location '" +
                repositoryHome +
                "' is not a valid directory");
        }
        if (prefix == null) {
            getLog().info("Defaulting 'prefix' to '<none>'");
        }
        if (prefix != null) {
            String[] temp = prefix.split(",");
            for (int i = 0; i < temp.length; i++) {
                String prefix = temp[i];
                if (!prefix.endsWith(".")) {
                    prefix = prefix + ".";
                }
                prefixes.add(prefix);
            }
        }
        try {
            populateRepository();
        } catch (IOException e) {
            throw new MojoExecutionException(
                "Repository population failed", e);
        }
        try {
            writePOMs();
        } catch (IOException e) {
            throw new MojoExecutionException(
                "POM writing failed", e);
        }
        try {
            createPOMDeploymentScript();
        } catch (IOException e) {
            throw new MojoExecutionException(
                "POM deployment script creation failed", e);
        }
    }
                                

    [Note]

    An Eclipse plugin manifest lets you specify which packages, if any, a plugin actually exposes for other plugins to use, regardless of the actual packages found in all its jars. When we scrape an Eclipse installation we do not honor this information and simply make all the jars available. Since Maven 2's compiler plugin does not have the same capability, allowing it to compile against a subset of packages in a jar, we decided that this would be a minor issue.

  • POM Modifier Mojo. (Issue: Eclipse plugin must have dependencies in its manifest file)

    For all our source-plugin projects we define the dependencies in the manifest file keeping Eclipse happy, and making developing Eclipse plugins in Eclipse as easy as it should be. So we know that all this while we claimed that "if it ain't in the POM, it doesn't exist", and now by putting dependencies in the manifest file, Maven 2 can't resolve any of the dependencies for us. The solution here is to dynamically augment the POM, during the build, with information from the Eclipse plugin's manifest. Thus we created this Mojo that parses out the manifest file and modifies the POM (in memory) with the dependencies in the appropriate build phase.

    Example 2. In-Memory POM Modification

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        if (logModifications) {
            logOriginalPom();
        }
        if ((manifest == null) || !manifest.exists()) {
            String path = baseDirectory.getAbsolutePath() +
                File.separator +
                ManifestConstants.MANIFEST_DIRECTORY +
                File.separator +
                ManifestConstants.MANIFEST_FILE_NAME;
            manifest = new File(path);
        }
        if (manifest.exists()) {
            try {
                FileInputStream fileInputStream =
                    new FileInputStream(manifest);
                Manifest manifest = new Manifest(fileInputStream);
                ManifestParser parser = new ManifestParser(manifest);
                parser.setGroupId(mavenProject.getGroupId());
                parser.parse();
                DependencyManagement parsedManagement =
                    parser.getDependencyManagement();
                List dependencies = parsedManagement.getDependencies();
                if (dependencies != null && dependencies.size() > 0) {
                    Iterator iterator = dependencies.iterator();
                    while (iterator.hasNext()) {
                        Dependency dependency =
                            (Dependency) iterator.next();
                        if (dependency.getVersion() == null) {
                            dependency.setVersion(parser.getVersion());
                        }
                        convertToSnapshotVersion(dependency);
                        if (ManifestConstants.SCOPE_SYSTEM.equals(
                            dependency.getScope())) {
                            String systemPath =
                                dependency.getSystemPath();
                            File file = new File(systemPath);
                            if (!file.exists()) {
                                file = new File(baseDirectory,
                                    systemPath);
                                dependency.setSystemPath(
                                    file.getAbsolutePath());
                            }
                        } else {
                            String type =
                                mavenProject.getProperties().
                                    getProperty(
                                        dependency.getArtifactId());
                            if (type != null) {
                                dependency.setType(type);
                            } else {
                                Artifact artifact =
                                    artifactFactory.createBuildArtifact(
                                        dependency.getGroupId(),
                                        dependency.getArtifactId(),
                                        dependency.getVersion(),
                                        "jar");
                                try {
                                    artifactResolver.resolve(
                                        artifact,
                                        mavenProject.
                                            getRemoteArtifactRepositories(),
                                        artifactRepository);
                                    dependency.setType("jar");
                                } catch (Exception e) {
                                    // Do Nothing
                                    // since the default is 'pom'
                                }
                            }
                        }
                        mavenProject.getDependencies().add(dependency);
                    }
                }
            } catch (IOException e) {
                throw new MojoExecutionException(
                    "Failed to update POM from Eclipse Manifest", e);
            }
        } else if (requireManifest) {
            throw new MojoExecutionException(
                "The Manifest file cannot be found.");
        }
        if (logModifications) {
            logModifiedPom();
        }
    }
                                

    We attach this Mojo to the process-resources phase of the build lifecycle as it is right before the compile phase.

    A sticky point in all of this is that manifest file entries for dependencies generally don't contain the same amount of information that Maven 2 POMs do. As mentioned, we use the Eclipse plugin id and version as the Maven 2 artifact id and version. Currently, Eclipse is smart enough to pick a matching, dependent plugin based on the id alone, but Maven 2 still requires a specific version spelled out. Luckily, the OSGi infrastructure does support the presence of a version and the Eclipse manifest editor also has support for entering it, so it is a requirement - for this Mojo to work correctly - that all Eclipse plugin dependencies have a version. Admittedly, this is a tad annoying, but since it's defaulted to 1.0.0 it's easy to recognize when Maven 2 complains about a missing Callisto jar with that version.

    Figure 2. An Eclipse Plugin with Versioned Dependencies

    An Eclipse Plugin with Versioned Dependencies

    Having resolved the version, there is still no groupId and certainly no packaging type to be found. At this point we go by the assumption that all Eclipse plugins we develop and all Eclipse plugins we scrape belong to the same groupId. That leaves us with the problem of finding out what packaging type the dependency is. Plugins come in two flavors: jar ("mapped" to the jar packaging) and directory (mapped to the pom packaging). Since Eclipse and OSGi don't make any distinction between the two, there is no way to specify this in the manifest, so we have solved this issue in two ways:

    • We look for a pom and if that doesn't exist, we assume it's a jar. Regrettably, this can be quite slow, because Maven 2 tries to find all artifacts both locally and remotely.
    • We support the configuration of the plugin packaging in the POM, so before resolving a dependency we check that configuration and if found - we're done. For this we employ the use of properties in Maven 2. Using properties we map the artifactId of an artifact to its packaging type. The listing below is a snippet from a pom.xml file.

      Example 3. Properties Map Artifact Packaging

      <properties>
          <org.eclipse.core.runtime>
              jar
          </org.eclipse.core.runtime>
          <org.eclipse.jface>
              jar
          </org.eclipse.jface>
          <org.eclipse.ui>
              jar
          </org.eclipse.ui>
          <org.eclipse.datatools.connectivity>
              pom
          </org.eclipse.datatools.connectivity>
          <org.eclipse.datatools.modelbase.dbdefinition>
              pom
          </org.eclipse.datatools.modelbase.dbdefinition>
          <org.eclipse.datatools.modelbase.sql>
              pom
          </org.eclipse.datatools.modelbase.sql>
          <org.eclipse.datatools.connectivity.sqm.core>
              jar
          </org.eclipse.datatools.connectivity.sqm.core>
          <org.eclipse.datatools.connectivity.sqm.core.ui>
              pom
          </org.eclipse.datatools.connectivity.sqm.core.ui>
          <org.eclipse.datatools.connectivity.sqm.server.ui>
              pom
          </org.eclipse.datatools.connectivity.sqm.server.ui>
          <org.eclipse.datatools.connectivity.ui>
              pom
          </org.eclipse.datatools.connectivity.ui>
          <org.eclipse.datatools.connectivity.ui.dse>
              pom
          </org.eclipse.datatools.connectivity.ui.dse>
          <org.eclipse.datatools.modelbase.sql>
              pom
          </org.eclipse.datatools.modelbase.sql>
          <org.junit>
              pom
          </org.junit>
      </properties>
                                                  

    For troubleshooting purposes, the Mojo usually spits out a before and after image of the POM; that way it's easy to see what it has done and fix any issues there might be.

  • Plugin Packaging Mojos. (Issue: Eclipse plugins need to packaged a certain way)

    This came mostly as an unfortunate side affect of using our own Maven 2 lifecycle definitions and our own packaging types. The default packaging Mojos have some assumptions of how certain packaging types can be packaged and how others (like pom packagings) can be excluded from the norm. Since there is no extension mechanism for packaging types we couldn't define our binary-plugin as a kind of pom packaging. Hence these Mojos that know how to treat our packagings.

    This Mojo, while packaging, will also pick the manifest file from the META-INF directory located at the root of the plugin project. At the time of writing this article, the default jar packager in Maven 2 would ignore this manifest file even if it was specified as a resource.

    Example 4. Eclipse Plugin Packaging

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        String packaging = project.getPackaging();
        if ((packaging.equals("pom")) ||
            (packaging.equals("binary-plugin"))) {
            getLog().debug(
                "Nothing to do for packaging '" +
                packaging +
                "'");
            return;
        }
        File jarFile = createArchive();
        String classifier = getClassifier();
        if (classifier != null) {
            projectHelper.attachArtifact(
                getProject(),
                "jar",
                classifier,
                jarFile);
        } else {
            getProject().getArtifact().setFile(jarFile);
        }
    }
                                

  • Manifest Generator Mojo. (Issue: Maven wants all dependencies in the POM)

    This Mojo pertains to the plugins with binary-plugin packaging only. Since these plugins are made up of 3rd party jars only, there is no code to maintain, check in or build. In fact, everything needed can be well described in the POM, so that's the only artifact we keep in CVS. For build, packaging and release this is all good, but when testing other plugins that depend on a plugin like this, PDE will have a problem because it requires a valid manifest and all the jars to actually exist in the project.

    The solution here is to get all the project information, stored in the POM, from Maven 2 and have the Mojo create a valid Eclipse plugin manifest on the fly. Also, since it spells out the dependencies the Mojo can copy all the jars from the Maven 2 repository into the right spot in the plugin project, enabling PDE to execute any other plugin that depends on binary-plugin type plugin.

    Example 5. Eclipse Plugin Manifest Generation

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        File manifestDirectory = new File(
            destinationDirectory,
            MANIFEST_DIRECTORY);
        getLog().debug("The manifestDir is " + manifestDirectory);
        if (!manifestDirectory.exists()) {
            if (!manifestDirectory.mkdir()) {
                throw new MojoExecutionException(
                "Unable to create directory '" +
                manifestDirectory +
                "'");
            }
        }
        libDirectory = new File(destinationDirectory, LIB_DIRECTORY);
        getLog().debug("The libDir is " + libDirectory);
        if (!libDirectory.exists()) {
            if (!libDirectory.mkdir()) {
                throw new MojoExecutionException(
                    "Unable to create directory '" +
                    libDirectory +
                    "'");
            }
        }
        File manifestFile = getManifestFile(manifestDirectory);
        Manifest manifest = new Manifest();
        Attributes mainAttributes = manifest.getMainAttributes();
        writeInitialManifestAttributes(mainAttributes);
        resolveBundleClasspathEntries(mainAttributes);
        writeManifestToFile(manifestFile, manifest);
    }
                                

    [Note]

    The only artifact we ever check in is the POM itself, since everything else either is managed by Maven 2 or can be generated cheaply. Ordinarily we would have preferred to stick the jars and the manifest under the target directory, but PDE wants everything in its proper place, right under the project's directory. This is not really a problem, other than the fact that we have to remember to not check the manifest or the jars into CVS and mvn clean doesn't clean the jars or the manifest. We handle the former partly thought process (only a few developers "own" and manipulate the binary-plugin packaged plugins) and partly by keeping .cvsignore up to date. The latter could be handled by tweaking the clean phase, but it has not annoyed us enough yet to do so.

    [Note]

    Eclipse and OSGi allow you specify which packages in a jar are actually available for other plugins to use. Maven 2 does not have anything equivalent, so when generating the Eclipse plugin manifest on the fly, we introspect all the dependency jars and export all packages in these jars. While there might be scenarios where we really only want some jar's packages exposed and not necessarily packages from any jars they depend upon (and thus are required dependencies), we don't see this as enough of an issue to change the process around this.

  • PDE Test Mojo. (Issue: Need to be able to run tests in Eclipse)

    Since one of our goals was to use the standard Maven 2 lifecycle and associated phases, we wanted to allow the developers to simply use mvn test for testing and mvn cobertura:cobertura for code coverage analysis. This, of course, is a bit of a challenge, since Maven 2 doesn't know anything about Eclipse and its test framework.

    [Note]

    Since the Eclipse test framework is Ant based, it needs a build file that it can execute. This build file usually relies on a more complex build file, provided by the test framework, so it's actually fairly simple. A typical build file contains targets for initialization; execution of the test suite; cleanup and a target for running the previous three in the right order. The init, cleanup and run targets are pretty much the same for all plugins but the suite target contains an Ant call (to the provided build file) for each test. Maintaining this target, of course, is a manual and error-prone task.

    The solution, provided by this Mojo, is to automatically orchestrate the entire test phase - and do it in such a way that the rest of the Maven 2 build infrastructure can work with it.

    Example 6. The PDE Test Orchestration

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        if (!needsToRunTests()) {
            return;
        }
        setupDefaults();
        PDETestEclipseValidator validator =
            new PDETestEclipseValidator(getLog(),
                baseDirectory,
                eclipseDirectory,
                prefixes,
                testFrameworkVersion);
        validator.execute();
        File targetDirectory = new File(baseDirectory,
            PDETestConstants.TARGET_DIRECTORY);
        File testDirectory = new File(targetDirectory,
            PDETestConstants.PDE_TEST_DIRECTORY);
        if (!testDirectory.exists()) {
            if (!testDirectory.mkdirs()) {
                throw new MojoExecutionException(
                    "Unable to create '" +
                    PDETestConstants.PDE_TEST_DIRECTORY +
                    "' directory '" +
                    testDirectory +
                    "'");
            }
        }
        PDETestEclipseCleaner cleaner = new PDETestEclipseCleaner(
            getLog(),
            baseDirectory,
            eclipseDirectory,
            prefixes);
        cleaner.execute();
        File buildFile = new File(testDirectory, "test.xml");
        PDETestFileBuilder builder = new PDETestFileBuilder(
            getLog(),
            baseDirectory,
            eclipseDirectory,
            prefixes,
            project,
            buildFile,
            testOutputDirectory,
            project.getArtifactId(),
            testFrameworkVersion);
        builder.execute();
        PDETestDeployer deployer = new PDETestDeployer(
            getLog(),
            baseDirectory,
            eclipseDirectory,
            prefixes,
            project,
            session);
        deployer.execute();
        PDETestRunner runner = new PDETestRunner(
            getLog(),
            baseDirectory,
            eclipseDirectory,
            prefixes
            buildFile,
            testFailureIgnore);
        runner.execute();
    }
                                

    [Note]

    The helper classes, like PDETestEclipseValidator used by the Mojo, are purposefully modeled closely after a Mojo proper. However, since they're not actually Mojos they don't require the extract Javadoc based markup and can be used more freely.

    The first step, done by the Mojo, is to validate the Eclipse test instance. Since the test framework cannot do self hosting, like PDE, a second Eclipse instance is required for testing. By convention, we keep that in the user's home directory, but the location can be configured. So before doing anything else, a sanity check is made to ensure that there is a valid Eclipse instance with the correct version of the test framework installed.

    Example 7. The PDE Test Eclipse Validation

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        File eclipseDirectory = getEclipseDirectory();
        getLog().info(
            "Validating target Eclipse environment in '" +
            eclipseDirectory +
            "'...");
        if (!eclipseDirectory.isDirectory()) {
            throw new MojoExecutionException(
            "The Eclipse directory location '" +
            eclipseDirectory +
            "' is not a valid directory");
        }
        File startupJarFile = getStartupJarFile();
        getLog().debug(
            "Eclipse startup jar must be in '" +
            startupJarFile +
            ".");
        if (!startupJarFile.exists()) {
            getLog().debug(
                "The file '" +
                startupJarFile +
                "' does not exist.");
            throw new MojoExecutionException(
            "The required startup jar file was not found in '" +
            eclipseDirectory +
            "'");
        }
        File pluginsDirectory = getPluginsDirectory();
        getLog().debug(
            "Eclipse plugins directory must be in '" +
            pluginsDirectory +
            ".");
        if (!pluginsDirectory.isDirectory()) {
            getLog().debug(
            "The location '" +
            pluginsDirectory +
            "' does not exist or is not a directory.");
            throw new MojoExecutionException(
                "The required plugins directory was not found in '" +
                eclipseDirectory +
                "'");
        }
        getLog().debug(
            "The Eclipse PDE test framework version is '" +
            testFrameworkVersion +
            "'.");
        File testPluginDirectory = new File(
            pluginsDirectory,
            "org.eclipse.test_" +
            testFrameworkVersion);
        getLog().debug(
            "The Eclipse PDE test framework plugin must be in '" +
            testPluginDirectory +
            "'.");
        if (!testPluginDirectory.isDirectory()) {
            getLog().debug(
                "The Eclipse PDE test framework plugin does not exist.");
            throw new MojoExecutionException(
                "The required Eclipse test framework plugin '" +
                "org.eclipse.test_" +
                testFrameworkVersion +
                "' was not found in '" +
                pluginsDirectory +
                "'");
        }
    }
                                

    The second step, done by the Mojo, is to clean out any plugins deployed for a previous test. The Mojo has a configurable parameter that specifies the plugin prefix(es) under test. These are the plugins not provided by Callisto. If not configured, the default is the first two segments of the artifact id of the artifact under test (which usually ends up being the reverse of your top-level internet domain name). For example, if you're testing a plugin with id com.princetonsoftech.foo.bar, the default list of prefixes will contain a single entry of com.princetonsoftech. If you're not using the reverse internet domain notation, the default list will contain a single entry with the entire artifact id (in which case you probably will need to configure the list). Any plugin that has an id that matches any of the prefix(es) is simply deleted (so do not configure that list with org.eclipse as that will render your Eclipse installation fairly useless).

    Example 8. The PDE Test Cleanup

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        File eclipseDirectory = getEclipseDirectory();
        getLog().info(
            "Cleaning target Eclipse environment in '" +
            eclipseDirectory +
            "'...");
        File pluginsDirectory = getPluginsDirectory();
        getLog().debug(
            "Looking for plugins in '" +
            pluginsDirectory +
            "'...");
        File[] files = pluginsDirectory.listFiles();
        for (int i = 0; i < files.length; i++) {
            String name = files[i].getName();
            if (!isMatchForPrefix(name)) {
                getLog().debug("Skipping plugin '" + name + "'.");
                continue;
            }
            getLog().info("Cleaning plug-in '" + name + "'...");
            if (files[i].isDirectory()) {
                deleteDirectory(files[i]);
            } else {
                deleteFile(files[i]);
            }
        }
    }
                                

    The third step, done by the Mojo, is to locate all JUnit tests and automatically construct the required Ant build file that will be used by the test framework. The Mojo pretty much follows the same conventions as the normal Surefire plugin and will pick anything that starts or ends with Test and doesn't start with Abstract. In the next release of the Mojo, we'll make that configurable. A small twist here is that Cobertura relies on the test phase to execute the tests (which is good, by the way, since it makes code coverage analysis as easy to orchestrate as a test is), so we need to detect if the Cobertura run-time jar is needed. We do this simply by looking for it in the list of dependencies; if found, we add it to the boot classpath of Eclipse for each test launch. This is conveniently and easily done in the build file, using the vmargs property. Also, when running with Cobertura, the cobertura.ser file needs to be moved to the home directory of Eclipse and it needs to be moved back after the run. In either case, the JUnit XML report files, needed by Surefire for formatting, need to be copied to the plugin's target/surefire-reports directory. Since Ant is so excellent at doing this, we simply construct the build file with all the necessary file operations.

    Example 9. The PDE Test Ant File Construction

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        getLog().info(
            "Building 'test.xml' for PDE testing in '" +
            file +
            "'...");
        findCoberturaJar();
        buffer.append("<?xml version=\"1.0\"?>\n\n");
        buffer.append("<!-- Auto-generated file - DO NOT MODIFY! -->\n\n");
        buffer.append("<project name=\"");
        getLog().debug("The artifact id is '" + artifactId + "'.");
        buffer.append(artifactId);
        buffer.append("\" default=\"run\" basedir=\"");
        buffer.append(getBaseDirectory().getAbsolutePath());
        buffer.append("\">\n");
        appendProperties();
        appendInitTarget();
        List testClasses = new ArrayList();
        findTestClasses(directory, testClasses);
        appendSuiteTarget(testClasses);
        appendCleanupTarget();
        appendRunTarget();
        buffer.append("</project>\n");
        try {
            FileWriter writer = new FileWriter(file);
            writer.write(buffer.toString());
            writer.close();
        } catch (IOException e) {
            throw new MojoExecutionException(
                "Unable to create 'test.xml'", e);
        }
    }
                                

    [Note]

    Since we always construct the build file from scratch during each test run, we stick it in target/pde-test, since we obviously never check it in or otherwise manipulate it manually.

    You can take a look at a sample test.xml file, built for a Cobertura analysis, here.

    The fourth step, done by the Mojo, is to deploy the plugin under test along with all of its dependencies. This is actually a fairly easy thing to do, since Maven 2 has all the information needed.

    Example 10. The PDE Test Artifact Deployment

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        getLog().info(
            "Deploying test artifacts and dependencies to target " + 
            "Eclipse environment in '" +
            getEclipseDirectory() +
            "'...");
        deployPluginArtifact();
        deployDependencies();
    }
    
    /**
     * Deploys the plugin artifact itself.
     * @throws MojoExecutionException
     */
    private void deployPluginArtifact() throws MojoExecutionException {
        getLog().debug("Deploying the plugin artifact...");
        File targetDirectory = new File(
            getBaseDirectory(),
            PDETestConstants.TARGET_DIRECTORY);
        File testDirectory = new File(
            targetDirectory,
            PDETestConstants.PDE_TEST_DIRECTORY);
        String artifactName =
            project.getArtifactId() + "-" +
            project.getVersion() +
            DeployConstants.EXTENSION_JAR;
        File sourceFile = new File(testDirectory, artifactName);
        File pluginsDirectory = getPluginsDirectory();
        File destinationFile = new File(pluginsDirectory, artifactName);
        try {
            copyFile(sourceFile, destinationFile);
        } catch (IOException e) {
            throw new MojoExecutionException(
            "Unable to deploy plugin artifact", e);
        }
    }
                                

    As is the case for the pre-test cleanup step, the Mojo needs to know what dependent plugins to look for and which ones are provided. So the same prefix list configuration (or suitable default) is employed to take care of that. A jar based plugin, with the source-plugin packaging, is easy, because it involves nothing more than copying a jar from the Maven 2 repository to the Eclipse plugins directory. A directory based plugin is a little more complex because in our world that always implies that it's packaging is binary-plugin and as outlined above, the only artifact we keep around for those is the pom.xml file. As also mentioned earlier, we need to reify these plugins on the fly to enable PDE launch, debug and test, so the PDE test Mojo reuses that code to create each binary plugin directly in the Eclipse plugins directory.

    Example 11. The PDE Test Dependency Deployment

    /**
     * Deploys the plugin artifact's dependencies.
     * @throws MojoExecutionException 
     * @throws MojoFailureException 
     */
    private void deployDependencies()
            throws MojoExecutionException, MojoFailureException {
        getLog().debug("Deploying the dependent artifacts...");
        Iterator artifacts = project.getDependencyArtifacts().iterator();
        while (artifacts.hasNext()) {
            Artifact artifact = (Artifact) artifacts.next();
            getLog().debug("Processing artifact: " + artifact);
            String scope = artifact.getScope();
            if (!Artifact.SCOPE_COMPILE.equals(scope)) {
                getLog().debug(
                    "Skipping artifact with '" +
                    scope +
                    "' scope.");
                continue;
            }
            if (isMatchForPrefix(artifact.getArtifactId())) {
                String type = artifact.getType();
                if (DeployConstants.PACKAGING_JAR.equals(type)) {
                    deployJar(artifact);
                } else if (DeployConstants.PACKAGING_POM.equals(type)) {
                    deployPom(artifact);
                } else {
                    getLog().warn(
                        "Don't know how to deploy artifact with type '" +
                        type +
                        "'.");
                }
            }
        }
    }
                                

    The fifth, and last, step for the Mojo is to launch Eclipse with the newly constructed build file. For that purpose, we've created a simple program runner class that uses standard Java Runtime.exec() method to launch a Process along with a couple of Threads to read the standard output and error streams.

    Example 12. The PDE Test Execution

    protected void doExecute()
            throws MojoExecutionException, MojoFailureException {
        String command = createCommand();
        ProgramRunner runner = new ProgramRunner(command);
        try {
            runner.execute();
        } catch (IOException e) {
            throw new MojoExecutionException(
                "The Eclipse launch failed", e);
        } catch (InterruptedException e) {
            throw new MojoExecutionException(
                "The Eclipse execution was interrupted", e);
        }
        extractErrorsAndFailures();
    }
                                

    One minor twist we had to take care of here is that of the underlying operating system and architecture. The Eclipse launcher usually takes care of all of this, but since the test framework command line requires the user to specify this, the Mojo has a few simple methods to find the OS and architecture values.

    When using Maven 2 with Surefire, you'll notice that in addition to providing the text files, used for reporting, it also spits out the test statistics on standard out. The Eclipse test framework Ant file doesn't do this, so the Mojo reads all the JUnit XML files and formats the pertinent content in a way similar to Surefire. Lastly, from an Ant perspective each test execution is a successful build if the Ant file runs without error - regardless of whether or not any errors were found. This can be confusing for developers who are used to Surefire, so in addition to listing the test results on standard out, the Mojo also checks to see if there were any failures or errors - and if so, it fails the Maven build.

    Example 13. The PDE Test Dump and Extract

    /**
     * Extracts errors and failures from the surefire reports.
     * @throws MojoExecutionException
     * @throws MojoFailureException
     */
    private void extractErrorsAndFailures()
            throws MojoExecutionException, MojoFailureException {
        getLog().info(
            "Extracting errors and failures from surefire reports");
        System.out.println(
            "-------------------------------------------------------");
        System.out.println(" T E S T S ");
        System.out.println(
            "-------------------------------------------------------");
        File targetDirectory = new File(
            getBaseDirectory(),
            PDETestConstants.TARGET_DIRECTORY);
        File surefireDirectory = new File(
            targetDirectory,
            PDETestConstants.SUREFIRE_DIRECTORY);
        if (!surefireDirectory.isDirectory()) {
            getLog().warn(
                "The surefire reports directory '" +
                surefireDirectory +
                "' was not created.");
            return;
        }
        File[] files = surefireDirectory.listFiles();
        if ((files == null) || (files.length == 0)) {
            getLog().warn("No surefire reports were moved.");
            return;
        }
        try {
            DocumentBuilderFactory factory =
                DocumentBuilderFactory.newInstance();
            builder = factory.newDocumentBuilder();
        } catch (Exception e) {
            throw new MojoExecutionException(
                "Unable to create XML document builder", e);
        }
        for (int i = 0; i < files.length; i++) {
            String name = files[i].getName();
            if (name.endsWith(".xml")) {
                extractErrorsAndFailures(files[i]);
            }
        }
        if ((failures > 0) || (errors > 0)) {
            if (testFailureIgnore) {
                getLog().info("Ignoring test failures");
                return;
            }
            StringBuffer buffer = new StringBuffer();
            buffer.append("There are test ");
            if (failures > 0) {
                buffer.append("failures");
                if (errors > 0) {
                    buffer.append(" and ");
                }
            }
            if (errors > 0) {
                buffer.append("errors");
            }
            buffer.append(".");
            throw new MojoFailureException(buffer.toString());
        }
    }
                                

Lifecycle Definitions

Maven 2 allows you to specify custom lifecycle definitions. To do this all you need to do is place a file called components.xml in your Maven 2 plugin jar under META-INF/plexus and then configure your plugin as an extension. The definitions you provide basically take the form of Plexus Components.

For our solution we essentially define a set of components that know how to handle artifacts with the packaging source-plugin or binary-plugin, and also know what lifecycle phases these artifacts should have for their default lifecycle.

  • source-plugin. An Eclipse plugin that contains source code

    This Lifecycle definition is meant for all Eclipse plugins that you are developing that contain any kind of source. We put the following definition in a components.xml in our plugin.

    [Note]

    When you define your own packaging, Maven 2 by convention will install the artifact with the extension being the new packaging type. So in this case it would be .source-plugin. We play a little trick here and say that the extension for source-plugin artifacts should be jar. Thus anything java still knows what the artifact is and can compile against it.

    Example 14. Components entry for source-plugin

    <component>
        <role>
            org.apache.maven.lifecycle.mapping.LifecycleMapping
        </role>
        <role-hint>source-plugin</role-hint>
        <implementation>
            org.apache.maven.lifecycle.mapping.DefaultLifecycleMapping
        </implementation>
        <configuration>
            <lifecycles>
                <lifecycle>
                    <id>default</id>
                    <phases>
                        <process-resources>
                            org.apache.maven.plugins:maven-resources-plugin:resources
                        </process-resources>
                        <compile>
                            org.apache.maven.plugins:maven-compiler-plugin:compile
                        </compile>
                        <process-test-resources>
                            org.apache.maven.plugins:maven-resources-plugin:testResources
                        </process-test-resources>
                        <test-compile>
                            org.apache.maven.plugins:maven-compiler-plugin:testCompile
                        </test-compile>
                        <test>
                            org.apache.maven.plugins:maven-psteclipse-plugin:test
                        </test>
                        <package>
                            org.apache.maven.plugins:maven-psteclipse-plugin:package
                        </package>
                        <install>
                            org.apache.maven.plugins:maven-install-plugin:install
                        </install>
                        <deploy>
                            org.apache.maven.plugins:maven-deploy-plugin:deploy
                        </deploy>
                    </phases>
                </lifecycle>
            </lifecycles>
        </configuration>
    </component>
                                    

  • binary-plugin. An Eclipse plugin that only contains jars and no source code

    This Lifecycle definition is meant for all Eclipse plugins that expose the dependencies that your source-plugins need. They do not contain any kind of source. We put the following definition in a components.xml in our plugin.

    [Note]

    Again, like in the source-plugin case, we play a similar trick and define the extension of all binary-plugin artifacts to be pom. Thereby, we have only ever either poms or jars to deal with.

    Example 15. Components entry for binary-plugin

    <component>
        <role>
            org.apache.maven.lifecycle.mapping.LifecycleMapping
        </role>
        <role-hint>binary-plugin</role-hint>
        <implementation>
            org.apache.maven.lifecycle.mapping.DefaultLifecycleMapping
        </implementation>
        <configuration>
            <lifecycles>
                <lifecycle>
                    <id>default</id>
                    <phases>
                        <package>
                            org.apache.maven.plugins:maven-site-plugin:attach-descriptor
                        </package>
                        <install>
                            org.apache.maven.plugins:maven-psteclipse-plugin:install
                        </install>
                        <deploy>
                            org.apache.maven.plugins:maven-deploy-plugin:deploy
                        </deploy>
                    </phases>
                    <optional-mojos>
                        <optional-mojo>
                            org.apache.maven.plugins:maven-site-plugin:attach-descriptor
                        </optional-mojo>
                    </optional-mojos>
                </lifecycle>
            </lifecycles>
        </configuration>
    </component>
    
    <component>
        <role>
            org.apache.maven.artifact.handler.ArtifactHandler
        </role>
        <role-hint>source-plugin</role-hint>
        <implementation>
            org.apache.maven.artifact.handler.DefaultArtifactHandler
        </implementation>
        <configuration>
            <extension>jar</extension>
            <type>source-plugin</type>
            <packaging>source-plugin</packaging>
            <language>java</language>
            <addedToClasspath>true</addedToClasspath>
        </configuration>
    </component>
    
    <component>
        <role>
            org.apache.maven.artifact.handler.ArtifactHandler
        </role>
        <role-hint>binary-plugin</role-hint>
        <implementation>
            org.apache.maven.artifact.handler.DefaultArtifactHandler
        </implementation>
        <configuration>
            <extension>pom</extension>
            <type>binary-plugin</type>
            <packaging>binary-plugin</packaging>
            <language>java</language>
            <addedToClasspath>true</addedToClasspath>
        </configuration>
    </component>
                                    

Configuration

We have arranged our project hierarchy so that all our Eclipse plugin projects sit under one parent. We call this our Eclipse Plugin Super POM. This has been convenient as all configuration that plugin projects require can be defined once.

Besides the configuration for the Maven 2 plugin that we provide (described later in the section called “POM Configuration”) the only necessary configuration for all of this to work is that of the resources

Example 16. Resources configuration for Eclipse plugins

<resources>
    <resource>
        <directory>src/main/java</directory>
        <includes>
            <include>**/*.properties</include>
            <include>**/*.xml</include>
        </includes>
    </resource>
    <resource>
        <directory>.</directory>
        <includes>
            <include>plugin.xml</include>
            <include>plugin.properties</include>
            <include>model/**</include>
            <include>icons/**</include>
        </includes>
    </resource>
</resources>
                    

Putting It All Together

Now that we have a gone through in some detail, what the issues are, and how we went about resolving them, you probably just want to know how to pull it all together. In brief, you need our Mojos, you need to configure your POM(s) and you need to provide an Eclipse installation for testing purposes, other than the Eclipse you use for development. The reason for this is that the test framework requires that all the plugins under test (and their dependencies) are installed. Unlike PDE, it cannot (yet) magically add the plugin(s) under development and not touch the Eclipse installation.

Getting the Mojos

The Mojos are available under the Apache 2.0 license at http://www.princetonsoftech.com/eclipse/mojos.zip

Installing the Test Eclipse Instance

If you follow the convention over configuration mantra, install a Callisto release into your home directory. I.e. on Windows your Eclipse directory would be C:\Documents and Settings\username\eclipse; on Unix/Linux/BSD it would be /home/username/eclipse and on Mac OS X it would be /Users/username/eclipse. The reason for this convention is not a Maven 2 one, but rather the need for supporting the Eclipse test framework on multiple platforms. Between Windows, Unix/Linux and Mac OS X the user's home directory is a simply a simple, convenient common denominator.

Besides the Callisto SDK you will need eclipse-test-framework-3.2.zip unzipped on top of your installation as well the set of base Eclipse plugins that you depend on (DTP, WTP, EMF, GEF etc.)

[Tip]

The Eclipse test framework provides a lot of the same JUnit infrastructure that is provided already by the SDK, so depending on your unzipper, it may prompt you for permission to overwrite certain things. Simply accept all the overwrites and proceed.

And that is pretty much it. If you decide to install Eclipse somewhere else, simply specify the eclipseDirectory parameter in the plugin configuration in the POM.

[Note]

Please do not attempt to use your normal Eclipse instance for Maven 2 testing. The plugins you test, along with their dependencies, will repeatedly be deployed in - and cleaned from - the plugins directory and may therefore wreak havoc on your development environment. Since the test instance is ever only used for the automated testing, the repeated deploy/clean cycles only happen when it's not actually running.

POM Configuration

To have some chance of success, you will need to stick the following configuration in your Eclipse plugin projects' POM, or in a super POM for all your Eclipse plugin projects. You can also take a look at all the below configuration put together in a super POM here.

Configuring the PST Maven 2 Eclipse plugin

Once you have obtained our plugin (we are calling it the "PST Maven 2 Eclipse plugin" for now) you can configure it the following way.

[Note]

Please make a special observation of the <extensions/> element. If you wind up writing your own plugin that contains a components.xml you will need to specify the configuration element <extensions>true</extensions>

Example 17. PST Eclipse configuration

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-psteclipse-plugin</artifactId>
    <version>1.0.6</version>
    <extensions>true</extensions>
    <configuration>
        <logModifications>true</logModifications>
        <testFrameworkVersion>3.1.0</testFrameworkVersion>
    </configuration>
    <executions>
        <execution>
            <id>test-package</id>
            <phase>test-compile</phase>
            <goals>
                <goal>testPackage</goal>
            </goals>
        </execution>
        <execution>
            <id>update</id>
            <phase>process-resources</phase>
            <goals>
                <goal>update</goal>
            </goals>
        </execution>
        <execution>
            <id>update-site-classpath</id>
            <phase>pre-site</phase>
            <goals>
                <goal>update</goal>
            </goals>
        </execution>
    </executions>
</plugin>
                        

Configuring the Maven 2 Eclipse plugin

To get the Maven 2 Eclipse plugin (i.e. mvn eclipse:eclipse) to generate the correct Eclipse project artifacts for your Eclipse plugins you will need to configure it the following way.

Example 18. Eclipse plugin configuration

<plugin>
    <artifactId>maven-eclipse-plugin</artifactId>
    <configuration>
        <projectnatures>
            <projectnature>
			    org.eclipse.jdt.core.javanature
            </projectnature>
            <projectnature>
                org.eclipse.pde.PluginNature
            </projectnature>
        </projectnatures>
        <buildcommands>
            <buildcommand>
                org.eclipse.jdt.core.javabuilder
            </buildcommand>
            <buildcommand>
                org.eclipse.pde.ManifestBuilder
            </buildcommand>
            <buildcommand>
                org.eclipse.pde.SchemaBuilder
            </buildcommand>    
        </buildcommands>
        <classpathContainers>
            <classpathContainer>
                org.eclipse.pde.core.requiredPlugins
            </classpathContainer>
        </classpathContainers>
    </configuration>
</plugin>
                        

Adding Profiles for various platforms

This configuration is to mainly aid our Test Mojos in loading dependencies without knowing about the platform it is running on. The use of Maven 2 Profiles comes in good for being able to write platform neutral Mojos.

Example 19. Sample Profile Configuration

<profiles>
    <profile>
        <id>org.eclipse.swt.win32</id>
        <activation>
            <os>
                <name>windows xp</name>
                <arch>x86</arch>
            </os>
        </activation>
        <dependencies>
            <dependency>
                <groupId>com.princetonsoftech.apollo</groupId>
                <artifactId>org.eclipse.swt.win32.win32.x86</artifactId>
                <version>3.2.1</version>
                <type>jar</type>
            </dependency>
        </dependencies>
    </profile>
    <profile>
        <id>org.eclipse.swt.macosx</id>
        <activation>
            <os>
                <name>mac os x</name>
                <arch>i386</arch>
            </os>
        </activation>
        <dependencies>
            <dependency>
                <groupId>com.princetonsoftech.apollo</groupId>
                <artifactId>org.eclipse.swt.carbon.macosx</artifactId>
                <version>3.2.1</version>
                <type>jar</type>
            </dependency>
        </dependencies>
    </profile>
    <profile>
        <id>org.eclipse.swt.linux</id>
        <activation>
            <os>
                <name>linux</name>
                <arch>i386</arch>
            </os>
        </activation>
        <dependencies>
            <dependency>
                <groupId>com.princetonsoftech.apollo</groupId>
                <artifactId>org.eclipse.swt.gtk.linux.x86</artifactId>
                <version>3.2.1</version>
                <type>jar</type>
            </dependency>
        </dependencies>
    </profile>
</profiles>
                        

Using it all together

Just to round things off, the following is a brief description of the development process for Eclipse binary and source plugins. The descriptions below assume that all configuration/installation specified in the previous section has been done and that all the projects sit in a hierarchy.

Developing binary plugins

You can begin by creating a Maven 2 project by using the archetype for creating projects. Do this under the project that contains the POM with all the configuration described in the previous section.

mvn archetype:create -DgroupId=com.mycompany.app -DartifactId=my-binary-plugin
This should give you the basic template for your project.

Now once you do a

mvn eclipse:eclipse
you will be able to import the project into Eclipse. Open up the POM file (pom.xml) in Eclipse and change the <packaging> element to <packaging>binary-plugin</packaging>. Also add all the dependencies that you want the binary plugin to expose right into the pom file. It may look like the example below when you are done:

Example 20. Example binary plugin POM

<project xmlns="http://maven.apache.org/POM/4.0.0"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
        http://maven.apache.org/maven-v4_0_0.xsd">
    <parent>
        <artifactId>com.princetonsoftech.eclipse.plugin</artifactId>
        <groupId>com.princetonsoftech.eclipse</groupId>
         <version>1.0-SNAPSHOT</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.mycompany.app</groupId>
    <artifactId>my-binary-plugin</artifactId>
    <name>Example Binary Plug-in</name>
    <version>1.0-SNAPSHOT</version>
    <packaging>binary-plugin</packaging>
    <dependencies>
        <dependency>
            <groupId>org.hibernate</groupId>
            <artifactId>hibernate</artifactId>
            <version>3.1.3</version>
        </dependency>
    </dependencies>
</project>
                    

Before other Eclipse plugins can depend upon this newly created binary plugin, you must create all the PDE artifacts from the POM. To do this, type

mvn psteclipse:eclipse-plugin
This will copy all the dependencies, defined in the POM, into a lib directory and generate an OSGi manifest file (MANIFEST.MF) in META-INF. Before Eclipse knows what PDE artifacts to export and make available to other plugins you must open the manifest file in Eclipse's PDE manifest editor and select the "Update the classpath and the compiler compliance settings" link on the editor's "Overview" tab.

Figure 3. Eclipse PDE manifest editor

Eclipse PDE manifest editor

[Note]

Whenever you change the POM for a binary plugin, the above steps must be executed again, in order to copy the correct dependent artifacts; create the manifest and update the classpath and compiler compliance settings.

[Note]

The PDE test Mojo, described earlier, leverages this exact same mechanism when deploying the dependent plugins for a source plugin being tested, but rather than creating the PDE artifacts in your project directory, it creates them directly in the plugins/ directory in your test Eclipse instance.

When doing a Maven 2 build of a binary plugin, the behavior seen is that of a "pom" packaging type. What this means, is that no actual compilation, packaging or testing is performed, but all dependencies are resolved and provided to the dependent plugins.

Developing source plugins

Creating a source plugin essentially is similar to creating a binary one. Simply do

mvn archetype:create -DgroupId=com.mycompany.app -DartifactId=my-source-plugin
and do a
mvn eclipse:eclipse

As with the binary plugin, import the project into Eclipse and change the packaging. However since this is a source plugin you must change the packaging to source-plugin. Your source plugin POM will end up looking something like the following:

Example 21. Example source plugin POM

<project xmlns="http://maven.apache.org/POM/4.0.0"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
        http://maven.apache.org/maven-v4_0_0.xsd">
    <parent>
        <artifactId>com.princetonsoftech.eclipse.plugin</artifactId>
        <groupId>com.princetonsoftech.eclipse</groupId>
        <version>1.0-SNAPSHOT</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.mycompany.app</groupId>
    <artifactId>my-source-plugin</artifactId>
    <name>Example Source Plug-in</name>
    <version>1.0-SNAPSHOT</version>
    <packaging>source-plugin</packaging>
</project>
                    

[Note]

Please remember that source plugins do not ever have dependencies listed in the POM. You must add them as OSGi dependencies in the manifest. The dependencies are injected into the in-memory representation of the POM during the build.

From now on you simply develop and test your Eclipse plugin in PDE as per usual. When building the plugin using Maven 2, the normal build lifecycle comes into effect. For example, to do a clean install of your plugin, do

mvn clean install
and to get a Cobertura code coverage report, do
mvn cobertura:cobertura

Acknowledgements

Based on the excellent article by Chris Aniszczyk and Lawrence Mandel, from IBM, we decided to write this article in DocBook. We took the liberty of further refining their kindly donated XSL stylesheet to include the "Eclipse Corner Article" banner and image. Please find the stylesheet here. [The stylesheet has since been further updated. - ed]

External Links

Legal Notices

Java and all Java-based trademarks are trademarks of Sun Microsystems, Inc. in the United States, other countries, or both.

Linux is a trademark of Linus Torvalds in the United States, other countries, or both.

Windows is a trademark of Microsoft Corporation in the United States, other countries, or both.

UNIX is a registered trademark of The Open Group in the United States and other countries.

Other company, product, or service names may be trademarks or service marks of others.