I had understood "sign" would also pack.
By default, it is not done in our build and there is no pack.gz
after the signing.
Should I do something more specific for pack?
The signing script does, sort of, run pack, before it signs, but it is the
"conditioning" part of pack, sometimes called "repack". This makes sure a
signed jar can later be packed in the most efficient way and that once it is
finally packed it can be unpacked without invalidating the signature. In
other words, it does stuff to the jar that has to be done before its signed.
To actually produce the *pack.gz files is a separate step. you need to do
later. Luckily there is a handy ant task that will both produce the pack.gz
files, and update the repository metadata appropriately:
p2.process.artifacts. I run the task with "antRunner" from Eclipse, since it
is a p2 task. Something like the following
<project
default="processArtifacts"
basedir=".">
<target
name="processArtifacts">
<!-- repoDirLocation normally set by caller, but can be set here
locally if using stand alone -->
<property
name="repoDirLocation"
value="/opt/public/webtools/committers/wtp-R3.3.0-S/20110127064115/S-3.3.0-20110127064115/repository"/>
<p2.process.artifacts
pack="true"
repositoryPath="file:/${repoDirLocation}"/>
<!-- clean up some unnecessary files -->
<delete
failonerror="false">
<fileset
dir="${repoDirLocation}"
includes="artifacts.jar.pack.gz, content.jar.pack.gz,
site.xml"/>
</delete>
</target>
</project>
Part of the "logic" behind this process, is its often best to always produce
the signed jars in a repo, some of us do that every build, and some of us
produced an archive of the whole repo at this point (so it has only
metadata, and jars) and then, when producing the final repo, that you know
will be accessed via http, then run p2.process.artifacts, with 'pack'
attribute (a time consuming process), since the pack.gz files are most
useful to save bandwidth when downloading them with http.
HTH