Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
RE: [cdt-dev] CDT Conference call - Build Model Docs

Thanks Alain,
You've raised some issues I had not considered. My comments below.

> I want to point a difficulty: not all framework enviroment 
> can be coerce
> to do this that easily.  For example, an environment that 
> uses recursive makefiles
> to get the information, on how to build a specific resource, 
> you pretty much
> have to walk up the entire Makefile structures(reimplementing 
> GNU make) to
> get to the right rule.  Sometimes, the info can be fecth by a dry-run
> or a "make -p" as you pointed below.
> 
> # make --dry-run foo.o
> gcc -D_MIPS_ -c src/foo_stub -o foo.o
> 
> Some enviroment builds are a mixed of shell scripts, some 
> other build frameworks
> ride on top of databases, etc ..
> I've seen customers drop excellent IDEs because it was not possible to
> use them in there framework and would like to see flexibility 
> in the CDT.
> 
> 

Ugh! Too true about the cobbled-together build environments. To add to the
list, there's a lot of GNU projects using automake/autoconf tools as well. I
go back to a key requirement for the parser (which in turn provides the
information we need for indexing, searching, code assist, refactoring, etc)
which is knowing the includes paths and defines for a particular project
configuration. To get this from existing projects, we were thinking that
there are two options for such users. The first is to remain an "unmanaged"
system. Under that scenario, users would have to enter environment
information, include paths, and build macros manually. They would also have
to maintain them in the event that they changed. Is that a pain-point? You
bet! However, it represents the lowest level of information we can get away
with storing and still offer those additional features. 

The other option for existing customers is for us to take over managing the
makefiles/build. This would likely involve a one-time, best effort parse of
the makefile system to determine this information automagically, but as you
point out, any number of (sadly typical) roll-your-own build environments
can and will trip us up. In this case, there is a pain point for the user
but if it happens only once, maybe this barrier is low enough. I doubt it,
but it's the best we can do.

A lot of this ties into your points about integration below.

> 
> There are many levels of integration:
> - (High) Take over the entire implementation: On some 
> environment because
>   of the complexity, it will be simpler.
>   For example, the QNX builder is extremely smart/mature and 
> yes we could
>   completely implement the build model, for now we take over 
> the builder
>   at the Eclipse/platform level, but by doing this, other 
> components can
>   not access the information. (Note: this is an example, we 
> do not know
>   yet how we will integrate our builder in the CDT Build Model).
> 
> - (Medium) Extending at some key points:  For example, if the 
> information is kept
>   in a database somewhere and all is needed is to implement 
> is say the IToolChain
>   or maybe the ICBuilder etc ..
> 
> - (Low) Lowest level: just the parser for the a tool, for example the
>   compiler is microsoft or Watcomm compiler with  a different 
> error patterns
>   but the default scheme is fine.
> 
> 
I want to make sure we are talking about the same thing. You have stated
that you think the build model should be completely replaceable so you can
use your QNX builder. The thing is, the build model is a repository for
information (at least in my mind) and the builder is the client. The
extension point for the ISV to insert the information about their own tools
is the toolchain (and maybe to replace the builder). I guess you would like
to be able to use your QNX builder both as the build model AND the builder.
If we come up with a decent interface, could you not extend your builder to
realize it, so the parser (and any other client) can extract the info they
need directly from you?

My other primary goal with the build model and builder is to make the CDT
useful out-of-the-box to as wide a range of developers as possible. Other
vendors can decide the level of integration they want to support. I guess
the challenge for us to to make the extension points and build model
interfaces to the toolchain and build model clients sufficiently general.
 
> 
> 
> How the code is now for the Default builder (ACBuilder, CBuilder).
> There is an ErrorParserManager attach to the builder, and a chance to
> add has many IErrorParser classes as needed, the default now is
> 
> ErrorParserManager.addParser(GCCErrorParser); 
> ErrorParserManager.addParser(GLDErrorParser); 
> ErrorParserManager.addParser(GASErrorParser); 
> ErrorParserManager.addParser(MakeErrorParser); 
> 
> The builder/ErrorParserManager gives a chance to all the 
> IErrorParser[]
> to have a look out the output and do some matching.  If a pattern
> match the IErrorParser set a ProblemMarker.
> 
> There was, however, no UI component to this, and the idea was to let
> user or ISV add/remove IErrorParser to a 
> builder/ErrorParserManager via
> a Preference page or extension points.
> 
> The other problems were doing a "make clean" and the passing 
> of specific
> targets.  Some targets like "clean" are special, since the entire
> build state as to be flushed.
> 

Thanks for the tip. The idea that the parsers are attached to the builder is
fine, as long as ISVs can supply specific parsers for their tools, all the
parsers get a crack at the output, and something can help them collaborate
to display errors on the task list without duplicates. This sounds like it
could be a lot of work.


Back to the top