[m-rev.] for review: analysis framework (1/2)

Julien Fischer juliensf at cs.mu.OZ.AU
Mon Jan 23 16:04:13 AEDT 2006


On Mon, 23 Jan 2006, Peter Wang wrote:

> On 2006-01-16, Julien Fischer <juliensf at cs.mu.OZ.AU> wrote:
> >
> > On Mon, 16 Jan 2006, Peter Wang wrote:
> >
> > > Estimated hours taken: 30
> > > Branches: main
> > >
> > > Some work on the intermodule analysis framework.  The main changes are that
> > > modules and analysis results have statuses associated with them, which are
> > > saved into the `.analysis' files, and there is now code to handle intermodule
> >
> > s/in/into/
>
> s/into/in ?
>

which are saved in the `.analysis' files ...

...

> > >       Also, it made two `Call' or two
> > > 	`Answer' values hard to compare, as a `FuncInfo' value had to be
> > > 	present for a comparison call to be made, and it was not always
> > > 	obvious where that `FuncInfo' value would come from.  I have changed
> > > 	it so that that any information which might be be stored in a
> > > 	`FuncInfo' should be stored in the corresponding `Call' value itself.
> > >
> >
> > I don't understand this one; could you provide an example.
>
> The methods in the partial_order typeclass both require a FuncInfo value so
> to compare any two Call (or Answer) values you need to get a FuncInfo value
> from somewhere.  It is usually supplied by the analysis pass which is
> invoking the framework.
>
> Where this is hard is when we are ready to write everything out to disk at
> the end of all analyses.  The analysis results have been recorded in a map,
> and you have the IMDG for the current module.  For each analysis result you
> need to compare new Answers with old Answers, for which you need a FuncInfo,
> which would have to be stored in the map as well.  Then you need to look up
> the IMDG to see if another module was using the an old result, which involves
> comparing Call patterns, which also requires a FuncInfo from somewhere.
> Any time you have two FuncInfos, you need either to add partial_order methods
> to take two FuncInfos, or you choose one of the FuncInfos arbitrarily.
>
> Since FuncInfos would have to be read/written with every Call and Answer
> value, including IMDGs, I decided that analyses might as well put whatever
> information that would be in a FuncInfo into the Call or Answer as it likes.
> Nick's thesis only deals in call and answer patterns and the unused args
> code just uses the FuncInfo to store the arity so that it can work out
> the representation of the `bottom' answer (which isn't used at all so far).
> Unless it will be useful for many other analyses, I would really like to
> be rid of the FuncInfos.
>

Okay.  I suggest including most of that justification in the log
message though.


> > > compiler/mmc_analysis.m:
> > > 	Add the trail usage analysis to the list of analyses to be used with
> > > 	the intermodule analysis framework.
> > > 	Update the entry for unused argument elimination.
> >
> > At the moment the results from the intermodule analysis framework should
> > correspond to the pragmas in the .opt files when compiling with
> > --intermodule-optimization (although not with
> > --transitive-intermodule-optimization).
> > You should check that this is the case.
>
> It mostly does, although the results can be different due to the order
> of compilation because the .analysis files are currently written out as
> a byproduct of doing other things.  The trail usage analysis was writing
> out results for non-exported procedures as well; I've fixed that now.
>

That's fine; I just wanted to make sure that the results weren't wildly
inconsistent.

...

> > > @@ -163,49 +231,122 @@
> > >  	throw(invalid_analysis_file)
> > >      ).
> > >
> > > +read_module_imdg(Info, ModuleId, ModuleEntries, !IO) :-
> > > +    read_analysis_file(Info ^ compiler, ModuleId, ".imdg",
> >
> > Make sure that mmake realclean (clean?) and the corresponding functionality in
> > mmc --make know how to clean up any new filetypes you add.
>
> Done.  It won't clean up the .analysis, .imdg, and .request files for
> imported library modules though.  Those should probably not be written
> in the first place, but I'm leaving that for later.
>

Fine.

> >
> > > +record_dependency(CallerModuleId, AnalysisName, CalleeModuleId, FuncId, Call,
> > > +	!Info) :-
> > > +    (if CallerModuleId = CalleeModuleId then
> > > +	% XXX this assertion breaks compiling the standard library with
> > > +	% --analyse-trail-usage at the moment
> >
> > How is it breaking it?
>
> record_dependency is being called with CalleeModuleId = CallerModuleId
> for some procedures during trail usage analysis.  I haven't figured it
> out as yet.
>
> > > +% XXX make this enableable with a command-line option.  A problem is that we
> > > +% don't want to make the analysis directory dependent on anything in the
> > > +% compiler directory.
> > > +
> > > +:- pred debug_msg(pred(io, io)::in(pred(di, uo) is det), io::di, io::uo)
> > > +    is det.
> > > +
> > > +debug_msg(_P, !IO) :-
> > > +    % P(!IO),
> > > +    true.
> >
> > I suggest using a mutable to keep track of whether debugging traces
> > are enabled and then export a predicate from the analysis library that
> > client compilers can use to turn debugging on and off.
>
> Are mutables supported in non-C backends yet?
>

No, but the compiler doesn't currently build in the non-C backends anyway
and if/when they do implementing mutables shouldn't be much of a
problem, so I would still suggest using one to do this.

> I've followed the rest of your suggestions and fixed a few bugs.

The rest looks fine.

Cheers,
Julien.
--------------------------------------------------------------------------
mercury-reviews mailing list
post:  mercury-reviews at cs.mu.oz.au
administrative address: owner-mercury-reviews at cs.mu.oz.au
unsubscribe: Address: mercury-reviews-request at cs.mu.oz.au Message: unsubscribe
subscribe:   Address: mercury-reviews-request at cs.mu.oz.au Message: subscribe
--------------------------------------------------------------------------



More information about the reviews mailing list