[m-rev.] diff: increase from_ground_term_threshold to 1000000 to avoid regressions

Zoltan Somogyi zs at csse.unimelb.edu.au
Mon Apr 27 15:20:58 AEST 2009


On 27-Apr-2009, Peter Ross <pro at missioncriticalit.com> wrote:
> Increase from_ground_term_threshold from 5 to 1000000 to avoid
> the two regressions reported in bugs 93 and 94 in the bug database.

Bug 93 is not so much as a bug as an issue that arises because we have not
(yet) resolved the tension between two competing optimizations. Opt1 is
replacing code that constructs ground terms at runtime with references
to static ground terms; opt2 is compile time garbage collection. Opt1
requires ground terms to be non-unique; opt2 requires ground terms to be
unique. When I implemented the special handling of large ground terms,
I obviously did what opt1 wants, since opt2 did not yet work anyway. Just
as obviously, deciding between unique and non-unique based only on the
size of the term is not a good idea, but what should the criterion be?

I think all ground terms should be non-unique, and ctgc should introduce
code that makes unique copies of ground terms whenever it needs them. It
has to think about this possibility *anyway*, because it has to balance
the gain in speed from ctgc against the loss in speed from having to
build the ground term from scratch each time. Asking it to record the
result of its analysis in the form of a call to a copy predicate, or
in the form of replacing a from_ground term scope with a conjunction of
construct_dynamically unifications, doesn't sound like asking too much.

Zoltan.
--------------------------------------------------------------------------
mercury-reviews mailing list
Post messages to:       mercury-reviews at csse.unimelb.edu.au
Administrative Queries: owner-mercury-reviews at csse.unimelb.edu.au
Subscriptions:          mercury-reviews-request at csse.unimelb.edu.au
--------------------------------------------------------------------------



More information about the reviews mailing list