Dominique de Waleffe ddw at miscrit.be
Tue Sep 16 19:33:40 AEST 1997

[on Mercury 0.7]. Which now compiles  a lot faster....Great.

I wrote a small programs that has to process some 900000 records. It is
a filter that outputs one line per input line with no data
dependencies between them.

When I run it, it report to have done some 1424 GCs in  6 min 35 secs
(the elapsed time), that seem a lot. The amount of memory seems to
remain stable around 4Mb RAM +4MB paged out.
Is there a way to diminish the # of GCs by letting the memory grow to
a lot more? Apparently setting MERCURY_OPTIONS='-sh2000' does not
have an effect. 

I also tried the asm_fast only grade, but the program bombs out
(apparently when getting to the memeory limit set) and triggers the
Cygnus exception handler loop...

Also on NT (Cygnus beta-17,no patch) the time reported by
report_stats is always 0. Is that a Cygnus or Mercury  problem?


Dominique de Waleffe   Email: ddw at acm.org, ddw at myself.com, ddw at miscrit.be
Mission Critical       WWW:   http://www.miscrit.be/~ddw
Phone: +32 2 759 95 60  Fax: +32 2 759 27 60
PGP key fingerprint: F9 CC 23 74 44 62 7C F3  8C 12 DF 71 BB 60 54 98

More information about the users mailing list