[m-rev.] for review: new termination analyser (part 4 of 6)

Mark Brown mark at cs.mu.OZ.AU
Sat Apr 2 01:55:17 AEST 2005


On 24-Mar-2005, Julien Fischer <juliensf at cs.mu.OZ.AU> wrote:
> 
> ===================================================================
> RCS file: compiler/term_constr_build.m
> diff -N compiler/term_constr_build.m
> --- /dev/null	1 Jan 1970 00:00:00 -0000
> +++ compiler/term_constr_build.m	24 Mar 2005 04:22:07 -0000
> +
> +:- type build_options.
> +
> +:- func build_options_init(functor_info, bool, bool) = build_options.

You should document the above function...

> +%-----------------------------------------------------------------------------%
> +%
> +% Build pass options.
> +%
> +
> +:- type build_options
> +    --->    build_options(
> +            functor_info    :: functor_info,
> +            failure_constrs :: bool,
> +            arg_size_only   :: bool

...and also document these fields.

> +    ).
> +
> +build_options_init(Norm, Failure, ArgSizeOnly) =
> +    build_options(Norm, Failure, ArgSizeOnly).
> +
> +%-----------------------------------------------------------------------------%
> +
> +% This information is accumulated while building the abstract representation
> +% of a SCC.  After we have finished we write it to the module_info.
> +% We cannot do this while we are building each individual procedure because we
> +% will not have all the information we need until we have finished the
> +% processing the entire SCC.

s/finished the/finished/

> +build_abstract_goal_2(conj(Goals)-_, AbstractGoal, !Info) :-
> +    list.map_foldl(build_abstract_goal, Goals, AbstractGoals0, !Info),
> +    AbstractGoals = simplify_conjuncts(AbstractGoals0),
> +    AbstractGoal = conj(AbstractGoals, [], []).

It would be nicer to write a separate build_abstract_conj predicate,
and reuse it for par_conj and if_then_else cases.

> +
> +% NOTE: for switches and disjunctions we add the variables that
> +% are local to the entire switch/disjunction to the list of variables
> +% that are local to each case/disjunct.  The reason for this is that
> +% the projection operation distributes over the convex hull operation
> +% and it is more efficient to eliminate the variables from each branch
> +% *before* taking the convex hull.  This is because the transformation
> +% matrix used by the convex hull operation (see polyhedron.m) will
> +% usually be much larger for the entire disjunction than the matrix used
> +% for each case/disjunct.

This comment belongs lower down, above the definition of build_abstract_disj.

> +build_abstract_goal_2(Goal - _, AbstractGoal, !Info) :-
> +    Goal = if_then_else(_, Cond, Then, Else),
> +    %
> +    % Reduce the if-then goals to an abstract conjunction.
> +    %
> +    list.map_foldl(build_abstract_goal, [Cond, Then], AbstractConjuncts,
> +        !Info),
> +    AbstractSuccessGoal = conj(AbstractConjuncts, [], []),
> +    %
> +    % Work out a failure constraint for the Cond and then abstract the else
> +    % branch.  We won't bother do any other simplifications here as the AR
> +    % simplification pass will sort all of this out.
> +    %
> +    CondFail = find_failure_constraint_for_goal(Cond, !.Info),
> +    %
> +    % XXX FIXME - the local/non-local variable sets end up
> +    % being incorrect here.

Is this one you need to fix now?

> +    % For non-recursive calls look up the argument size constraints for the
> +    % callee procedure and build an abstract primitive goal to store them.
> +    %
> +    % If we are running termination analysis, as opposed to just the IR
> +    % analysis then we also need to check that the termination status of the
> +    % callee procedure.
> +    %
> +:- pred build_non_recursive_call(pred_proc_id::in, size_vars::in,
> +    prog_context::in, abstract_goal::out, traversal_info::in,
> +    traversal_info::out) is det.
> +
> +build_non_recursive_call(CalleePPId, CallerArgs, Context, AbstractGoal,
> +        !Info) :-
> +    ModuleInfo = !.Info ^ module_info,
> +    CallerPPId = !.Info ^ ppid,
> +    ZeroVars = !.Info ^ zeros,
> +    module_info_pred_proc_info(ModuleInfo, CalleePPId, _, CalleeProcInfo),
> +    %
> +    % Check the termination status of the callee procedure if we are running a
> +    % full analysis - ignore it if we are only running the IR analysis.
> +    %
> +    proc_info_get_termination2_info(CalleeProcInfo, CalleeTerm2Info),
> +    ( !.Info ^ arg_analysis_only = no ->
> +        TermStatus = CalleeTerm2Info ^ term_status,
> +        (
> +            TermStatus = yes(can_loop(_))
> +        ->
> +            Error = Context - can_loop_proc_called(CallerPPId,
> +                CalleePPId),
> +            info_update_errors(Error, !Info)
> +        ;
> +            TermStatus = yes(cannot_loop(_))
> +        ->
> +            true
> +        ;
> +            unexpected(this_file, "Callee procedure has no " ++
> +                "termination status.")
> +        )
> +    ;
> +        true
> +    ),

The above code would be better written as nested switches.

> +%------------------------------------------------------------------------------%
> +%
> +% Additional predicates for abstracting switches and disjunctions.
> +%
> +
> +:- type disj_info
> +    --->    switch(prog_var, list(case))
> +    ;       non_switch(hlds_goals).
> +
> +:- pred build_abstract_disj(disj_info::in, abstract_goal::out,
> +    traversal_info::in, traversal_info::out) is det.
> +
> +build_abstract_disj(Type, AbstractGoal, !Info) :-
> +    (
> +        Type = non_switch(Goals),
> +        build_abstract_disj_acc(Goals, [], AbstractGoals, !Info)
> +    ;
> +        Type = switch(SwitchVar, Cases),
> +        build_abstract_switch_acc(SwitchVar, Cases, [], AbstractGoals,
> +            !Info)
> +    ),
> +    ( AbstractGoals = [] ->
> +        AbstractGoal = primitive(polyhedron.identity, [], [])
> +    ; AbstractGoals = [Goal] ->
> +        AbstractGoal = Goal
> +    ;
> +        DisjSize = list.length(AbstractGoals),
> +        AbstractGoal = disj(AbstractGoals, DisjSize, [], [])
> +    ).

The above code would be better written as a switch.

> +%------------------------------------------------------------------------------%
> +%
> +% Additional predicates for abstracting unifications.
> +%

Why do the "additional" predicates come first?  In any case, the predicates
in this section don't appear to be used anywhere else, so this section can
be removed.

> +%-----------------------------------------------------------------------------
> +%
> +% Procedures for abstracting unifications.
> +%
> +

> +:- pred filter_args_and_modes(map(prog_var, (type))::in, list(prog_var)::in,
> +    list(prog_var)::out, list(uni_mode)::in, list(uni_mode)::out) is det.
> +
> +filter_args_and_modes(VarTypes, Args0, Args, Modes0, Modes) :-
> +    ArgsAndModes0 = assoc_list.from_corresponding_lists(Args0, Modes0),
> +    IsNotTypeInfo = (pred(ArgMode::in) is semidet :-
> +        map.lookup(VarTypes, fst(ArgMode), Type),
> +        not is_introduced_type_info_type(Type)
> +    ),
> +    list.filter(IsNotTypeInfo, ArgsAndModes0, ArgsAndModes),
> +    assoc_list.keys_and_values(ArgsAndModes, Args, Modes).

This predicate could be better named, and would also be easier to read and
more efficient if you just traversed the two lists concurrently rather
than building an assoc_list.

> +
> +    % Assignment and simple_test unifications of the form X = Y
> +    % are abstracted as |X| - |Y| = 0.
> +    %
> +:- pred build_abstract_simple_or_assign_unify(prog_var::in, prog_var::in,
> +    constraints::out, traversal_info::in, traversal_info::out) is det.
> +
> +build_abstract_simple_or_assign_unify(LeftProgVar, RightProgVar,
> +        Constraints, !Info) :-
> +    SizeVarMap = !.Info ^ var_map,
> +    Zeros  = !.Info ^ zeros,
> +    LeftSizeVar = prog_var_to_size_var(SizeVarMap, LeftProgVar),
> +    RightSizeVar = prog_var_to_size_var(SizeVarMap, RightProgVar),
> +    (
> +        set.member(LeftSizeVar, Zeros),
> +        set.member(RightSizeVar, Zeros)
> +    ->
> +        Constraints = []    % `true' constraint.
> +    ;
> +        (
> +            (set.member(LeftSizeVar, Zeros)
> +            ; set.member(RightSizeVar, Zeros))
> +        ->
> +            unexpected(this_file, "zero unified with non-zero.")
> +        ;
> +            % Create non-negativity constraints.
> +            %
> +            NonNegConstrs = list.map(make_nonneg_constr,
> +                [LeftSizeVar, RightSizeVar]),
> +            Terms = [LeftSizeVar - one, RightSizeVar - (-one)],
> +            AssignConstr = constraint(Terms, (=), zero),
> +            % XXX I don't think this call to simplify helps anymore.
> +            Constraints = simplify_constraints([AssignConstr | NonNegConstrs])
> +        )

This goal doesn't need the parentheses and indentation.

> +    ).
> +
> +%-----------------------------------------------------------------------------%
> +%
> +% Failure constraints.
> +%
> +
> +% The idea here is that if a goal can fail, the the fact that it fails

s/the the/the/

> Index: compiler/term_constr_data.m
> ===================================================================
> RCS file: compiler/term_constr_data.m
> diff -N compiler/term_constr_data.m
> --- /dev/null	1 Jan 1970 00:00:00 -0000
> +++ compiler/term_constr_data.m	23 Mar 2005 22:58:56 -0000
> +%------------------------------------------------------------------------------%
> +%
> +% Mapping the HLDS to the AR
> +%
> +
> +% 1. unification
> +%
> +% A HLDS unification of the form:
> +%
> +%       X = f(A, B, C)
> +%
> +% is converted to a AR primitive goal of the form:
> +%
> +%       { |X| = |A| + |B| + |C| + |f| }
> +%
> +% where |X| represents the size of the variable X (according to whatever
> +% measure we are using).  There will also additional non-negativity
> +% constraints on any variables that have non-zero size type.  Variables
> +% of that have zero-size type are not included at all.  Variables that

s/of that have zero-size/that have zero size/

> +% represent polymorphic types are included.  The code in
> +% (term_constr_fixpoint.m and term_constr_pass2.m that processes calls
> +% is responsible for dealing with the situation where a polymorphic
> +% procedure is called with zero sized arguments).

Remove the parentheses.

> +%
> +% 2. conjunction and parallel conjunction
> +%
> +% A HLDS conjunction (A, B) is converted to an AR conjunction.  Parallel
> +% conjunction is treated the same way.
> +%
> +% 3. disjunction and switches.
> +%
> +% A HLDS disjunction (A ; B) is converted to an AR disjunction.  Switches
> +% are similar although we also have to add any constraints on the variable
> +% being switched on.
> +%
> +% 4. calls
> +%
> +% A HLDS call to a procedure lower down the call graph is abstracted as
> +% an AR primitive.  A call to something in the same SCC becomes an AR call.
> +%
> +% 5. negation.
> +%
> +% A HLDS negation is abstracted as an AR primitive.
> +% XXX Mention failure constraints here.

Any reason you haven't addressed this XXX in this change?

> +%
> +% 6. existential quantification.
> +%
> +% Existential quantifications do not affect term sizes are ignored.

s/are/and are/

Presumably this applies to all scoped hlds goals?  You should update this
comment now.

> +%
> +% 8. if-then-else.
> +%
> +% ( Cond -> Then ; Else ) is abstracted as
> +%
> +%  disj(conj(|Cond|, |Then|), conj(neg(|Cond|), |Else|))
> +%
> +% (using |Cond| to represent the abstraction of Cond).

Better to write that as "using |Goal| to represent the abstraction of Goal".

> +        zeros :: zero_vars,
> +            % The size_vars that have zero-size.

s/zero-size/zero size/

> +simplify_abstract_rep(conj(!.Conjuncts, Locals, NonLocals), Goal) :-
> +    list.map(simplify_abstract_rep, !Conjuncts),
> +    list.filter(isnt(is_empty_primitive), !Conjuncts),
> +    flatten_conjuncts(!Conjuncts),
> +    list.filter(isnt(is_empty_conj), !Conjuncts),
> +    ( !.Conjuncts = [Conjunct] ->
> +        Goal = update_local_and_nonlocal_vars(Conjunct, Locals,
> +            NonLocals)

This warrants a comment as to why the (non-)locals need updating.

> +    ;
> +        Goal = conj(!.Conjuncts, Locals, NonLocals)
> +    ).
> +
> +    % Given a conjunction of the abstract goals of the form:
> +    % conj(A, B, C, Goals), where A, B and C are primitive goals
> +    % transform this to conj(A /\ B /\ C, Goals) where `/\' is
> +    % the intersection operator for convex polyhedra.  This tries
> +    % to simplify the conjunction as much as possible so that
> +    % we don't end up doing as much work during the fixpoint calculation.

This description doesn't match the code.  The code finds the intersection
of adjacent primitives anywhere in the list, whereas this description only
finds the intersection of leading primitives at the start of the list.

> +    %
> +    % Note: because intersection is commutative we could go further
> +    % and take the intersection of all the primitive goals in a
> +    % conjunction but that unnecessarily increases the size of the edge
> +    % labels in pass 2.
> +    %
> +:- pred flatten_conjuncts(abstract_goals::in, abstract_goals::out) is det.
> +
> +flatten_conjuncts([], []).
> +flatten_conjuncts([Goal], [Goal]).
> +flatten_conjuncts([GoalA, GoalB | Others0], NewGoals) :-
> +    flatten_conjuncts([GoalB | Others0], Others1),
> +    (
> +        Others1 = [GoalC | Others],
> +        GoalC = primitive(PolyC, LocalsC, NonLocalsC),
> +        GoalA = primitive(PolyA, LocalsA, NonLocalsA)
> +    ->
> +        Poly = polyhedron.intersection(PolyC, PolyA),
> +        Locals = LocalsC ++ LocalsA,
> +        NonLocals = NonLocalsC ++ NonLocalsA,
> +        Goal = primitive(Poly, Locals, NonLocals),
> +        NewGoals = [Goal | Others]
> +    ;
> +        NewGoals = [GoalA | Others1]
> +    ).
> +

> +%-----------------------------------------------------------------------------%
> +%
> +% Predicates for printing out the abstract data structure.
> +% (These are for debugging only)
> +%
> +
> +dump_abstract_scc(SCC, Module, !IO) :- dump_abstract_scc(SCC, 0, Module, !IO).
> +
> +dump_abstract_scc(SCC, Indent, Module, !IO) :-
> +    list.foldl((pred(Proc::in, !.IO::di, !:IO::uo) is det :-
> +            indent_line(Indent, !IO),

You shouldn't need this indentation, since dump_abstract_proc will do that.

> +            dump_abstract_proc(Proc, Indent, Module, !IO)
> +        ), SCC, !IO).
> +
> +dump_abstract_proc(Proc, Indent, Module, !IO) :-
> +    Proc = abstract_proc(AbstractPPId, _, _, _, HeadVars, _ ,_,
> +        Body, _, Varset, _, _),
> +    indent_line(Indent, !IO),
> +    AbstractPPId = real(PPId),
> +    hlds_out.write_pred_proc_id(Module, PPId, !IO),
> +    io.write_string(" : [", !IO),
> +    WriteHeadVars = (pred(Var::in, !.IO::di, !:IO::uo) is det :-
> +        varset.lookup_name(Varset, Var, VarName),
> +        io.format(VarName ++ "[%d]", [i(term.var_id(Var))], !IO)
> +    ),
> +    io.write_list(HeadVars, ", ", WriteHeadVars, !IO),
> +    io.write_string(" ] :- \n", !IO),
> +    dump_abstract_goal(Module, Varset, Indent + 1, Body, !IO).
> +
> +%-----------------------------------------------------------------------------%
> +%
> +% Predicates for simplifying conjuncts.
> +%
> +
> +% XXX Make this part of the other AR simplification predicates.
> +
> +simplify_conjuncts(Goals0) = Goals :-
> +    simplify_conjuncts(Goals0, Goals).
> +
> +:- pred simplify_conjuncts(abstract_goals::in, abstract_goals::out) is det.
> +
> +simplify_conjuncts(Goals0, Goals) :-
> +    (
> +        Goals0 = []
> +    ->
> +        Goals = []
> +    ;
> +        Goals0 = [Goal]
> +    ->
> +        Goals = [Goal]
> +    ;
> +        % If the list of conjuncts starts with two primitives
> +        % join them together into a single primitive.
> +        Goals0 = [GoalA, GoalB | OtherGoals],
> +        GoalA  = primitive(PolyA,  LocalsA, NonLocalsA),
> +        GoalB  = primitive(PolyB,  LocalsB, NonLocalsB)
> +    ->
> +        Poly = polyhedron.intersection(PolyA, PolyB),
> +        Locals = LocalsA ++ LocalsB,
> +        NonLocals = NonLocalsA ++ NonLocalsB,
> +        Goal = primitive(Poly, Locals, NonLocals),
> +        Goals1 = [Goal | OtherGoals],
> +        simplify_conjuncts(Goals1, Goals)
> +    ;
> +        % A /\ (B \/ C) <= (A /\ B) \/ (A \/ C).
> +        % Applying this identity improves the accuracy of
> +        % the analysis.  At the moment we only apply it
> +        % in the situation where there is a primitive, followed
> +        % by a disjunction.
> +        %
> +        % XXX This is really the wrong place to do this -
> +        % during the fixpoint calculation would be better.

Presumably this is the reason that the code is commented out.  Is it wrong
because it is incorrect, or because it doesn't improve the accuracy as well
as it should?

> +        %
> +    %   Goals0 = [GoalA, GoalB | OtherGoals],
> +    %   GoalA  = primitive(_PolyA, LocalsA, NonLocalsA),
> +    %   GoalB  = disj(Disjuncts0, DisjLocals, DisjNonLocals)
> +    %->
> +    %   ExpandConj = (func(Disj0) = Disj :-
> +    %       Conjs = simplify_conjuncts([GoalA, Disj0]),
> +    %       Disj  = conj(Conjs, LocalsA, NonLocalsA)
> +    %   ),
> +    %   Disjuncts = list.map(ExpandConj, Disjuncts0),
> +    %   Disjunct = disj(Disjuncts, DisjLocals, DisjNonLocals),
> +    %   Goals1 = [Disjunct | OtherGoals],
> +    %   simplify_conjuncts(Goals1, Goals)
> +    %;
> +        Goals = Goals0
> +    ).
> +

To be continued...

--------------------------------------------------------------------------
mercury-reviews mailing list
post:  mercury-reviews at cs.mu.oz.au
administrative address: owner-mercury-reviews at cs.mu.oz.au
unsubscribe: Address: mercury-reviews-request at cs.mu.oz.au Message: unsubscribe
subscribe:   Address: mercury-reviews-request at cs.mu.oz.au Message: subscribe
--------------------------------------------------------------------------



More information about the reviews mailing list