An Example of Structured Explanation Generation
An explanation mechanism designed to explain the behavior of a goal-driven system which navigates its way around the streets of Princeton is described in this paper. The mechanism embodies a model in which explanation is viewed as a type of goal-directed, purposive discourse. In this model, discourse is
produced via the achievement of goals at several levels, each level posing goals to be satisfied by lower levels, until a set of instructions which a sentence generator can follow to produce English is reached. These levels are: linearization, selection and local coherence operations. The model was implemented and tested for a special method of linearization called structured linearization. Structured linearization utilizes the order inherent in the knowledge being explained to provide the ordering necessary for test generation. An example of an actual run of the system is shown. The explanation mechanism is currently being enhanced with the addition of a dialogue management system; a brief sketch of this work is also presented.