Abstracts of Selected Papers by Dale Miller

(Last changed: 7 July 1995)


Observations about using logic as a specification language

Dale Miller

Invited paper to the 1995 Gulp-Prode Joint Conference on Declarative Programming, Marina di Vietri (Salerno-Italy), 11-14 September 1995.

This extended abstract contains some non-technical observations about the roles that logic can play in the specification of computational systems. In particular, computation-as-deduction, meta-programming, and higher-order abstract syntax are briefly discussed.

(HTML) (PS)


Forum: A Multiple-Conclusion Specification Logic

Dale Miller

This paper will appear in TCS.

The theory of cut-free sequent proofs has been used to motivate and justify the design of a number of logic programming languages. Two such languages, lambda Prolog and its linear logic refinement, Lolli, provide for various forms of abstraction (modules, abstract data types, and higher-order programming) but lack primitives for concurrency. The logic programming language, LO (Linear Objects) provides some primitives for concurrency but lacks abstraction mechanisms. In this paper we present Forum, a logic programming presentation of all of linear logic that modularly extends lambda Prolog, Lolli, and LO. Forum, therefore, allows specifications to incorporate both abstractions and concurrency. To illustrate the new expressive strengths of Forum, we specify in it a sequent calculus proof system and the operational semantics of a programming language that incorporates references and concurrency. We also show that the meta theory of linear logic can be used to prove properties of the object-languages specified in Forum.

See here for the paper.


Higher-Order Logic Programming

Gopalan Nadathur and Dale Miller

To appear in Volume 5 of the Handbook of Logic for Artificial Intelligence and Logic Programming.

This paper, which is to be a chapter in a handbook, provides an exposition of the notion of higher-order logic programming. It begins with an informal consideration of the nature of a higher-order extension to usual logic programming languages. Such an extension is then presented formally by outlining a suitable higher-order logic --- Church's simple theory of types --- and by describing a version of Horn clauses within this logic. Various proof-theoretic and computational properties of these higher-order Horn clauses that are relevant to their use in programming are observed. These observations eventually permit the description of an actual higher-order logic programming language. The applications of this language are examined. It is shown, first of all, that the language supports the forms of higher-order programming that are familiar from other programming paradigms. The true novelty of the language, however, is that it permits typed lambda terms to be used as data structures. This feature leads to several important applications for the language in the realm of meta-programming. Examples that bring out this facet of the language are presented. A complete exploitation of this meta-programming capability requires certain additional abstraction mechanisms to be present in the language. This issue is discussed, the logic of hereditary Harrop formulas is presented as a means for realizing the needed strengthening of the language, and the virtues of the features arising out the extension to the logic are illustrated.

  1. Introduction
  2. Motivating a Higher-Order Extension to Horn Clauses
  3. A Higher-Order Logic
    1. The Language
    2. Equality between Terms
    3. The Notion of Derivation
    4. A Notion of Models
    5. Predicate Variables and the Subformula Property
  4. Higher-Order Horn Clauses
  5. The Meaning of Computations
    1. Restriction to Positive Terms
    2. Provability and Operational Semantics
  6. Towards a Practical Realization
    1. The Higher-Order Unification Problem
    2. P-Derivations
    3. Designing an Actual Interpreter
  7. Examples of Higher-Order Programming
    1. A Concrete Syntax for Programs
    2. Some Simple Higher-Order Programs
    3. Implementing Tactics and Tacticals
    4. Comparison with Higher-Order Functional Programming
  8. Using lambda-Terms as Data Structures
    1. Implementing an Interpreter for Horn Clauses
    2. Dealing with Functional Programs as Data
    3. A Shortcoming of Horn Clauses for Meta-Programming
  9. Hereditary Harrop Formulas
    1. Permitting Universal Quantifiers and Implications in Goals
    2. Recursion over Structures that Incorporate Binding
  10. Conclusion
  11. Acknowledgements
See
here for the paper.


A Multiple-Conclusion Meta-Logic

Dale Miller

Proceedings of the 1994 Symposium on Logics in Computer Science, edited by S. Abramsky, pp. 272-281.

The theory of cut-free sequent proofs has been used to motivate and justify the design of a number of logic programming languages. Two such languages, lambda Prolog and its linear logic refinement, Lolli, provide for various forms of abstraction (modules, abstract data types, higher-order programming) but lack primitives for concurrency. The logic programming language, LO (Linear Objects) provides for concurrency but lacks abstraction mechanisms. In this paper we present Forum, a logic programming presentation of all of linear logic that modularly extends the languages lambda Prolog, Lolli, and LO. Forum, therefore, allows specifications to incorporate both abstractions and concurrency. As a meta-language, Forum greatly extends the expressiveness of these other logic programming languages. To illustrate its expressive strength, we specify in Forum a sequent calculus proof system and the operational semantics of a functional programming language that incorporates such non-functional features as counters and references.

See here for the paper.


Logic Programming in a Fragment of Intuitionistic Linear Logic

Joshua S. Hodas and Dale Miller

Journal of Information and Computation, 110(2):327-365, 1 May 1994. An earlier version appeared in LICS'91.

When logic programming is based on the proof theory of intuitionistic logic, it is natural to allow implications in goals and in the bodies of clauses. Attempting to prove a goal of the form (D => G) from the context (set of formulas) Gamma leads to an attempt to prove the goal G in the extended context Gamma U {D}. Thus during the bottom-up search for a cut-free proof contexts, represented as the left-hand side of intuitionistic sequents, grow as stacks. While such an intuitionistic notion of context provides for elegant specifications of many computations, contexts can be made more expressive and flexible if they are based on linear logic. After presenting two equivalent formulations of a fragment of linear logic, we show that the fragment has a goal-directed interpretation, thereby partially justifying calling it a logic programming language. Logic programs based on the intuitionistic theory of hereditary Harrop formulas can be modularly embedded into this linear logic setting. Programming examples taken from theorem proving, natural language parsing, and data base programming are presented: each example requires a linear, rather than intuitionistic, notion of context to be modeled adequately. An interpreter for this logic programming language must address the problem of splitting contexts; that is, when attempting to prove a multiplicative conjunction (tensor), say G1 x G2, from the context Delta, the latter must be split into disjoint contexts Delta1 and Delta2 for which G1 follows from Delta1 and G2 follows from Delta2. Since there is an exponential number of such splits, it is important to delay the choice of a split as much as possible. A mechanism for the lazy splitting of contexts is presented based on viewing proof search as a process that takes a context, consumes part of it, and returns the rest (to be consumed elsewhere). In addition, we use collections of Kripke interpretations indexed by a commutative monoid to provide models for this logic programming language and show that logic programs admit a canonical model.

See here for the paper. Prolog programs described in that paper is available.


A Proposal for Modules in lambda Prolog

Dale Miller

Proceedings of the 1993 Workshop on Extensions to Logic Programming, edited by Roy Dyckhoff, Springer-Verlag, Lecture Notes in Computer Science, 1994.

Higher-order hereditary Harrop formulas, the underlying logical foundation of lambda Prolog, are more expressive than first-order Horn clauses, the logical foundation of Prolog. In particular, various forms of scoping and abstraction are supported by the logic of higher-order hereditary Harrop formulas while they are not supported by first-order Horn clauses. Various papers have argued that the scoping and abstraction available in this richer logic can be used to provide for modular programming, abstract data types, and state encapsulation. None of these papers, however, have dealt with the problems of programming-in-the-large, that is, the essentially linguistic problems of putting together various different textual sources of code found, say, in different files on a persistent store into one logic program. In this paper, I propose a module system for lambda Prolog and shall focus mostly on its static semantics.

See here for the paper.


The pi-calculus as a theory in linear logic: Preliminary results

Dale Miller

Proceedings of the 1992 Workshop on Extensions to Logic Programming, edited by E. Lamma and P. Mello. Springer-Verlag, LNCS 660, pp. 242 - 265.

The agent expressions of the pi-calculus can be translated into a theory of linear logic in such a way that the reflective and transitive closure of pi-calculus (unlabeled) reduction is identified with ``entailed-by''. Under this translation, parallel composition is mapped to the multiplicative disjunct (``par'') and restriction is mapped to universal quantification. Prefixing, non-deterministic choice (+), replication (!), and the match guard are all represented using non-logical constants, which are specified using a simple form of axiom, called here a process clause. These process clauses resemble Horn clauses except that they may have multiple conclusions; that is, their heads may be the par of atomic formulas. Such multiple conclusion clauses are used to axiomatize communications among agents. Given this translation, it is nature to ask to what extent proof theory can be used to understand the meta-theory of the pi-calculus. We present some preliminary results along this line for pi0, the ``propositional'' fragment of the pi-calculus, which lacks restriction and value passing (pi0 is a subset of CCS). Using ideas from proof-theory, we introduce co-agents and show that they can specify some testing equivalences for pi0. If negation-as-failure-to-prove is permitted as a co-agent combinator, then testing equivalence based on co-agents yields observational equivalence for pi0. This latter result follows from observing that co-agents directly represent formulas in the Hennessy-Milner modal logic.

See here for the paper. Postscript version is here.


UNIFICATION UNDER A MIXED PREFIX

Dale Miller

Journal of Symbolic Computation, Vol. 14 (1992), 321-358.

Unification problems are identified with conjunctions of equations between simply typed lambda-terms where free variables in the equations can be either universally or existentially quantified. Two schemes for simplifying quantifier alternation, called Skolemization and raising (a dual of Skolemization), are presented. In this setting where variables of functional type can be quantified and not all types contain closed terms, the naive generalization of first-order Skolemization has several technical problems that are addressed. The method of searching for pre-unifiers described by Huet is easily extended to this setting, although solving the so-called flexible-flexible unifications problems is undecidable since types may be empty. Unification problems may have numerous incomparable unifiers. Ocassionally unifiers share common structure called factors. Various occasions when factors exist are explored.

See here for the paper.


From Operational Semantics to Abstract Machines

John Hannan and Dale Miller

Journal of Mathematical Structures in Computer Science, Vol. 2, No. 4 (1992), 415-459.

We consider the problem of mechanically constructing abstract machines from operational semantics, producing intermediate-level specifications of evaluators guaranteed to be correct with respect to the operational semantics. We construct these machines by repeatedly applying correctness-preserving transformations to operational semantics until the resulting specifications have the form of abstract machines. Though not automatable in general, this approach to constructing machine implementations can be mechanized, providing machine-verified correctness proofs. As examples we present the transformation of specifications for both call-by-name and call-by-value evaluation of the untyped lambda-calculus into abstract machines that implement such evaluation strategies. We also present extensions to the call-by-value machine for a language containing constructs for recursion, conditionals, concrete data types, and built-in functions. In all cases, the correctness of the derived abstract machines follows from the (generally transparent) correctness of the initial operational semantic specification and the correctness of the transformations applied.

See here for the paper.


Unification of Simply Typed Lambda-Terms as Logic Programming

Dale Miller

Eighth International Logic Programming Conference, edited by Koichi Furukawa, Paris, France. MIT Press, June 1991, pp. 255 - 269.

The unification of simply typed lambda-terms modulo the rules of beta- and eta-conversions is often called ``higher-order'' unification because of the possible presence of variables of functional type. This kind of unification is undecidable in general and if unifiers exist, most general unifiers may not exist. In this paper, we show that such unification problems can be coded as a query of the logic programming language L_lambda in a natural and clear fashion. In a sense, the translation only involves explicitly axiomatizing in L_lambda the notions of equality and substitution of the simply typed lambda-calculus: the rest of the unification process can be viewed as simply an interpreter of L_lambda searching for proofs using those axioms.

See here for the paper.


Uniform proofs as a foundation for logic programming

Dale Miller, Gopalan Nadathur, Frank Pfenning, and Andre Scedrov

Annals of Pure and Applied Logic, Vol. 51 (1991), 125 - 157.

A proof-theoretic characterization of logical languages that form suitable bases for Prolog-like programming languages is provided. This characterization is based on the principle that the declarative meaning of a logic program, provided by provability in a logical system, should coincide with its operational meaning, provided by interpreting logical connectives as simple and fixed search instructions. The operational semantics is formalized by the identification of a class of cut-free sequent proofs called uniform proofs. A uniform proof is one that can be found by a goal-directed search that respects the interpretation of the logical connectives as search instructions. The concept of a uniform proof is used to define the notion of an abstract logic programming language, and it is shown that first-order and higher-order Horn clauses with classical provability are examples of such a language. Horn clauses are then generalized to hereditary Harrop formulas and it is shown that first-order and higher-order versions of this new class of formulas are also abstract logic programming languages if the inference rules are those of either intuitionistic or minimal logic. The programming language significance of the various generalizations to first-order Horn clauses is briefly discussed.

See here for the paper.


A Logic Programming Language with Lambda-Abstraction, Function Variables, and Simple Unification

Dale Miller

Journal of Logic and Computation, Vol. 1, No. 4 (1991), 497 - 536.

It has been argued elsewhere that a logic programming language with function variables and lambda-abstractions within terms makes a good meta-programming language, especially when an object-language contains notions of bound variables and scope. The lambda Prolog logic programming language and the related Elf and Isabelle systems provide meta-programs with both function variables and lambda-abstractions by containing implementations of higher-order unification. This paper presents a logic programming language, called L_lambda, that also contains both function variables and lambda-abstractions, although certain restrictions are placed on occurrences of function variables. As a result of these restrictions, an implementation of L_lambda does not need to implement full higher-order unification. Instead, an extension to first-order unification that respects bound variable names and scopes is all that is required. Such unification problems are shown to be decidable and to possess most general unifiers when unifiers exist. A unification algorithm and logic programming interpreter are described and proved correct. Several examples of using L_lambda as a meta-programming language are presented.

See here for the paper.


An Extension to ML to Handle Bound Variables in Data Structures: Preliminary Report

Dale Miller

Proceedings of the Logical Frameworks BRA Workshop, May 1990.

Most conventional programming languages have direct methods for representing first-order terms (say, via concrete datatypes in ML). If it is necessary to represent structures containing bound variables, such as lambda-terms, formulas, types, or proofs, these must first be mapped into first-order terms, and then a significant number of auxiliary procedures must be implemented to manage bound variable names, check for free occurrences, do substitution, test for equality modulo alpha-conversion, etc. We shall show how the applicative core of the ML programming language can be enhanced so that lambda-terms can be represented more directly and so that the enhanced language, called ML_lambda, provides a more elegant method of manipulating bound variables within data structures. In fact, the names of bound variables will not be accessible to the ML_lambda$ programmer. This extension to ML involves the following: introduction of the new type constructor ('a => 'b) for the type of lambda-terms formed by abstracting a parameter of type 'a out of a term of type 'b; a very restricted and simple form of higher-order pattern matching; a method for extending a given data structure with a new constructor; and, a method for extending function definitions to handle such new constructors. We present several examples of ML_lambda programs.

The paper is available: DVI, Postscript, PDF.


A Logic Programming Approach to Manipulating Formulas and Programs

Dale Miller and Gopalan Nadathur

IEEE Symposium on Logic Programming, San Francisco, edited by Seif Haridi, pp. 79-388, September 1987.

First-order Horn clause logic can be extended to a higher-order setting in which function and predicate symbols can be variables and terms are replaced with simply typed lambda-terms. For such a logic programming language to be complete in principle, it must incorporate higher-order unification. Although higher-order unification is more complex than usual first-order unification, its availability makes writing certain kinds of programs far more straightforward. In this paper, we present such programs written in a higher-order version of Prolog called lambda Prolog. These programs manipulate structures, such as formulas and programs, which contain abstractions or bound variables. We show how a simple natural deduction theorem prover can be implemented in this language. Similarly we demonstrate how several simple program transformers for a functional programming language can be written in lambda Prolog. These lambda Prolog programs exploit the availability of lambda-conversion and higher-order unification to elegantly deal with several of the awkward aspects inherent in these tasks. We argue that writing similar procedures in other languages would yield programs which are much less perspicuous.

(Postscript)


Encoding Transition Systems in Sequent Calculus: Preliminary Report

Raymond McDowell, Dale Miller, and Catuscia Palamidessi

To appear in the Proceedings of the 1996 Workshop on Linear Logic. ENTCS.

ABSTRACT. Linear logic has been used to specify the operational semantics of various process calculi. In this paper we explore how meta-level judgments, such as simulation and bisimulation, can be established using such encodings. In general, linear logic is too weak to derive such judgments and we focus on an extension to linear logic using definitions. We explore this extension in the context of transition systems.

This paper is available in DVI and PostScript formats.


TITLE

AUTHORS

PUBLISHED WHEN AND WHEN.

ABSTRACT.

See here for the paper.