Papers & Software





Comète-Parsifal Seminar (Abstracts 2008)


2012 2011 2010 2009 2008 2007 2006 2005 2004
Wed Dec 09, 14:30Paolo Baldan Celf: An implementation of the concurrent logical framework CLF.
Tue Dec 02, 14:30 Romain Beauxis and David Baelde Focusing in the asynchronous pi-calculus
Tue Nov 04, 14:30 Alessio Guglielmi Proof Identity, Proof Complexity and Deep Inference
Tue Oct 21, 11:00 Stefan Hetzl On the non-confluence of cut-elimination
Fri Oct 03, 14:30 Stefan Haar Partial Orders are good for you !
Mon Sep 15, 13:30 Linda Postniece Cut-elimination and Proof-search for Bi-Intuitionistic Logic Using Nested Sequents
Mon Sep 15, 14:25 Revantha Ramanayake Valentini's cut-elimination for provability logic resolved
Thu Jul 24, 16:00 Yuxi Fu Expressiveness of pi and CCS
Thu Jul 24, 15:00 Yuxin Deng Testing Probabilistic Processes
Tue Jun 17, 14:30 Simon Kramer Reducing Provability to Knowledge in Multi-Agent Systems
Fri Jun 13, 14:30 Jorge A. Perez On the Expressiveness and Decidability of Higher-Order Process Calculi
Mon Jun 16, 14:30 Roberto Bagnara On the Design of Generic Static Analyzers for Imperative Languages
Thu May 29, 11:00 Maribel Fernandez Nominal Matching and Alpha-Equivalence
Wed May 14, 16:00 Simon Kramer A General Definition of Malware
Tue Apr 08, 14:30 Carlos Olarte On the Expressive Power of Universal Timed Concurrent Constraint Programming
Tue Apr 01, 14:30 Luca Fossati A Petri Net Model of Handshake Circuits
Tue Mar 25, 14:30 Jesus Aranda CCS with Replication in the Chomsky Hierarchy: The Expressive Power of Divergence
Tue Mar 18, 14:30 Kazushige Terui Computational Ludics
Thu Mar 13, 10:30 Christelle Braun Overview on Quantum Computation and Quantum Information Theory
Thu Feb 28, 14:30 Noam Zeilberger Focusing with higher-order rules
Fri Feb 22, 14:00 Daniele Gorla Towards a Unified Approach to Encodability and Separation Results for Process Calculi
Tue Feb 19, 14:00 Romain Beauxis A probabilistic powerdomain extension for the concurrent constraint programming
Tue Jan 22, 14:00 Matteo Capelletti Parsing with Non-Associative Lambek Grammars
Tue Jan 15, 14:00 Kosta Dosen Théorie Générale de la Démonstration


Dec 11 2008, 10:30
Speaker: Carsten Schuermann
Title: Celf: An implementation of the concurrent logical framework CLF.

CLF (Concurrent LF) is a logical framework for specifying and implementing deductive and concurrent systems from areas, such as programming language theory, security protocol analysis, process algebras, and logics. Celf is an implementation of the CLF type theory that extends the LF type theory by linear types to support representation of state and a monad to support representation of concurrency.

In my talk I will give a gentle introduction to the CLF type theory and demo the type inference and logic programming features of the Celf system.

Dec 02 2008, 14:30
Speaker: Romain Beauxis and David Baelde
Title: Focusing in the asynchronous pi-calculus

During the asynchronous parallel execution of two concurrent programs, the executions of each program are interleaved. This means that if the first program has three steps to do during its execution, and the other two, then all the possible combinations might be observed, so that an explicit exploration of all possible executions of the two program in parallel has a very important complexity.

A similar issue was already raised in linear logic, leading to the definition of a certain category of focalised executions. Such executions constraints the evaluation of the two programs so that all the executions of the first one are done before it starts to execute the transitions of the second. It is then proved, in linear logic, that the proof search is not weakened, while the complexity of the search is greatly reduced.

The work presented here propose a transposition of this technique to the asynchronous pi-calculus. Hence, we define the notion of focalised transitions for the asynchronous pi-calculus, allowing to greatly reduce the complixity of the evaluation of a process. Furthermore, we prove that we preserve all the available information on the processes while using focalised transitions.

Nov 04 2008, 14:30
Speaker: Alessio Guglielmi (University of Bath and INRIA Nancy Grand-Est)
Title: Proof Identity, Proof Complexity and Deep Inference

We have been dealing with mathematical proofs for millennia, but we still don't know them enough to answer elementary questions, such as: `When are two proofs identical?' and `How complex need proofs be?' These questions correspond to open problems at the crossing of logic and computer science: respectively, the so-called `Hilbert's 24th problem' and `coNP vs NP'.

To deal with proof identity, we need new proof systems with a much improved ability to see the fine detail of proofs, i.e., we need systems whose inference rules only deal with a minimal amount of information. These rules are called `local' and cannot be obtained in the traditional proof theory. However, locality is achievable by adopting a relatively novel approach to the structure of proofs, that we call `deep inference'. Thanks to locality, not only are we able to see more detail, we are also able to discard unnecessary detail, the so-called `bureaucracy' of traditional proof systems. In particular, we can show that normalisation can be achieved in a largely syntax-independent way by removing bureaucracy from deep-inference proofs, and reducing them to their `atomic flow'.

We are about to satisfactorily solve the proof identity problem, by a coordinated effort involving several research groups. We have also made progress towards reducing the complexity of proofs and of their normalisation.

I will give an accessible survey of these topics, for anybody with an interest in the theoretical foundations of computer science. More on deep inference at

Oct 21 2008, 11:00
Speaker: Stefan Hetzl
Title: On the non-confluence of cut-elimination

Cut-elimination is one of the most important proof transformations in logic. Historically, it has been introduced by Gentzen as a tool for consistency proofs in the tradition of Hilbert's programme. It has however found many other applications: One of them is the application of cut-elimination to the analysis of concrete mathematical proofs ('proof mining'). A famous example is Girard's demonstration that from the topological Fuerstenberg-Weiss proof of van der Waerden's theorem, the original combinatorial proof can be obtained.

However, cut-elimination is an inherently non-deterministic process. At each stage one has the choice between different cuts to reduce and - for a single cut - there are different ways of reducing it. It is not clear, in how far this formal non-determinism may lead to mathematical differences in the resulting elementary proofs. Understanding the possible span of the results of cut-elimination procedures is of fundamental importance for judging proof analyses based on these methods.

In this talk I will treat the standard syntactic cut-elimination for first-order classical logic and report on a recent result obtained jointly with M. Baaz on the non-confluence of this reduction: It is possible to construct a sequence of polynomial-size proofs s.t. a proof in this sequence has non-elementarily many different cut-free normal forms (each of non-elementary size). These normal forms are different in a strong sense: They not only represent different Herbrand-disjunctions but also have different propositional structures and hence are not substitution-instances of a single general proof.

This result shows that the constructive content of a cut-free proof does not only depend on the proof with cuts from which it was generated but also on the way the cuts are eliminated.

Oct 03 2008, 14:30
Speaker: Stefan Haar
Title: Partial Orders are good for you !

The talk takes a stand to advocate the use of Petri net-type models and partial order unfolding semantics for the analysis of large networked systems, exemplified by fault diagnosis in telecommunication networks. The methodology of asynchronous diagnosis is presented, followed by a discussion of related problems that are fun but challenging, such as probabilistic processes, distribution of unfolding, and testing of input/output automata over partially ordered I/O streams.
Sep 15 2008, 13:30
Speaker: Linda Postniece
Title: Cut-elimination and Proof-search for Bi-Intuitionistic Logic Using Nested Sequents

Bi-intuitionistic logic is the extension of intuitionistic logic with a connective dual to implication. We propose a new sequent calculus for bi-intuitionistic logic which sits somewhere between display calculi and traditional sequent calculi by using nested sequents. Our calculus enjoys a simple (purely syntactic) cut-elimination proof as do display calculi. But it has an easily derivable variant calculus which is amenable to automated proof search as are (some) traditional sequent calculi. As far as we know, our new calculus is the first sequent calculus for bi-intuitionistic logic which uses no semantic additions like labels, which has a purely syntactic cut-elimination proof, and which can be used naturally for backwards proof-search. (Joint work with Rajeev Gore' and Alwen Tiu)
Sep 15 2008, 14:45
Speaker: Revantha Ramanayake
Title: Valentini's cut-elimination for provability logic resolved

In 1983, Valentini presented a syntactic proof of cut-elimination for the provability logic GL. The sequent calculus he used was built from sets, as opposed to multisets, thus avoiding an explicit contraction rule. From a syntactic viewpoint it is more satisfying and formal to explicitly identify applications of the contraction rule. However, it has been claimed recently that Valentini's arguments to eliminate the cut-rule do not terminate for sequents built from multisets. This claim has led to much confusion regarding the source of failure. Consequently, various authors have sought new proofs of cut-elimination for GL in a multiset setting. In this talk we identify the mistake in the claim and present a proof of cut-elimination for sequents built from multisets based on Valentini's original arguments. Complications arising from the contraction rule require special attention. Finally we discuss a generalisation of the cut-elimination procedure to tackle two logics that have a similar characteristic axioms, namely Grz and Go.
Jul 24 2008, 15:00
Speaker: Yuxin Deng
Title: Testing Probabilistic Processes

In this talk I will review a natural extension of the testing theory of De Nicola & Hennessy to processes with probabilistic and nondeterministic choice. For both the may and must testing preorders, I present their characterisations in terms of simulations, modal logics, and inequational axioms in a probabilistic extension of the process algebra CSP.

(The talk is based on joint work with Rob van Glabbeek, Matthew Hennessy, Carroll Morgan, and Chenyi Zhang)

Jul 24 2008, 16:00
Speaker: Yuxi Fu
Title: Expressiveness of pi and CCS

Subbisimilarity is proposed as a general tool to classify the relative expressive power of process calculi. The expressiveness of several variants of CCS is compared in terms of the subbisimilarity relationship. Similar investigation is also carried out for the variants of the pi calculus. The relative expressiveness of the different forms of the choice operation and the different ways of introducing infinite behaviors are systematically studied in both the frameworks of CCS and pi. Some issues concerning the expressiveness of both CCS and pi are clarified. Several open problems are solved along the way. The subbisimilarity approach and the relative expressiveness results can be applied to show the independence of the operators of the pi calculus. The definition of the subbisimilarity relationship can be further strengthened with computational requirement, which leads to a uniform treatment of computation and interaction.
Jun 13 2008, 14:30
Speaker: Jorge A. Perez
Title: On the Expressiveness and Decidability of Higher-Order Process Calculi

In this talk I will give an overview of a recent work on the expressiveness of higher-order calculi. In higher-order process calculi the values exchanged in communications may contain processes. I will introduce HOcore, a core calculus of higher-order concurrency; it has only the operators necessary to express higher-order communications: input prefix, process output, and parallel composition. By exhibiting a nearly deterministic encoding of Minsky Machines, HOcore is shown to be Turing Complete and therefore its termination problem is undecidable. Strong bisimilarity, however, is proved to be decidable. Further, the main forms of strong bisimilarity for higher-order processes (higher-order bisimilarity, context bisimilarity, normal bisimilarity, barbed congruence) coincide. They also coincide with their asynchronous versions.

If time permits, I will discuss two additional results:

(i) a sound and complete axiomatization of bisimilarity, and

(ii) the proof that bisimilarity becomes undecidable if at least four static (i.e., top-level) restrictions are added to the calculus.

Jun 17 2008, 14:30
Speaker: Simon Kramer
Title: Reducing Provability to Knowledge in Multi-Agent Systems

We construct a reduction of provability to a combination of four kinds of knowledge in multi-agent systems. These kinds are: individual knowledge (knowledge of messages), plain propositional knowledge (knowledge that a state of afairs is the case), common knowledge (propositional knowledge shared in a community of agents), and a new kind of knowledge, namely adductive knowledge (propositional knowledge contingent on the adduction of certain individual knowledge, e.g., through oracle invocation ).

Keywords: modal logic (conditional, epistemic, provability, and temporal), knowledge (individual, propositional, common, and adductive), designated-verifier proofs, multi-agent systems

Jun 16 2008, 14:30
Speaker: Roberto Bagnara
Title: On the Design of Generic Static Analyzers for Imperative Languages

The design and implementation of precise static analyzers for significant fragments of modern imperative languages like C, C++, Java and Python is a challenging problem. Here we consider a core imperative language that has several features found in mainstream languages such as those including recursive functions, run-time system and user-defined exceptions, and a realistic data and memory model. For this language we provide a concrete semantics -- characterizing both finite and infinite computations -- and a generic abstract semantics that we prove sound with respect to the concrete one. We say the abstract semantics is generic since it is designed to be completely parametric on the analysis domains: in particular, it provides support for relational domains (i.e., abstract domains that can capture the relationships between different data objects). We also sketch how the proposed methodology can be extended to accommodate a larger language that includes pointers, compound data objects and non-structured control flow mechanisms. The approach, which is based on structured, big-step operational semantics and on abstract interpretation, is modular in that the overall static analyzer is naturally partitioned into components with clearly identified responsibilities and interfaces, something that greatly simplifies both the proof of correctness and the implementation.
May 29 2008, 11:00
Speaker: Maribel Fernandez
Title: Nominal Matching and Alpha-Equivalence

Nominal matching is matching modulo alpha-equality, and has applications in programming languages and theorem proving, amongst others. In this talk I will describe efficient algorithms to check the validity of equations involving binders, and also to solve matching problems modulo alpha-equivalence, using the nominal approach.
May 14 2008, 16:00
Speaker: Simon Kramer
Title: A General Definition of Malware

We propose a general, formal definition of malware in the language of modal logic. Our definition is general thanks to its abstract formulation, which, being abstract, is independent of --- but nonetheless generally applicable to --- the manifold concrete manifestations of malware. From our formulation of malware, we derive equally general and formal definitions of benware (not malware), anti-malware ("antibodies" against malware), and medware ("medecine" for affected software). We obtain results that express fundamental relations between malware, benware, anti-malware, and medware. Our general defining principle is causation of (in)correctness.


malware (computer viruses, worms, Trojan horses, etc.), benware, anti-malware, medware, modal logic, formal software engineering

Apr 08 2008, 14:30
Speaker: Carlos Olarte
Title: On the Expressive Power of Universal Timed Concurrent Constraint Programming

The timed concurrent constraint programming model tcc is a process calculus, closely related to First-order linear-temporal logic (FLTL), for modeling reactive systems. Universal tcc (utcc) is a conservative extension of tcc with an abstraction operator for modeling mobile reactive systems. It has been shown in the literature that tcc processes are finite-state and thus cannot encode Turing-powerful formalisms. In contrast, here we provide encodings in utcc of formalisms such as Minsky machines and the lambda-calculus. Although both formalisms are Turing- equivalent, these encodings serve different purposes: The encoding of Minsky machines uses a monadic constraint system which together with the utcc theory allows us to prove a new undecidability result for a fragment of FLTL. Namely, the undecidability of the validity problem for monadic FLTL without equality and function symbols. This result justies the restriction imposed in previous decidability results on the quantification of flexible-variables in monadic FLTL.

The encoding of the lambda-calculus uses instead a polyadic constraint system but it is compositional unlike that of Minsky machines. Compositionality is an important property of an encoding as it may allow structural analysis in utcc of the source terms; i.e., functional programs.

In summary, we show the full computational expressiveness of utcc, its compositional correspondence to functional programming (lambda-calculus), and its applicability for proving new results in standard concurrency formalisms such as FLTL.

Apr 01 2008, 14:30
Speaker: Luca Fossati
Title: A Petri Net Model of Handshake Circuits

We propose a Petri net model of handshake circuits. Handshake circuits are asynchronous circuits which enforce several properties such as absence of transmission interference and insensitivity from delays of propagation on wires. We introduce the notion of handshake Petri net, a Petri net with a specific external interface. We show that the set of observable quiescent traces generated by such a net captures the properties defining a handshake protocol. Conversely we show that for any handshake protocol we can construct a corresponding net. We also study different subclasses of the model. Many examples are provided.

Mar 25 2008, 14:30
Speaker: Jesus Aranda
Title: CCS with Replication in the Chomsky Hierarchy: The Expressive Power of Divergence

A remarkable result in [4] shows that in spite of its being less expressive than CCS w.r.t. weak bisimilarity, CCS! (a CCS variant where infinite behavior is specified by using replication rather than recursion) is Turing powerful. This is done by encoding Random Access Machines (RAM) in CCS!. The encoding is said to be non-faithful because it may move from a state which can lead to termination into a divergent one which do not correspond to any configuration of the encoded RAM. I.e., the encoding is not termination preserving. In this paper we study the existence of faithful encodings into CCS! of models of computability strictly less expressive than Turing Machines. Namely, grammars of Types 1 (Context Sensitive Languages), 2 (Context Free Languages) and 3 (Regular Languages) in the Chomsky Hierarchy. We provide faithful encodings of Type 3 grammars. We show that it is impossible to provide a faithful encoding of Type 2 grammars and that termination-preserving CCS! processes can generate languages which are not Type 2. We finally show that the languages generated by termination-preserving CCS! processes are Type 1 .

[4] N. Busi, M. Gabbrielli, and G. Zavattaro. Comparing recursion, replication, and iteration in process calculi. In ICALP'04, volume 3142 of Lecture Notes in Computer Science, pages 307-319. Springer-Verlag, 2004.

Mar 18 2008, 14:30
Speaker: Kazushige Terui
Title: Computational Ludics

We present a computationally adapted version of Girard's ludics, aiming at giving a foundational framework for the computability and complexity theory. We introduce a handy syntax for designs that simplifies and extends Curien's concrete syntax with explicit cuts and identities. We also consider design generators that allow for finite representation of some infinite designs. A normalization procedure in the style of Krivine's abstract machine works on generators directly, giving rise to an effective means of normalization.

Our extended system enjoys the analytical theorems of ludics in a suitably modified form. In addition, we prove a general form of full completeness with respect to coinductively defined behaviours and infinitary proofs.

Finally, an analysis is made on those designs representing data objects. We show that any set of finite data forms a behaviour, and then discuss acceptance of languages via normalization.<>

Mar 13 2008, 10:30
Speaker: Christelle Braun
Title: Overview on Quantum Computation and Quantum Information Theory

Quantum information theory results from the effort in the last decades to generalize classical information theory to the quantum world. This cross-disciplinary study of information processing within quantum mechanics lead to novel achievable limits for the computation and communication capabilities, and demonstrated remarkable effects such as quantum cryptography or teleportation. After a brief overview on the required basics in quantum mechanics, we will present quantum computation and its relation to the classical approach before giving more insights in quantum information theory and sketching exciting illustrations of this promising field of research.

Feb 28 2008, 14:30
Speaker: Noam Zeilberger
Title: Focusing with higher-order rules

What makes Andreoli's notion of focusing proofs so robust? In this talk, I will try to argue for the naturality of focusing by giving a new presentation that makes very explicit the duality between focus and inversion, and between positive and negative polarity. Loosely inspired by Girard's ludics, the idea is to stratify the definition of a focusing sequent calculus into two stages: first define a form of "pattern-typing", and then define the focusing judgments by quantifying over patterns. Focus quantifies existentially, while inversion quantifies universally---positive connectives quantify over constructor patterns, negative connectives over destructor patterns. Besides giving a simple account of evaluation order and pattern-matching, this style of presentation has the benefit of being amenable to modular extension, since we potentially can encode many features into pattern-typing without affecting the overall structure of focusing proofs, or the meta-proofs of identity and cut. I will describe a few of these extensions, which are still preliminary

Feb 22 2008, 14:00
Speaker: Daniele Gorla
Title: Towards a Unified Approach to Encodability and Separation Results for Process Calculi

In this talk, we present a unified approach to evaluating the relative expressive power of process calculi. In particular, we identify a small set of criteria (that have already been somehow presented in the literature) that an encoding should satisfy to be considered a good means for language comparison. We argue that the combination of such criteria is a valid proposal by noting that:

(i) the best known encodings appeared in the literature satisfy them;

(ii) this notion is not trivial, because there exist encodings that do not satisfy all the criteria we have proposed;

(iii) the best known separation results can be formulated in terms of our criteria; and

(iv) some widely believed (but never proved) separation results can be proved by using the criteria we propose.

Moreover, the way in which we prove known separation results is easier than the way in which such results were originally proved.

Feb 19 2008, 14:00
Speaker: Romain Romain
Title: A probabilistic powerdomain extension for the concurrent constraint programming

Concurrent constraint programming (CCP, [1]) is a model of computation based on the notion of store as the information available for the process. Each process is equiped with a store, with respect to which it tests and adds constraints. During the execution, the store can only increase, in a domain-theoreric fashion.

A domain-theoretic denotational semantic has been defined in [2] that maps a process to the supremum store that it can reach. It is then possible to compute this supremum store by a fixed point construction, based on the grammar of the process.

This article studies an extension of concurrent constraint programming with probabilistic executions. Although similar kinds of extensions have already been investigated in the literature[3, 4] , we are interested in extending the original operational and denotational semantics in the most natural way. To do this, we propose to model the probabilistic CCP as a probabilistic powerdomain, similar to the one described int [5], built on top of the original constraint lattice.

We apply those result to a very simple probabilistic CCP based on the original CCP, with blind probabilistic choice added. Results show a very natural extension of the original denotational semantic.


[1]: Concurrent Constraint Programming Languages, Vijay A. Saraswat, PhD Thesis, Carnegie-Melon University, 1989

[2]: Semantic Foundations of Concurrent Constraint Programming, Vijay A. Saraswat and Martin Rinard and Prakash Panangaden, Conference Record of the Eighteenth Annual {ACM} Symposium on Principles of Programming Languages, 1991

[3]: Probabilistic Concurrent Constraint Programming, Vineet Gupta and Radha Jagadeesan and Vijay A. Saraswat, International Conference on Concurrency Theory, 1997

[4]: Stochastic Processes as Concurrent Constraint Programs, Vineet Gupta and Radha Jagadeesan and Prakash Panangaden, Symposium on Principles of Programming Languages, 1999

[5]: The troublesome probabilistic powerdomain, A. Jung and R. Tix, Third Workshop on Computation and Approximation, 1998

Jan 22 2008, 14:00
Speaker: Matteo Capelletti
Title: Parsing with Non-Associative Lambek Grammars

The Non-Associative Lambek calculus (NL, Lambek 1961) is a logic without structural rules which can be applied to the syntactic and semantic analysis (of fragments) of natural language. If we look at it as a logical system, we may want to develop theorem proving methods for its sequents. While if we look at it as a grammar system, we are interested in developing parsing systems for its grammars. For NL, the second problem is more general than the first (since it does not assume any given structure) and is the one that I will address in my talk. However, the solution that I propose for parsing with NL grammars also provides (and in fact presupposes) a solution to the theorem proving problem. Altogether, I will illustrate a normal form polynomial parsing method for NL grammars.

Jan 15 2008, 14:00
Speaker: Kosta Dosen
Title: Théorie Générale de la Démonstration

La question centrale de la théorie générale de la démonstration est la question d'identité des démonstrations -- question traitée ou bien dans le calcul lambda typé ou bien dans la théorie des catégories. Un aperçu sera donné des résultats obtenus surtout en traitant la question de cette seconde façon.

Page maintainer: Carlos Olarte