Fall 2000, CSE 520: Lectures 21 and 22


A Calculus for Communicating Systems

The Calculus for Communicating Systems (CCS) was developed by Milner around the 80s.

Motivation: A calculus with few, orthogonal mechanisms, able to represent all the relevant concepts of concurrent computations.

The aim of Milner was also to design a calculus which would have "maximum expressive power" (for concurrency) and use the least number of concepts. More complex mechanisms should be built by using the basic ones.

Why we need a specific calculus for concurrency

We have seen that the lambda calculus is computationally complete, i.e. it is able to express any computable function. Thus we might wonder why we need another calculus for concurrent computation. The point is, sometimes the function computed by a computation does not represent all the interesting aspects of the computation. In particular, in the case of concurrent computations, there are some aspects which cannot be captured by a sequential model. One main such aspect is the following:

Interaction among processes

In sequential programs the various components of a system interact in a rigid, fixed way. Examples: concatenation of commands, function application.

In concurrecy the interaction possibilities are much richer. Example: consider the following two fragments of programs:

A: x := 1
B: x := 0; x := x+1
If a sequential computation model, A and B are equivalent (i.e. they induce the same state-tranformation) in any context. If a concurrent computation model, on the contrary, there are context which distinguish them. Consider for instance the composition with the following
C: x := 2
We have that A | C and A | B (where | stands for the parallel composition) are not equivalent. In fact, the first can produce only the states where x is 1 or 2, while the latter can produce also the state where x is 3.

Nondeterminism

We have seen in previous example that parallel composition can induce nondeterminism. We need to spend a few words about this concept, since it is a different kind of nondeterminism from the one you might be acquainted with.

Nondeterminism in sequential models

In this context nondeterminism is a convenient tool for expressing solutions to certain problems in an easy way or to study complexity (examples: search for a path in a graph, search for a proof etc.) Examples of nondeterministic formalisms are: The characteristics of nondeterminism in these sequential formalisms are:

Nondeterminism in concurrent models

In this context nondeterminism arises because of the way processes may interact with each other. Its characteristics are: Because of this second point, controlling nondeterminism (i.e. trying to reduce the possibility of "wrong choices") it is even more important here than in sequential programming. In sequential programming is just a matter of efficiency, here is a matter of avoiding crashes.

To illustrate what are the "undesirable situations", consider the example of the dining philosophers:

n philosophers are sitting at a circular table. Between each two philosophers there is a fork (hence there are n forks on the table). Each philosopher can either think or eat. In order to eat, he needs two forks. He can take only one fork at the time. All philosophers are the same in the sense that they follow "the same attitude about thinking and eating". Also all forks are the same. Hence the situation is completely symmetric, i.e. there are no privilegies, no preestablished ordering, etc.
This example is paradigmatic of a situation in which processes are competing for some shared and distributed resources. The "bad situations" are "deadlock" (each philosopher has a fork, nobody eats) and "starvation" (some philosopher never eats because the neighboroughs are quicker in getting the forks.)

The problem of the dining philosophers is to guarrantee maximal independence (hence avoid having a scheduler or a monitor who decides whose turn is to eat) while avoiding those bad situations. Note that, even if we "convince" each philopher to give back the fork in case of deadlock, it is not so easy avoiding starvation, because we could enter a loop in all philopher take one fork each, detect deadlock, put back the fork, takes one fork again etc.

This example was proposed by Dijkstra in the 70s as a benchmark to test the expressiveness of concurrent languages. It was observed by Rabin in the 80s that a completely distributed, symmetric solution, must rely on probabilistic methods. In this solution the starvation possibility is not ruled out, but has probability 0.

Interaction

In order to have the maximum expressivity with the smalles number of constructs, we need to understand what is the basic kind of interaction (the same of course should be done also for the other concepts relevant for concurrency). In general, interaction has to do with communication.

Concurrent systems offer several kinds of communication, depending on the medium. Examples are:

Each of these kinds can be further divided into various cases: What is the basic kind of communication? Milner's answer was: none of the above!. In his view, the reduction to essential principles require to avoid separation between agents (active entities) and resources (passive entities). In his approach, everything is a process, more or less active.

Thus the fundamental kind of interaction is not the one between two proceeses P and Q communicating via a buffer B, but rather between P and B, and Q and B. In Milner's view, the fundamental model of interaction is synchronous and symmetric, i.e. the partners act at the same time performing complementary actions. This kind of interaction is called handshaking: the partners agree simoultaneously on performing the two (complementary) actions.

In the following, the complement of an action a will be denoted by ^a. Usually we will regard a as the action of "receiving along channel (or interface, or port) a", and ^a as the action of "sending along channel (interface, port) a". But let us not forget that this terminology is purely a convention: the two actions have really the same status from every possible point of view. We will also use the terms "input" and "output" to denote the same distinction between the two counterparts of the action.

If we name in the interface of the buffer B where it receives data, and out the interface where its data are made available, then the buffer can be specified as follows (assuming for simplicity that it has only one cell, i.e. that it can store only one datum at a time) :

B =def= in(x).B'(x)
B'(x) =def= ^out(x).B
The "." here is called "action prefixing" and denotes sequentialization; i.e. B'(x) becomes active only after the action in(x) has been performed. The sending and the receiving processes will then be specified as follows (assuming that P send the datum d):
P =def= ^in(d).P'
Q =def= out(x).Q'(x)
As explained above, the complementary actions ^in(d) and in(x) must take place at the same time (and cause the instantiation of x with d). Same for ^out(x) (by then instantiated to ^out(d)) and out(x). In other words, we want that the system P | B | Q evolve as follows:
P | B | Q --> P' | B'(d) | Q --> P' | B | Q'(d)

Structural Operational Semantics

In sequential (functional) languages, beta reduction captures the essential mechanism in the evolution of computation. It is a structural semantics, in the sense that the evolution of a complex term is defined in terms of the evolution of the components (i.e. if M -> M' then MN -> M'N etc.) Other semantics we have seen (lazy, eager evaluation) are also defined structurally.

In concurrency, in order to achieve a structural definition, we must add some information in the transition relation (specifying the behaviour of processes). In particular, to model interaction, we have to specify the action that is being preformed during a transition. Transitions will then be formalized as a relation between two processes (or configurations) and one action.

A process with an input prefix can make a transition by performing the corresponding input action:

   a.P -a-> P 
Analogously, a process with an output prefix can make a transition by performing the corresponding output action:
   ^a.P -^a-> P 
Finally, the interaction between two parallel processes is captured by the following rule:
    P -^a-> P'  Q -a-> Q'
   ----------------------
     P | Q -tau-> P'| Q'   
Were the label tau in the conclusion represents "a silent action", and is the only action which does not have a complement. This is to express the fact that if P and Q are interacting with each other, they cannot (at the same time) interact with anybody else (two-ways interaction).

Two parallel processes should not be obliged to interact at every step. For this reason, we need also another rule for parallel composition, which models the situation in which one process makes a step and the other does not (is idle). The rule is the following:

        P -a-> P'  
   ------------------
    P | Q -a-> P'| Q  
Of course there will be also the symmetric rule (where P and Q roles are exchanged), and the symbol "a" here can also represent an output action.

In some formalisms for concurrency there are also other rules, to represent the fact that two processes can be active at the same time independenty, i.e. without interacting. These are called "true concurrency models". In CCS, however, the two rules above (and the symmetric of the second) are all what we have for the parallel construct. Such a kind of model of concurrent computation is called "interleaving": the actions of the processes are interleaved so that at each moment only one (at most) is observed.

In the above example of the buffer, we have the possibility of communicating different values through the buffer. This feature is called "parameter-passing". In order to model it, one possibility is to enrich the rule for interaction in the following way:

    P -^a(d)-> P'  Q -a(x)-> Q'(x)
   --------------------------------
         P | Q -tau-> P'| Q'(d)   
We leave as an exercise to apply these rules to prove the two transitions of the above system (P | B | Q).

Note: CCS does not deal explicitly with parameter-passing. We will see later how parameter-passing is usually modeled n CCS.

Other useful operators

We have seen the input and output actions, the silent (tau) action, the prefix, and the parallel operators. There are other syntactic constructs that are useful in concurrency, and are incorporated in CCS. These are:

Nondeterministic choice

This construct is written as + and represents alternative choice: P + Q is a process that can behave either as P or as Q. This construct represents the nondeterminism that is unavoidable in the semantics of concurrent processes. The reason why it has been introduced also as a linguistic notion (i.e. in the syntax) is to have a richer calculus, which helps in specifying, and reasoning about, concurrent systems (as an analogy, consider the introduction of complex numbers to help solving equations on real numbers). It is an odd construct however, and many other proposals for concurrency do not use +, i.e. do not accept + as a primitive concept.

Restriction

P\L, where L is a set of actions, denotes the process P deprived of its capability to perform any of the actions contained in L. It is used to represent locality of names (i.e. to express the fact that the names in L are local to P), and also to enforce process synchronization, as we will see later.

Relabeling

Given a function f on actions, P[f] represents the process that behaves like P, except that all the actions are transformed by f. Namely, for every action a that P performs, P[f] performs f(a). THe relabeling functions are usually assumed to preserve complementarity, i.e. f(^a) = ^f(a). It should be remarked that relabeling is another controversial construct, that is not universally accepted as a primitive notion.

Recursion

Recursion will be treated by using a fixpoint operator. Thus we will write fixXP, where P is a process expression possibly containing the variable X, to mean the solution of the equation X = P. Often we will write the equation explicitly, and use the variable X instead of the fixpoint expressions. Alternative notations are: muXP, fix(\X.P) and fix(X=P).

Inaction

Finally we need a constant (0-ary operator) as the basis for the construction of our processes. We will use, in our presentation, the constant 0, which represents the process that does not do anything (inaction).

The langauge CCS

We give now the formal syntactic and semantic definition of CCS. We will assume a set of actions Act, formed by a set A, the of complementary actions ^A, and the silent action tau. In the following, the elements of Act are denoted by a,b,c,... The elements of Act\{tau} are called visible actions.

Syntax

The CCS processes are generated by the following grammar
 
   P ::= 0          inaction
       | a.P        prefix (a is an action in Act)
       | P | P      parallel
       | P + P      choice
       | P\L        restriction
       | P[f]       relabeling
       | fix(X=P)   recursion
       | X          variable
Note: often, instead of writing fix(X=P), we will simply write X in the process expression, and write X =def= P (or X = P) separately.

Operational semantics

The rules of the operational semantics are as follows

Modeling value-passing

The input-output actions of CCS are not parametrized on values. However, the passing of values acn be easily implemented also with these simple actions. If we want to model the passing of values along a port a, i.e. ^a(v) and a(x), it is sufficient to assume actions ^av and av for each value v. The process a(x).P can then be defined as
Sumvav.Pv
where Pv represents the process obtained by substituting x with v in P, i.e. P[v/x].

Example

Let us consider again the examples of the two processes communicating via a buffer. For simplicity, we assume to have only one value, hence we do not need to parametrize the actions with values.
B =def= in.B'
B' =def= ^out.B P =def= ^in.P'
Q =def= out.Q'
We have the following graph:
                 P|B|Q -out-> P|B|Q' ...
                 / | \ 
               in  |  ^in
               /   |   \
         P|B'|Q   tau  P'|B|Q 
            /  \   |   /  \
         out   ^in |  in  out
          /      \ | /      \
      P|B'|Q'   P'|B'|Q    P'|B|Q' 
      ...         ...        ...
However the situation changes if we put a restriction on all visible actions. The resulting processes are forced to synchronize:
(P | B | Q)\{in,out} -tau-> (P | B' | Q)\{in,out} -tau-> (P | B | Q)\{in,out}

Specification and verification

One of the uses of CCS is to specify concurrent systems. For example, a buffer with two positions can be specified in the following way. Let Bi denote a buffer with i empty positions (i = 0, i = 1 or i = 2). we have:
   B2 =def= in.B1
   B1 =def= in.B0 + ^out.B2
   B0 =def= ^out.B1

Suppose now that we decide to implement the two-position buffer by using the one-position buffers that we have defined earlier. Such implementation could be done in the following way: The idea is to connect the "out" port of the first buffer with the "in" port of the second buffer. This can be done by renaming "out" by "a" in the first buffer and "in" by "a" in the second buffer, where "a" is a new name. Furthermore, we have to restrict the use of "a", so that an external process cannot access it. It must be for "internal use" of the two buffers only. In conclusion, we can define the two-cell buffer B2 as follows:

   B2 =def= (B[out|->a] | B[in|->a])\{a}
We will see later that these two definitions can be considered equivalent in some precise sense. Proving that they are equivalent is called "verification" (verification of the correctness of the implementation with respect to the specification).

Bisimulation Semantics

If two processes have isomorphic transition graphs, they can be certainly be considered equivalent. However there are more cases in which it seems reasonable to identify two processes.

Example 1

Consider the processes
  R =def= (P | Q)\{b}, where
  P =def= a.^b.P
  Q =def= c.b.Q
and
  R = a.c.tau.R + c.a.tau.R
these processes are equivalent, since they have isomorphic transition graphs

Example 2

Consider the two processes
   P =def= a.b.P 
and
   Q =def= a.(b.Q + b.Q)
These two processes have clearly the same behaviour and should be identified, despite of the fact that their transition graph is not isomorphic. To this purpose, the notion of bisimulation equivalence has been introduced.

Strong bisimulation

In this equivalence, tau actions count just like any other action.

We say that a relation R is a (strong) bisimulation iff for every two processes P and Q such that (P,Q) is in R, we have

We will say that two processes P and Q are (strongly) bisimilar iff there exists a bisimulation relation R such that (P,Q) is in R

Examples: a.b.0 + c.b.0 and c.b.0 + a.b.0 are strongly bisimilar. Also fix(X = a.b.X) and fix(Y = a.(b.Y + b.Y)) are strongly bisimilar. On the contrary, the processes a(b.0+c.0)b and a.b.0 + a.c.0 are not strongly bisimilar.

Weak bisimulation

This definition abstracts wrt tau actions. It identifies processes like those given in the specification and the implementation of the two-positions buffer. In the following, given a non-silent action a, P =a=> Q stands for: there exist P' and P'' such that P -->* P' -a-> P''-->* Q, where -->* is the reflexive and transitive closure of the tau-transition. For a tau action, P =tau=> Q represents the case of a sequence of tau-transitions, possibly empty, from P to Q.

A relation R is a weak bisimulation iff for every two processes P and Q, if (P, Q) is in R, then

We will say that two processes P and Q are weakly bisimilar iff there exists a weak bisimulation relation R such that (P,Q) is in R

Process algebra

Process algebra is the axiomatic approach to process theory. A process algebra is any system of axioms that extablishes a theory of equality over processes.

One particularly interesting process algebra is the one that captures the notion of (strong or weak) bisimulation. More precisely, this p.a. consists of a systems of equality axioms Ax such that Ax |= P = Q iff P is bisimilar (weakly bisimilar) to Q. For the axioms we refer to the book of Milner "Communication and Concurrency" Prentice Hall 1989 Chapter 3 and 7.