\section{The \tcc{} model}\label{tcc} This section is devoted to reviewing the formalization and technical development of the intuitions discussed in the previous section. Some readers may prefer to look at the more concrete examples in subsequent sections, and consult this section as needed for technical definitions. For this purpose, the syntax, denotational and operational semantics of \tcc{} are summarized in Tables~\ref{syntax-table},\ref{densem-table},\ref{opsem-table} respectively. \subsection{Constraint systems} Formally, constraint systems $\cal C$ are essentially arbitrary systems of partial information \cite{cccc,my-book}. They are taken to consist of a set $D$ of tokens with minimal first-order structure (variables, substitutions, existential quantification and conjunction), together with an entailment relation $\vdash_{\cal C}$ that specifies when a token $x$ must follow given the tokens $y_1, \ldots, y_n$ are known to hold. Intuitionistic disjunction can be built in purely formally on top of the underlying constraint system. In particular, the constraint system needs to provide us only with an algorithm that will answer queries of the form ``is $c$ entailed by the store S''. To answer the query `` is $c \vee d$ entailed by the store S'', send two different queries for $c$ and $d$ and return yes if either returns yes. Conversely, when a constraint $c \vee d$ is sent to the store, the store appears as if split into two, one receiving $c$ the other $d$. All subsequent queries are answered successfully only if {\em both} splits answer successfully. Examples of constraint systems: (1)~\herbrand{} underlying logic programming (variables range over finite trees, tokens specify equality of terms), (2)~FD \cite{cc-FD} for finite domains (variables range over finite domains, with tokens for equality and for specifying that a variable lies in a given range), etc. A third example, \gentzen{}, will be the concrete constraint system used in the programming examples in this paper. For this constraint system, we assume that we are given a vocabulary of terms that includes variables, and a vocabulary of predicates with arities. The tokens of the constraint system are then atomic formulas, built out of the predicates and terms. The entailment relation is built out of only structural rules, i.e.{} is induced by the identity relation on atomic formulas. In this section, we will take as fixed a constraint system $\cal C$, with underlying set of tokens $D$. Let the entailment closed subsets of $D$ be denoted by $|D|$. $(|D|, \subseteq)$ is a complete algebraic lattice; we will use the notation $\sqcup$ and $\sqcap$ for the joins and meets of this lattice. We shall call (finite) sets of tokens {\em primitive constraints} and designate them by the letters $c,d,e$. A (finite) {\em constraint } is an element of $|D|$ generated from a (finite) set of tokens; they will be designated by the letters $a,b$. For any primitive constraint $c$, we let $\LL{c}$ stand for the constraint generated by $c$. \subsection{Denotational intuitions} Earlier work~\cite{popl91} identified a \cc{} program with the observations that can be made about the program. The observations of a program are the set of quiescent points -- the set of elements of $|D|$ on which the program quiesces without adding any new information. It was further shown that the set $S$ of quiescent points of a program is {\em determinate} that is, satisfies the property that for any point $x$, the greatest lower bound (glb) of any non-empty subset of $S$ above $x$ is also in $S$.\footnote{Such a set is said to be determinate because it can be used to define a partial function: associate with every point $x$ in the underlying partial set $T$, the least element of $S$ above $x$, if any. These functions are partial {\em closure operators}, i.e., they are monotone, idempotent and extensive.} Such a model is rich enough to define an ``abort'' process that, in fact, quiesces on no input (and has hence the empty set of quiescent points). What should the timed model be? Recall that the execution within a time step is captured by execution in \cc{} and that discrete time is modeled by natural numbers. Combining these two ideas, we conclude that the timed model should be a sequence of models of \cc{} programs. Formally, this is captured by the intuition that ``Processes are relations extended over time''~\cite{IntCat}. The formal development follows. First, we extend the set of observations over time: \begin{definition} \Obs{}, the set of observations, is the set of finite sequences of constraints. \end{definition} Intuitively, we shall observe the {\em quiescent sequences} of constraints for the system. We let $s,u,v$ range over sequences of constraints. We use ``$\epsilon$'' to denote the empty sequence. The concatenation of sequences is denoted by ``$\cdot$''; for this purpose a constraint $a$ is regarded as the one-element sequence $\tuple{a}$. Given $S \subseteq \Obs{}$ and $s \in \Obs$, we will write $S\ {\bf after}\ s$ for the set $\{a \in |D| \alt s \cdot a \in S\}$ of quiescent points of $S$ in the instant after it has exhibited the observation $s$. \begin{definition} $P \subseteq \Obs{}$ is a {\em process} iff it satisfies the following conditions: \begin{enumerate} \item~({\em Non-emptiness}) $\epsilon \in P$, \item~({\em Prefix-closure}) $s \in P$ whenever $s\cdot t \in P$, and \item~({\em Determinacy}) $P\; {\bf after}\; s$ is determinate whenever $s \in P$. \end{enumerate} \end{definition} We will let \PROC{} designate the set of all processes, and let $P,Q$ range over \PROC{}.\footnote{An alternate way of thinking of processes is that they correspond exactly to the resting points of partial closure operators on \OBS{} (in the lexicographic ordering on sequences) that are {\em stratified} ($f(s)$, if defined, has the same cardinality as $s$), and {\em causal} ($f(s)$ is a prefix of $f(s\cdot a)$, assuming both are defined). Space does not permit discussing this alternate viewpoint in detail.} \subsection{Process algebra} We now identify basic processes and process combinators in the model. The combinators fall into two categories: \begin{description} \item[CCP constructs: ] Tell, Parallel composition and Timed Positive Ask are inherited from CCP. These do not, by themselves, cause ``extension over time''. Parallel composition is the explicit representation of concurrency in the language. Timed Positive Ask and Tell combine to allow synchronization capabilities. \item[Timing constructs: ] Timed Negative Ask, Unit Delay, and Abortion that cause extension over time. Unit delay starts a process to be started in the next time instant. Timed Negative Ask is a conditional version of Unit Delay, based on detection of negative information. It causes a process to be started in the the next time instant if on quiescence of the current time instant, the store was not strong enough to entail some information. \end{description} The formal description of these combinators follows. We present the denotational and operational semantics simultaneously. \begin{table} \begin{tabular}{|l|} \hline \begin{minipage}[t]{\columnwidth} {\bf Syntax.} \[ \begin{array}{lllll} \mbox{(Agents)}& A & {::=} & c & \mbox{(Tell)}\\ && \alt & \nowt c \then {A} & \mbox{(Timed Positive Ask)}\\ && \alt & \nowe c \else {A} & \mbox{(Timed Negative Ask)}\\ && \alt & \next A & \mbox{(Unit Delay)}\\ && \alt & \abort & \mbox{(Abort)}\\ && \alt & \Skip & \mbox{(Skip)}\\ && \alt & A \PAR A & \mbox{(Parallel composition)}\\ && \alt & X\Hat A & \mbox{(Hiding)}\\ && \alt & g & \mbox{(Procedure Call)} \\ \\ \mbox{(Procedure Calls)}& g & {::=} & p(t_1,\ldots,t_n) & \\ \mbox{(Declarations)} & D & {::=} & g::A & \mbox{(Definition)}\\ && \alt & D.D & \mbox{(Conjunction)}\\ \mbox{(Programs)} & P & {::=} & \{D.A\}\\ \end{array} \] \end{minipage}\\ \hline\end{tabular} \caption{Syntax for \tcc{} languages}\label{syntax-table} \end{table} \begin{table} \begin{tabular}{|l|} \hline \begin{minipage}[t]{\columnwidth} {\bf Semantic Domain.} We take as fixed a constraint system $\cal C$, with underlying set of tokens $D$ and let the finitary inference relation $\vdash$ record which token follows from which ollection of tokens. Let the $\vdash$-closed subsets of $D$ be denoted by $|D|$. $(|D|, \subseteq)$ is a complete algebraic lattice; we will use the notation $\sqcup$ and $\sqcap$ for the joins and meets of this lattice. A (finite) {\em constraint } is an element of $|D|$ generated from a (finite) set of tokens. We usually denote (finite) sets of tokens by the letters $c,d,e$, and constraints by the letters $a,b$. For any set of tokens $c$, we let $\LL{c}$ stand for the constraint generated by $c$. \begin{definition} \Obs{}, the set of observations, is the set of finite sequences of constraints. \end{definition} Concatenation of sequences is denoted by ``$\cdot$''; for this purpose a constraint $a$ is regarded as the one-element sequence $\tuple{a}$. For any $S \subseteq \OBS$, and sequence $s \in \OBS$, let $S\; \mbox{\bf after}\; s$ be the set $\{a \in |D| \alt s\cdot a \in S\}$. A subset $S$ of a partially ordered set $T$ is said to be {\em determinate} if for every $x \in T$, the subset of $S$ above $x$ is empty or contains a minimum. \begin{definition} A process $P \in \PROC{}$ is a non-empty, prefix-closed subset of $\Obs$ such that $P\; {\bf after}\; s$ is determinate for all $s \in P$. \end{definition} {\bf Semantic Equations.} For a sequence of constraints $s$, by $\exists X.s$ we shall mean the element-wise existential quantification over $X$. \begin{eqnarray*} \LL{c} &=& \{\epsilon\} \cup \{d\cdot s \in \Obs \alt d \supseteq \LL{c}\} \\ \LL{\nowt c \then A} &=& \{\epsilon\} \cup \{d \cdot s \in \Obs\alt d \supseteq \LL{c} \entails d \cdot s \in \LL{A}\} \\ \LL{\nowe c \else A} &=& \{\epsilon\} \cup \{d \cdot s \in \Obs\alt d \not\supseteq \LL{c} \entails s \in \LL{A})\} \\ \LL{\next A} &=& \{\epsilon\} \cup \{d \cdot s \in \Obs\alt s \in \LL{A})\} \\ \LL{\abort} &=& \{\epsilon\}\\ \LL{\Skip} &=& \OBS\\ \LL{A \PAR B} &=& \{s \in \Obs \alt s \in \LL{A} \wedge s \in \LL{B} \} \\ \LL{X\Hat A} &=& \{ s \in \Obs \alt \exists X.s = \exists X.t, \mbox{ for some } t \in\LL{A}\} \end{eqnarray*} The semantics of recursion is handled in the usual way by taking least fixed points of the associated functional. \end{minipage}\\ \hline\end{tabular} \caption{Denotational Semantics for \tcc{} languages}\label{densem-table} \end{table} \begin{table} \begin{tabular}{|l|} \hline \begin{minipage}[t]{\columnwidth} \paragraph{Configuration.} A configuration is merely a multiset $\Gamma$ of agents. Multiset union is viewed as parallel composition. We implicitly represent the store in a configuration: for $\Gamma$ a configuration, $\sigma(\Gamma)$ denotes the sub-multiset of tokens in $\Gamma$. {\bf Transition rules.} Two binary transition relations $\derives, \timederives$ are defined over configurations. $\derives$ represents transitions {\em within} a time instant. $\timederives$ represents a transition from one time instant to the next. {\bf Axioms for $\derives$.} The binary relation $\derives$ on configurations is the least relation satisfying the rules: \begin{eqnarray*} (\Gamma, \Skip) &\derives& \Gamma\\ (\Gamma, \abort) &\derives& \abort \\ (\Gamma, A \PAR B) &\derives& (\Gamma, A, B)\\ (\Gamma, X \Hat A) &\derives& (\Gamma, A[Y/X]) \ \ \ (Y \mbox{not free in } \Gamma)\\ (\Gamma, p(t_1,\ldots, t_n))& \derives& (\Gamma,A[X_1 \mapsto t_1, \ldots, X_n \mapsto t_n]) \end{eqnarray*} and \begin{eqnarray*} {\from{\sigma(\Gamma) \vdash c}\infer{(\Gamma, \nowt c \then{A})\derives (\Gamma, A)} }\\ {\from{\sigma(\Gamma) \vdash c}\infer{(\Gamma, \nowe c \else{B})\derives \Gamma}}\\ \end{eqnarray*} In the case of procedure calls we assume that the underlying program contains a clause $p(X_1,\ldots, X_n)::A$. {\bf Axioms for $\timederives$.} The binary relation $\timederives$ is the least relation satisfying the single rule: \begin{eqnarray*} \from{\Delta, \{\nowe c_i \else {A_i}\alt i < n \}\not\derives} \infer{\Delta, \{\nowe c_i \else {A_i}\alt i < n\}, \{\next{B_j} \alt j < m \} \timederives \{A_i \alt i < n\},\{B_j \alt j < m\}} \end{eqnarray*} Above, $\Delta$ ranges over multisets of agents that do not contain Timed Negative Asks or Unit Delays. Note that the empty configuration can always make a $\timederives$ transition, and that a configuration containing \abort{} can never quiesce (that is, if $\Gamma$ contains $\abort$, then there is always a $\Delta$, namely, \abort{}, such that $\Gamma \derives \Delta$). Therefore there are configurations $\Gamma$ which are such that for no $\Delta_1, \Delta_2$ do we have: $$\Gamma \starderives \Delta_1 \timederives \Delta_2$$ In that sense $\timederives$ is ``partial''. \end{minipage}\\ \hline\end{tabular} \caption{Operational Semantics for \tcc{} languages}\label{opsem-table} \end{table} The operational semantics is defined via two binary transition relations $\derives, \timederives$ over configurations. $\derives$ represents transitions {\em within} a time instant. This relation is intimately related to the transition relation in CCP since it represents computation within a time step. $\timederives$ represents a transition from one time instant to the next. A configuration will be simply a (possibly empty) multiset of {\em agents}, the syntactic representatives of processes. We implicitly represent the store in a configuration: for $\Gamma$ a multiset of agents, we will let $\sigma(\Gamma)$ be the sub-multiset of tokens in $\Gamma$. \subsection{Timing constructs} \paragraph{Skip.} \Skip{} is the process that does nothing at all at every time instant. Hence every sequence of constraints is quiescent for it. Thus: \begin{eqnarray*} \LL{\Skip} & \defeq & \OBS \end{eqnarray*} Operationally, \Skip{} has no effect: \begin{eqnarray*} \Gamma, \Skip \derives \Gamma \end{eqnarray*} \paragraph{Abortion.} \abort{} is the process that instantly causes all interactions with the environment to cease. Hence the only observation that can be made of it is $\epsilon$. \begin{eqnarray*} \LL{\abort} & \defeq & \{\epsilon\} \end{eqnarray*} Operationally, \abort{} annihilates the environment: \begin{eqnarray*} \Gamma, \abort \derives \abort \end{eqnarray*} Note that this implies that a configuration $\Gamma$ containing $\abort$ cannot ``$\derives$-quiesce'', that is, reach, through $\derives$ transitions, a state which has no $\derives$ transition emanating from it. \paragraph{Unit Delay. } Unit Delay is a variant of the unit delay primitives in synchronous languages~\cite{Milner83,BerryFSTTCS}; intuitively, $A=\next B$ is the process that behaves like $B$ in the next time instant. It performs no computation in the present instant. When is $o = a \cdot s$, an arbitrary element of $\Obs$, an observation of $A$? There are no restrictions on $a$. Since $A$ behaves like $B$ from the next time instant, $s$ must be an observation of $B$. \begin{eqnarray*} \LL{\next B} &\defeq& \{\epsilon\} \cup \{a \cdot s\in \Obs \alt s \in \LL{B}\} \end{eqnarray*} \paragraph{Timed Negative Asks.} Timed Negative Ask is the only way to detect ``negative information'' in \tcc{}. Intuitively, $A = \nowe c \else B$ is the process that behaves like $B$ in the next time instant if on quiescence of the current time instant the store was not strong enough to entail $c$.\footnote{In reality, we allow the more general combinator $A=\nowe \{c_1;\ldots; c_n\} \else B$ --- it behaves like $B$ provided that the store on quiescence is not above each of the constraints $c_i$. The details are straightforward.} When is $o = a \cdot s$, an arbitrary element of $\Obs$, an observation of $A$? There are two possibilities. Either $a \supseteq \LL{c}$ or not. If $a \supseteq \LL{c}$, $A$ behaves like \Skip{}, so $s$ can be any observation whatsoever. If not, $A$ behaves like $B$ from the next time instant; so $s$ must be an observation of $A$. Thus: \begin{eqnarray*} \LL{\nowe c\else A} & \defeq & \{\epsilon\} \cup \{a \cdot s\in \Obs \alt a \not\supseteq \LL{c} \entails s \in \LL{A}\} \end{eqnarray*} Operationally, if the current store entails $c$, then we can eliminate $A$: \begin{eqnarray*} \from{\sigma(\Gamma) \vdash c} \infer{(\Gamma, \nowe c \else{B}) \derives \Gamma} \end{eqnarray*} In the following for an agent $\nowe c \else A$ we say that $c$ is the ``antecedent'' and $A$ the ``body''. \paragraph{Operational semantics: the $\timederives$ relation. } Consider now the situation in which computation in the current time instant has quiesced. This means that all reductions that could have been caused because of the entailment of Positive Asks have been done, all Negative Asks whose antecedents are entailed have been eliminated, and computation has not aborted. Computation can now progress to the next time instant. The active agents at the next time instant are the bodies of the remaining Negative Ask agents or the bodies of agents within a Unit Delay. All other agents, including the current store, will be discarded. Formally, if $\Delta$ ranges over multisets of agents (including constraints) other than Negative Ask or Unit Delay, we can write the transition rule as: \begin{eqnarray*}\label{nextelserule} \from{\Delta, \{\nowe c_i \else {A_i}\alt i < n \} \not\derives} \infer{\Delta, \{\nowe c_i \else {A_i}\alt i < n\}, \{ \next{B_j} \alt j < m \} \timederives \{A_i \alt i < n\},\{B_j \alt j < m\}} \end{eqnarray*} Note that a configuration $\Gamma$ can make a $\timederives$-transition even if $\Gamma$ is the empty set of agents; however, it cannot make any $\timederives$-transition if it contains \abort. \paragraph{Guarded Recursion.} A {\em program} is a pair of declarations $D$ and an agent $A$; a declaration is of the form $g::A$, where $g$ is of the form $p(X_1,\ldots, X_n)$, for $p$ a procedure name and $X_i$ variables. (In the following, we let $F,G$ range over programs.) We require that the recursion is guarded, {\em i.e.} the recursion variable occurs within the scope of an {\bf else} or a {\bf next}. The important consequence of this restriction is that the recursion equations have {\em unique} solutions, and that computation in each time-step is lexically bounded (i.e., bounded by the size of the program). % This is a familiar restriction from process algebra. To ensure bounded ``extension in time'' we also need to ensure that recursions are {\em bounded}, that is, at run-time there are only boundedly many different procedure calls. There are several ways of achieving this. For instance, we could require that procedures take no parameters. This is a familiar restriction in the realm of synchronous programming; for example, the sole recursion construct in \Esterel{} is ``loop''. We shall make the slightly more liberal assumption that any recursive procedure call take exactly the same parameters as the procedure declaration. That is, for any set of mutually recursive declarations: \begin{ccprogram} \agent p(X_1, \ldots, X_n)::A.. \agent q(Y_1, \ldots, Y_m)::B.. \ldots \end{ccprogram} \noindent any call to a procedure {\tt p}, {\tt q}, \ldots in {\tt A}, {\tt B}, \ldots is exactly of the form ${\tt p(X_1, \ldots, X_n)}$, $\tt q(Y_1, \ldots, Y_n)$, \ldots. (This restriction is equivalent to the parameterless procedures restriction provided that procedure definitions are allowed to be nested within agents, so that bodies of procedure definitions can refer to variables in the lexical scope.) From a logic programming perspective, this restriction is not as severe as it may seem because constraints are not carried over from one time instant to the next: hence the same variable name may be ``reused'' at subsequent time-instants. Note that procedures with parameters that take values in finite domains can be compiled away using the parameterless procedure and hiding constructs available in \tcc{}. The operational and denotational semantics of recursion is standard. In the following, we shall sometimes use the syntax $\mu p. A$ to denote an agent $p$ under the assumption that the program contains a (unique) procedure declaration of the form $p::A$. \subsection{Combinators from CCP} These combinators follows traditional treatment~\cite{popl91}. \paragraph{Tell.} This process adds $c$ in the current store. It is quiescent in any observation that contains at least as much information as $c$ at the first instant. \begin{eqnarray*} \LL{c} &=& \{\epsilon\} \cup \{d \cdot s \in \Obs \alt d \supseteq c\} \end{eqnarray*} Since the store is represented in the configuration, there is no explicit operational transition. \paragraph{Parallel composition.} What are the quiescent points of the parallel composition of two agents agents $A$ and $B$? Exactly those which are quiescent points of $A$ and of $B$: \begin{eqnarray*} \LL{A \PAR B} &=& \LL{A} \cap \LL{B} \end{eqnarray*} Operationally, the idea of parallel composition is already implicit in the ``,'' of multiset union~\cite{BoudolBerry90}: \begin{eqnarray*} (\Gamma, A \PAR B) \derives (\Gamma, A, B) \end{eqnarray*} \paragraph{Timed Positive Ask.} $B = \nowt c \then A$ is the process that checks if the current store is strong enough to entail $c$; and if so, behaves like $A$. When is $o = d \cdot s$ an arbitrary element of $\Obs$ in $\LL{B}$? There are two possibilities. Either $d \supseteq \LL{c}$ or not. In the first case, $B$ behaves like $A$; so $o$ must be an observation of $A$. Otherwise $o$ can be any observation whatsoever --- since $B$ will remain quiescent. Thus: \begin{eqnarray*} \LL{\nowt c\then A} &=&\{\epsilon\} \cup \{d \cdot s\in \Obs \alt d \supseteq \LL{c} \entails d \cdot s \in \LL{A}\} \end{eqnarray*} Operationally, we have: \begin{eqnarray*} \from{\sigma(\Gamma) \vdash c} \infer{(\Gamma, \nowt c \then{A}) \derives (\Gamma, A)} \end{eqnarray*} \paragraph{Hiding.} $X \Hat A$ is a process where the variable $X$ is hidden from the environment. At any time step, the agent $\exists X. f$ transforms its input $c$ into the value $c \sqcup \exists X. f(\exists X.c)$. No constraints cross time boundaries. This yields, in terms of fixed points: \begin{eqnarray*} \LL{X \Hat A} &=& \{ s \in \Obs \alt \exists X.s = \exists X.t, \mbox{for some } t \in \LL{A}\} \end{eqnarray*} where $\exists X.(c_1.c_2 \ldots.c_n) \defeq (\exists X c_1).(\exists X c_2)\ldots (\exists X.c_n)$. For the operational semantics, we find a variable $Y$ that does not occur free in the configuration, and substitute it for the bound variable: \begin{eqnarray*} (\Gamma, X \Hat A) &\derives& (\Gamma, A[Y/X]) \ \ \ (Y \mbox{not free in } \Gamma) \end{eqnarray*} \paragraph{Review.} This concludes our discussion of the basic mathematical and operational semantics of \tcc{} constructs. The connections between operational and denotational semantics are straightforward. Indeed, it is not hard to determine that the denotational semantics captures precisely the quiescent sequences of the operational semantics. Full abstraction follows from standard considerations \cite{tcc}. We emphasize that the definitions given above work for {\em any} constraint system, and are hence very general. While we do not have space to discuss this further \cite{tcc} also provides a sound and essentially complete proof system for establishing inequivalence of programs. Because \tcc{} programs have a logical reading, they may also be used to specify (safety) properties of other \tcc{} programs. Hence this proof system can be used for establishing that programs meet their safety specifications. \cite{tcc} also discusses a compositional compilation algorithm for \tcc{} programs that takes as input programs with bounded recursion (as discussed above) and outputs a constraint-based FSA. The FSA accepts as input a constraint from the environment, computes (through a bounded number of calls to the underlying constraint system) the constraint that must be output at the current time step, and checks it against a bounded number of other constraints to determine the transition to the next state. The mere fact of such a compilability into FSA's makes possible the use of \tcc{} to program reactive controllers, and the use of powerful analysis techniques for finite state systems.