\section{General programming techniques} \def\do#1\watching#2{{\bf do}\ #1\ {\bf watching}\ #2} We now show how to define some natural patterns of temporal activity in \tcc{} languages. We show that several interesting combinators are definable in the underlying model, and are hence definable for any \tcc{} language.\footnote{When we say that a combinator is definable in this model we mean that when it takes processes as arguments it returns a process. Furthermore, unless otherwise discussed all combinators will be monotone and continuous in their process arguments. This means that they can be used freely in recursive definitions.} We then present explicit definitions through simplification laws that show how to eliminate the combinators in favor of the basic combinators described in the previous section. (In the following when we write $A=B$, for two agents $A$ and $B$, we mean that the denotations of these agents are identical processes.) \subsection{Some simple combinators} \paragraph{Unconditional persistence.} Let $\always A$ be the agent that behaves like $A$ at every time instant: \[ \LL{\always A} = \{s \in \Obs \alt s^i \in \LL{A}, 0 \leq i \leq |s| \} \] where $s^0=s$ and $(d \cdot s)^{n+1}=s^n$. We have the law: \[ \always A = A \PAR \next \always A \] Thus the agent $\always c$ ensures that the constraint $c$ will be posted at every time instant. \paragraph{Conditionals.} Let $\nowte c \then A \else B$ be the agent that behaves like $A$ if $c$ holds in the current time instant; otherwise it behaves like $B$ from the {\em next} time instant onwards. Its observations are: \[ \LL{\nowte c \then A \else B} = \begin{array}[t]{ll} \{\epsilon\} \cup \{d \cdot s \in \Obs \alt & d \supseteq \LL{c} \Rightarrow d \cdot s \in \LL{A}, \\ & d \not\supseteq \LL{c} \Rightarrow s \in \LL{B}\} \end{array} \] From this we can deduce: \begin{eqnarray*} \nowte c \then A \else {B} &=& (\nowt c \then A \PAR \nowe c \else {B}) \end{eqnarray*} Note that the conditional construct will take one branch or the other; there are no conditions under which it can ``suspend''. In contrast, assuming that the constraint $\neg c$ is definable in the underlying constraint system, the agent $(\nowt c \then A \PAR \nowt\neg c \then{\next B})$ can suspend without executing either $A$ in the current step or $B$ in the next step --- exactly in those stores which are not strong enough to entail either $c$ or $\neg c$.\footnote{For example, if $c$ is the constraint {\tt X=1}, then every store which has no constraints on {\tt X} will be unable to prove one of {\tt X=1} or ${\tt X \neq 1}$.} This is a reflection of the intuitionistic nature of the implication underlying the $\nowt c \then A$ construct. \paragraph{Multiple Prioritized Waits.} Let $A$ be the agent: \begin{equation}\label{await} \begin{array}{l} \Now \\ \qqq \Case\ c_1\ {\bf do}\ A_1\\ \qqq \Case\ c_2\ {\bf do}\ A_2\\ \qqq \ldots \\ \qqq \Case\ c_n\ {\bf do}\ A_n\\ \qqq \Default\ B \\ {\bf end} \end{array} \end{equation} defined to behave like $A_1$ if $c_1$ is now true, else behaves in the next time instant like $A_2$ if $c_1$ is not true, but $c_2$ was true, and so on, behaving like $B$ if none of the $c_i$ are true. Following the same denotational style reasoning as earlier, $A$ is equivalent to the agent: \[ \begin{array}[t]{ll} & \{\epsilon\}\\ \cup & \{d \cdot s \in \Obs \alt d \supseteq \LL{c_1}, d \cdot s \in \LL{A_1} \}\\ \cup & \cup_{2 \leq i \leq n} \{d \cdot s \in \Obs \alt \forall j < i.\ d \not\supseteq \LL{c_j}, d \supseteq \LL{c_i}, s \in \LL{A_i}\} \\ \cup & \{d \cdot s \in \Obs \alt \alt \forall j \leq n.\ d \not\supseteq \LL{c_j}, s \in \LL{B}\} \end{array} \] From the structure of the denotation, it is easy to write down the following equivalent agent in the basic language: \[ \begin{array}{ll} & \nowt c_1 \then {A_1}\\ \PAR & \nowt c_2 \then {\nowe c_1 \else A_2}\\ \PAR & \nowt c_3 \then{\nowe (c_1 ; c_2) \else A_3} \\ \ldots\\ \PAR & \nowt c_n \then{\nowe (c_1 ; \ldots ; c_{n-1})\else A_n}\\ \PAR & \nowe (c_1 ; \ldots ; c_n) \else B \end{array} \] \subsection{Temporal control constructs} \subparagraph{Extended Wait.} Let ${\bf whenever}\ c\ {\tt do}\ A$ be the agent that suspends until the first time-instant at which $c$ is true; it then behaves like $A$. Its denotation is: \begin{eqnarray*} \LL{{\bf whenever}\ c\ {\tt do}\ A} & = & \begin{array}[t]{rl} & \{\epsilon\} \\ \cup & \{s\cdot d \cdot t \in \Obs \alt d \supseteq \LL{c}, d \cdot t \in \LL{A}\} \\ \cup & \{s \cdot d \in \Obs \alt d \not\supseteq \LL{c}\} \end{array} \end{eqnarray*} with the law: \[ \whenever c \do {A} = \nowte c \then A \else {(\whenever c \do {A})} \] \paragraph{Watchdogs: $\mdow{P}{c}.$} The \Esterel{} interrupt construct $\dow A \watching S$ \cite{BerryFSTTCS} cannot be translated precisely into \tcc. In \Esterel{} the computation of $A$ is aborted at the instant that $S$ is emitted. This leads to semantic complexities --- for example, the agent $\dow S \watching {\mbox{\bf immediately } S}$ is required to produce $S$ but if it were to produce $S$ it must terminate before producing $S$! In \tcc{} it is not possible to specify that a computation be aborted in the current instant on receipt of positive information from the environment. Instead, we describe a construct that aborts the computation at the next {\em stable} instant, that is, at the next moment of quiescence. This is achieved through polling as shown below. $\mdow{P}{c}$ behaves like $P$ till a time instant when $c$ is entailed; when $c$ is entailed $P$ is killed from the next time instant onwards. (We can similarly define the related {\em exception handler} primitive, $\mdow{P}{c}\; \mbox{\bf timeout}\; B$, that also activates a handler $B$ when $P$ is killed.) In the following, for $s \in \OBS$, we use the notation $s \not\geq a$ to mean $b \not\supseteq a$ for every element $b \in s$. Formally, \begin{eqnarray*} \LL{\mdow{P}{c}} & \defeq & \begin{array}[t]{rl} & \{s \in \OBS \alt s \in \LL{P}, s \not\geq \LL{c}\ \}\\ \cup& \{s \cdot a \cdot t \in \OBS{} \alt s \cdot a \in \LL{P}, a \supseteq \LL{c} \} \end{array} \end{eqnarray*} It satisfies the following laws: \begin{eqnarray*} \do d \watching c &=& d \\ \do (A_1\PAR A_2) \watching c &=& \do A_1 \watching c \PAR \do A_2 \watching c \\ \do (\nowt d \then A) \watching c &=& \nowt d \then (\do A \watching c)\\ \do (\nowe d \else A) \watching c &=& \nowe (c; d) \else {(\do A \watching c)}\\ \do (X\Hat A) \watching c &=& X\Hat(\do A \watching c) \end{eqnarray*} In the last rule, we assume that $X$ is not free in $c$ (this can always be achieved by renaming the bound variable). We believe that the ``monotonic'' approximation of asynchronous interrupts afforded by \tcc{} is appropriate for all practical purposes. Consider the example of an Automated Teller Machine (presented in more detail in Section~\ref{ATM}). In the case in which the customer presses a {\tt start} button, the action desired (say, $A$) might be different from that (say $B$) in which the customer simultaneously presses a {\tt start} and {\tt cancel} button. In \tcc{} the negative information that {\tt cancel} was not pressed must be converted into positive information --- and this can only be done at the next time instant. So the action associated with only the {\tt start} button being pressed would be taken for one time step before being aborted. If the exact functionality of \Esterel{} is desired, this can be obtained by sampling the button state at a higher frequency than the frequency driving the main program. (For example, this can be achieved by downsampling from the basic clock --- indeed there needs to be a clock more basic than the one associated with button-presses in order to time out an extremely slow customer.) The button sampler would use an adaptation of the mouse protocol (see below) to determine exactly how many buttons were pressed in a given interval and communicate that positive information at the end of the interval to the main program. The main program can then do $A$ on receipt of the information $\tt (start, \neg cancel)$ and $B$ on receipt of $\tt (start, cancel)$. \paragraph{Suspension-Activation primitive: $\mbox{\bf S}_c\mbox{\bf A}_e(P).$} This is a preemption primitive that is a variant of {\em weak suspension} in \Esterel~\cite{BerryFSTTCS}. $\mbox{\bf S}_c\mbox{\bf A}_e(P)$ behaves like $P$ till a time instant when $c$ is entailed; when $c$ is entailed $P$ is suspended from the next time instant onwards (hence the ${\bf S}_c$). $P$ is reactivated in the time instant when $e$ is entailed (hence the ${\bf A}_e$). This, for instance, is the behavior of the $({\tt control-Z,fg})$ mechanism in Unix. \begin{eqnarray*} \LL{\mbox{\bf S}_c\mbox{\bf A}_e(P)} &\defeq & \begin{array}[t]{rl} & \{ s\in \OBS{} \alt s \in \LL{P}, s \not\geq \LL{c} \} \\ \cup& \{ s \cdot a \cdot t \in \OBS{} \alt s \cdot a \in \LL{P}, s \not\geq \LL{c}, a \supseteq \LL{c}, t \not\geq \LL{e} \} \\ \cup & \{ s \cdot a \cdot t \cdot a' \cdot u \in \OBS{} \alt \begin{array}[t]{l} s \cdot a \cdot a' \cdot u \in \LL{P}, s \not\geq \LL{c}, \\ d \supseteq \LL{c},t \not\geq \LL{e}, d' \supseteq \LL{e} \} \end{array} \end{array} \end{eqnarray*} \paragraph{Multiform time: {\bf time} $A$ {\bf on} $c$.} One of the most attractive aspects of the synchronous programming languages is that no particular notion of time is built in. There is no basic signal, say a milli-second clock, with control constructs that are defined explicitly for that signal. (This is done, for instance by Timed CSP \cite{timed-csp}.) Rather, the basic temporal sequencing constructs can be used uniformly with respect to {\em any} signal that extends over time. On first glance, it might seem that the introduction of Unit Delay and Negative Ask constructs refer explicitly to an underlying clock and hence do not support a multi-form notion of time. This is however not the case. We shall see shortly that in fact it is possible to uniformly clock arbitrary processes using a general class of processes. For the moment, we concentrate on showing how an arbitrary process $A$ may be exposed to some ``sub-sampling'' of the basic clock. Intuitively, {\bf time} $A$ {\bf on} $c$ should denote a process whose notion of time is the occurrence of the primitive constraint $c$ --- $A$ evolves only at the time instants at which the store entails $c$. Those instants at which $c$ is not entailed by the store should be regarded as completely invisible to $A$. In particular this implies that the notion of the ``next time instant'' in $A$ will in fact be the ``next time instant at which $c$ holds''. Formally, using the notation $s \downarrow a $ to denote the subsequence of $s$ with all elements $ \supseteq a$, we have: \begin{eqnarray*} \LL{{\bf time}\; A \; {\bf on} \;c} & \defeq &\{ s \in \OBS{} \alt s \downarrow \LL{c} \in \LL{A}\} \end{eqnarray*} The laws for {\bf time} $A$ {\bf on} $c$ are easy to write down. We defer giving an explicit presentation because we now show how to define this construct in terms of the more general {\bf clock} construct. \subsection{A general clocking construct: the {\bf clock} combinator} The combinators discussed above have a common basic idea --- they allow for the introduction of time steps that are ignored by the underlying agent $A$. This suggests the introduction of a combinator that directly captures this intuition: \clock{B}{A} is a process that executes $A$ only on those instants which are quiescent points of $B$. Let $P$ be a process. We identify the maximal subsequence $t_P$ of the sequence $t$ that is an element of the process $P$. $t_P$ is defined inductively by: \begin{eqnarray*} \epsilon_P &=& \epsilon \\ (s\cdot a)_P &=& \left\{ \begin{array}{l} (s_P)\cdot a, \; \mbox{if} \; a \in (P \; {\bf after} \; (s_P))\\ (s_P), \; \mbox{otherwise} \end{array} \right. \end{eqnarray*} Now, recognizing that $A$ is executed only at the quiescent points of $B$ we can state: \begin{eqnarray*} \clock{B}{A} & \defeq & \{ t \in \OBS \alt t_{\LL{B}} \in \LL{A} \} \end{eqnarray*} It is easy to see that \clock{B}{A} is non-empty and prefix-closed if $\LL{B}$ and $\LL{A}$ are. However, \clock{B}{A} may not be determinate for arbitrary $\LL{B}$. %% since %% \begin{equation} %% (\clock{B}{A})\; {\bf after}\; s = (|D| \setminus B\; {\bf after}\; %% s_B)\cup ((A \cap B)\; {\bf after}\; s_B) %% \end{equation} %% is not necessarily glb-closed (for $s \in \clock{B}{A}$). A necessary and sufficient condition is that $\LL{B}\; {\bf after}\; s$ is upwards-closed, for all $s \in \LL{B}$. (Such processes can be thought of as arising by ``extending over time'' the basic processes of the form {\bf tell} $c$.) Accordingly, we identify the syntax for ``basic processes'' by: \begin{eqnarray*} \begin{array}{rllll} \mbox{(Basic Agents)} & B &{::=}& & c \\ &&& \alt &\nowt c \then \next B \\ &&& \alt & \nowe c \else B \\ &&& \alt & \next B\\ &&& \alt & \Skip\\ &&& \alt & \abort\\ &&& \alt & B \PAR B \\ &&& \alt & X \Hat B\\ &&& \alt & g \\ \quad\\ \mbox{(Basic Procedures)} & D &{::=}&& g::B \end{array} \end{eqnarray*} Note that $\LL{\always{A}}$ and $\LL{\whenever c \do \next A}$ are basic processes if $\LL{A}$ is. The {\bf clock} combinator is not monotone or anti-monotone in its first argument. Nevertheless, it is possible to establish \begin{eqnarray*} \clock{(\mu g.B)}{A} & =& \mu g.\clock{B}{A} %%\clock{\bigcap_{i \geq 0} B^i(\true)}{A} = \bigcap_{i \geq 0} \clock{B^i(\true)}{A} \end{eqnarray*} using the fact that recursion is guarded and fixed-points are unique. %% holds (where $B$ is an operator on $\PROC$ and $B^i$ is the $i$-fold %% iteration of $B$), using the fact that recursion is guarded and %% fixed-points are unique. %% because of the following crucial ``stratification'' property for %% clocks (that holds because of guarded recursion). For any process %% $P$, let $P|_k$ be the (unique) process satisfying $P|_k\; {\bf %% after}\; s = P\;{\bf after}\; s$ if $|s| < k$, and $|D|$ %% otherwise. Then, we have $\LL{\clock{B}{A}}|_k = %% \LL{\clock{\LL{B}_k}{\LL{A}}}|_k$. Table~\ref{clock-def} in Section~\ref{g-imp} shows how to compositionally reduce programs containing this construct into programs that do not. (The table is given in the syntax of \tgentzen{}, explained in Section~\ref{tgentzen}.) We illustrate how some standard temporal constructs can be expressed using {\bf clocks}. Clearly, this construct is in the flavor of the {\tt when} construct (undersampling) in \Lustre\ and \Signal, generalizd to general processes $B$ instead of boolean streams. The combinators introduced above can be expressed thus: \begin{eqnarray*} \nowe c \else A &=& \clock{(\nowt c \then \next \abort)}{\next A}\\ \whenever c \do A &=& \clock{c}{A}\\ \mbox{\bf time}\; A \; \mbox{\bf on} \;c &=& \clock{(\always c)}{A}\\ \mdow{A}{c} &=& \clock{(\whenever c \do \next \abort)}{A} \\ \mbox{\bf S}_c\mbox{\bf A}_e(A) &=& \clock{(\whenever c \do \next e)}{A} \end{eqnarray*} Repeated pause/resumptions of a process can be expressed by: \begin{eqnarray*} \clock{(\mu g. \whenever c \do \next (e \PAR \next{g}))}{A} \end{eqnarray*} % \begin{table} % \begin{tabular}{|l|} % \hline % \begin{minipage}[t]{\columnwidth} % \paragraph{Idempotence. } \clock{B}{(\clock{B}{A})} = \clock{B}{A} % % % \paragraph{Tells. } % \begin{eqnarray*} % \clock{c}{A} &=& \whenever c \do A \\ % \clock{(\nowt c \then \next B)}{d} &=& d \\ % \clock{(\nowe c \else B)}{d} &=& d \\ % \clock{\next B}{d} &=& d % \end{eqnarray*} % % \paragraph{Aborts and Skips} % \begin{eqnarray*} % \clock{\abort}{A} &=& \Skip\\ % \clock{A}{\abort} &=& \abort\\ % \clock{\Skip}{A} &=& A\\ % \clock{B}{\Skip} &=& \Skip\\ % \end{eqnarray*} % % \paragraph{Parallel Composition.} % <\begin{eqnarray*} % \clock{(B_1\PAR B_2)}{A} &=& \clock{B_1}{(\clock{B_2}{A})} \\ % \clock{B}{(A_1\PAR A_2)} &=& (\clock{B}{A_1})\PAR (\clock{B}{A_2}) % \end{eqnarray*} % % \paragraph{Existential quantification.} % \[ \clock{B}{X\Hat A} = Z\Hat\clock{B}{A[Z/X]}, \; \mbox{ $Z$ not free % in $A,B$} \] % % \paragraph{Recursion.} % \begin{eqnarray*} % \clock{B}{\mu p.A} &=& \mu p.\clock{B}{A} \\ % \clock{\mu g.B}{A} &=& \mu g.\clock{B}{A} % \end{eqnarray*} % \end{minipage}\\ % \hline % \end{tabular} % \caption{Clock algebra I}\label{clock-table-1} % \end{table} % % \begin{table} % \begin{tabular}{|l|} % \hline % \begin{minipage}[t]{\columnwidth} % \paragraph{Next in first argument.} % \begin{eqnarray*} % \clock{\next B}{(\nowt d \then A)} &=& \nowt d \then \clock{\next % B}{A} \\ % \clock{\next B}{\next A} &=& \next \clock{B}{A} \\ % \clock{\next B}{(\nowe d \else A)} &=& \nowe d \else \clock{B}{A} \\ % \end{eqnarray*} % % \paragraph{Asks in first argument} % \[ % \begin{array}{l} % \clock{(\nowt c \then \next B)}{(\nowt d \then A)} = \\ % \begin{array}{lll} % && \nowt d \then \clock{(\nowt c \then \next B)}{A} % \end{array} \\ % \clock{(\nowe c \else B)}{(\nowt d \then A)} = \\ % \begin{array}{lll} % &&\nowt d \then \clock{(\nowe c \else B)}{A} % \end{array} \\ % \clock{(\nowt c \then \next B)}{(\nowe d \else A)} = \\ % \begin{array}{lll} % && \nowt c \then {\clock{\next B}{(\nowe d \else A)}}\PAR \nowe (c,d) \else A % \end{array} \\ % \clock{(\nowe c \else B)}{(\nowe d \else A)} = \\ % \begin{array}{lll} % && \nowt c \then {\nowe d \else A}\PAR \nowe (c, d) \else \clock{B}{A} % \end{array} \\ % \clock{(\nowt c \then \next B)}{\next A} = \\ % \begin{array}{lll} % && (\nowt c \then {(\next \clock{B}{A})})\PAR (\nowe c \else A) % \end{array} \\ % \clock{(\nowe c \else B)}{\next A} = \\ % \begin{array}{lll} % &&(\nowt c \then {\next A}) \PAR (\nowe c \else \clock{B}{A}) % \end{array} % \end{array} % \] % \end{minipage}\\ % \hline % \end{tabular} % \caption{Clock algebra II}\label{clock-table-2} % \end{table} %