\section{Synchronous programming} Reactive systems~\cite{reactive-systems,berry-lang,Hbook} are those that react continuously with their environment at a rate controlled by the environment. Execution in a reactive system proceeds in bursts of activity. In each phase, the environment stimulates the system with an input, obtains a response in bounded time, and may then be inactive (with respect to the system) for an arbitrary period of time before initiating the next burst. Examples of reactive systems are controllers and signal-processing systems. The primary issues that arise in programming reactive systems are time-criticality, reliability and maintainability in the face of change. \subsection{Automata} Arguably, the most natural way of programming such systems is in terms of automata with simple loop-free transitions, to ensure bounded response. The transitions of the automaton correspond to simple actions executed by the program at every time step. Consider for example a simple protocol that implements the controller of a paper-tray of a photo-copier. The function of the controller is to switch the paper tray motor on and off, always trying to keep the position of the top of the stack of paper next to the feeder. There are two sensors, $P$ and $E$, which are set to $1$ when the height of the paper is OK. Whenever the paper level falls, i.e.{} one of the sensors becomes zero, the motor is activated to push up the paper stack. This protocol can be construed as a finite state machine with two states as in Figure~\ref{simple-automaton}. \begin{figure}[htb]\label{simple-automaton} \vskip 2in \special{psfile=pic1.ps hoffset=-100 voffset=-500 hscale=100 vscale=100} \caption{Automaton for a perfect paper-tray} \end{figure} However, automata do not have hierarchical or parallel structure; in particular, small and succinct changes in the specification can lead to global changes in the automaton~\cite{Murakami-Sethi}. For example, let us say that we want the controller to also be aware of the fact that the sensors may be broken, in which case, it should stop the motor after a certain delay, to prevent it from damaging the copier. An automaton for this protocol is as in Figure~\ref{complex-automaton}. \begin{figure}[htb]\label{complex-automaton} \vskip 4in \special{psfile=pic2.ps hoffset=-100 voffset=-300 hscale=100 vscale=100} \caption{Automaton for a paper-tray with failure modes} \end{figure} Note that there is no {\em structural} relationship between the two automata. This makes the maintenance of such code through changes in specification a very onerous task. \subsection{Process calculi} Process calculi~\cite{Hoare85,Milner89a,Milner89} support parallel composition and communication/synchronization via rendezvous. However, these calculi do not specify the ``occurrence time'' of the rendezvous. Consequently, program execution is inherently indeterminate. Furthermore, this results in inadequate support for preemption, which is not integrated into the calculi. \subsection{Temporal logic programming} Temporal logic programming languages~\cite{brzoska,metatem,baudinet,moszkowski,merz} achieve bounded response by imposing syntactic restrictions. This paradigm is inherently nondeterministic. Furthermore, the languages are forced to identify {\em a priori}, global and fixed notions of ``system-variables'' and ``environment-variables'' to ensure true reactivity. This distinction has nebulous logical status, and goes against the algebraic view of parallel composition, where one process is part of the environment of the other process. \subsection{Synchronous languages} The problems with automata leads to the style of synchronous programming; the motivation behind these languages is to achieve the strong real time guarantees of automata in the context of a higher level of system description. The class of synchronous languages \cite{esterel,lustre,signal,statecharts} have the following features: \begin{description} \item[Perfect Synchrony: ] Program combinators are determinate primitives that respond instantaneously to input. Output at time $t$ is a function of the input at time up to $t$; at any instant the presence {\em and} the absence of signals can be detected. \item[Bounded State: ] Programs that operate only on ``signals'' can be compiled into finite state automata with simple transitions. \item[Multiform Time: ] Physical time has the same status as any other external event. \end{description} Determinacy and synchrony ensure that programs are determinate in the input-output sense as well as the interactive temporal sense. The bounded state property bounds the single step execution time of the program and makes the synchrony assumption realizable in practice. The multiform nature of time allows the combination of programs with different notions of time. There are two classes of languages in this genr\'{e}: Imperative real time synchronous languages like {\sc Esterel} allow flexible programming constructs, and come with verification tools. The declarative synchronous real time languages, {\sc Lustre} and {\sc Signal}, inherit elegant equational reasoning from the underlying dataflow model. The incongruity between internal temporality (there is computation ``in between'' the stimulus from the environment and the response) and Perfect Synchrony (at each instant a signal is either present or not present, there is no ``in between'') affects programming language design and implementation. For example, \begin{description} \item[Temporal paradoxes: ] One can express programs that require a signal to be present at an instant only if it is not present at that instant. Indeed, the behavioral semantics of \cite{esterel} requires (effectively) complex default reasoning to determine what must happen at each step --- in a sense, ``stable models'' of a default theory must be computed by finding the fixed-points of a system of equations involving non-monotone operators. Approximate static analysis, a calculus of potentials in the case of \Esterel, is used to eliminate programs with paradoxes (exact analysis is not possible for standard computability reasons). This results, in some cases, in rejecting intuitively correct programs~\cite{Hbook}. Furthermore, this poses problems in constructing process networks with cycles. \item[Compilation is not compositional: ] The compilation of a program fragment has to take into account the static structure of the entire program~\cite[Page 93]{Hbook}. This is in direct contrast to the standard compilation procedures for traditional programming languages~\cite{sethi-ullman}. \end{description} \subsection{Our contributions} A re-analysis of the elegant ideas underlying synchronous programming, starting from the viewpoint of asynchronous computation leads us to the framework of {\em timed concurrent constraint programming}, henceforth called \tcc, with the following salient features. \begin{description} \item[Declarative view: ] \tcc{} has a fully-abstract semantics based on solutions of equations. \tcc{} programs can be viewed as formulas in an intuitionist linear time temporal logic, and computation can be viewed as generating a ``least model'' of the program. \item[Modularity: ] In the spirit of process algebra, we identify a set of basic combinators, from which programs and reasoning principles are built compositionally. \item[Expressiveness :] \tcc{} supports the {\em derivation} of a ``{\bf clock}'' construct that allows a process to be clocked by another (recursive) process; it generalizes the {\em undersampling} constructs of \Signal\ and \Lustre, and the preemption/abortion constructs supported by \Esterel{}. Thus, \tcc{} encapsulates the rudiments of a theory of preemption constructs. In addition, by selecting an appropriate underlying constraint system (e.g., \herbrand{} underlying (concurrent) logic programming languages), \tcc{} languages can be used to specify cyclic, dynamically-changing networks of processes (c.f. ``mobility''~\cite{Milner89}). \item[Executability :] Under simple syntactic conditions, \tcc{} programs may be compiled into finite state ``constraint'' automata that have bounded response time. \item[``Paradox-free'' :] \tcc{} resolves the tension between synchrony and causality. \end{description} \subsection{Basic Intuitions} Our starting point is the paradigm of concurrent constraint programming, henceforth called CCP \cite{my-book,popl91}. CCP is highly expressive; for example, it generalizes dataflow languages, CCP is an asynchronous and determinate paradigm of computation. The basic move in this paradigm is to replace the notion of store-as-valuation central to von Neumann computing with the notion that the store is a constraint, that is, a collection of pieces of partial information about the values that variables can take. Computation progresses via the monotonic accumulation of information. Concurrency arises because any number of agents may simultaneously interact with the store. The usual notions of ``read'' and ``write'' are replaced by {\em ask} and {\em tell} actions. A tell operation takes a constraint and conjoins it with the constraints already in the store; tells are executed asynchronously: thus, it is guaranteed that the store eventually satisfies the constraint, but no assumptions are made on the amount of time taken to achieve this result. Synchronization is achieved via the ask operation: ask takes a constraint (say, $c$) and uses it to probe the structure of the store: it succeeds if the store contains enough information to entail $c$; the agent blocks if the store is not strong enough to entail the constraint it wishes to check. The information accumulated by computation in the CCP paradigm is {\em positive} information: constraints on variables, or a piece of history recording that ``some event happened''. The fundamental move we now make is to consider the addition of {\em negative} information to the computation model: information of the form ``an event did not happen''. This is the essence of the ``time out'' notion in real-time programming: some sub-program may be aborted because of the absence of an event. How can a coherent conceptual framework be fashioned around the detection of negative information? The first task is to identify states of the system in which no more positive information is being generated; the {\em quiescent} points of the computation. Only at such moments does it make sense to say that a certain piece of information has not been generated. Now {\em time } can be introduced by identifying quiescent points as the markers that distinguish one time step from the next. This allows the introduction of primitives that can trigger an action in the {\em next} time interval if some event did not happen through the extent of the {\em previous} time interval. Since actions of the system can only affect its behavior at current or future time step, the negative information detected is {\em stable}. Hence there is some hope that a semantic treatment as pleasant as that for the untimed case will be possible. Our basic ontological commitment then is to the following refinement of the Perfect Synchrony Hypothesis --- the {\em Timed Asynchrony Hypothesis}: \begin{description} \item[Bounded Asynchrony] Computation progresses asynchronously in bounded intervals of activity separated by arbitrary periods of quiescence. Program combinators are determinate primitives that respond in a bounded amount of time to the input. \item[Strict Causality] Output at time $t$ is a function of the positive information input upto and including time $t$ and the negative information input at time upto $t$. The {\em absence} of a signal in a period can be detected only at the end of a period, and hence can trigger activity only in the next interval. \end{description} The Timed Asynchrony hypothesis addresses one practical and mathematical shortcoming of the Perfect Synchrony hypothesis. It is more pleasant and practical to regard the computation at each external clock tick as having a notion of {\em internal temporality}: it happens {\em not instantaneously} but over a {\em very short} --- bounded --- period of time. This is the reality in any case --- sophisticated embedded real-time controllers may require that certain conditions be checked, and then values created for local variables, and then subsequently some other conditions be checked, etc. Indeed, even \Esterel{} has such an internal notion of temporality --- but it is achieved through a completely separate mechanism, the introduction of assignable variables and internal sequencing (``;''). In contrast, \tcc{} does away with the assignable variables of \Esterel{} in favor of the same logical (denotational) variables that are used for representing signals. Computation progresses through the asynchronous accumulation and detection of positive information within an interval, and negative information at the end of an interval\footnote{Prima facie, this ``monotonic approximation'' of \Esterel{} may seem to be less powerful than \Esterel, which can react to negative information at the same clock tick. The variety of programming examples in the rest of the paper support our argument for the expressiveness of \tcc.}. To summarize, computation in \tcc{} proceeds in intervals. During the interval, positive information is accumulated and detected asynchronously, as in CCP. At the end of the interval, the absence of information can be detected, and the constraints accumulated in the interval are discarded. We do not provide for the implicit transfer of positive information across time boundaries to maintain the bounded size of the constraint store; this must be done explicitly by the programmer by using the basic combinators.