% Document Type: LaTeX \documentstyle[meaning]{article} \include{diagram} \newtheorem{theorem}{Theorem}[section] \newtheorem{proposition}[theorem]{Proposition} \newtheorem{lemma}[theorem]{Lemma} \newtheorem{corollary}[theorem]{Corollary} \newtheorem{definition}[theorem]{Definition} \newtheorem{defin}[theorem]{Definition} \newtheorem{assumption}{Assumption}[section] \newtheorem{convention}{Convention}[section] \newtheorem{example}{Example}[section] \newtheorem{comment}{Comment}[section] \newtheorem{digression}{Digression}[section] \newtheorem{myfact}{Fact}[section] \newtheorem{notation}{Notation}[section] \newenvironment{proof}{{\bf Proof:}}{\rule{2mm}{3mm}\\} \newcommand{\parlog}{{Parlog}} \newcommand{\CP}{{Concurrent Prolog}} \newcommand{\Var}{\mbox{\tt Var}} \def\from#1\infer#2{{{\textstyle #1}\over{\textstyle #2}}} \newcommand{\alt}{\mbox{$\;{\tt\char`\|}\;$}} \newcommand{\powerset}[1]{\mbox{${\cal P}(#1)$}} \newcommand{\powfin}[1]{\mbox{${\cal P}_{fin}(#1)$}} \newcommand{\newin}[2]{\mbox{${\sf new}\;#1\;{\sf in}\;#2$}} \newcommand{\then}{\mbox{$\;\rightarrow\;$}} %\newcommand{\to}{\mbox{$\;\rightarrow\;$}} \newcommand{\cc}{{\sf cc}} \newcommand{\fix}[2]{\mbox{$#1\ \underline{{\tt fix}}\ #2$}} \newcommand{\derives}{\longrightarrow} \newcommand{\imp}{\mbox{$\Rightarrow$}} \newcommand{\entails}{\mbox{$\Rightarrow$}} \newcommand{\pair}[2]{\mbox{$\langle #1,\; #2 \rangle$}} \newcommand{\tuple}[1]{\mbox{$\langle #1 \rangle$}} \newcommand{\defined}{\stackrel{\bf def}{=}} \newcommand{\false}{\mbox{\tt false}} \newcommand{\true}{\mbox{\tt true}} \newcommand{\filt}[1]{\mbox{${\cal F}(#1)$}} \newcommand{\forw}[1]{\mbox{${\bf F}(#1)$}} \newcommand{\fin}[1]{\mbox{${\cal P}_{fin}(#1)$}} \newcommand{\base}{\mbox{${\bf B}$}} \newcommand{\baseop}{\mbox{${\bf B}^{op}$}} \newcommand{\ppp}[1]{\mbox{${\cal P}(#1)$}} \newcommand{\mpre}{\mbox{${\bf \wedge-Preord}$}} \newcommand{\image}[2]{\mbox{$#1 (#2)$}} \newcommand{\up}[1]{\mbox{$(#1)\uparrow$}} \newcommand{\caladj}{\mbox{${\bf CAL}^{adj}$}} \newcommand{\self}[1]{\mbox{$[#1\rightarrow #1]$}} \newcommand{\close}[1]{\mbox{$\overline{#1}$}} \newcommand{\clopf}[1]{\mbox{${\bf Cl}(#1)$}} \newcommand{\twoimp}{\mbox{$\Leftrightarrow$}} \newcommand{\agent}[1]{\mbox{${\cal A}\Mean{#1}$}} \def\OO{{\cal O}} \def\DD{{\cal D}} \def\PP{{\cal P}} \def\CC{{\cal C}} \def\RR{{\cal R}} \def\glb{\sqcap} \def\lub{\sqcup} \def\dep{\mbox{\em dep}} \def\indep{\mbox{\em indep}} \def\S{\mbox{\bf S}} \def\ran{Range} \def\defeq{\stackrel{\Delta}{=}} \begin{document} \bibliographystyle{plain} \title{What is a Constraint System?} \author{Prakash Panangaden\thanks{Research supported by an operating grant from the Natural Sciences and Engineering Research Council and a team grant from Fonds pour la Formation de Chercheurs et l'Aide \'{a} la Recherche (Qu\'ebec).}, McGill University \\ Vijay Saraswat, Xerox PARC \\ P. J. Scott\thanks{Research supported by an operating grant from the Natural Sciences and Engineering Research Council, a team grant from Fonds pour la Formation de Chercheurs et l'Aide \'{a} la Recherche (Qu\'ebec), and the Ontario Ministry of Colleges and Universities Ontario-Qu\'ebec Exchange.}, University of Ottawa \\ R. A. G. Seely\thanks{Research supported by Fonds pour la Formation de Chercheurs et l'Aide \'{a} la Recherche (Qu\'ebec) and an operating grant from the Natural Sciences and Engineering Research Council}, McGill University and John Abbott College } \maketitle \thispagestyle{empty} \begin{abstract} We study a relationship between logic and computation via concurrent constraint programming. In previous papers it has been shown how a simple language for asynchronous concurrent processes can be interpreted in terms of constraints. In the present paper we show that the programming interpretation via closure operators is intimately related to the logic of the constraints. More precisely we show how the usual hyperdoctrinal description of first order logic can be functorially related to another hyperdoctrine built out of closure operators. The logical connectives map onto constructions on closure operators that turn out to model programming constructs, specifically conjunction becomes parallel composition and existential quantification becomes hiding of local variables. \end{abstract} \newpage \setcounter{page}{1} \section{Introduction} This paper develops a category theoretic view of the relationship between concurrent constraint programming (CCP) and logic. One may think of this as an explication of the relationship between logic and what is often called logic programming. The present treatment takes some new directions from the usual study that links the operational semantics to the model theoretic semantics (e.g. \cite{Apt82}) because of the desire to handle concurrency and synchronization in a logical way. Briefly these directions are: (i) the mathematical semantics is based on the idea of programs as {\em information transducers}; in the spirit of denotational semantics, programs are modelled as closure operators on certain algebraic lattices and (ii) the semantics treats concurrency in a natural and simple way; parallel composition of processes is one of the easiest combinators to describe. The CCP paradigm can be described in the following way. The crucial concept underlying this paradigm is to replace the notion of {\em store-as-valuation} behind imperative programming languages with the notion of {\em store-as-constraint}. A {\em constraint} is just a partial specification of the values of variables (e.g., $X \geq Y+Z$). Thus the usual notion of a ``store'' is replaced by the notion of a variable that may only be partly defined. This shift to partially specified values renders the usual notions of (imperative) ``write'' and ``read'' incoherent. Instead, \cite{saraswat-thesis-book} proposes the replacement of read with the notion of {\em Ask} and write with the notion of {\em Tell}. An Ask operation takes a constraint (say, $c$) and uses it to probe the structure of the store. It succeeds if the store contains enough information to entail $c$. Tell takes a constraint and conjoins it to the constraints already in place in the store. That is, the set of valuations describing the resultant store is the intersection of the set of valuations describing the original store and those describing the additional constraint. Thus, as computation progresses, more and more information is accumulated in the store---a basic step does not {\em change} the value of a variable but rules out certain values that were possible before; the store is {\em monotonically refined}. This simple view of computation can be seen as arising as a purification of logic programming~\cite{saraswat-thesis-book}, an enhancement of functional programming, a generalization of imperative programming, and a {\em bidirectional} generalization of dataflow programming (cf., the I-structures of Id Nouveau~\cite{Jagadeesan89}). From the viewpoint of programming languages, the important contribution of the CCP paradigm is that {\em the programmer can work directly with the notion of partial information}. In other words, programming constructs provide direct access to the underlying information order. For this reason the present semantics, described and shown fully abstract in~\cite{Saraswat91}, is important. This paper has two main contributions. First, we propose to view constraint systems as certain hyperdoctrines, obtained from the usual categorical account of first-order logic by restricting our attention to the logic of conjunction and existential quantification. Second, we construct a hyperdoctrinal structure of closure operators and show that it arises as a functorial image of the first, with the (functorial images of) conjunction and existential quantification exactly matching the constructions on closure operators used to model parallel composition and hiding (local variables). Thus the functor describes the passage from logical constructs to program combinators in a simple and pleasing way. The idea of monotonic update is central to the theoretical treatment of I-structures in Id Nouveau~\cite{Jagadeesan89}. From the viewpoint of dataflow programming, the concurrent constraint paradigm is also a generalization in that the flow of information between two processes is {\em bidirectional}. The one other categorical approach to logic programming that we are aware of, by Asperti~\cite{Asperti90} has a different focus. Like us he uses categorical machinery to avoid talking about variables. However, he is interested in modelling the SLD resolution process whereas our intention is to study a semntical theory of concurrent constraint program based on closure operators. The rest of this paper is as follows. The next section briefly summarizes the theory of concurrent constraint languages~\cite{Saraswat90,Saraswat91}. (A detailed discussion of programming idioms within this paradigm is contained in the forthcoming book by Saraswat~\cite{saraswat-thesis-book}.) Subsequently, we describe the hyperdoctrine of constraint-systems. Finally, we describe the hyperdoctrine of closure operators and exhibit its functorial connection with constraint systems. We conclude with a brief discussion of the possibilities for future work opened up by this paper. \subsection*{Semantics of Concurrent Constraint Languages} The discussion in this subsection is a condensation of the discussion in~\cite{Saraswat91}. The syntax and operational semantics of the language are given in Table~\ref{sem-table}. We use the letter $c$ to stand for an element of the constraint system. The basic combinators are the {\bf ask} and {\bf tell} written $c\rightarrow A$ and $c$ respectively. Intuitively, $c\rightarrow A$ executes by asking the store whether $c$ holds, if it does than $A$ executes otherwise the process suspends. The {\bf tell} combinator $c$ simply asserts the constraint $c$. The parallel composition of two agents is written $A_1 || A_2$. Hiding is written \newin{x}{A}. Finally we have procedure calls, including possibly recursive procedures but we exclude them from the present discussion. The syntax and operational semantics of the language are given in Table~\ref{sem-table}. \begin{table} \begin{tabular}{|l|} \hline \begin{minipage}[t]{\columnwidth} {\bf Syntax.} $$ \begin{array}{l} A {::=} c \alt c \rightarrow A \alt A || A \alt \newin{x}{A} \end{array} $$ Configurations are pairs, consisting of a (possibly empty) multiset of agents, written $\Gamma$ and a (possibly empty) multiset of primitive constraints, the store, written $s$. The transition relation is defined between configurations. $$ \begin{array}{lcl} \pair{(\Gamma, A_1 \parallel A_2)}{s} &\to & \pair{(\Gamma,A_1,A_2)}{s}\\ \pair{(\Gamma, \newin{x}{A})}{s} & \to & \pair{(\Gamma, A)}{s}\\ \pair{(\Gamma, c\to A)}{s} & \to & \pair{(\Gamma,A)}{s}\;\mbox{where}\; s\vdash c\\ \pair{(\Gamma, c)}{s} & \to & \pair{\Gamma}{(s,c)} \end{array} $$ \end{minipage}\\ \hline\end{tabular} \caption{Operational semantics for the Ask-and-Tell \cc\ languages}\label{sem-table} \end{table} In the description of the operational semantics one can already see that there is a relation with a sequent style presentation of logic. In the rule for \newin{x}{A}, we have assumed that $x$ is not free in $\Gamma$ or in $s$. In the denotational semantics one models processes as closure operators. Speaking operationally, in order to model a process {\em compositionally}, it suffices to record its resting points. Mathematically, this is mirrored by the fact that a closure operator is completely specified by its set of fixed points. One can define intersection of sets of fixed points of closure operators. It turns out that this operation is exactly what one needs to model parallel composition. The denotational semantics is given in~\ref{den-sem-table}. We have left out details like the definition of the environment mechanism and procedures. The closure operators are described by giving their set of fixed points. \begin{table} \begin{tabular}{|l|} \hline \begin{minipage}[t]{\columnwidth} {\bf Syntax.} $$ \begin{array}{l} A {::=} c \alt c \rightarrow A \alt A \parallel A \alt \newin{X}{A} \end{array} $$ {\bf Semantic Equations.} $$ \begin{array}{l} \agent{c} = \{d \in |D| \alt d \geq c\} \\ \agent{c \then A} = \{d \in |D| \alt d \geq c \entails d \in \agent{A}\} \\ \agent{A \parallel B} = \{d \in |D| \alt d \in \agent{A} \wedge d \in \agent{B} \} \\ \agent{\newin{X}{A}} = \{d \in |D| \alt \exists c \in \agent{A}. \exists_X d=\exists_X c\} \end{array} $$ Above, $c$ ranges over basic constraints, that is, finite sets of tokens. \end{minipage}\\ \hline\end{tabular} \caption{Denotational Semantics for the Ask-and-Tell \cc\ languages}\label{den-sem-table} \end{table} \section{Constraint Systems as Hyperdoctrines} In this section we review the connection, elucidated by Lawvere originally\cite{Lawvere63}, between ordinary first-order logic and category theory through the use of hyperdoctrines. We will give a minimal and simplified account that only encompasses conjunction and existential quantification. These are the two connectives that any constraint system must have. There is nothing original in this section; we follow the ideas in Seely's discussion~\cite{Seely83} of the connection between theories in first order logic presented in a natural deduction format and hyperdoctrines. The most important point of this section is that in the hyperdoctrinal presentation is that one has a general abstract definition of existential quantification, which encompasses the usual logical notion, but which also carries over to other hyperdoctrinal structures, as we shall see in the next section. In the hyperdoctrinal presentation, one has a family of categories, called the {\em fibres}, indexed by the objects of a cartesian category called the {\em base} category. Corresponding to the arrows of the base category are functors between the fibres. Thus, in general, a hyperdoctrine over the base category is a contravariant functor\footnote{Generally these are pseudofunctors but in the posetal situation that we consider it does not make any difference.} to {\bf CAT}. For our purposes it will be sufficient to consider the fibres to be preordered sets. We define a constriant system as a simplified hyperdoctrine. \begin{defin} A (hyperdoctrinal) constraint system is a contravariant functor $\ppp{}:\base^{op} \longrightarrow \mpre$, where \mpre{} is the category of meet-preorders, and \base{} is cartesian. We assume that for each arrow $f$ in \base{}, the (monotone) function \ppp{f} (often written $f^*$) preserves meets and has a left adjoint, written $\exists_{f}$. We also require the following two conditions: \begin{enumerate} \item {\bf Beck condition} If the following diagram is a pullback $$\square[A`B`A'`B';f`g`h`k]$$ and $\phi\in\ppp{B}$ then $\exists_g(f^*\phi) \sim k^*(\exists_h\phi)$, where $\sim$ means that we have two way entailment. \item {\bf Frobenius Reciprocity:} For each $f:A\to B$ in \base{} and $\phi\in \ppp{A}$ and $\psi\in \ppp{B}$ we have $\exists_f(f^*\psi\wedge\phi)\to \psi\wedge\exists_f\phi$. \end{enumerate} \end{defin} In the case where we have a constraint system syntactically presented as a concrete theory in first order conjunctive logic with existential quantification the hyperdoctrine conditions are easy to check~\cite{Seely83}. At the least, the base category is generated by a set of basic data values $V$ : the objects of $B$ are all finite products of $V$, including the empty product $1$, and the arrows are the smallest set of arrows containing all the projections and identity arrows, and closed under composition and pairing. In general there may be other arrows between objects corresponding to terms. \begin{convention} Since the objects of \base{} are indexed by nonnegative integers we will usually just write ${\bf n}$ when we mean the object $V^n$ of \base{}. \end{convention} \begin{convention} We will use greek letters like $\phi$, $\psi$ etc. to stand for formulas. \end{convention} \begin{defin} Given a concrete constraint system we define a functor \ppp{} from \baseop{} to \mpre{} as follows. The functor \ppp{} takes an object ${\bf n}$ to the formulas with ${\bf n}$ free variables constructed out of terms, variables, predicate symbols, conjunction and existential quantification. The preorder describes the entailment relation. Thus, if $\phi\vdash\psi$ we define $\phi\leq\psi$, or, equivalently, $\phi\rightarrow\psi$. If $f$ is an arrow from ${\bf m}$ to ${\bf n}$ in \base{}, we define \ppp{f} from \ppp{{\bf n}} to \ppp{{\bf m}} by $\ppp{f}(\phi)[\vec{X}] = (\phi)[f(\vec{Y})]$, where $\vec{X}$ and $\vec{Y}$ are vectors of variables. \end{defin} Note that the arrows $\ppp{f}$ are clearly meet preserving when $f$ is one of the projection maps, because all this means in this case is that adjoining dummy variables is defined structurally. Now we give the abstract definition of existential quantification as the left adjoint to substitution. Consider, for definiteness, the fibres over ${\bf 1}$ and ${\bf 2}$, i.e. \ppp{{\bf 1}} and \ppp{{\bf 2}}. Recall that ${\bf 2}$ is just ${\bf 1}\times {\bf 1}$, let $p$ be the first projection from ${\bf 2}$ to ${\bf 1}$. The functor (monotone function between preorders), \ppp{p}, written $p^{*}$, from \ppp{{\bf 1}} to \ppp{{\bf 2}} is, according to the above, just $p^{*}(\phi)[X] = \phi[p(X,Y)]$. Because $p^{*}$ preserves meets, it has a left adjoint\footnote{This is easily proved for Galois connections between posets, see for example~\cite{Compendium}.}, written $\exists_p$, a functor from \ppp{{\bf 2}} to \ppp{{\bf 1}}. In fact, \[ \exists_p.\psi[X,Y] = \exists X'\exists Y'(p(X',Y') = X \wedge\psi[X',Y'] ).\] By Tarski's trick, this equals $\exists Y' \psi[X,Y']$ . It is easy to prove directly (for intuitionistic first order logic with equality) that $\exists_p$ is left adjoint to subsitution $p^{*}$. All this clearly extends to other maps between other fibres and to the multisorted case. Defined this way, the functor enjoys all the properties one normally expects of existential quantification. For example $\phi \leq p^{*}(\exists_p\phi)$ is immediate and means the same as what is traditionally written as $\phi \leq \exists_X.\phi$. The Beck condition that we discussed above needs to be checked for only a few simple classes of diagrams. When the constraint system is derived from syntax these are easily checkable formulas. Frobenius reciprocity, when one has implication, is equivalent to saying that the maps of the form $f^*$ preserve implication. In the absence of an implication in the fibres we demand Frobenius reciprocity as a condition. Again in the concrete case one can checj Frobenius easily. Seely~\cite{Seely83} shows how one can go back and forth between the hyperdoctrinal presentation of a first order theory in and the more traditional presentation of such theories in a natural deduction format. \section{A Hyperdoctrine of Closure Operators} In this section we show how one can build two new hyperdoctrines, by defining a suitable functor from {\bf meet-Preord} to {\bf CAL}, the category of complete algebraic lattices and then an endofunctor on {\bf CAL}. Roughly speaking the first functor takes us from logic to ``information structure'' in a familiar way and the second takes us from an information structure to a collection of closure operators. The main point of this construction is that it provides the promised tie up between existential quantification in the logic and block structure at the program level. The main technical difficulty is actually finding the functor that produces the last hyperdoctrine. We give proofs at the lemma and theorem level. The detailed proofs of lemmas that we have omitted are easy. The category {\bf meet-Preord} has preordered sets equipped with binary meets as objects and monotone functions as the morphisms. The category we actually use is not quite {\bf CAL} but instead \caladj{} which has complete algebraic lattices as objects and adjunction pairs as morphisms. \begin{defin} The objects of the category \caladj{} are complete algebraic lattices. A morphism $m$ from $L$ to $M$ is a {\em adjunction pair} between $L$ and $M$. In more detail, a morphism $m$ from $L$ to $M$ is a pair of monotone functions, \pair{f:L\rightarrow M}{g:M\rightarrow L} with $f$ left adjoint to $g$ (i.e. $f \dashv g$). The composition of morphisms \pair{f}{g} and \pair{h}{k} is \pair{h \circ f}{g \circ k}. \end{defin} By the adjoint functor theorem, $f$ preserves arbitrary sups. The lattices in question are obtained by taking entailment closed sets of formulas or ``theories''. Since this hyperdoctrine arises by composing the previous hyperdoctrine with a functor, the Beck conditions and Frobenius reciprocity hold automatically. We can define a functor, \filt{}, from {\bf meet-Preord} to {\bf CAL} as follows. In the next few paragraphs let $A$ and $B$ be meet preorders and $m:A\rightarrow B$ a monotone function between them. \begin{defin} A {\em filter} in $A$ is an upwards closed set that is also closed under the formation of binary (and hence all finitary) meets. \end{defin} \begin{defin} The lattice \filt{A} is defined to be the set of filters of $A$ ordered by inclusion. \end{defin} \begin{convention} We use letters like $u,v,w$ to stand for elements of the complete algebraic lattices generated in this way. \end{convention} \begin{notation} Given $\phi$ an element of $A$, we write $(\phi)\uparrow$ for the principal filter generated by $\phi$. For any subset $A'$ of $A$ we write $(A')\uparrow$ for $\bigcup_{\phi\in A}(\phi)\uparrow$. \end{notation} Note that with our choice of ordering we have $\phi\leq\psi\Rightarrow\up{\psi}\sqsubseteq\up{\phi}$. \begin{notation} Given any set $X$, a function $f$ from $X$ to $X'$ and a subset $Y$ of $X$ we write $\image{f}{Y'}$ for the direct image $\{ f(y)|y\in Y\}$. \end{notation} \begin{defin} The arrow part of the functor \filt{}, is given by mapping the monotone function $m$ to the pair, \pair{\forw{m}}{m^{-1}}, where \forw{m} from \filt{A} to \filt{B} is given by $ u\mapsto \up{\image{m}{u}}$ where $u\in\filt{A}$ and $m^{-1}$ is inverse image. \end{defin} \begin{proposition} If $A$ is a meet-preorder then \filt{A} is a complete algebraic lattice with joins given by upward closures of unions and meets given by intersections. If $m$ is a monotone function from $A$ to $B$ then \pair{\forw{m}}{m^{-1}} is an adjunction pair from \filt{A} to \filt{B}. \filt{} defined in this way is a functor. \end{proposition} In fact we can think of \filt{} as going from adjunction pairs to adjunction pairs. In view of this we have the following theorem. \begin{theorem} The composition of the functors \filt{} and \ppp{} produces another hyperdoctrinal constraint system on \base{} with $(p^*)^{-1}$ {\bf right} adjoint to \forw{p^*}. \end{theorem} A simple calculation shows that $(p^*)^{-1}$ is in fact \forw{\exists_p}. We will often write $\pair{\forw{p^*}}{\forw{\exists_p}}$ rather than $\pair{\forw{p^*}}{(p^*)^{-1}}$, to emphasize the role of existential quantification. We consider closure operators over the lattices produced in the last hyperdoctrine. The formal definition of closure operators and some basic facts about them~\cite{Scott76,Compendium} appear in the appendix. \begin{notation} We write \clopf{L} for the set of closure operators on a lattice $L$. \end{notation} Now we define a hyperdoctrine of lattices by taking the closure operators on the lattces that form the fibres in the previous hyperdoctrine. Once again we will define the new hyperdoctrine as the functorial image of the preceding hyperdoctrine. We recall the following technique, due to Dana Scott, for lifting adjunctions between posets to adjunctions between their endofunction spaces. \begin{proposition} Suppose that \pair{f}{g} is a morphism\footnote{We only use the fact that they form an adjunction.} in \caladj{} from $L$ to $M$. Then \pair{f'}{g'} is a morphism from \self{L} to \self{M}; where $f' = \lambda h:\self{L}.f\circ h\circ g$ $g' = \lambda k:\self{M}.g\circ k\circ f$ \end{proposition} The basic idea is to try to define an endofunctor from \caladj{} to \caladj{} as follows. We map an object $L$ (a complete algebraic lattice) to the set of all closure operators on $L$ ordered pointwise. For the arrow part of the functor, we define it by saying that it maps a morphism \pair{f}{g} from $L$ to $M$ to \pair{f'}{g'} as defined in the last proposition. Since the closure operators on $L$ are a sublattice of \self{L} we should have another adjoint pair and hence a morphism of \caladj{}. Unfortunately, however, given $h$, a closure operator on $L$, $f\circ h\circ g$ need not be a closure operator on $L$. In order to get around the above difficulty we use a remarkable function, $V$, discussed by Scott~\cite{Scott76}. The function $V$ maps functions on \self{D} to closure operators on $D$. It turns out that $V$ is itself a closure operator on \self{\self{D}} whose fixed points are exactly the closure operators. In our discussion we use the characterization of closure operators in terms of their sets of fixed points. \begin{lemma} If $f$ is any monotone function in \self{D}, where $D$ is any poset, then $\{ x\in D| f(x)\leq x\}$ is closed under the formation of meets insofar as they exist. \end{lemma} \begin{defin} Suppose that $f$ is a function in \self{D} where $D$ is any poset. We define the function $V$ in \self{\self{D}} by $V(f) = clo(\{ x\in D| f(x)\leq x\})$. \end{defin} The lemma tells us that the definition just above makes sense. Thus, using $V$, we may take any monotone endofunction and convert it into a closure operator by applying $V$ to it. \begin{notation} Suppose that $f$ is a monotone function in \self{D} we write \close{f} for $V(f)$. \end{notation} \begin{comment} Note that \close{f} is the least closure operator extensionally greater than $f$. \end{comment} \begin{lemma} If \pair{f}{g} is a morphism in \caladj{} from $L$ to $M$ then \pair{f'}{g'} is a morphism in \caladj{} from \clopf{L} to \clopf{M} where $f' = \lambda h:\clopf{L}.\close{f\circ h\circ g}$ and $g' = \lambda k:\clopf{M}.\close{g\circ k\circ f}$. \end{lemma} We can now define the promised endofunctor on \caladj{}. \begin{defin} The functor \clopf{-} from \caladj{} to \caladj{} is defined as follows. It takes a complete algebraic lattice $L$ to the complete algebraic lattice of closure operators on $L$ ordered extensionally (reverse inclusion of the sets of fixed points). It takes a morphism \pair{f}{g} from $L$ to $M$ to the morphism \pair{f'}{g'} where $f' = \lambda h:\self{L}.\close{f\circ h\circ g}$ $g' = \lambda k:\self{M}.\close{g\circ k\circ f}$. \end{defin} In order to prove this, however, it is much easier to describe the arrow part of the functor in terms of the effects on sets of fixed points. \begin{lemma} \label{image} Suppose that \pair{f}{g} is a morhism from $L$ to $M$ with $h\in\clopf{L}$ and $k\in\clopf{M}$. Then $\close{g\circ k\circ f} = \image{g}{fix(k)}$ and $\close{f\circ h\circ g} = g^{-1}(fix(h))$. \end{lemma} With this in hand the proof of the folowing is easy. \begin{lemma} $\clopf{\pair{f_1}{g_1}}\circ\clopf{\pair{f_2}{g_2}} = \clopf{\pair{f_1\circ f_2}{g_1\circ g_2}}$. \end{lemma} These lemmas have allowed us to claim the following theorem: \begin{theorem} The composition of the functors $\clopf{-}\circ\filt{-}\circ\ppp{-}$ produces a hyperdoctrinal constraint system. \end{theorem} Since we have defined it as acting on adjoint pairs it is evident that the hyperdoctrinal structure on \caladj{} is carried over to the closure operators. One can now inspect the table for the denotational semantics of the concurrent constraint language and see that the definintion of the construct $\newin{X}{A}$ uses the concrete version of the existential quantification operation in the last hyperdoctrine. Parallel composition is modelled by the sup operation in the lattices which is, of course comes ultimately from the meet operation in the first hyperdoctrine. \section{Conclusions} We view the results of this paper as the start of a larger investigation in the same spirit. The most important direction is to formulate a notion of higher order constraint programming. Higher order process calculi are starting to be studied in earnest especially those related to the lambda calculus~\cite{Boudol89,Jagadeesan90}. Recently, Jagadeesan and Pingali~\cite{Jagadeesan92} have presented a higher-order version of Id, a functional language with logical variables, and Saraswat~\cite{Saraswat91u} has developed the notion of higher-order concurrent constraint programming. We expect to study properties of the category of constraint systems of ~\cite{Saraswat91u}, by extending the techniques of this paper. For example, we expect to obtain this category by applying some standard constructions, such as Grothendeick fibration, to the hyperdoctrinal structures of Section 3. The other important direction to pursue is the study of other logical combinators and process combinators. In particular, we expect to explore the categorical implications of adding disjunction, negation, implication and universal quantification to the computational setting. \appendix \section{Closure operators} In order to avoid confusion in comparing our statements with those in the compendium we note that we say ``left adjoint'' where the compendium would say ``lower adjoint'' and similarly we say ``right adjoint'' where the compendium would say ``upper adjoint''. \begin{defin} Given a lattice $L$ a function $c$ from $L$ to $L$ is a {\em closure operator} if $c$ is monotone, idempotent, i.e. $c = c\circ c$, and increasing (extensive), i.e. $\forall x\in L. x\leq c(x)$. \end{defin} The following proposition, taken from pages 21 and 22 of~\cite{Compendium}, gives an equivalent characterization of closure operators. \begin{proposition} Let $f$ be a monotone function from $L$ to $L$. Let $f^{\circ}$ be the corestriction to the image $f^{\circ}:L\rightarrow f(L)$ and let $f_{\circ}$ be the inclusion of the image of $f$ in $L$, $f_{\circ}: f(L)\rightarrow L$. Then $f$ is a closure operator iff $f^{\circ}$ is right adjoint to $f_{\circ}$. \end{proposition} The next two propositions relate a closure operator with its set of fixed points. \begin{proposition} The set of fixed points of a closure operator are closed under the formation of meets (infs) to the extent that they exist. \end{proposition} \begin{proposition} Let $L$ be a complete lattice. Let ${\bf Cl}(L)$ be the set of closure operators on $L$ ordered extensionally. Let ${\cal C}(L)$ be the set of meet-closed subsets of $L$ ordered by {\em reverse} inclusion. Then the map $fix: {\bf Cl}(L)\rightarrow{\cal C}(L)$ that takes a closure operator to its image (which is its set of fixed points) is an order isomorphism. The inverse of $fix$ is the map $clo:{\cal C}(L)\rightarrow{\bf Cl}(L)$ given by $clo(S)(x) = min(\up{x} \cap S) = inf(\up{x} \cap S)$. \end{proposition} \begin{proposition} The collection of closure operators on a complete algebraic lattice ordered extensionally, form a complete algebraic lattice. \end{proposition} \begin{thebibliography}{10} \bibitem{Apt82} K.R Apt and M.H. van Emden. \newblock Contributions to the theory of logic programming. \newblock {\em JACM}, 29(3):841--862, 1982. \bibitem{Asperti90} A. G. Asperti. \newblock Categorical Topics in Computer Science. \newblock Doctoral dissertation, dipartmento di informatica, University of Pisa. \bibitem{Boudol89} G.~Boudol. \newblock Towards a lambda-calculus for concurrent and communicating systems. \newblock In J.~Diaz, editor, {\em TAPSOFT 89, Lecture Notes in Computer Science 351}, pages 149--161. Springer-Verlag, 1989. \bibitem{Compendium} G.Gierz, K.H.Hoffman, K.Keimel, J.D.Lawson, M.Mislove, and D.S.Scott, editors. \newblock {\em A compendium of continuous lattices}. \newblock Springer-Verlag Berlin Heidelberg New York, 1980. \bibitem{Girard87} J.-Y. Girard. \newblock Linear logic. \newblock {\em Theoretical Computer Science}, 50:1--102, 1987. \bibitem{cylindric-algebra} Leon Henkin, J.~Donald Monk, and Alfred Tarski. \newblock {\em Cylindric Algebras (Part I)}. \newblock North Holland Publishing Company, 1971. \bibitem{Jagadeesan92} R.~Jagadeesan, and K.~Pingali. \newblock Abstract Semantics for a higher-order functional language with logic variables. \newblock In {\em Proceedings of the Nineteenth Annual ACM Symposium on Principles of Programming Languages}, 1992. \bibitem{Jagadeesan90} R.~Jagadeesan and P.~Panangaden. \newblock A domain-theoretic model of a higher-order process calculus. \newblock In M.~Paterson, editor, {\em Proceedings of the 17th International Colloquium on Automata Languages and Programming}, pages 181--194. Springer-Verlag, 1990. \newblock Lecture Notes in Computer Science 443. \bibitem{Jagadeesan89} R.~Jagadeesan, P.~Panangaden, and K.~Pingali. \newblock A fully abstract semantics for a functional language with logic variables. \newblock In {\em Proceedings of IEEE Symposium on Logic in Computer Science}, pages 294--303, 1989. \bibitem{Kowalski76} R.A. Kowalski and M.H. van Emden. \newblock The semantics of predicate logic as a programming language. \newblock {\em Journal of the ACM}, 23(4):733--742, 1976. \bibitem{Lawvere63} F.~W. Lawvere. \newblock Functorial semantics of algebraic theories. \newblock {\em Proc. Nat. Acad. Sci. U.S.A.}, 50:869--872, 1963. \bibitem{Saraswat90} Vijay Saraswat and Martin Rinard. \newblock Concurrent constraint programming. \newblock In {\em Proceedings of the Seventeenth Annual ACM Symposium on Principles of Programming Languages}, pages 232--245, 1990. \bibitem{Saraswat91} Vijay Saraswat, Martin Rinard, and Prakash Panangaden. \newblock Semantic foundations of concurrent constraint programming. \newblock In {\em Proceedings of the Eighteenth Annual ACM Symposium on Principles of Programming Languages}, 1991. \bibitem{Saraswat91u} Vijay Saraswat. \newblock Concurrency in the $\lambda$-calculus: higher order concurrent constraint programming \newblock Unpublished. \bibitem{saraswat-thesis-book} Vijay~A. Saraswat. \newblock {\em Concurrent Constraint Programming Languages}. \newblock PhD thesis, Carnegie-Mellon University, January 1989. \newblock To appear, Doctoral Dissertation Award and Logic Programming Series, MIT Press, 1990. \bibitem{Scott76} D.~Scott. \newblock Data types as lattices. \newblock {\em SIAM Journal of Computing}, 5(3):522--587, 1976. \bibitem{scott-info-sys} D.~S. Scott. \newblock Domains for denotational semantics. \newblock In {\em Ninth International Colloquium On Automata Languages And Programming}. Springer-Verlag, 1982. \newblock Lecture Notes In Computer Science 140. \bibitem{Seely83} R.~A.~G. Seely. \newblock Hyperdoctrines, natural deduction and the beck conditions. \newblock {\em Zeitschr. f. math. Logik und Grundlagen d. Math.}, vol 29, pp 505--542, 1983. \end{thebibliography} \end{document}