Algebraic Separation Logic
Algebraic Separation Logic
Abstract
We present an algebraic approach to separation logic. In particular, we give an algebraic characterisation for assertions
of separation logic, discuss different classes of assertions and prove abstract laws fully algebraically. After that, we use
our algebraic framework to give a relational semantics of the commands of a simple programming language associated
with separation logic. On this basis we prove the frame rule in an abstract and concise way, parametric in the operator
of separating conjunction, of which two particular variants are discussed. In this we also show how to algebraically
formulate the requirement that a command preserves certain variables. The algebraic view does not only yield new
insights on separation logic but also shortens proofs due to a point free representation. It is largely first-order and
hence enables the use of off-the-shelf automated theorem provers for verifying properties at an abstract level.
Keywords: formal semantics, separation logic, frame rule, algebra, semirings, quantales
1. Introduction
Two prominent formal methods for reasoning about the correctness of programs are Hoare logic [20] and Dijk-
stra’s wp-calculus [18]. These approaches, although foundational, lack expressiveness for shared mutable data struc-
tures, i.e., structures where updatable fields can be referenced from more than one point. To overcome this deficiency,
Reynolds, O’Hearn and others have developed separation logic for reasoning about complex and shared data struc-
tures [42, 46]. Their approach extends Hoare logic by a “spatial conjunction” and adds assertions to express separation
between memory regions. In particular, for arbitrary assertions p and q the conjunction p ∗ q asserts that p and q both
hold, but each for a separate part of the storage. This allows expressing aspects of locality, e.g., that mutation of a
single cell in the part that satisfies p will not affect the part that satisfies q. Hence, when reasoning about a program,
one may concentrate on the memory locations that are actually touched by its execution and then embed them into a
larger memory context.
The basic idea of the spatial conjunction is related to early work of Burstall [7] which was then explicitly described
in an intuionistic logic by Reynolds and others. This classical version of separation logic was extended by Reynolds
with a command language that allows altering separate ranges and includes pointer arithmetic. O’Hearn extended this
language to concurrent programs that work on shared mutable data structures [41].
In this paper we present an abstract algebraic approach to separation logic. Our approach is based on quantales [39]
These structures, also called standard Kleene algebras [10], are a special case of the fundamental algebraic structure
of idempotent semirings, which have been used in various applications ranging from concurrency control [9, 21, 22]
to program analysis [30] and semantics [36]. In particular, there are already algebraic characterisations for Hoare
logic [28, 37] and the wp-calculus of Dijkstra [38], which serve as the basis of our current approach.
The algebraic approach achieves several goals. First, the view becomes more abstract, which leads to a consid-
erable reduction of detail and hence allows simpler and more concise proofs. On some occasions also additional
precision is gained. Second, the algebraic abstraction places the topic into a more general context and therefore allows
re-use of a large body of existing theory. Last but not least, since it is largely formulated in pure first-order logic, the
2. Basic Definitions
Separation logic, as an extension of Hoare logic, does not only allow reasoning about explicitly named program
variables, but also about anonymous variables in dynamically allocated storage. Therefore a program state in separa-
tion logic consists of a store and a heap. In the remainder we consistently write s for stores and h for heaps.
To simplify the formal treatment, one defines values and addresses as integers, stores and heaps as partial functions
from variables or addresses to values and states as pairs of stores and heaps:
Values = Z,
{nil} ∪· Addresses ⊆ Values ,
Stores = V ; Values ,
Heaps = Addresses ; Values ,
States = Stores × Heaps ,
where V is the set of program variables, ∪· denotes the disjoint union on sets and M ; N denotes the set of partial
functions between M and N. The constant nil is a value for pointers that denotes an improper reference like null in
programming languages like Java or C; by the above definitions, nil is not an address and hence heaps do not assign
values to nil.
As usual we denote the domain of a relation (or partial function) R by dom(R):
dom(R) =df {x : ∃ y : (x, y) ∈ R} .
In particular, the domain of a store dom(s) denotes all currently used program variables and dom(h) is the set of all
currently allocated addresses on a heap h.
As in [33] and for later definitions we also need an update operator. It is used to model changes in stores and
heaps. Let f1 and f2 be partial functions. Then we define
f1 | f2 =df f1 ∪ {(x, y) : (x, y) ∈ f2 ∧ x < dom( f1 )} . (1)
The function f1 updates the function f2 with all possible pairs of f1 in such a way that f1 | f2 is again a partial function.
The domain of the right argument of ∪ is disjoint from that of f1 . In particular, f1 | f2 can be seen as an extension
of f1 to dom( f1 ) ∪ dom( f2 ). In later definitions we abbreviate an update {(x, y)} | f on a single variable or address by
omitting the set-braces and simply writing (x, y) | f instead.
Expressions are used to denote values or Boolean conditions on stores and are independent of the heap, i.e., they
only need the store component of a given state for their evaluation. Informally, exp-expressions are simple arithmetical
expressions over variables and values, while bexp-expressions are Boolean expressions over simple comparisons and
true, false. Their syntax is given by
2
The semantics e s of an expression e w.r.t. a store s is straightforward (assuming that all variables occurring in e
are contained in dom(s)). For example,
z s = z ∀ z ∈ Values = Z , true s = true and false s = false .
3. Assertions
Assertions play an important rôle in separation logic. They are used as predicates to describe properties of heaps
and stores and as pre- or postconditions in programs, like in Hoare logic:
assert ::= bexp | ¬ assert | assert ∨ assert | ∀ var. assert |
emp | exp 7→ exp | assert ∗ assert | assert −∗ assert .
In the remainder we consistently write p, q and r for assertions of separation logic. Assertions are split into two
parts: the “classical” ones from predicate logic and four new ones that express properties of the heap. The former
are supplemented by the logical connectives ∧ , → and ∃ that are defined, as usual, by p ∧ q =df ¬ (¬ p ∨ ¬ q),
p → q =df ¬ p ∨ q and ∃ v : p =df ¬∀ v : ¬ p .
The semantics of assertions is given by the relation s, h |= p of satisfaction. Informally, s, h |= p holds iff the
state (s, h) satisfies the assertion p ; an assertion p is called valid iff p holds in every state and, finally, p is satisfiable
iff there exists a state (s, h) which satisfies p. The semantics is defined inductively as follows (e.g. [46]).
s, h |= b ⇔ b s = true
df
s, h |= ¬p ⇔df s, h 6|= p
s, h |= p ∨ q ⇔df s, h |= p or s, h |= q
s, h |= ∀ v : p ⇔df ∀ x ∈ Z : (v, x) | s, h |= p
s, h |= emp ⇔df h=∅
s, h |= e1 7→ e2 ⇔df h = {( e1s , e2s )}
s, h |= p∗q ⇔df ∃ h1 , h2 ∈ Heaps : dom(h1 ) ∩ dom(h2 ) = ∅ and
h = h1 ∪ h2 and s, h1 |= p and s, h2 |= q
s, h |= p −∗ q ⇔df ∀ h0 ∈ Heaps : (dom(h0 ) ∩ dom(h) = ∅ and s, h0 |= p)
implies s, h0 ∪ h |= q .
Here, b is a bexp-expression, p, q are assertions and e1 , e2 are exp-expressions. The first four clauses do not consider
the heap; they are well known from predicate logic or Hoare logic [20]. The remaining lines describe the new parts
in separation logic: For an arbitrary state (s, h), emp ensures that the heap h is empty and contains no addressable
cells. An assertion e1 7→ e2 characterises states with the singleton heap that has exactly one cell at the address e1s
with the value e2s . To reason about more complex heaps, the separating conjunction ∗ is used. It allows expressing
properties of heaps that result from merging smaller disjoint heaps, i.e., heaps with disjoint domains. Note, that there
is no separating operator for stores. Later, in Section 8, we will introduce an operator ∗ s for splitting stores and heaps
simultaneously.
A state (s, h) satisfies the separating implication p −∗ q iff, whenever its heap h is extended with a disjoint heap h0
satisfying p , the state with identical store s and the combined heap h ∪ h0 is guaranteed to satisfy q (cf. Figure 1). The
diagram for h ∪ h0 might suggest that the heaps are adjacent after the join. But the intention is only to illustrate that
the united heap satisfies q.
p −∗ q p q
h h′ h ∪ h′
3
4. Quantales
To present our algebraic semantics of separation logic in the next section, we now prepare the algebraic background
by defining the main algebraic structure for assertions.
Definition 4.1.
(a) A quantale [39, 47] is a structure (S , ≤, ·, 1) where (S , ≤) is a complete lattice, (S , ·, 1) is a monoid and multipli-
cation · distributes over arbitrary suprema: for a ∈ S and T ⊆ S ,
a · ( T ) = {a · b : b ∈ T } and ( T ) · a = {b · a : b ∈ T } ,
F F F F
(2)
F
where, for U ⊆ S , the element U is the supremum of U. The least and greatest element of S are denoted by
0 and >, resp. The infimum and supremum of two elements a, b ∈ S are denoted by a u b and a + b, resp. We
assume for the rest of this paper that · binds tighter than u and +. The definition implies that · is strict, i.e., we
have 0 · a = 0 = a · 0 for all a ∈ S . The notion of a quantale is equivalent to that of a standard Kleene algebra [10]
and a special case of the notion of an idempotent semiring.
(b) A quantale is called commutative iff a · b = b · a for all a, b ∈ S .
(c) A quantale is called Boolean iff its underlying lattice is distributive and complemented, whence a Boolean algebra.
In such a structure, complementation is denoted by . Moreover, we define the greatest element > by 0.
An important Boolean quantale is REL, the algebra of binary relations over a set under set inclusion ⊆ and relation
composition ; .
A useful property in a Boolean quantale is shunting:
In particular, a u b ≤ 0 ⇔ b ≤ a.
Definition 4.2.
(a) In any quantale, the right residual a\b [4] exists and is characterised by the Galois connection
This specifies a\b as the greatest solution of the inequation a · x ≤ b. Therefore, \ is a pseudo-inverse to compo-
sition, which is useful in many circumstances.
(b) Symmetrically, the left residual b/a can be defined. If the underlying quantale is commutative then both residuals
coincide, i.e., a\b = b/a. In REL, one has R1 \R2 = R1˘ ; R2 and R1 /R2 = R1 ; R2˘, where ˘ denotes relational
converse.
(c) In a Boolean quantale, the right detachment abb can be defined based on the left residual as
By de Morgan’s laws, the Galois connection for / transforms into the exchange law
abb ≤ x ⇔ x · b ≤ a (exc)
for b that generalises the Schröder rule of relational calculus. An important consequence is the Dedekind rule [27]
a u (b · c) ≤ (abc u b) · c . (Ded)
In a Boolean quantale, for each a the element a u 1 is a test with complement ¬(a u 1) = a u 1; in particular, every
element below 1 is a test. Moreover, according to [35] for a test t and arbitrary elements a, b ∈ S
t · (a u b) = t · a u b = t · a u t · b. (testdist)
And as a direct consequence, the equation t1 · a u t2 · a = t1 · t2 · a holds for tests t1 , t2 and an arbitrary element a.
Algebraic structures in general are suitable for automated theorem proving. However, quantales are not easy to
encode and perform rather badly [12]. This is mainly due to the two distributivity laws (2). If one only uses the
distributivity laws of the form a · (b + c) = a · b + a · c and (b + c) · a = b · a + c · a one can use any first-order automated
theorem proving system (ATP system). Examples are Prover9 [32] and Waldmeister [6]. This simpler structure—
an idempotent semiring—is particularly suitable for ATP systems [24, 25]. For the purpose of this paper, we have
proven most of the theorems using ATP systems. Most of the input files can be found at a web page [23]. However, to
demonstrate the simplicity of our algebra, we will also present most of the proofs within the paper.
In this section we give an algebraic interpretation for the semantics of separation logic. The main idea is to switch
from the satisfaction-based semantics for single states to an equivalent set-based pointfree one where every assertion
is associated with the set of all states satisfying it. This simplifies proofs considerably. For an arbitrary assertion p we
therefore define its set-based semantics as
[[ p ]] =df {(s, h) : s, h |= p} .
Sets of states will be the elements of our algebra, which later will be abstracted to an arbitrary Boolean quantale. For
the standard Boolean connectives we obtain
[[ ¬ p ]] = {(s, h) : s, h 6|= p} = [[ p ]] ,
[[ p ∨ q ]] = [[ p ]] ∪ [[ q ]] ,
[[ p ∧ q ]] = [[p]] ∩ [[q]] , [[ p → q ]] = [[p]] ∪ [[q]] ,
[[ ∀ v : p ]] = {(s, h) : ∀ x ∈ Z : (v, x) | s, h |= p } ,
= {(s, h) : ((v, x) | s, h) ∈ [[ p ]]} ,
d
x∈Z
[[ ∃ v : p ]] = [[ ∀ v : ¬ p ]] = {(s, h) : ∃ x ∈ Z : (v, x) | s, h |= p }
= {(s, h) : ((v, x) | s, h) ∈ [[ p ]]} ,
F
x∈Z
[[ emp ]] = {(s, h) : h = ∅}
= (s, h) : h = e1s , e2s .
[[e1 7→ e2 ]]
Theorem 5.1. The structure AS =df (P(States), ⊆ , ∪· , [[ emp ]]) is a commutative and Boolean quantale with P+Q =
P ∪ Q.
5
The proof is by straightforward calculations; it can be found in [11]. It is easy to show that [[true]] is the greatest element
in the above quantale, i.e., [[true]] = >, since every state satisfies the assertion true. This implies immediately that
[[true]] is the neutral element for u. However, in contrast to addition ∪, multiplication ∪· is in general not idempotent.
Example 5.2. In AS, the set [[ x 7→ 1 ]] consists of all states (s, h) that have the single-cell heap {(s(x), 1)}. We
calculate
[[ (x 7→ 1) ∗ (x 7→ 1) ]]
= [[ (x 7→ 1) ]] ∪· [[ (x 7→ 1) ]]
= {(s, h ∪ h0 ) : (s, h), (s, h0 ) ∈ [[ x 7→ 1 ]] ∧ dom(h) ∩ dom(h0 ) = ∅}
= ∅.
In the last step, the states (s, h) and (s, h0 ) would have to share that particular heap. Hence the domains of the heaps
would not be disjoint. Therefore the last step yields the empty result. t
u
The second property denotes isotony of separating conjunction. More laws and examples can be found in [11].
For the separating implication the set-based semantics extracted from the definition in Section 3 is
In Reynolds works it is often only stated that these laws follow directly from the definition although many authors
uses these laws as an equivalent definition and refer to Reynolds. In fact the separating implication was first defined
in [26] and used there as a particular instantiation of a Boolean BI algebra [45], i.e., a Boolean algebra with an
additional residuated commutative monoid structure. Hence Lemma 5.4 expresses exactly the expected equalities.
As a further example we prove the algebraic counterpart of the inference rule
q ∗ (q −∗ p) ⇒ p .
Lemma 5.6. Let S be a quantale. For a, b ∈ S the inequality b · (b\a) ≤ a holds. And therefore also (a/b) · b ≤ a.
The proof of Lemma 5.8 as well as other proofs are deferred to Appendix A.
In the appendix of [13], a couple of properties for septraction are listed. They can all easily be verified (for example
using off-the-shelf ATP systems).
So far, we have derived an algebraic structure for assertions and have presented the correspondence between the
operations of separation logic and the algebra. We sum up this correspondence in Table 1.
In separation logic one distinguishes different classes of assertions [46]. We now give algebraic characterisations
for the most important classes of assertions, namely intuitionistic, pure, precise and supported assertions. Intuitionistic
assertions do not describe the domain of a heap exactly. Hence, when using these assertions one does not know whether
7
the heap contains additional anonymous cells. This is often the case when pointer references to some portions of a
heap are lost. Pure assertions are independent of the heap and therefore only express conditions on store variables.
In contrast to intuitionistic assertions, the precise ones point out a unique subheap which is relevant to its predicate.
Finally supported assertions ensure for a given set of heaps that there exists a subheap for which the predicate already
holds. This class is e.g. used in [46] when reasoning about directed acyclic graphs.
Intuitively, if a heap h that satisfies an intuitionistic assertion p then every larger heap, i.e. h extended by arbitrary
cells, still satisfies p.
[[ p ]] ∪· [[ true ]] ⊆ [[ p ]] .
Proof.
∀ s, h, h0 : (h ⊆ h0 ∧ s, h |= p) ⇒ s, h0 |= p
⇔ {[ Definition of true ]}
∀ s, h, h0 : (h ⊆ h0 ∧ s, h |= p ∧ s, (h0 − h) |= true) ⇒ s, h0 |= p
⇔ {[ set theory ]}
∀ s, h, h0 : (s, h |= p ∧ s, (h0 − h) |= true ∧ dom(h) ∩ dom(h0 − h) = ∅ ∧ h0 = h ∪ (h0 − h))
⇒ s, h0 |= p
⇔ {[ ⇒ : h00 = h0 − h, ⇐ : dom(h) ∩ dom(h00 ) = ∅ ∧ h0 = h ∪ h00 ⇒ h00 = h0 − h ]}
∀ s, h, h0 : (∃ h00 : s, h |= p ∧ s, h00 |= true ∧ dom(h) ∩ dom(h00 ) = ∅ ∧ h0 = h ∪ h00 ) ⇒ s, h0 |= p
⇔ {[ logic ]}
∀ s, h0 : (∃ h, h00 : s, h |= p ∧ s, h00 |= true ∧ dom(h) ∩ dom(h00 ) = ∅ ∧ h ∪ h00 = h0 ) ⇒ s, h0 |= p
⇔ {[ Definition of ∗ ]}
∀ s, h0 : s, h0 |= p ∗ true ⇒ s, h0 |= p
t
u
Lifting this to an abstract level motivates the following definition.
Definition 6.2. In an arbitrary Boolean quantale S an element a is called intuitionistic iff it satisfies
a·>≤a. (5)
This inequation can be strengthened to an equation since its converse holds for arbitrary Boolean quantales. Elements
of the form a · > are also called vectors or ideals. Those elements are well known, and therefore we obtain many
properties for free (e.g. [48, 31]). We only list some of them to show again the advantages of the algebra.
In particular, we focus on laws that describe the interaction of · and u using intuitionistic assertions.
Lemma 6.3. Consider a commutative Boolean quantale S , intuitionistic elements a, a0 ∈ S and arbitrary elements
b, c ∈ S Then
(a) (a u b) · > ≤ a;
(b) a · b ≤ a u (b · >);
(c) (a u b) · c ≤ a u (b · c);
8
(d) a · a0 ≤ a u a0 .
Proof. To show Part (a) we calculate (a u b) · > ≤ a · > ≤ a. For a proof of Part (b) we know a · b ≤ a · > ≤ a and
a · b ≤ b · > by isotony of · and the assumption. The Laws (c) and (d) can be proved analogously. t
u
Using the quantale AS, it is easy to see that none of these inequations can be strengthened to an equation. In
particular, multiplication (separation conjunction) and meet need not coincide.
Example 6.4. Consider a =df a0 =df [[ x 7→ 1 ∗ true ]] = [[ x 7→ 1 ]] ∪· [[ true ]]. By this definition it is obvious that
a and a0 are intuitionistic. The definitions of Section 3 then immediately imply
a ∩ a0 = [[ x 7→ 1 ]] ∪· [[ true ]]
a ∪· a0 = [[ x 7→ 1 ]] ∪· [[ true ]] ∪· [[ x 7→ 1 ]] ∪· [[ true ]] = ∅ .
The next class will be a proper subset of intuitionistic assertions where these operations indeed coincide.
Theorem 6.5. In AS an element [[ p ]] is pure iff it satisfies, for all [[ q ]] and [[ r ]],
For the proof—which is presented later—we need a number of auxiliary laws. The above theorem motivates the
following definition.
Definition 6.6. In an arbitrary Boolean quantale S an element a is called pure iff it satisfies, for all b, c ∈ S ,
a·>≤a, (7)
a u (b · c) ≤ (a u b) · (a u c) . (8)
Lemma 6.9. In a commutative Boolean quantale, Property (8) is equivalent to ab> ≤ a, where ab> forms the down-
ward closure of a.
Lemma 6.10. In a commutative Boolean quantale, an element a is pure iff one of the following equivalent properties
is satisfied.
(a) a · > ≤ a and a · > ≤ a.
(b) a = (a u 1) · >.
(c) (a u b) · c = a u b · c.
9
A proof can be found in Appendix A. Part (a) also holds for non-commutative quantales; Part (b) characterises
pure elements as fixed points. Part (c) is according to [3]. Since the underlying quantale is commutative, there is also
the dual of Part (c), namely b · (a u c) = a u b · c.
With these characterisations we can now prove the equivalence between the formulation in separation logic and
the algebraic one.
Proof of Theorem 6.5. By Lemma 6.10(b) and definition of the elements of AS it is sufficient to show that the
following formulas are equivalent in separation logic
Since both assertions are universally quantified over states we omit that quantification in the remainder and only keep
the quantifiers on heaps. Before proving this equivalence we simplify s, h |= (p ∧ emp ) ∗ true. Using the definitions
of Section 3, we get for all h ∈ Heaps
s, h |= (p ∧ emp ) ∗ true
⇔ ∃ h1 , h2 ∈ Heaps : dom(h1 ) ∩ dom(h2 ) = ∅ and h = h1 ∪ h2
and s, h1 |= p and s, h1 |= emp and s, h2 |= true
⇔ ∃ h1 , h2 ∈ Heaps : dom(h1 ) ∩ dom(h2 ) = ∅ and h = h1 ∪ h2
and s, h1 |= p and h1 = ∅
⇔ ∃ h2 ∈ Heaps : h = h2 and s, ∅ |= p
⇔ s, ∅ |= p .
Instantiating Equation (9) and using this result we obtain
∀ h, h0 ∈ Heaps : (s, h |= p ⇔ s, h0 |= p)
⇒ ∀ h ∈ Heaps : (s, h |= p ⇔ s, ∅ |= p)
⇔ ∀ h ∈ Heaps : (s, h |= p ⇔ s, h |= (p ∧ emp ) ∗ true) .
Corollary 6.11. Pure elements form a Boolean lattice, i.e., they are closed under +, u and . Moreover the lattice is
complete.
The following lemma shows that in the complete lattice of pure elements meet and join coincide with composition
and sum, respectively.
Lemma 6.12. Consider a commutative Boolean quantale S , pure elements a, a0 ∈ S and arbitrary elements b, c ∈ S
Then
(a) a · b = a u b · >;
(b) a · a0 = a u a0 ; in particular a · a = a and a · a = 0.
Proof. For a proof of Part (a) we calculate, using Lemma 6.10(c), a u b · > = a u > · b = (a u >) · b = a · b. To show
Part (b), we use again Lemma 6.10(c) and neutrality of 1 w.r.t. · to obtain a · a0 = a u a0 · 1 = a u a0 . t
u
Many further properties, in particular, for the interaction of pure assertions with residuals and detachments, can be
found in in the appendix of [13]. In some cases we have analogous situations as for the ∗ - operator in Lemma 6.10(c)
where pure assertions can be pulled out of each argument of residuals and detachments.
10
6.3. Precise Assertions
Next we focus on the class of precise assertions. They play a major rôle in characterising best local-actions in [8].
Intuitively they point out precise portions of heaps that satisfy the predicate. An assertion p is called precise iff for all
states (s, h), there is at most one subheap h0 of h for which (s, h0 ) |= p, i.e.,
∀ s, h, h1 , h2 : (s, h1 |= p ∧ s, h2 |= p ∧ h1 ⊆ h ∧ h2 ⊆ h) ⇒ h1 = h2
Theorem 6.13. In AS an element [[ p ]] is precise iff it satisfies, for all [[ q ]] and [[ r ]],
(a · b) u (a · c) ≤ a · (b u c) . (11)
Equation (11) can be strengthened to an equation since a · (b u c) ≤ (a · b) u (a · c) holds by isotony. Next we give
some closure properties for this assertion class which again can be proved fully algebraically.
Lemma 6.15. If a and a0 are precise then so is a · a0 , i.e., precise assertions are closed under multiplication.
Proof. The proof is by straightforward calculations. For arbitrary elements b, c and precise elements a, a0 , we have
Lemma 6.16. If a is precise and a0 ≤ a then a0 is precise, i.e., precise assertions are downward closed.
[[p]] ∪· [[emp]] ⊆ ∅
Proof.
11
∀ s, h : s, h |= p ⇒ dom(h) = Addresses
⇔ ∀ s, h : (s, h |= p ⇒ (∀ h0 . h ⊆ h0 ⇒ h0 ⊆ h))
⇔ ∀ s, h, h0 : (s, h |= p ⇒ (h ⊆ h0 ⇒ h0 ⊆ h))
⇔ ∀ s, h, h0 : (s, h |= p ⇒ ¬(h ⊆ h0 ∧ h0 − h , ∅))
⇔ ∀ s, h0 : ¬(∃ h : s, h |= p ∧ h ⊆ h0 ∧ h0 − h , ∅)
⇔ ∀ s, h0 : ¬(∃ h, h00 : s, h |= p ∧ h00 , ∅ ∧ dom(h) ∩ dom(h00 ) = ∅ ∧ h0 = h ∪ h00 )
⇔ ∀ s, h0 : s, h0 |= p ∗ ¬emp ⇒ false
t
u
Consequently in our algebraic setting we characterise this class as follows.
Definition 6.19. In an arbitrary Boolean quantale S an element a is called fully allocated iff
a·1≤0. (13)
Proof. We calculate
a · b = a · ((b u 1) + (b u 1)) = a · (b u 1) + a · (b u 1) = a · (b u 1) ,
Proof of 6.21. For a fully allocated element a and arbitrary b and c, we then get
Again this holds by Theorem 6.22, (testdist), isotony of ·, u and the fact that · coincides with u on test elements. 2
∀ s, h1 , h2 : h1 , h2 are compatible ∧ s, h1 |= p ∧ s, h2 |= p
⇒ ∃ h0 : h0 ⊆ h1 ∧ h0 ⊆ h2 ∧ s, h0 |= p
By the assumption h1 and h2 are compatible, it is meant that they agree on their intersection, i.e., h1 ∪ h2 is a
function again. However, the store s is fixed in the definition. Later on, we will introduce an operator that also allows
decomposition of the store.
The following characterisation of supported elements is novel.
12
Theorem 6.23. In AS an element [[ p ]] is supported iff it satisfies, for all [[ q ]] and [[ r ]],
The key idea to prove Theorem 6.23 is to use predicates p which precisely describe one heap and one store, i.e.,
the set [[p]] is a singleton set that contains only a single state. Therefore we state directly [[s, h]] = {(s, h)}. Before
showing the proof we need two auxiliary lemmas. The first is simple set theory — therefore we skip the proof.
Lemma 6.24. We assume three arbitrary sets A, B and C. If B ⊆ C then we have C − A ⊆ C − B ⇔ B ⊆ A.
Hence, if A ⊆ C and B ⊆ C then C − A = C − B ⇔ A = B.
Next we give some simple properties of [[s, h]].
The lengthy, but straightforward proof can be found in Appendix A. Now we give a proof of the above theorem.
⇔ {[ definition of ∗ ]}
s, h |= p ∗ (q ∗ true ∧ r ∗ true)
Next we show the other direction. For that we assume, for all q,r,
⇔ {[ definition of ∗ ]}
∃ h : h0 ⊆ h1 ∪ h2 ∧ s, h0 |= p ∧
0
⇔ {[ set theory ]}
∃ h0 : s, h0 |= p ∧ h0 ⊆ h1 ∧ h0 ⊆ h2
2
As before, this can be lifted to the abstract level of quantales.
Definition 6.26. In an arbitrary Boolean quantale S an element a is supported iff it satisfies for arbitrary b,c
a · b u a · c ≤ a · (b · > u c · >) .
Following this characterisation of supported assertions we now give properties for this class of assertion which
can be completely derived algebraically. As before, we refer to the appendix of [13] for further properties.
Proof. By Lemma 6.12(a), associativity, commutativity and idempotence of u, isotony, and Lemma 6.12(a) again:
a·bua·c
= (a u b · >) u (a u c · >)
= a u (b · > u c · >)
≤ a u (b · > u c · >) · >
= a · (b · > u c · >)
t
u
Proof. a · b u a · c ≤ a · (b u c) ≤ a · (b · > u c · >). This holds by the definition of precise elements and isotony. t
u
Proof. For supported elements a and a0 we calculate, using the definition of supported elements and isotony,
14
7. Commands
After dealing with the assertions of separation logic, we now turn to the commands in the simple imperative
language associated with it.
Let us explain these constructs informally, where we skip the well-known first three lines. For a (boolean) expres-
sion e we denote its value w.r.t. a store s by e s .
In a given state s the command v := cons (e1 , ..., en ) allocates n cells with eis as the contents of the i-th cell.
The cells have to form an unused contiguous region somewhere on the heap; the concrete allocation is chosen non-
deterministically [50].
The address of the first cell is stored in v while the rest of the cells can be addressed indirectly via the start address.
A dereferencing assignment v := [e] assumes that e is an exp-expression and the value e s (corresponding to *e in
C) is an allocated address on the heap, i.e., e s ∈ dom(h) for the current heap h. In particular, after its execution, the
value of v is the contents of a dereferenced heap cell.
Conversely, an execution of [e1 ] := e2 assigns the value of the expression on the right hand side to the cell whose
address is the value of the left hand side.
Finally, the command dispose e is used for deallocating the heap cell with address e s . After its execution the
disposed cell is not valid any more, i.e., dereferencing that cell would cause a fault in the program execution.
We now define a formal semantics for this language by inductively assigning to every program P formed according
to the above grammar a command [[P]]c ∈ Cmds in the sense of the previous section. In particular, [[skip]]c =df I. For
all language constructs we also specify the correctness conditions, like that the free variables of all program parts are
within the domains of the stores involved and that only allocated heap cells may be accessed.
As auxiliaries we define the functions FV and MV that assign to every program P the set of its free and modified
variables, resp., where in the case of FV we assume that for (boolean) expressions a corresponding function with the
same name is predefined.
To abbreviate the semantic definition, we use a convention similar to that of the refinement calculus (e.g. [1]). We
characterise relations by formulas linking the input states (s, h) and output states (s0 , h0 ). If F is such a formula then
Rb= F abbreviates the clause (s, h) R (s0 , h0 ) ⇔df F. For use in the definition of if and while we also assign a semantics
16
to boolean expressions b viewed as assertion commands.
We now discuss inference rules for commands. By the above abstraction these rules become simple consequences
of the well established general relational view of commands. We only sketch the rule for mutation.
Theorem 7.4. Let p be an assertion and e, e0 be exp-expressions. Then valid triples for mutation commands are
An important ingredient of the separation calculus is the frame rule [42]. It describes the use of the separating con-
junction in pre- and postconditions for commands and allows local reasoning. For assertions p, q and r and command
C it reads
{p} C {q}
.
{p ∗ r} C {q ∗ r}
The premise ensures that starting the execution of C in a state satisfying p ends in a state satisfying q. Furthermore the
conclusion says that extending the initial and final heaps consistently with disjoint heaps will not invalidate the triple
17
in the premise. Hence a “local” proof of {p} C {q} will extend to a “more global” one in the additional heap context
r. A standard proof [50] uses three assumptions: A side condition stating that no modified variable is changed by r,
safety monotonicity and the frame property. Safety monotonicity guarantees that if C is executable from a state, it can
also run on a state with a larger heap; the frame property states that every execution of C can be tracked back to an
execution of C running on states with a possibly smaller heap.
We will present a modified, more generally applicable rule using algebraic counterparts for the conditions involved.
Definition 8.1. A separation algebra is a partial commutative monoid (Σ, •, u). A partial commutative monoid is
given by a partial binary operation where the commutativity, associativity and unit laws hold for the equality that
means both sides are defined and equal, or both are undefined. The induced combinability relation # is given by
σ0 # σ1 ⇔df σ0 • σ1 is defined
An example for a separation algebra is (States, ∗, [[emp]]c ). Later we will use this particular algebra to illustrate the
frame rule.
Over every separation algebra we can define the split and join relations:
Definition 8.2.
(a) The split relation • ⊆ Σ × (Σ × Σ) w.r.t. • is given by
σ• (σ , σ ) ⇔
1 2 df σ1 # σ2 ∧ σ = σ1 • σ2
(σ1 , σ2 )
• σ ⇔
df σ1 # σ2 ∧ σ = σ1 • σ2
We introduce a special symbol for it, rather than writing • ˘, to ease reading.
R1 • R2 =df • ; (R1 × R2 ) ;
•
By associativity and commutativity of • on Σ, the lifted operation is associative and commutative, too. For tests
p, q the • composition p • q is a test again, irrespective of the underlying separation algebra.
(e) The partial identity # ⊆ (Σ × Σ) × (Σ × Σ) characterises those pairs of states which are combinable:
18
It is well known that × and ; satisfy an exchange law:
• = • ; # ,
• = #;
• . (16)
Moreover, from the definition and the unit property of u it is clear that • ;
• = I.
In the concrete separation algebra (States, ∗, [[emp]]c ) the split relation ∗ means that the a state (s, h) is split into
two states (s, h1 ) and (s, h2 ) with h = h1 ∪ h2 ; more precisely, only the heap is split into disjoint parts. By this, the
·
separating conjunction p ∪· q of tests p, q in the command quantale over States coincides with p ∗ q = ∗ ; (p × q) ; ∗ .
Lemma 8.3.
Proof.
(a) First, σ (> ; • ) (ρ , ρ ) ⇔ ∃ ρ : σ > ρ ∧ ρ # ρ ∧ ρ = ρ • ρ ⇔ ρ # ρ ∧ ∃ ρ : ρ = ρ • ρ ⇔ ρ # ρ .
1 2 1 2 1 2 1 2 1 2 1 2
Second,
σ ( • ; (> × >) ; #) (ρ , ρ )
1 2
⇔ ∃ σ1 , σ2 : σ1 # σ2 ∧ σ = σ1 • σ2 ∧ σ1 > ρ1 ∧ σ2 > ρ2 ∧ ρ1 # ρ2
⇔ ∃ σ1 , σ2 : σ1 # σ2 ∧ σ = σ1 • σ2 ∧ ρ1 # ρ2
⇔ ρ1 # ρ2 ,
since we can choose σ1 = σ and σ2 = u.
(b) Straightforward from Part (a), exchange (15) and the definition of • on relations. t
u
Definition 8.4. Let H be the command that preserves all heaps while being liberal about the stores:
19
Definition 8.5. Command C has the ∗-frame-property iff
(dom(C) × I) ;
∗ ; C ⊆ (C × H) ;
∗ .
Prefixing the join operator with dom(C) × I on the left-hand side ensures that the property need only hold in those
cases where C can indeed act on a smaller state. Pointwise, for arbitrary s, s0 ∈ Stores and h0 , h1 , h0 ∈ Heaps, this
definition spells out as
Informally, this describes that the heap storage not being used by C will remain unchanged while C may change
certain variables of the starting state (s, h1 ).
As a check for adequacy we show exemplarily that the mutation command [e1 ] := e2 satisfies the frame property.
To ease reading, we denote the heap of a state σ by hσ . Moreover, none of the mutation commands change the stores
and ∗ assumes the stores of the considered states to be equal, hence all stores of all states that occur in the proof are
the same, denoted by s. Since I ⊆ H, it suffices to show the frame property with ([[[e1 ] := e2 ]]c × I) ; ∗ instead of
([[[e1 ] := e2 ]]c × H) ;
∗ on the right hand side.
The localisation property states that for partial functions f1 | ( f2 ∪ f3 ) = ( f1 | f2 ) ∪ f3 provided dom( f1 ) ∩ dom( f3 ) = ∅.
A proof can be found in [33].
20
8.3. Preservation of Variables
It remains to express algebraically the requirement that a command preserves certain variables. In this, we would
like to avoid an explicit mention of syntax and free variables and find a suitable purely algebraic condition instead.
In the proof of the frame rule it will be applied after the command under consideration has been placed in parallel
composition with H after application of the frame property. This “context” is taken care of by the following definition,
where > is again the universal relation.
∗ ; (C × (r ; H)) ; # ⊆ > ; ∗ ; (I × r) .
The informal explanation is straightforward: when C is run in parallel with something starting in r, every re-
assembled result state must contain an r-part, too. Equivalently, we can replace the right-hand side by > ; • ; (I × r) ; #.
Proof. The assumption implies that C cannot change the store part of any state, but only heap parts. Formally,
∀ σ, τ : σ C τ ⇒ sσ = sτ . (19)
Now we calculate
(σ1 , σ2 ) # ; (C × (r ; H)) ; # (τ1 , τ2 )
⇔ {[ definitions ]}
σ1 # σ2 ∧ σ1 C τ1 ∧ σ2 (r ; H) τ2 ∧ τ1 # τ2
⇔ {[ definition of ; and r is subidentity ]}
σ1 # σ2 ∧ σ1 C τ1 ∧ σ2 ∈ r ∧ σ2 H τ2 ∧ τ1 # τ2
⇔ {[ by τ1 # τ2 ∧ σ1 # σ2 , assumption (19) implies sσ2 = sτ2 ]}
σ1 # σ2 ∧ σ1 C τ1 ∧ σ2 ∈ r ∧ σ2 H τ2 ∧ sσ2 = sτ2 ∧ τ1 # τ2
⇒ {[ the definition of H and sσ2 = sτ2 imply σ2 = τ2 ]}
σ1 # σ2 ∧ σ1 C τ1 ∧ σ2 ∈ r ∧ σ2 = τ2 ∧ τ1 # τ2
⇒ {[ logic and omitting conjuncts ]}
σ1 # σ2 ∧ τ2 ∈ r ∧ τ1 # τ2
⇔ {[ definition of > ]}
σ1 # σ2 ∧ σ1 > τ1 ∧ σ2 > τ2 ∧ τ2 ∈ r ∧ τ1 # τ2
⇔ {[ definitions ]}
(σ1 , σ2 ) # ; (> × (> ; r)) ; # (τ1 , τ2 ) .
t
u
A similar treatment of the general condition MV(C) ∩ FV(r) = ∅ is possible; we omit the technicalities.
21
8.4. A General Frame Rule
In this section we give a completely algebraic soundness proof of the standard frame rule, generalised to separation
algebras. In particular, the relation H will be replaced by a generic one that again compensates for incongruities,
possibly both on store and heap.
Definition 8.8. Given a separation algebra (Σ, •, u), a relation K ⊆ Σ × Σ is called a compensator iff it satisfies the
following properties:
(a) cod(K) ⊆ dom(K) ⊆ K,
(b) # ; (I × K) ; # = #,
(c) K ; K = K.
Requirement (a) ensures that arbitrarily long sequences of commands can be “accompanied” by equally long
compensator sequences. Requirement (b) enforces that K does not deviate too far from simple equality. This and
Requirement (c) r will be used right below and in Section 8.5 for establishing the frame property and preservation for
some concrete (classes of) commands.
The command H is a compensator for the separation algebra (States, ∗, [[emp]]c ).
We now can give generalised definitions of the frame and preservation properties.
(dom(C) × dom(K)) ;
• ; C ⊆ (C × K) ;
• .
• ; (C × (r ; K)) ; # ⊆ > ; • ; (I × r) .
As an indication that this definition is reasonable, we show some properties about I, both as a command and a test.
Corollary 8.10. I has the frame property and preserves every test. Moreover, every command preserves I.
K-frame property.
Second,
# ; (I × (r ; K)) ; #
= {[ neutrality of I and exchange (15) ]}
# ; (I × r) ; (I × K) ; #
= {[ # and (I × r) are partial identities and hence are idempotent and commute ]}
# ; (I × r) ; # ; (I × K) ; #
⇔ {[ definition of compensators (Def. 8.8(b)) ]}
# ; (I × r) ; #
⊆ {[ isotony ]}
# ; (> × (> ; r)) ; #
Third, that every command preserves I is immediate from (20) and neutrality of I. t
u
We derive some further consequences from the definitions.
22
Lemma 8.11.
(a) Suppose command C has the K-frame-property. Then for all tests p, r we have
(b) Suppose command C K-preserves test r. Then for all tests q we have
(C ; q) • (r ; K) ⊆ > ; (q • r) .
The proof can be found in the Appendix.
The generalised frame rule now reads as follows.
Theorem 8.12. Assume a compensator K, a command C and a test r such that C has the K-frame-property and
K-preserves r. Then the following inference rule is valid:
p • r ⊆ (p ; dom(C)) • (r ; dom(K)) {p} C {q}
{p • r} C {q • r}
Proof. We calculate, assuming p • r ⊆ (p ; dom(C)) • (r ; dom(K)),
(p • r) ; C
⊆ {[ by Lemma 8.11(a) ]}
(p ; C) • (r ; K)
⊆ {[ by {p} C {q}, i.e., p ; C ⊆ C ; q, and isotony ]}
(C ; q) • (r ; K)
⊆ {[ by Lemma 8.11(b) ]}
> ; (q • r)
By the second characterisation of the Hoare triple presented in Lemma 7.3 we have therefore established {p•r} C {q•r}.
t
u
For the special separation algebra (States, ∗, [[emp]]c ) we obtain the ∗ frame rule. Notice that dom(H) = I.
Corollary 8.13. Consider a command C and a test r such that C has the ∗-frame-property and ∗-preserves r. Then
the following inference rule is valid:
p ∗ r ⊆ (p ; dom(C)) ∗ r {p} C {q}
{p ∗ r} C {q ∗ r}
The condition p ∗ r ⊆ (p ; dom(C)) ∗ r is a relaxation of the implicit condition p ⊆ dom(C) that is part of Reynolds’s
definition of validity of a Hoare triple (which differs from the standard one). For that reason also no notion of safety
or safety-monotonicity of a command enters our treatment.
# ; ((C ; D) × K) ; # = # ; (C × K) ; # ; (D × K) ; # (21)
23
The proof can be found in the Appendix. The condition # ; ((C ; D) × K) ; # = # ; (C × K) ; # ; (D × K) ; # means that
the “local” intermediate state of the composition C ; D induces a “global” intermediate state; it means a “modular”
way of composition.
Let us briefly comment on the assumption cod(C) ⊆ dom(D) in this lemma. This is needed since we want to
argue about compositions in isolation. Contrarily, the related paper [50] talks about complete, non-blocking runs of
programs; therefore for all compositions that make up such a run the above assumption about codomain and domain
of successive commands is automatically always satisfied.
The second condition that enters closure of preservation under composition reflects the following observations in
algebraic terms. If one wants to prove a frame law {p • r} C ; D {s • r} for a composition C ; D one may proceed in the
Hoare logic style as follows. First one shows, with a suitable intermediate assertion q, that {p} C {q} and {q} D {s} hold.
Then, assuming the frame property and r-preservation for C and D, one infers {p • r} C {q • r} and {q • r} D {s • r}, from
which the claim follows by the sequencing rule. The existence of q corresponds to the insertion of an intermediate #
relation in (21).
Corollary 8.15. The K-frame property and K-preservation propagate from commands C and D to if p then C else D
for arbitrary test p.
Corollary 8.16. The structure (States, ∗S , {(∅, ∅)}) is a separation algebra and I is a compensator. Therefore the gen-
eralised frame rule is valid for it.
We have presented an algebraic treatment of separation logic. For assertions we have introduced a model based
on sets of states. By this, separating implication coincides with a residual and most of the inference rules of [46]
are simple consequences of standard residual laws. For intuitionistic, pure, precise and supported assertions we have
given algebraic characterisations. Furthermore we have defined a class of assertions which are both intuitionistic and
precise.
As a next step we embedded the command part of separation logic into a relational algebraic structure. There, we
have derived algebraic characterisations of properties on which the frame rule relies. In particular, we have expressed
the side condition that certain variables must not be changed by a command in a purely semantic way without appeal-
ing to the syntax. Moreover, we have formulated a more liberal version of the frame rule that rests on simpler side
conditions and have proved it purely algebraically. Finally we have shown that, under mild conditions, the commands
for which the frame rule is sound are closed under union and composition.
To underpin our approach we have algebraically verified one of the standard examples — an in-place list reversal
algorithm. The details can be found in [11]. The term in-place means that there is no copying of whole structures, i.e.,
the reversal is done by simple pointer modifications.
Due to our relational embedding we can, as a next step, derive inference rules for if-statements and for the while-
loop. This has been done for classical Hoare logic (see [37]); hence it should be straightforward to extend this into the
setting of separation logical.
So far we have not analysed situations where data structures share parts of their cells (cf. Figure 2). First steps
towards an algebraic handling of such situations are given in [34, 19]. In future work, we will adapt these approaches
to algebraic separation logic.
24
x 1 2 3 4 5
◦
y 7 8
Our algebraic approach to separation logic also paves the way for verifying properties with off-the-shelf theorem
provers. Boolean semirings have proved to be reasonably well suitable for automated theorem provers [24]. Therefore
the first-order part of our approach can easily be shown by automated theorem provers. If one needs the full power
of quantales, the situation is a bit different: there are encodings for quantales for higher-order theorem provers [12].
However at the moment higher-order systems can only verify simply theorems fully automatically. Looking at the
development of first-order provers in the past, we expect a rapid development in automated higher-order provers.
Hence one of the next plans for future work is to analyse the power of such systems for reasoning with separation logic.
A long-term perspective is to incorporate reasoning about concurrent programs with shared linked data structures
along the lines of [41]. One central property mentioned in that paper is the rule of disjoint concurrency which reads
Acknowledgements: We are grateful to the anonymous referees for their careful scrutiny of the manuscript and their
many helpful remarks. The material in Section 8 benefited greatly from discussions with C.A.R. Hoare and P. O’Hearn.
References
[1] R.-J. Back and J. von Wright. Refinement Calculus: A Systematic Introduction. Graduate Texts in Computer Science. Springer, 1998.
[2] R. C. Backhouse and J. van der Woude. Demonic operators and monotype factors. Mathematical Structures in Computer Science, 3(4):417–
433, 1993.
[3] B. Biering, L. Birkedal, and N. Torp-Smith. BI-hyperdoctrines, Higher-order Separation Logic, and Abstraction. ACM Transactions on
Programming Languages and Systems, 29(5):24, 2007.
[4] G. Birkhoff. Lattice Theory, volume XXV of Colloquium Publications. Annals of Mathematics Studies, 3rd edition, 1967.
[5] T. Blyth and M. Janowitz. Residuation theory. Pergamon Press, 1972.
[6] A. Buch, T. Hillenbrand, and R. Fettig. Waldmeister: High Performance Equational Theorem Proving. In J. Calmet and C. Limongelli,
editors, Proceedings of the International Symposium on Design and Implementation of Symbolic Computation Systems, number 1128 in
Lecture Notes in Computer Science, pages 63–64. Springer, 1996.
[7] R. Burstall. Some techniques for proving correctness of programs which alter data structures. Machine Intelligence, 7:23–50, 1972.
[8] C. Calcagno, P. O’Hearn, and H. Yang. Local action and abstract separation logic. In LICS ’07: Proceedings of the 22nd Annual IEEE
Symposium on Logic in Computer Science, pages 366–378. IEEE Press, 2007.
[9] E. Cohen. Using Kleene algebra to reason about concurrency control. Technical report, Telcordia, 1994.
[10] J. H. Conway. Regular Algebra and Finite Machines. Chapman & Hall, 1971.
[11] H.-H. Dang. Algebraic aspects of separation logic. Technical Report 2009-01, Institut für Informatik, 2009.
[12] H.-H. Dang and P. Höfner. Automated Higher-Order Reasoning about Quantales. In B. Konev, R. Schmidt, and S. Schulz, editors, Workshop
on Practical Aspects of Automated Reasoning (PAAR-2010), pages 40–49, 2010.
[13] H.-H. Dang, P. Höfner, and B. Möller. Algebraic Separation Logic. Technical Report 2010-06, Institute of Computer Science, University of
Augsburg, 2010.
[14] J. Desharnais, N. Belkhiter, S. Sghaier, F. Tchier, A. Jaoua, A. Mili, and N. Zaguia. Embedding a demonic semilattice in a relational algebra.
Theoretical Computer Science, 149(2):333–360, 1995.
[15] J. Desharnais, A. Mili, and T. Nguyen. Refinement and demonic semantics. In C. Brink, W. Kahl, and G. Schmidt, editors, Relational
Methods in Computer Science, pages 166–183. Springer, 1997.
[16] J. Desharnais and B. Möller. Characterizing Determinacy in Kleene Algebras. Information Sciences, 139:253–273, 2001.
25
[17] J. Desharnais, B. Möller, and F. Tchier. Kleene under a modal demonic star. Journal of Logic and Algebraic Programming, 66(2):127–160,
2006.
[18] E. Dijkstra. A discipline of programming. Prentice Hall, 1976.
[19] T. Ehm. Pointer Kleene algebra. In R. Berghammer, B. Möller, and G. Struth, editors, Relational and Kleene-Algebraic Methods in Computer
Science, volume 3051 of Lecture Notes in Computer Science, pages 99–111. Springer, 2004.
[20] C. A. R. Hoare. An axiomatic basis for computer programming. Communications of the ACM, 12(10):576–580, 1969.
[21] C. A. R. Hoare, B. Möller, G. Struth, and I. Wehrman. Concurrent Kleene algebra. In M. Bravetti and G. Zavattaro, editors, CONCUR 09 —
Concurrency Theory, volume 5710 of Lecture Notes in Computer Science, pages 399–414. Springer, 2009.
[22] C. A. R. Hoare, B. Möller, G. Struth, and I. Wehrman. Foundations of concurrent Kleene algebra. In R. Berghammer, A. Jaoua, and B. Möller,
editors, Relations and Kleene Algebra in Computer Science, volume 5827 of Lecture Notes in Computer Science. Springer, 2009.
[23] P. Höfner. Database for automated proofs of Kleene algebra. http://www.kleenealgebra.de (accessed March 31, 2011).
[24] P. Höfner and G. Struth. Automated reasoning in Kleene algebra. In F. Pfennig, editor, Automated Deduction, volume 4603 of Lecture Notes
in Artificial Intelligence, pages 279–294. Springer, 2007.
[25] P. Höfner, G. Struth, and G. Sutcliffe. Automated verification of refinement laws. Annals of Mathematics and Artificial Intelligence, Special
Issue on First-order Theorem Proving, pages 35–62, 2008.
[26] S. Ishtiaq and P. O’Hearn. BI as an assertion language for mutable data structures. ACM SIGPLAN Notices, 36:14–26, 2001.
[27] B. Jónsson and A. Tarski. Boolean algebras with operators, Part I. American Journal of Mathematics, 73, 1951.
[28] D. Kozen. On Hoare logic and Kleene algebra with tests. ACM Transactions on Computational Logic, 1(1):60–76, 2000.
[29] D. Kozen. On Hoare logic, Kleene algebra, and types. In P. Gärdenfors, J. Woleński, and K. Kijania-Placek, editors, In the Scope of Logic,
Methodology, and Philosophy of Science: Volume One of the 11th Int. Congress Logic, Methodology and Philosophy of Science, Cracow,
August 1999, volume 315 of Studies in Epistemology, Logic, Methodology, and Philosophy of Science, pages 119–133. Kluwer, 2002.
[30] D. Kozen and M.-C. Patron. Certification of compiler optimizations using Kleene algebra with tests. In J. Lloyd, V. Dahl, U. Furbach,
M. Kerber, K.-K. Lau, C. Palamidessi, L. M. Pereira, Y. Sagiv, and P. J. Stuckey, editors, Proc. 1st Int. Conf. Computational Logic (CL2000),
volume 1861 of Lecture Notes in Computer Science, pages 568–582. Springer, 2000.
[31] R. Maddux. Relation Algebras, volume 150 of Studies in Logic and the Foundations of Mathematics. Elsevier, 2006.
[32] W. W. McCune. Prover9 and Mace4. <http://www.cs.unm.edu/∼mccune/prover9>. (accessed March 31, 2011).
[33] B. Möller. Towards pointer algebra. Science of Computer Programming, 21(1):57–90, 1993.
[34] B. Möller. Calculating with acyclic and cyclic lists. Information Sciences, 119(3-4):135–154, 1999.
[35] B. Möller. Kleene getting lazy. Science of Computer Programming, 65:195–214, 2007.
[36] B. Möller, P. Höfner, and G. Struth. Quantales and temporal logics. In M. Johnson and V. Vene, editors, Algebraic Methodology and Software
Technology, Proc. AMAST 2006, volume 4019 of Lecture Notes in Computer Science, pages 263–277. Springer, 2006.
[37] B. Möller and G. Struth. Algebras of modal operators and partial correctness. Theoretical Computer Science, 351(2):221–239, 2006.
[38] B. Möller and G. Struth. WP is WLP. In W. MacCaull, M. Winter, and I. Düntsch, editors, Relational Methods in Computer Science, volume
3929 of Lecture Notes in Computer Science, pages 200–211. Springer, 2006.
[39] C. Mulvey. &. Rendiconti del Circolo Matematico di Palermo, 12(2):99–104, 1986.
[40] T. T. Nguyen. A relational model of nondeterministic programs. International J. Foundations Comp. Sci., 2:101–131, 1991.
[41] P. O’Hearn. Resources, concurrency, and local reasoning. Theoretical Computer Science, 375:271–307, 2007.
[42] P. O’Hearn, J. C. Reynolds, and H. Yang. Local reasoning about programs that alter data structures. In L. Fribourg, editor, CSL ’01: 15th
International Workshop on Computer Science Logic, volume 2142 of Lecture Notes in Computer Science, pages 1–19. Springer, 2001.
[43] P. O’Hearn, J. C. Reynolds, and H. Yang. Separation and information hiding. ACM Trans. Program. Lang. Syst., 31(3):1–50, 2009.
[44] V. Preoteasa. Frame rule for mutually recursive procedures manipulating pointers. Theoretical Computer Science, 410(42):4216–4233, 2009.
[45] D. J. Pym, P. O’Hearn, and H. Yang. Possible worlds and resources: the semantics of BI. Theoretical Computer Science, 315(1):257–305,
2004. Mathematical Foundations of Programming Semantics.
[46] J. C. Reynolds. An introduction to separation logic. In M. e. a. Broy, editor, In Engineering Methods and Tools for Software Safety and
Security, pages 285–310. IOS Press, 2009.
[47] K. Rosenthal. Quantales and their applications. Pitman Research Notes in Mathematics Series, 234, 1990.
[48] G. Schmidt and T. Ströhlein. Relations and Graphs: Discrete Mathematics for Computer Scientists. Springer, 1993.
[49] V. Vafeiadis and M. Parkinson. A marriage of rely/guarantee and separation logic. In In 18th Int’l Conference on Concurrency Theory
(CONCUR’07), volume 4703 of Lecture Notes in Computer Science, pages 256–271. Springer, 2007.
[50] H. Yang and P. O’Hearn. A semantic basis for local reasoning. In M. Nielsen and U. Engberg, editors, Foundations of Software Science and
Computation Structures, Proc. FOSSACS 2002, volume 2303 of Lecture Notes in Computer Science, pages 402–416. Springer, 2002.
A. Deferred Proofs
Proof of Lemma 5.8. By Lemma 5.4 we have [[p −∗ q]] = [[q]]/[[p]]. Now it is easy to see that
s, h |= p − q
⇔ {[ definition of − ]}
s, h |= ¬(q −∗(¬p))
⇔ {[ definition of −∗ ]}
¬(∀h0 : ((dom(h0 ) ∩ dom(h) = ∅, s, h0 |= q) ⇒ s, h0 ∪ h |= ¬p))
26
⇔ {[ logic: ¬ over ∀ ]}
∃ h0 : ¬((dom(h0 ) ∩ dom(h) = ∅, s, h0 |= q) ⇒ s, h0 ∪ h |= ¬p)
⇔ {[ logic: ¬(A ⇒ B) ⇔ (A ∧ ¬B) ]}
∃ h0 : dom(h0 ) ∩ dom(h) = ∅, s, h0 |= q, s, h0 ∪ h 6|= ¬p
⇔ {[ logic ]}
∃ h : dom(h0 ) ∩ dom(h) = ∅, s, h0 |= q, s, h0 ∪ h |= p
0
Proof of Lemma 6.9. (⇐): Using Equation (Ded), isotony and the assumption, we get 2
a u b · c ≤ (abc u b) · c ≤ (ab> u b) · c ≤ (a u b) · c
and the symmetric formula a u b · c ≤ b · (a u c). From this the claim follows by
a u (b · c) = a u a u (b · c) ≤ a u ((a u b) · c) ≤ (a u b) · (a u c) .
(⇒): From (8) we obtain a u (a · >) ≤ (a u a) · (a u >) = 0 · a = 0 and hence, by shunting (shu) and the exchange
law (exc), ab> ≤ a.
(c) (a u b) · c = a u b · c.
We show the equivalence of the equation with Definition 6.6. First we prove the ⇒ -direction and split the proof
showing each inequation separately starting with ≥:
a u b · c ≤ (a u b) · (a u c) ≤ (a u b) · c .
This holds by Equation 8 and isotony. To prove the ≤-direction we know (a u b) · c ≤ a · c ≤ a · > ≤ a which
follows from isotony and Equation 7. Now using (aub)·c ≤ b·c we can immediately conclude (aub)·c ≤ aub·c.
Next we give a proof for the ⇐ -direction and assume (a u b) · c = a u b · c holds. Equation 7 follows by
a · > = (a u a) · > = a u a · > ≤ a. Furthermore we calculate
a u (b · c) = a u a u (b · c) = a u ((a u b) · c) = a u (a · (a u b)) = (a u b) · (a u c) ,
which holds by using idempotence of u, the assumption, commutativity of · and again the assumption. 2
27
Proof of Lemma 6.25.
(a) Since [[s, h]] is a singleton set, this is obvious.
(b) By definition of ∗, Part (1), by the assumption h ⊆ h0 and Lemma 6.24, and set theory:
s, h0 |= p ∗ [[s, h0 − h]]
⇔ ∃ ĥ : ĥ ⊆ h0 ∧ s, ĥ |= p ∧ s, h0 − ĥ |= [[s, h0 − h]]
⇔ ∃ ĥ : ĥ ⊆ h0 ∧ s, ĥ |= p ∧ h0 − ĥ = h0 − h
⇔ ∃ ĥ : ĥ ⊆ h0 ∧ s, ĥ |= p ∧ ĥ = h
⇔ s, h |= p
(p • r) ; C
⊆ {[ assumption ]}
((p ; dom(C)) • (r ; dom(K))) ; C
= {[ definition of • on relations (Def. 8.2(d)) ]}
• ; ((p ; dom(C)) × (r ; dom(K))) ; • ;C
28
= {[ exchange (15) ]}
• ; (p × r) ; (dom(C) × dom(K)) ;
• ;C
⊆ {[ exchange (15) ]}
; (p ; C) × (r ; K) ;
• •
= {[ definition of • on relations ]}
(p ; C) • (r ; K)
(b) (C ; q) • (r ; K)
= {[ definition of • on relations ]}
• ; (C ; q) × (r ; K) ; •
= {[ Equation (16) ]}
; (C × (r ; K)) ; (q × I) ; # ;
• •
⊆ {[ C K-preserves r ]}
> ; • ; (I × r) ; (q × I) ; •
= {[ definition of • on relations ]}
> ; (q • r) .
Proof of Lemma 8.14. Closure under union is straightforward from distributivity of ; and × over ∪. We now show
closure under (pre-)composition.
(a) Assume that C has the K-frame property and that p is a test.
⊆ {[ isotony ]}
(p × I) ; (dom(C) × dom(K)) ; • ;C
= {[ exchange (15) ]}
((p ; C) × K) ; •
Assume now that C and D have the K-frame property and satisfy cod(C) ⊆ dom(D). Then
⊆ {[ definition of domain ]}
(dom(C) × dom(K)) ; • ; C;D
29
= {[ R = R ; cod(R) for arbitrary R and exchange (15) ]}
(C × K) ; (cod(C) × cod(K)) ; • ;D
= {[ exchange (15) ]}
((C ; D) × (K ; K)) ; •
= {[ definition of compensators ]}
((C ; D) × K) ; • .
• ; ((C ; D) × (r ; K)) ; #
= {[ neutrality of I and exchange (15) ]}
• ; (I × r) ; ((C ; D) × K) ; #
= {[ by Equation (16) ]}
; # ; (I × r) ; ((C ; D) × K) ; #
•
= {[ # and I × r are partial identities and hence are idempotent and commute ]}
• ; # ; (I × r) ; # ; ((C ; D) × K) ; #
= {[ by Equation (16) ]}
• ; (I × r) ; # ; ((C ; D) × K) ; #
= {[ by the assumption ]}
• ; (I × r) ; # ; (C × K) ; # ; (D × K) ; #
⊆ {[ # is a subidentity ]}
; (I × r) ; (C × K) ; # ; (D × K) ; #
•
⊆ {[ C K-preserves r ]}
> ; • ; (I × r) ; (D × K) ; #
= {[ exchange (16) and neutrality of I ]}
> ; • ; (D × (r ; K)) ; #
⊆ {[ D K-preserves r ]}
> ; > ; • ; (I × r)
⊆ {[ > largest element ]}
> ; • ; (I × r)
30