Archive for March, 2008

Productivity

Friday, March 14th, 2008 by Nils Anders Danielsson

In today’s FP Lunch I mentioned a problem with Coq’s guardedness checker.

Consider the following simple representation of languages (using Agda-like syntax):

codata L : Set where
ε   : L
sym : Char -> L
_·_ : L -> L -> L
_∣_ : L -> L -> L


Here ε is the language consisting of the empty string, sym c is the language containing only the character c, _·_ encodes sequencing, and _∣_ takes the union of two languages. I want to be able to represent languages with strings of arbitrary (but finite) length, so L is a coinductive definition.
(more…)

Fair schedulers

Saturday, March 8th, 2008 by Hancock

From time to time, I try to put report things in this blog. If someone (local) can tell me how to put tags in so stuff like ASCII “art” comes out nice, I’d be grateful. It is beyond me. “code” doesn’t work.

This lunchtime was in 2 bits. First Hank spoke, about what he called “functional operating systems”. He seemed to mean combinations of continuous stream processing functions involving fair stream schedulers.

What is a fair stream scheduler? A function

m : Stream A -> Stream B -> Stream (A + B)

whose value always contains infinitely many inL’s and infinitely many inR’s.

Hank claimed that any such function could be represented by a program of type:

M = nu ((A->)* . (B ->)+. (A->))

where F* denotes the free monad over F, and F+ = F . F*.
He gave the constructors of (A->)* X the names Ret and Rd:
 Ret : B -> (A->)* B Rd : (A -> (A->)* B) -> (A ->)* B 
The idea is as follows.

First, define

eat' : (A->)* (C) -> Str A -> ([A], C, Str A)
eat' (Ret c) as = ([],as)
eat' (Rd f) (a:as) = let (al,c,op) = eat (f a) in (a:al,c,op)


Now, let’s define

schd :  M -> Str A -> Str B -> Str (A +B)
schd m as bs = let
t = elim m
(al,f,as') = eat' t as : ([A], (B -> (B ->)* ((A ->) M)), Str A)
t' = f (hd bs)
(bl,g,bs') = eat' t' (tl bs) : ([B], (A ->) M, Str B)
m' = g (hd as')
in (map inL al)
++ [inR (hd bs)]
++ (map inR bl)
++ [inL (hd as)]
++ schd m' (tl as') bs'


In essence, m eats 0 or more A’s, then one or more B’s, then an A, then loops. As m reads things, we pass them through, appropriately
tagged to the output.

Hank asserted, in a voice of thunder, that the output of such a function contained infinitely many inR’s, and infinitely many inL’s; moreover that any such interleaving was the output of such function. He conjectured that
 nu ((A->)* . (B ->)+. (A->)) = nu ((B->)* . (A ->)+. (B->)) 
and then began to talk rather obscurely about circuits

      /|      _________      |\\
/ |---->|    f    |---->| \\
/  |      ---------      |  \\
+->    |      _________      | m  >-+
|   \\  |---->|    g    |---->|  /   |
|    \\ |      ---------      | /    |
|     \\|---->out      in---->|/     |
|                                   V
+<----------------------------------+


which "worked properly" for any choice of scheduler m, suggesting that at any rate he had made sense of the "for any", and that working
properly" might be, minimally, something like continuity (from in to out). He said this idea might help understanding delay-insensitive circuitry, or circuitry which is insensitive to buffering.

This all seemed to be in the context of the work Hank and Dirk and Neil had done some while ago in "stream eating". which represented continuous functions of type Str A -> Str B by nu ((A->)* (B \times)). There is a recent draft here.

Order Preserving Embeddings

Friday, March 7th, 2008 by James Chapman

Last friday I talked about OPEs which Peter Morris and I have been thinking about recently. OPEs give you a nice way of implementing weakening for a syntax with de Bruijn variables and binding. Here’s one:

data Ty : Set where
ι    : Ty
_→_ : Ty -> Ty -> Ty
data Con : Set where
ε   : Con
_,_ : Con -> Ty -> Con
data Var : Con -> Ty -> Set where
vZ : Var (Γ , σ) σ
vS : (τ : Ty) -> Var Γ σ -> Var (Γ , τ) σ
data Tm : Con -> Ty -> Set where
var : Var Γ σ -> Tm Γ σ
λ   : Tm (Γ , σ) τ -> Tm Γ (σ → τ)
_$_ : Tm Γ (σ → τ) -> Tm Γ σ -> Tm Γ τ The simply-typed lambda calculus with one base type. If you throw away the types your contexts become natural numbers, you variables become Fin and your terms become the well-scoped terms. The operation we want is this one: wk : (τ : Ty) -> Tm Γ σ -> Tm (Γ , τ) σ But we cannot implement it by induction on the terms: wk τ (var x) = var (vS τ x) wk τ (t$ u) = wk τ t \$ wk τ u
wk τ (λ t)   = λ ? -- blast!

The type of ? is Tm ((Γ , τ) , σ) ρ where σ and ρ are the domain and range of the lambda.

Clearly we need a more liberal kind of weakening where we can introduce new variables earlier in the context. We could introduce thinning like this:

th : forall Δ τ -> Tm (Γ + Δ) σ -> Tm ((Γ < τ) + Δ) σ

+ is context extension here. We can then define weakening as:

wk = th ε

This works ok until you have to prove properties about thinnings. For example:

wk ρ (th Γ' τ v) == th (Γ' < ρ) τ (wk ρ v)

To prove this you need a more general lemma where both are thinnings which is even very difficult to state due to difficulties with associativity of + not being free in type theory.

Ok, that's the problem. What's the solution? Yes; order preserving embeddings. The idea is to give an inductive description of exactly the types of functions from one context to another that we are interested in. We can then prove the properties we want in a category where the objects are context and the morphisms are OPEs and lift our results to the category of terms.

What are OPEs then?

data OPE : Con -> Con -> Set where
done : OPE ε ε
keep : forall {Γ Δ σ} -> OPE Γ Δ -> OPE (Γ , σ) (Δ , σ)
skip : forall {Γ Δ σ} -> OPE Γ Δ -> OPE (Γ , σ) Δ

This type classifies exactly the operations from Con -> Con where all you can do is say which things to avoid in the target context (which is the first one by the way).

We define identity and composition like so:

id : OPE Γ Γ
id {ε}     = done
id {Γ , σ} = keep id
_•_ : OPE B Γ -> OPE Γ Δ -> OPE B Δ
done   • done   = done
skip f • g      = skip (f • g)
keep f • keep g = keep (f • g)
keep f • skip g = skip (f • g)

and can easily show that they satisfy the left and right identity laws and associativity of composition if we feel so inclined.

Given identity it is easy to define the weakening OPE:

weak : forall τ -> OPE (Γ , τ) Γ
weak τ = skip id


and I can state my property easily:

weak ρ • f == (keep f) • (weak ρ)

The proof is now one line.

We define functors:

vmap : OPE Γ Δ -> Var Δ σ -> Var Γ σ
tmap : OPE Γ Δ -> Tm Δ σ -> Tm Γ σ

Now we can define weakening:

wk τ = tmap (weak τ)

And that's it. Ok so we did have to do some work but the structure is just the categorical structure you'd expect. In fact for a dependently-typed version of this (using induction recursion) you are forced to prove the laws about identity, composition and map as you define the operations. The structure is forced upon you.

A more liberal version of the above called Order Preserving Functions are mentioned in Conor's excellent AFP notes (well worth a read for any budding (wilting?) epigram or agda programmer). Peter and I plan to keep digging on this subject.