Discussion:
Strong lambda reduction using Scheme?
(too old to reply)
Nomen Nescio
2008-12-22 09:30:06 UTC
Permalink
In [1], Torben Mogensen describes using Scheme to reduce to normal
form certain pure lambda calculus terms. But the lambda terms require
strong reduction to reach normal form, and this cannot ordinarily be
done with Scheme's weak head reduction.

How did Mogensen coax strong reduction out of a weak head reducer?

Example: One of Mogensen's terms computes Ackermann's function on a
Church numeral. But weak head reduction fails already in taking the
successor of Church zero. The successor function is

S = (lambda (m f x) f (m f x)).

Church zero is

0 = (lambda (f x) x)

With strong reduction (S 0) becomes (lambda (f x) f x), but with weak
head reduction (S 0) stops at (lambda (f x) f ((lambda (f x) x) f x)).
Normal form is never reached.

[1] @Article{mogensen94,
author = {Torben {\AE.} Mogensen},
title = {Efficient Self-Interpretation in Lambda Calculus},
journal = {ftp://ftp.diku.dk/diku/semantics/papers/D-128.dvi.Z},
year = 1994
}
Chung-chieh Shan
2008-12-25 05:47:11 UTC
Permalink
Post by Nomen Nescio
How did Mogensen coax strong reduction out of a weak head reducer?
You can implement any programming language in Scheme, by representing
the programs of the implemented language as values of Scheme.
http://www.brics.dk/~hosc/local/HOSC-11-4-pp363-397.pdf

Mogensen's representation is nice, but it's not necessary to answer your
question.
Nomen Nescio
2008-12-29 09:00:13 UTC
Permalink
Post by Chung-chieh Shan
Post by Nomen Nescio
How did Mogensen coax strong reduction out of a weak head reducer?
You can implement any programming language in Scheme, by
representing the programs of the implemented language as values of
Scheme.
Yes, Scheme is Turing-complete. My impression, though, is that
Mogensen (and others) are using some "trick" to gain strong reduction
rather directly from Scheme's weak head reducer.

Remember that Mogensen was presenting performance figures for these
strong reductions. It seems unlikely that he would not have mentioned
a strong reduction programming language implemented in Scheme if one
were in use, considering that such an interpreter typically slows
computation by an order of magnitude or more.
Post by Chung-chieh Shan
http://www.brics.dk/~hosc/local/HOSC-11-4-pp363-397.pdf
Thanks for this reference.
Chung-chieh Shan
2009-01-01 06:48:08 UTC
Permalink
Post by Nomen Nescio
My impression, though, is that
Mogensen (and others) are using some "trick" to gain strong reduction
rather directly from Scheme's weak head reducer.
Depending on what you consider normal for an interpreter, you might
consider different parts of Mogensen's representation `tricky'. Perhaps
one tricky bit is his use of higher-order abstract syntax: bindings in
the object language are represented by bindings in the metalanguage,
which makes it easy and efficient to perform substitutions. For
example, the Scheme program

(define (reify-function f)
(let ((x (gensym)))
`(lambda (,x) ,(f x))))

(reify-function (lambda (x) ((lambda (y) y) x)))

evaluates to the S-expression (lambda (g43) g43) or something like it.
The reify-function technique (namely, reduce inside a lambda abstraction
by applying it to a symbol) is also used to great effect in Danvy's
type-directed partial evaluation (POPL 1996).
Post by Nomen Nescio
Remember that Mogensen was presenting performance figures for these
strong reductions. It seems unlikely that he would not have mentioned
a strong reduction programming language implemented in Scheme if one
were in use, considering that such an interpreter typically slows
computation by an order of magnitude or more.
Indeed, Mogensen mentions "a call-by-need lambda calculus reducer
implemented in Chez Scheme", in other words a strong reduction
programming language implemented in Scheme.
--
Edit this signature at http://www.digitas.harvard.edu/cgi-bin/ken/sig
100 Days to close Guantanamo and end torture http://100dayscampaign.org/
Nomen Nescio
2009-01-02 12:20:03 UTC
Permalink
For example, the Scheme program
(define (reify-function f)
(let ((x (gensym)))
`(lambda (,x) ,(f x))))
(reify-function (lambda (x) ((lambda (y) y) x)))
evaluates to the S-expression (lambda (g43) g43) or something like
it.
Very pretty. It also works for SKK. Thank you.
Indeed, Mogensen mentions "a call-by-need lambda calculus reducer
implemented in Chez Scheme", in other words a strong reduction
programming language implemented in Scheme.
I did not interpret "call-by-need" to mean "strong". Maybe that's
what he meant.
The reify-function technique (namely, reduce inside a lambda
abstraction by applying it to a symbol) is also used to great effect
in Danvy's type-directed partial evaluation (POPL 1996).
Ironically, this work has been electronically archived here since it
first appeared, but not read until now. Thank you.

A challenge to his technique of partial evaluation is the function

F(n,x) = <I,<I,...,<I,x>>>,

where "<.,.>" denotes the usual pair and there are n "I"s, where I =
\x.x, where "\" is lambda. In lambda calculus, this function can be
expressed as

F = \nx.Y(\hn.Znx<I,h(Pn)>)n,

where Y is a fix-point operator, Z is a zero-test, and Pn is the
predecessor of a numeral n.

I chose this function because it's "deep": The de Bruijn number for x
increases with n. It's a simple result that weak head reduction
cannot deepen a term, so weak head reduction cannot normalize this
term (assuming shallow numerals), even after suitable recoding of the
recursion.

So I want to see what Danvy's partial evaluation technique does with
this function F(n,x) for given n. Studying intensively now. Can you
help?
Nomen Nescio
2009-01-07 22:20:09 UTC
Permalink
Post by Nomen Nescio
Post by Chung-chieh Shan
The reify-function technique (namely, reduce inside a lambda
abstraction by applying it to a symbol) is also used to great effect
in Danvy's type-directed partial evaluation (POPL 1996).
A challenge to his technique of partial evaluation is the function
F(n,x) = <I,<I,...,<I,x>>>,
where "<.,.>" denotes the usual pair and there are n "I"s, where I =
\x.x, where "\" is lambda. In lambda calculus, this function can be
expressed as
F = \nx.Y(\hn.Znx<I,h(Pn)>)n,
where Y is a fix-point operator, Z is a zero-test, and Pn is the
predecessor of a numeral n.
I chose this function because it's "deep": The de Bruijn number for
x increases with n. It's a simple result that weak head reduction
cannot deepen a term, so weak head reduction cannot normalize this
term (assuming shallow numerals), even after suitable recoding of
the recursion.
So I want to see what Danvy's partial evaluation technique does with
this function F(n,x) for given n.
It turns out that it reifies the correct answer just fine, once a
correct type is supplied -- very nice.

However, this does not constitute strong reduction. An eval of the
reified result would be required to get the strongly reduced lambda
term. This is outside of the lambda calculus fragment of Scheme.

I wonder if, for arbitrary given n, Mogensen's "call-by-need lambda
calculus reducer implemented in Chez Scheme" would reduce Fn to strong
normal form.

Loading...