Discussion:
Lambda calculus for dummies?
(too old to reply)
kj
2009-11-04 11:28:02 UTC
Permalink
I've taught myself many programming languages before, but I'm having
a pretty hard time with OCaml. I think the reason for this is that
to learn OCaml I need to learn a whole new way of thinking about
computation.

I recently read a very brief historical sketch that shed some light
on my difficulties. My understanding from it is that Alonzo Church
and Alan Turing independently came up with ways to formalize
computation. Both systems were in some deep sense equivalent (i.e.
neither could compute something that the other one couldn't), but
lead to different ways of construing programs. Turing's way lead
to "von Neumann machines", and to languages like FORTRAN, Pascal,
and C. Church's way lead to languages like ML and Haskell.

I surmise that at least part of my difficulty with learning OCaml
is that I never got a solid foundation on Church's model of
computation, whereas I did get a pretty thorough foundation on the
Turing/von Neumann model, and now my overtrained Turing/von Neumann
brain keeps tripping over a radically different conceptual topography.
So, in a sense, I'm worse off than if I were completely new to
programming...

I know that there are plenty of books out there on Church's lambda
calculus, but what I would *really* like to read is a detailed,
rigorous comparison of Church's and Turing's formalizations of
computation, showing not only their equivalence, but hopefully also
their implications for language design.

Does such exist?

TIA!

kynn
Dirk Thierbach
2009-11-04 14:35:50 UTC
Permalink
Post by kj
I've taught myself many programming languages before, but I'm having
a pretty hard time with OCaml. I think the reason for this is that
to learn OCaml I need to learn a whole new way of thinking about
computation.
Yes :-) That's something everyone who switches from imperative to
functional programming has experienced.
Post by kj
I recently read a very brief historical sketch that shed some light
on my difficulties. My understanding from it is that Alonzo Church
and Alan Turing independently came up with ways to formalize
computation. Both systems were in some deep sense equivalent (i.e.
neither could compute something that the other one couldn't),
It's no so deep: One usually shows the equivalence of models of
computation by simulating one model in the other. That often involves
a bit of handwaving and skipping over the details, i.e. one shows
that one can implement more sophisticated programming features in
the more simple ones, and then essentially writes a simulator in
a high level language.
Post by kj
but lead to different ways of construing programs. Turing's way
lead to "von Neumann machines", and to languages like FORTRAN,
Pascal, and C. Church's way lead to languages like ML and Haskell.
I surmise that at least part of my difficulty with learning OCaml
is that I never got a solid foundation on Church's model of
computation, whereas I did get a pretty thorough foundation on the
Turing/von Neumann model,
I don't know how you learned to program, but in my experience, people
rarely first learn about to program a Turing Machine and then use
this knowledge to learn an imperative programming language. It's the
other way round, they first learn one or several imperative programming
languages and then learn about TM's.

And it's perfectly fine to do it the same way with respect to functional
programming and lambda calculus.
Post by kj
and now my overtrained Turing/von Neumann brain keeps tripping over
a radically different conceptual topography. So, in a sense, I'm
worse off than if I were completely new to programming...
Again, that's something many people experience. What you need to re-learn
is to avoid state, to use functions as basic building blocks and ways
of abstraction, to use algebraic data types, and to think in terms
of whole operations on data instead of the little step by little step
approach of imperative programs. Lambda calculus isn't going to help
you here.
Post by kj
I know that there are plenty of books out there on Church's lambda
calculus, but what I would *really* like to read is a detailed,
rigorous comparison of Church's and Turing's formalizations of
computation, showing not only their equivalence, but hopefully also
their implications for language design.
Does such exist?
I don't think such a comparison would make much sense. You can treat
those two in seperation, and the implications for language design
are somewhat more indirect.

What I can recommend is the Turing Award lecture paper "Can Programming
Be Liberated From the von Neumann Style?" by John W. Backus, especially
the beginning. It's somewhat dated, but still relevant, und should give
you a few ideas how to think differently. Google should be able to
find a copy.

For OCaml itself, I can't comment on OCaml books, because I haven't
read any :-) I learned the language, like every language I learned,
by writing programs in it. So pick some set of exercise (e.g. Project
Euler), and just start writing. You'll get the feeling for it eventually,
though it will take some time.

- Dirk
Benjamin L. Russell
2009-11-09 11:25:24 UTC
Permalink
Post by kj
[...]
I surmise that at least part of my difficulty with learning OCaml
is that I never got a solid foundation on Church's model of
computation, whereas I did get a pretty thorough foundation on the
Turing/von Neumann model, and now my overtrained Turing/von Neumann
brain keeps tripping over a radically different conceptual topography.
So, in a sense, I'm worse off than if I were completely new to
programming...
I actually experienced a similar problem, albeit inversely: I had
difficulty in trying to learn Smalltalk after learning Scheme and some
Haskell (I'm slightly comfortable with Scheme, but am actually still
learning Haskell and Smalltalk).

Eventually, I decided that the only way to learn Smalltalk comfortably
after much exposure to Scheme was somehow to start off again with a
blank slate. While a completely blank slate was not feasible, I was
able mentally to trace back my learning path to before I entered
college, before I was first exposed to Scheme and Haskell, and to
remember what programming was like before I ever learned those
languages, using (to express the idea semi-facetiously) a mental
metaphor of a Scheme continuation: I restored the
self-learning-programming-continuation at the point just before
entering college, before any exposure to Scheme, Haskell, or the
lambda calculus. This process involved mentally tracing back my life
until the point just before entering college, when I had first been
exposed to Scheme, Haskell, and the lambda calculus.

I was fortunate enough to be living in the same town as I had lived
from 1985 to 1989, before journeying to college. Since I still did
not feel any different about programming, to deepen the trance, so to
speak, I went further and started eating the same foods, drinking the
same kinds of drinks, and walking the same roads as then. When I
manged to find one hidden road that I had often walked in my teenage
years whenever I had felt unhappy and wanted to take a break from the
rest of the world, something clicked mentally, and I was able to
remember what life felt like back then: I returned to who I was back
then.

Then I resumed programming as I would have done in 1989, in N-80 BASIC
(a dialect of line BASIC that I had used on the NEC PC-8001 mkII
computer in circa 1982). I quickly discovered that I could no longer
stand programming in that language because any program that would not
fit on the screen would quicky turn to spaghetti code, and require
reams of printouts to follow, and I simply didn't have the patience to
trace reams of printouts of code when I had a computer sitting in
front of me idle.

Using my 1989 mindset, before exposure to the lambda calculus and
before I had conquered math phobia, I then searched around to discover
which language that I would have chosen had I never been exposed to
college nor conquered math phobia.

I discovered that I didn't like Scheme in that mindset because the
thinking involved for most introductory books to computer science
using Scheme was too mathematical, and because I could not find enough
examples of multimedia-related programs (in 1989, I had been very
interested in the FM TOWNS, a Fujitsu personal computer with
relatively advanced (for its time) sound and color graphics
abilities).

To my own surprise, I did find Haskell interesting, but only because I
came across a copy of _The Haskell School of Expression,_ a book by
Paul Hudak that used multimedia examples to illustrate Haskell.
However, I didn't like having to review trigonometry (which I had long
since forgotten) in order to solve some of the exercises therein, and
looked for something that did not require reviewing trigonometry.

Then I found Smalltalk. I discovered that one implementation (Squeak)
actually allowed coding within a mini-OS, complete with color graphics
and sound. This environment reminded me of the environment that I had
seen on the FM TOWNS in 1989. No need to review trigonometry. No
need to (re-)conquer math phobia. No need to trace reams and reams of
printouts manually, either.

That was enough finally to get me started with Smalltalk. Eventually
I recovered my interests in Haskell and, to a certain extent, Scheme
(although my Scheme interests took a dramatically different path, and
I very soon wound up diverging from PLT Scheme because I now felt
differently about Scheme, and because of an argument with one person
on comp.lang.scheme about a bug in the file explorer of DrScheme that
only appeared on the Japanese version of Windows XP (the other person
claimed that that bug only appeared on my computer, and I claimed that
it appeared on most computers running the Japanese version of Windows
XP), and I wound up switching to Gauche and Chez Scheme and learning
continuations--in a sense, approaching Scheme from the other side of
the coin--but that is a different story).

Actually, I summarized this experience in a short blog entry back
then; the reference is as follows:

Paradigm Shift: Back to the Past, and No Small Talk About Smalltalk
http://dekudekuplex.wordpress.com/2009/08/25/paradigm-shift-back-to-the-past-and-no-small-talk-about-smalltalk/
Post by kj
I know that there are plenty of books out there on Church's lambda
calculus, but what I would *really* like to read is a detailed,
rigorous comparison of Church's and Turing's formalizations of
computation, showing not only their equivalence, but hopefully also
their implications for language design.
I don't think that would help. At first, I tried to compare the
functional and message-passing paradigms, but that did nothing to help
me feel more comfortable with the latter. What you probably really
need is less information, not more.

Try to remember how you felt before you first started programming, and
pretend that none of the imperative programming languages exist. Start
out with something ridiculously inconvenient, such as line BASIC, and
discover how incredibly tedious tracing line numbers becomes when
there is no structure, when each line of code requires a different
line number to be jumped to, when no line numbers are allowed past
line number 65529, and when every time you make a significant change,
you need to renumber your line numbers and watch out that they don't
go past 65529. After 3 days, your mind will start squirming in agony,
and you'll search for something, anything, less tedious, anything that
will allow you to focus on what you really want to do, instead of
every single petty detail about how to do it without shooting yourself
in the foot. Then, go look at such languages as Scheme, Haskell, and
O'Caml. Believe me, you'll feel very happy to be able to program in
functional programming languages then.

-- Benjamin L. Russell
--
Benjamin L. Russell / DekuDekuplex at Yahoo dot com
http://dekudekuplex.wordpress.com/
Translator/Interpreter / Mobile: +011 81 80-3603-6725
"Furuike ya, kawazu tobikomu mizu no oto."
-- Matsuo Basho^
Continue reading on narkive:
Loading...