But I'd love to see evidence that McCarthy arrived at ternary IF before anyone else.
Kaz Kylheku <643-408-1753@kylheku.com> writes:
... The code is from just before MacCarthy invented the ternary IF,
as a shorthand for a one clause cond: ...
I don't think that McCarthy invented IF as an abbreviation for COND, but
I could be wrong. He certainly _could_ have invented it, but if he did,
it was forgotten by the mid 1970s. Defining IF as a _personal_ macro was common practice for 1970s Lisp programmers, and I don't think IF was
provided as standard by _any_ major Lisp implementation at that time.
(Maybe InterLisp did? I don't recall, and I no longer have an InterLisp manual on my bookshelf.)
I don't think that McCarthy invented IF as an abbreviation for COND, but
I could be wrong. He certainly _could_ have invented it, but if he did,
it was forgotten by the mid 1970s. Defining IF as a _personal_ macro was common practice for 1970s Lisp programmers, and I don't think IF was
provided as standard by _any_ major Lisp implementation at that time.
(Maybe InterLisp did? I don't recall, and I no longer have an InterLisp manual on my bookshelf.) Scheme had IF _instead_ of COND, but nobody
was using Scheme for anything serious yet. And I don't think the T
project had started at that point.
In fact, when the Lisp Machine group decided to introduce a standard IF
macro into Lisp Machine Lisp, we were a bit worried that another
significant body of Lisp code was already using a slightly different
syntax for IF than the one we wanted to use. Multics Emacs (written in Multics MacLisp) had popularized an IF macro that used an ELSE keyword somehow. We negotiated a deal with the Multics guys (Bernie Greenberg)
where they agreed to change their IF to guarantee that an IF with no
ELSE keyword would work exactly the same way ours did.
This was perhaps the very beginning of the road that eventually led us
to Common Lisp. A bunch of Lisp implementations (Lisp Machine Lisp, NIL
and both branches of MacLisp) all introduced a bunch of new syntax (IF,
LET, DEFSTRUCT, SETF, backquote, sharpsign, etc.) into their standard environment so that individual programmers and programs no longer had to define them themselves.
But I'd love to see evidence that McCarthy arrived at ternary IF before anyone else. He certainly invented COND, where the interesting thing
about COND was that it was an _expression_ rather than a statement, so
we often say that "McCarthy invented the conditional expression". And sometimes that gets shortened to "McCarthy invented the if-expression".
But I don't think that he ever literally wrote "(IF ...)" rather than
"(COND ...)".
I'm not even sure that McCarthy would have seen the advantage of IF over
COND given that what he originally wanted us all to write was actually
"[expr -> expr; expr]"!
Lisp doesn't really have statements _because_ McCarthy invented the conditional expression. That's kind of the point. Other programming languages at the time (e.g. FORTRAN and ALGOL) only had conditional statements. McCarthy invented the conditional expression and thus
created the first expression-only programming language.
People sometimes sloppily say that "McCarthy invented the
if-expression", by which they really mean just (1). It is also true
McCarthy invented the conditional expression and thus created the
first expression-only programming language.
But when McCarthy was inventing Lisp, nobody thought that pure Lambda Calculus was anything like a programming language.
It's only since then that we've done the work to make that practical.
Kaz Kylheku <643-408-1753@kylheku.com> writes:
On 2024-04-03, Alan Bawden <alan@csail.mit.edu> wrote:
> Lisp doesn't really have statements _because_ McCarthy invented the
> conditional expression. That's kind of the point. Other programming
> languages at the time (e.g. FORTRAN and ALGOL) only had conditional
> statements. McCarthy invented the conditional expression and thus
> created the first expression-only programming language.
However, conditional expressions ultimately come from math. E.g for
specifying a discontinuous function:
f(x) = { x, if x >= 0
{ 0, if x < 0
If you think about it, it's actually kind of ignorant to invent a
programming language with imperative if statements, but in which where
the math conditional is missing.
If you think like a historian, you don't describe this as "ignorant".
It's just not something that was above the horizon in the mind set of
the time. After all, that mathematical notation you are referring to
isn't something that mathematicians get very formal about.
On 2024-04-03, Alan Bawden <alan@csail.mit.edu> wrote:
Kaz Kylheku <643-408-1753@kylheku.com> writes:
If you think about it, it's actually kind of ignorant to invent a
programming language with imperative if statements, but in which where
the math conditional is missing.
If you think like a historian, you don't describe this as "ignorant".
It's just not something that was above the horizon in the mind set of
the time. After all, that mathematical notation you are referring to
isn't something that mathematicians get very formal about.
That is a fair observation; roughly speaking, higher languages first
evolved from that of the machine. Why we have an if /statement/ is that
the machine has testing and branching in its instruction set, which are
also statements. The imperative language that works by jumping around
and shuffling mainly word-sized quantities inside a Von Neumann machine
is an abstraction of machine language, not an abstraction of functions.
The abstraction of machine language isn't ignorant, it's just different.
Paul Rubin <no.email@nospam.invalid> writes:
Alan Bawden <alan@csail.mit.edu> writes:
> McCarthy invented the conditional expression and thus created the
> first expression-only programming language.
I think Church's lambda calculus also had this.
https://en.wikipedia.org/wiki/Church_encoding#Church_Booleans
Yeah, I though about that while I was composing my previous message, but
my life is finite, so I didn't go there. But this is Usenet, so we are forced to explore every side issue until we all drop dead of exhaustion.
So does Lambda Calculus qualify as a programming language? Well _today_
we would all say yes, because we have examples of purely functional programming languages where evaluation is lazy and so we don't need a
special operator for conditional evaluation. But when McCarthy was
inventing Lisp, nobody thought that pure Lambda Calculus was anything
like a programming language. It's only since then that we've done the
work to make that practical.
So I _could_ argue that in 1958 Lambda Calculus was _not_ a programming language, but today in 2024 Lambda Calculus _is_ a programming language,
even though Lambda Calculus didn't change in any way!
Kaz Kylheku <643-408-1753@kylheku.com> writes:
On 2024-04-03, Alan Bawden <alan@csail.mit.edu> wrote:
> Lisp doesn't really have statements _because_ McCarthy invented the
> conditional expression. That's kind of the point. Other programming
> languages at the time (e.g. FORTRAN and ALGOL) only had conditional
> statements. McCarthy invented the conditional expression and thus
> created the first expression-only programming language.
However, conditional expressions ultimately come from math. E.g for
specifying a discontinuous function:
f(x) = { x, if x >= 0
{ 0, if x < 0
If you think about it, it's actually kind of ignorant to invent a
programming language with imperative if statements, but in which where
the math conditional is missing.
If you think like a historian, you don't describe this as "ignorant".
It's just not something that was above the horizon in the mind set of
the time. After all, that mathematical notation you are referring to
isn't something that mathematicians get very formal about.
It has a status that's midway between being able to define something
using a simple equation, and having to resort to a definition in the
form of a paragraph of words. Realizing that you could tighten that
notation up and use it in a programming language _as a kind of
expression_ is actually a bit of a leap.
Other programming languages at the time (e.g. FORTRAN and ALGOL) only
had conditional statements.
I think lazy evaluation is not needed since the
input to the Church boolean (I hope I have the jargon right) is two
lambda abstractions. The boolean selects one of them, and the result is applied to whatever the next thing is.
On 2024-04-03, Kaz Kylheku <643-408-1753@kylheku.com> wrote:
On 2024-04-03, Alan Bawden <alan@csail.mit.edu> wrote:
Kaz Kylheku <643-408-1753@kylheku.com> writes:
If you think about it, it's actually kind of ignorant to invent a
programming language with imperative if statements, but in which where >>> the math conditional is missing.
If you think like a historian, you don't describe this as "ignorant".
It's just not something that was above the horizon in the mind set of
the time. After all, that mathematical notation you are referring to
isn't something that mathematicians get very formal about.
That is a fair observation; roughly speaking, higher languages first
evolved from that of the machine. Why we have an if /statement/ is that
the machine has testing and branching in its instruction set, which are
also statements. The imperative language that works by jumping around
and shuffling mainly word-sized quantities inside a Von Neumann machine
is an abstraction of machine language, not an abstraction of functions.
The abstraction of machine language isn't ignorant, it's just different.
But, right, okay; I lost a thoguht I had some hours earlier about this.
By the time we have a higher level language inspired by math formulas in which you can do A * B + C, and define math-like functions, you would
think that the right synapse would fire between the right two brain
cells, so that a value-yielding conditional would be supplied. When you
have translation of arithmetic formulas to machine language, the scene
is ripe for such an operator. So maybe ignorance is a strong word, but
there is a margin for disappointment.
I'm finally heading my own software engineering project at work and,
since I call the shots, it's up to me to decide on what collaboration
and review tools we utilize on the project.
I'm pretty well decided on git for source code management but what are
your favourite linux tool for bug tracking, code review, collaboration, etc?
Lawrence D'Oliveiro <ldo@nz.invalid> writes:
On Wed, 03 Apr 2024 14:15:14 -0400, Alan Bawden wrote:
> Other programming languages at the time (e.g. FORTRAN and ALGOL) only
> had conditional statements.
Algol60 had if-expressions, e.g.
a := if b then c else d
Ah yes, the history is slightly more complicated than I remembered.
Algol 58 did not have conditional expressions. But McCarthy then joined
the Algol committee and he suggested that they add conditional
expressions. And so they do appear in Algol 60.
Kaz Kylheku <643-408-1753@kylheku.com> writes:
... The code is from just before MacCarthy invented the ternary IF,
as a shorthand for a one clause cond: ...
I don't think that McCarthy invented IF as an abbreviation for COND, but
I could be wrong.
Could you please give some pointers about the rationale (or further documentation) here?
I use this in Python, to try to avoid that abortion that is the Python conditional expression. Instead of, say,
a = x / y if y != 0 else 0
I would do
a = (lambda : 0, lambda : x / y)[y != 0]()
I am also surprised that True "equals" to 1 in this context and False
to 0 (rather not C-like).
Could you please give some pointers about the rationale (or further documentation) here?
On Wed, 03 Apr 2024 15:29:52 -0700, Paul Rubin wrote:
I think lazy evaluation is not needed since the
input to the Church boolean (I hope I have the jargon right) is two
lambda abstractions. The boolean selects one of them, and the result is
applied to whatever the next thing is.
I use this in Python, to try to avoid that abortion that is the Python conditional expression. Instead of, say,
a = x / y if y != 0 else 0
I would do
a = (lambda : 0, lambda : x / y)[y != 0]()
and it was natural to invent a function XIF(M,N1,N2) whose value
was N1 or N2 according to whether the expression M was zero or not.
The function shortened many programs and made them easier to
understand, but it had to be used sparingly, because all three
arguments had to be evaluated before XIF was entered, since XIF was
called as an ordinary FORTRAN function though written in machine
language.
It's a common misconception that McCarthy was trying to turn Lambda
Calculus into a programming language. ... He added LAMBDA (and LABEL) because he needed LAMBDA in order to define recursive functions, but
as he himself often admitted, he didn't really understand Lambda
Calculus, he just needed the notation.
Paul Rubin <no.email@nospam.invalid> writes:
Alan Bawden <alan@csail.mit.edu> writes:
It's a common misconception that McCarthy was trying to turn Lambda
Calculus into a programming language. ... He added LAMBDA (and LABEL)
because he needed LAMBDA in order to define recursive functions, but
as he himself often admitted, he didn't really understand Lambda
Calculus, he just needed the notation.
I see, yes, and this is confirmed by his History of Lisp article. His
Wikipedia biography also surprised me a bit. For some reason I had
thought of him as an academic mathematical logician who later somehow
got involved with computers, but it was more like the other way around.
Thanks.
His PhD was in mathematics---differential equations. He would think up things like a simple function that is continuous but nowhere
differentiable on the real line [1]. He was a mathematician by all
accounts. It's pretty hard to remove mathematics from computer science.
The culture seems to be that if a mathematician contributes more to the
field of computer science, he is called a computer scientist.
I also agree that he was a logician: he worked on a mathematical basis
for computer science. A mathematical basis for computer science must be classified as logic. He was interested in proving programs were
correct. His idea of a conditional expression is precisely to write mathematical functions in high precision. Mathematics in high precision---that's a logician.
The creation of LISP by John McCarthy was surely not at first with
intention of a programming language. In fact, it was Steve Russell, his student at the time, that first had the idea of implementing EVAL and
did it. I believe McCarthy was even somewhat surprised because, then,
he did not think of LISP having that kind of purpose.
Alan Bawden <alan@csail.mit.edu> writes:
It's a common misconception that McCarthy was trying to turn Lambda
Calculus into a programming language. ... He added LAMBDA (and LABEL)
because he needed LAMBDA in order to define recursive functions, but
as he himself often admitted, he didn't really understand Lambda
Calculus, he just needed the notation.
I see, yes, and this is confirmed by his History of Lisp article. His Wikipedia biography also surprised me a bit. For some reason I had
thought of him as an academic mathematical logician who later somehow
got involved with computers, but it was more like the other way around. Thanks.
It's pretty hard to remove mathematics from computer science.
Lisp doesn't really have statements _because_ McCarthy invented the >conditional expression. That's kind of the point. Other programming >languages at the time (e.g. FORTRAN and ALGOL) only had conditional >statements. McCarthy invented the conditional expression and thus
created the first expression-only programming language.
- Alan
Algol 58 ...
I also agree that he was a logician: he worked on a mathematical basis
for computer science. A mathematical basis for computer science must be classified as logic.
Julieta Shem <jshem@yaxenu.org> writes:
I also agree that he was a logician: he worked on a mathematical basis
for computer science. A mathematical basis for computer science must be
classified as logic.
Mathematical logic is an area that deals with topics like proof theory.
Not to diminish McCarthy in any way, but it sounds like he didn't work
in that particular area.
I also remember McCarthy in the late 1960s being obsessed with logic and philosophy and randomly spouting off discussions such as how processing
of
Sir Walter Raleigh was the author of "The Lie"
Did the Queen know that Raleigh wrote "The Lie"?
might produce
Did the Queen know TRUE?
might produce
Did the Queen know TRUE?
Jeff Barnett <jbb@notatt.com> writes:
might produce
Did the Queen know TRUE?
This is a topic in philosophy of language (idr what it is called). Mathematical logic is completely different. By that, I mean you could
go to any university math library in the 1960s (when McCarthy was
active) or today, and find shelves full of textbooks with "mathematical logic" in their titles. Those would all be about facets of the same
subject. Noted authors in it were people like Church (inventor of
lambda calculus), Curry (currying and the Haskell language are named
after him), Tarski, and so on. McCarthy's work was in tangentially
related areas.
McCarthy was super smart and probably could have read those textbooks
and gotten to understand the subject fairly easily, but not that many mathematicians were into it back then. So I don't think he spent his
time that way. That's what I mean by saying he worked in different
areas.
I only met McCarthy once, and unfortunately, the only topic I remember
from the conversation had something to do with Chinese food.
I met him in the early/middle 1960s when DARPA gave us a contract to do
a thing called Lisp 2 - a Lisp system with extended language facilities
and borrows from Algol.
[Lisp 2] That project was considered a “failure”, but I wonder why?
Did it turn out that getting rid of the (ahem) quirky Lisp syntax in
fact got rid of some of its expressive power, too?
SPITBOL (Speedy Implementation of SNOBOL, where SNOBOL was StriNg
Oriented symBOLic language) was a quite amazing 1970s(?) implementation
of a language that could be seen as an antecedent of something like
Perl:
https://github.com/spitbol
Had a quick look at that. I would say the whole SNOBOL family has been
left in the dust by Perl.
Lawrence D'Oliveiro <ldo@nz.invalid> writes:
Had a quick look at that. I would say the whole SNOBOL family has been
left in the dust by Perl.
SNOBOL/SPITBOL in the 1970s were sort of like Perl in the 1990s, I
think. But, they are mostly of historical interest now. I think by the
time Spitbol ran on anything resembling a personal computer or
workstation, it was already history. It ran on old mainframe OS's and
there wasn't much overlap.
... so there was a lot of interest in the
1960s of including some sort of pattern match and reconstruction
primitives into programming languages.
This stuff simply exacerbated our limited memory problems on old
machines not matter how interesting they/it might be.
Was it a matter of timing, then? Perl came along at just the point where
the hardware was powerful enough to take the complexities of regular expressions in its stride, so that’s when the whole idea really took off.
Lawrence D'Oliveiro <ldo@nz.invalid> writes:
Was it a matter of timing, then? Perl came along at just the point
where the hardware was powerful enough to take the complexities of
regular expressions in its stride, so that’s when the whole idea
really took off.
Unix had regular expressions because Thompson's QED editor on some
weird old GE(?) minicomputer had had them. It compiled the regexex
into machine code, iirc. Perl was sort of Awk on steroids and Awk
also had regexes. I think regexes per se were never very cpu or
memory hungry.
Snobol and Spitbol didn't have regexes. They did pattern matching by
brute force backtracking. By that era though, computers had much more
memory than they did when Lisp 2 was happening.
Sir Walter Raleigh was the author of "The Lie"
Did the Queen know that Raleigh wrote "The Lie"?
might produce
Did the Queen know TRUE?
Lawrence D'Oliveiro <ldo@nz.invalid> writes:
Was it a matter of timing, then? Perl came along at just the point where
the hardware was powerful enough to take the complexities of regular
expressions in its stride, so that’s when the whole idea really took off.
Unix had regular expressions because Thompson's QED editor on some weird
old GE(?) minicomputer had had them. It compiled the regexex into
machine code, iirc. Perl was sort of Awk on steroids and Awk also had regexes. I think regexes per se were never very cpu or memory hungry.
Snobol and Spitbol didn't have regexes. They did pattern matching by
brute force backtracking. By that era though, computers had much more
memory than they did when Lisp 2 was happening.
SNOBOL's patterns were much more "procedural" than REs. For example,
you could implement Russell's paradox in SNOBOL: a pattern that matches
only those patterns that don't match themselves.
* Paul Rubin <875xwgzal9.fsf@nightsong.com> :
Wrote on Wed, 17 Apr 2024 00:56:18 -0700:
Snobol and Spitbol didn't have regexes. They did pattern matching by
brute force backtracking. By that era though, computers had much more
memory than they did when Lisp 2 was happening.
Languages in the next decade didn't seem to mind not having regex either REXX, ICON, REBOL (this one is much later)
Mathematicians seem to be scared of paradoxes. But if you realize that
they are just equivalent to endless loops in a computation, which is something we deal with all the time in Comp Sci, then they are no longer
so scary.
Nowadays, say anything about “string processing”, and regexes are automatically assumed to be part of the mix.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 388 |
Nodes: | 16 (2 / 14) |
Uptime: | 10:30:33 |
Calls: | 8,221 |
Files: | 13,122 |
Messages: | 5,872,709 |