Why Wolfram (Mathematica) did not use Lisp

A usenet post by Kent M Pitman on comp.lang.lisp - Fri, 8 Nov 2002 23:29:04 GMT

Subject: Re: mathematica {Did Wolfram know Macsyma and/or Lisp?]


o...@cs.yorku.ca (ozan s. yigit) writes:

> [fateman's interesting note on wolfram, smp, mathematica and their relation
> (or lack of) to macsyma not repeated]

> thanks for the historic details. it certainly clarifies some things.
> [next time i see hugh, i'll ask him about SMP. i would have thought
> he would fix any issues with numeric representations. he knows better]

I'm not sure this is precisely the forum in which to log this fact,
but since Fateman is telling historical stories I wanted to add one.
I was in Pasadena at one point, visiting a friend at Caltech, and
popped in to see Wolfram around the time he was gearing up to write
SMP, I think.  If I recall, he was 19 at the time.  People around me
informed me that though he was very young, or maybe because of it, he
was on track to win a nobel prize of some sort.  I myself worked for
the MIT Macsyma group at the time as an undergrad, perhaps my first
senior year, so  I think I must have been a year or two older than him.

He told me that Lisp was "inherently" (I'm pretty sure even after all
this time that this was his exact word) 100 times slower than C and
therefore an unsuitable vehicle.  I tried to explain to him that this
was implausible.  That he could probably construct an argument for 2-5
that he could at least defend in some prima facie way, but that 100
was ridiculous.  (This was in the heyday of Maclisp when it had been shown
to trump Fortran's speed, so probably even 2-5 could be refuted, but at least
taking a position in that range would have left him with some defenses in
a debate.  He didn't cite anything credible that I recall to back up this
factor of 100 problem.

I tried to explain why and was not clear why a person smart enough to
"maybe win a nobel prize" couldn't entertain a discussion on the
simple set of concepts involved, whether or not schooled in
computation.  It was quite frustrating and he seemed impatient.

He in fact did not purport to be adequately competent on the matter of
computation at the time but he pointed to a stack (literally) of books
(I'd say about a foot high) including the Knuth books, the compiler
book with the dragon on it, and a number of other really standard
texts.  He then said "I'm going to read these and then I'll know as
much as you."  (Again, I'm pretty sure even now that this is pretty
close to an exact quote.  But whether it's exact or not, what struck
me was the incredible arrogance of the remark.) The point seemed
debatable, but I didn't bother to debate it.  He seemed deadset on his
goal and once he got to the point where he seemed to feel he could use
as a credential books he had not yet read, there seemed to be no
deflecting him.

My real concern, of course, was not that he was using optimized data
structures so much as that he seemed on target to reintroduce
numerical error back into a world that we had worked hard to make
'exact' (Macsyma used bignums from Lisp) or at least 'arbitrarily
exact' (Macsyma had a derived type called 'bigfloat' that was
internally a pair bignums, acting more or less as a ratio but with
lots of other hidden bits to assure that any decimalization had enough
bits to be precise to a given number of digits).  Stephen's aim seemed
to be to sacrifice correctness for speed.

He seemed clear on that the error was not a problem for him.  I'm not
a domain expert and I can only assume that he did know what he was
doing when he accepted an approximation over an exact value.  Probably
HE had a good reason.  But I worried for others, and recall telling
him so.  My specific recollection of this part of the discussion is
less clear, but I'm pretty sure his response was that what tools
others chose to use or not use was not his concern.

There's a fine ethical line here between simply making a tool and
actively promoting it, but I'll not expound on that in detail.
Rather, I'll just say that this line concerned me.  The problem I
have, and had then, is that other users, not him, might NOT understand
that this trade-off had been made and so might not be making an
informed choice.  People tend to say "well, if it's good enough for
him" rather than "well, if it's good enough for the purpose he had in
mind" since presumably there are other applications Stephen or anyone
could have come up with that would not have tolerated error.

But ah well.  History marched on and what happened happened.

My remarks are just personal memories and opinions, offered just as a
way to add historical perspective for those who care about such
things.  I think it's relevant to the Lisp community because it
relates to Stephen's departure from Macsyma (and implicitly from
Lisp)--he had previously been a Macsyma user and I'm pretty sure he
was more annoyed with some specific Macsyma program (that probably
used some inefficient data structures that the underlying Lisp would
have required it to do)... He seemed annoyed at Lisp but I didn't have
the sense that he'd done enough survey of other Lisp programs or of
the language itself to really have a believable opinion on this.  Then
again, I'm sure he's not the only person in Lisp history to see one
bad program and then overgeneralize...

(Source and copyright: http://groups.google.com/group/comp.lang.lisp/msg/f3b93140c2f2e922)