[Date Index] [Thread Index] [Author Index]
Re: Mathematica and Lisp
On Saturday, February 2, 2013 1:15:33 AM UTC-5, Richard Fateman wrote: > On 1/31/2013 5:46 PM, John Doty wrote: > > .... > > > > >> (RJF WROTE) > > >> I wrote a parser for it (in Lisp) some time ago, and in my experience > > >> > > >> one of the best ways to really learn a language is to "implement" it. > > > > > >(JD WROTE) That's crazy. Implementing a language teaches you nothing > > > about its true strengths and weaknesses. To learn that, you must > > >*use* it for real-world problems, not the toy problems of theoretical > > > computer science. > > > > You miss the point. Let me try to clarify this. Mathematica consists > > of several parts. Among these: > > > > a specialized language and implementation of a graphics package. > > an implementation of routines for Solve, Reduce, Eliminate. > > an expression "simplifier" e.g. FullSimplify and friends. > > an implementation of significance arithmetic (bigfloats). > > .... > > > > There is also a programming language, which appears to have no separable > > name and so it too seems to be called Mathematica. One can imagine > > this language stripped of all the application stuff specific to > > mathematics. Apparently Stephen Wolfram has thought about how this > > could be separately sold, though has refrained, so far as I know, > > from actually doing this. > > > > This programming language has features, none of which would be called > > theoretical computer science by a computer scientist. > > Of these features there are many that the vast majority > > of users are either unaware of, or misunderstand. This is > > probably typical for programming languages and naive users. > > > > In addition to the mysteries of the math application and its implementation, > > some of Mathematica's LANGUAGE features are non-obvious and > > the misunderstanding is promoted by overly-simplified "explanations" > > in the documentation. In some cases the obvious and plausible > > understanding for an expert in programming languages, based on > > an understanding of what other languages do, is wrong. Now such an > > expert might venture to say that such "features" were bugs and should > > be corrected. > > But one learns that sometimes putting a hat and a > > beard on a bug makes it into a feature. > > As an example, compare Module and Block. Do you think you > > really understand Hold, Defer, Evaluate, UpSetDelayed? Do > > you think that I do, after implementing them? (Actually, I did > > not implement Defer, which was introduced in version 5.0) Well, I know from private communication that you do *not* understand UpSetDelayed in any useful way, since you don't understand what to use it for. Implementing it has taught you nothing. But if you were to read the dcumentation... > > > > For purposes of discussion here I would > > include in the programming language the surface syntax and internal > > representation of programs, the fundamental parts of naming, binding, > > function evaluation, matching, and integer arithmetic. > > > > I expect that few programmers, even if they have written pages of code, > > understand all the binding rules and evaluation orders for Rules and > > Functions and matching and such. > > > > Often it does not matter if you have an unlimited store of different > > names, and never use the same name twice. But sometimes it does matter, > > and people write to this newsgroup with mysterious code. > > > > > > > > <snip> > > RJF reports a bug, in version 8, not version 9. > > > > >> > > >> returns False. > > > > > > (JD writes) > > > > > > > Discovering and reporting this kind of bug is useful, but you're still in the realm of toy problems. > > > Such bugs exist in many useful codes, and are only a minor source of error. > > > > Well, there are 3 ways reading this. > > > > 1. You are unlikely to encounter this bug. > > or > > 2. If you encounter this bug, it will bother you only in a > > minor way. > > or > > 3. Most programs have bugs and it doesn't matter. > > > > regarding 1. Read about the Therac-25, in which a bad software > > design was implemented, It hardly affected anyone who was treated > > with that X-ray machine. It only killed 3 people. > > > > regarding 2. Read about how arithmetic failure, specifically > > the conversion of a 64-bit floating point quantity to a 16-bit > > signed integer caused the crash of an Ariane 5 rocket, losing > > $500 million. Fortunately unmanned. Considered as software failures, both of these occurred in application code and were not the result of programming language deficiencies. Quite different from your nitpicking. But seeing these as software failures obscures the real issues. All non-trivial software has bugs. In this light, both of these can be seen fundamentally as systems engineering failures. In the Therac-25 case, the engineers had removed a hardware safety interlock that prevented earlier models from operating when dangerously misconfigured. There were other engineering errors, but that was the critical one. Even if the software could be made perfect, a device like the Therac-25 needs such hardware interlocks because computers themselves are not perfectly reliable. At least the Ariane engineers recognized that their computer hardware was not perfectly trustworthy. So, they arranged that if the guidance computer detected a fault in its execution it would halt. Then, hardware would pass control to the backup, which was an identical unit running identical software with identical inputs. Oops. This approach guaranteed that any fault caused by a software bug or hardware design error would automatically shut down all guidance. It makes little sense in this case to blame the specific software bug that triggered the failure: the system was designed to fail. Contrast that with the Apollo LM computer, where software or hardware upset would prompt a restart, with just enough state maintained in hardware to allow the computer to continue controlling the descent. Good, robust engineering: it saved the Apollo 11 landing. In that case, the "bug" was actually in a checklist, but from a systems engineering perspective that's part of the software. > > regarding 3. Programming languages with mysterious and undocumented > > semantics (as well as poor debugging features) are likely to make > > validation more difficult than otherwise. To you, Mathematica is mysterious, because you fight it rather than using it. And you haven't even read the documentation you complain about (if you had, you'd understand what UpSetDelayed is for). I don't find Mathematica to be especially mysterious relative to its capabilities. > > > > > > > > > > > >> (RJF) If I were using a computer to do something that required correct answers > > >> for, say, life safety, like building a bridge, I would follow WRI's > > >> advice and not use Mathematica. > > > > > > (JD) I use Mathematica in the creation of designs for space flight hardware. > > >But, of course, I don't *only* use Mathematica. It's most useful for > > exploring > > >ideas ahead of detailed analysis with more specialized software. > > >But in my business counting on unverified calculation, regardless of > > the source, > > >is asking for trouble. > > > > If you find Mathematica useful, fine. If you were using it to (for > > example) generate code for real-time embedded processors for space > > flight controllers, I would hope you would be very aware of the Ariane 5 > > and similar disasters. I am aware of those, as well as many that you've never heard of. And I have multiple viewpoints from which to understand them. Preventing these is often a matter of several layers of "even-if". "We've verified X, but even if we're wrong, there's Y to prevent disaster. And even if Y fails, Z can takeover." > There is, however, an underlying issue here. That is, it is somehow OK > > for Mathematica to have bugs because -- its result would be > > independently verified, when it matters. Not all bugs are of equal importance. An error of 5.5E-79 in a Bessel function is very unlikely to cause trouble in a practical application. I've been using Mathematica to do practical work since version 1, and I've never encountered a bug in its numerics. > > > > Imagine someone doing some speculative computation in Mathematica, and > > exploring some actually non-existent physical phenomenon which was > > predicted because of numerical errors. Crazy results from numerical codes are a normal occurrance, so the consumer of the results must beware. However, these are rarely due to the programming language or library, whether the code is in Mathematica or some other language. Every non-trivial numerical model depends on incompletely understood assumptions and approximations. Even for a venerable and relatively well-understood code like SPICE3, tuning the integrator parameters is a bit of a black art, and insane results are to be expected if you get those wrong for your circuit. I don't find Mathematica to be unusually hazardous here. > This is pretty far afield from the original question which I > > think was somehow...s Mathematica somehow Lisp-like.... should I learn Lisp... I'm unusual in that I write practical engineering code in Scheme (a Lisp dialect), along with code for both science and engineering in Mathematica. I actually *use* both languages, rather than fighting them, so I think I have a fair amount of insight here. From that viewpoint, I would tell the OP that the resemblance of Mathematica and Lisp is superficial: Mathematica is actually closer to "sed". And perhaps that's the root of your problem as well. Mathematica is not Lisp: Lisp design prejudices are irrelevant. > > > > Anyone can obviously use whatever tools float your boat/rocket ship. > > > > To make a plausible case that you are using an especially > > good "programming language" and should/should not learn another > > seems to call for some comparative evaluation from people who > > have appropriate expertise (in programming languages). Programming language specialists have many interesting ideas, but lousy judgement when it comes to which ones are important in any particular application context. Four decades ago I recall hearing "C's too crude, PL/I is the future". Yeah, right. But "the proof of the pudding is in the eating". If you want to understand the applicability of a language to, say, bridge design, you need to talk to bridge designers who've used it.