MathGroup Archive 2013

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Mathematica and Lisp

  • To: mathgroup at
  • Subject: [mg129852] Re: Mathematica and Lisp
  • From: "djmpark" <djmpark at>
  • Date: Mon, 18 Feb 2013 06:04:43 -0500 (EST)
  • Delivered-to:
  • Delivered-to:
  • Delivered-to:
  • Delivered-to:
  • References: <kcqkv4$lq5$> <kct7fj$sgo$> <kfkm72$j97$> <24729974.35095.1361092362942.JavaMail.root@m06>

For critical applications isn't one of the best methods to have Team A that
writes the code, and Team B that tries to break it by throwing input at it?
It really helps if they hate each other's guts and Team B has skeptical
people who will be using the application.

David Park
djmpark at 

From: David Bailey [mailto:dave at] 

On 15/02/2013 06:56, John Doty wrote:

>> hard to debug programming language (assembler).
> There are always bugs in non-trivial software.
> There are always layers of misunderstanding:
> 1. The engineers (hardware and software) never fully understand the
application, and are usually too stubborn to admit this.
> 2. The programmers never fully understand the hardware, and are usually
too stubborn to admit this.
> 3. The operators never fully understand the machine, and are usually too
stubborn to admit this.
> Hardware is not perfectly reliable, especially in radiology 
> departments where there's much EMI and possibly other problems like 
> stray neutrons (I know from experience that unhardened 
> microcontrollers misbehave near the MGH cyclotron in Boston, even in 
> "shielded" areas). Operators are often distracted, tired, and 
> pressured. And misspelling of silly made-up words is common, too ;-)
> One must therefore assume that if the hardware can be put into a fatal
configuration, it will be at some point. When it actually happens, the
retrospective details of how it happened are misleading. The fundamental
engineering issue is that one must design so that the ordinary, routine
failures do not cascade to fatality. By removing the hardware interlock, the
Therac engineers had designed the system to fail.

I would really like to endorse that. I feel that some people like to scoff
at software developers and their supposedly inadequate methods without
proposing a viable alternative. For example, program proving seems an
impossible dream for serious programs, and would in any case require a
formal specification that might itself contain bugs.

All the most complex artifacts we have are either software, or contain large
amounts of software. Software engineers are routinely required to deliver a
level of complexity unheard of say 50 years ago - yet some people like to
scoff when they sometimes fail.

Anything that is extremely complex is susceptible to mistakes - particularly
if it can't really be tested until it is finished. Take for example, the
Mars probe that crashed because of a mixup over physical units. Clearly such
a trivial mistake would be unthinkable in a simpler project - I presume it
got overlooked because it was hidden among vast amounts of other detail.

Anyone using Mathematica (or any other software) for a serious task has to
take responsibility for the results he/she uses, and even then, there are
still some risks involved. So for a very trivial example if you decide to

In[6]:= Integrate[Exp[ax]x,x]

Out[6]= (E^ax x^2)/2

by doing:

In[7]:= D[%,x]

Out[7]= E^ax x

Your check will return the original expression, and maybe lead you to
believe you have the answer you wanted! Maybe if you recognise that you are
prone to make that type of mistake, you should examine anything important in
FullForm - but ultimately the user has to be responsible.

David Bailey

  • Prev by Date: Re: Low precision exponentiation
  • Next by Date: Re: Low precision exponentiation
  • Previous by thread: Re: Mathematica and Lisp
  • Next by thread: Re: Mathematica and Lisp