MathGroup Archive 2013

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Mathematica and Lisp

  • To: mathgroup at smc.vnet.net
  • Subject: [mg129481] Re: Mathematica and Lisp
  • From: David Bailey <dave at removedbailey.co.uk>
  • Date: Wed, 16 Jan 2013 23:15:53 -0500 (EST)
  • Delivered-to: l-mathgroup@mail-archive0.wolfram.com
  • Delivered-to: l-mathgroup@wolfram.com
  • Delivered-to: mathgroup-newout@smc.vnet.net
  • Delivered-to: mathgroup-newsend@smc.vnet.net
  • References: <kcqkv4$lq5$1@smc.vnet.net> <kct7fj$sgo$1@smc.vnet.net> <kd03ej$6dl$1@smc.vnet.net> <20130115043105.21DD56958@smc.vnet.net> <kd5huk$jk6$1@smc.vnet.net>

On 16/01/2013 06:39, Murray Eisenberg wrote:
>
>
>
> It all depends on just what you want somebody to accomplish when learning his/her first programming language.
This is, of course, the key. However, I would hope that at least some 
students will continuw to emerge with an understanding of what goes on 
at the bottom level - otherwise we are getting cut off from the roots of 
a vital technology.
>
> I've thought about that a lot over the some 30 years that I taught FORTRAN, Pascal, APL, J, and Mathematica in university math courses. (My very first programming language was Assembly for a Univac I, followed shortly by octal-coded machine instructions for a paper-tape input computer,  and then FORTRAN II  -- all while I was an undergraduate. A decade later, BASIC, APL, Pascal; subsequently, J and Mathematica. Even later, a smattering of C, Perl, Python, Java. I've even dabbled a tiny bit with Forth and Snobol.)
>
Wow - you sound a bit like me, I didn't touch Perl, Forth, APL, or 
Python (except to discover I hated them), but I did a lot with LISP and 
Prolog, and a few more obscure languages - Algol 68, and Coral!

> So what is essential for a first exposure to programming? To me, the essentials are:
>
> (0) Understanding carefully what the problem is, including what is given and what is to be found.
>
> (1) Identifying the objects (data) and what is to be done with them (operations, functions, procedures).
>
> (2) Breaking up a larger problem into its constituent parts.
>
> (3) Isolating the big ideas from the smaller technical points.
>
> (4) Expressing things within the constraints of a precise syntax.
>
> (5) Suitably modularizing the code in accordance with (2) and (3).
>
> (6) Making the code readable through judicious choice of names along with sufficient but non-redundant comments.
>
> (7) Making the code maintainable -- by the original author or others.
>
> (8) Being able to test and debug the code.
>
> (9) For numerical work, understanding and coping with roundoff and other errors due to the limitations of finite precision.
>
> (10) Recognizing when and knowing how to code an operation repetitively -- whether explicitly (iteratively or recursively) or implicitly (functionally).
>
> (11) Recognizing when and knowing how to express conditionals (whether via an explicit If, Which, etc., or instead as is possible in Mathematica, separate definitions of the same function for different cases).
>
> Must the first-language learner get closer what s/he is "asking the computer to do" than might be the case with Mathematica?  If so, how should you reasonably decide how deep to descend? (If you don't know what the ultimate actual binary code is, how could you know what the computer is _actually_ doing?)
>
> My experience, in fact, is that the higher the language level -- such as that possible with Mathematica or APL or J -- then the easier it is to master these essentials.  All too often I have seen students unable to ascend to effective programming with a higher-level language if their minds were rotted by the first exposure being to too low-level a language. (And that is despite my personal learning path that ascended from the ridiculous to the sublime.)
>
I used to work in compiler development, so I have been right down to the 
bit-level in a very serious way. However, even that isn't the end 
nowadays, because computers contain hidden layers of microcode, etc. For 
example, the 32-bit Intel architecture has been largely the same across 
many processors over the years, while the insides have changed radically.

Your list is fine, but it doesn't seem to address the issue of code 
efficiency. For some areas, this is vital (and for others it is 
irrelevant), and it is not uncommon for someone to test out an idea on a 
small dataset, without any realisation that it will not scale well to 
real world problems. Sometimes that is a question of the choice of 
algorithm, but Mathematica has a lot of subtle issues (such as not 
Appending to long lists) that aren't true in other languages. Also rules 
like "functional constructs are more efficient", are only true because 
of the way Mathematica is implemented - they are not universal truths.

David Bailey
http://www.dbaileyconsultancy.co.uk



  • Prev by Date: Saving mov in v9
  • Next by Date: Re: Aligning graphics on frame boundaries
  • Previous by thread: Re: Mathematica and Lisp
  • Next by thread: Re: Mathematica and Lisp