Re: What should Mma be, part II
- To: mathgroup at yoda.physics.unc.edu
- Subject: Re: What should Mma be, part II
- From: jacobson at cello.hpl.hp.com
- Date: Fri, 11 Dec 92 14:42:47 -0800
Warren Wiscombe waxes eloquent regarding the debate about the future of Mathematica, or more precisely, the non-necessity of Mathematica providing for quality output: > ... > My concern now is one of deja vu. Long ago, in the early 1970s, > Kernighan, Plauger, and the other driving forces behind Unix began > developing the philosophy that a collection of software tools was better > and more versatile than a single giant application [I'm sure you can see > where this is leading, but hear me out]. This philosophy gradually spread > and took root all over the world, in spite of monumental holdouts like IBM. > Most of us who care deeply about computing consider this a watershed, one > we are ecstatic we have crossed, and we would not for anything in the world > go back. > > Nevertheless, eternal vigilance is necessary to prevent being dragged > back to the days of monolithic applications. There will always be those > that don't have the energy, the desire, or the experience to put the custom > collection of tools together to do the job right, but rather prefer a giant > application that does everything--but nothing very well. There are many > examples I might cite, but perhaps one from my own long experience with PCs > has at least the virtue of being first-person and not anecdotal. In the > early days of PCs, I tried tools like WordStar, and I tried monumental > applications like Symphony that 'integrated' a word processor, a > spreadsheet, and God knows what else. The tool philosophy quickly won my > allegiance, and that of the vast majority of PC users, while applications > like Symphony are now history. Walk into any software store and see the > winning philosophy. Beginners still flock to applications like MicroSoft > Works, but the minute they acquire any experience whatsoever, they leave > them to work with a collection of custom tools. > > The principle of using a collection of sharp tools rather than one blunt > sledgehammer is a triumph of modern computing philosophy. To roll back the > clock, and ask Mma to be the Symphony of the 1990s, is to betray everything > we have learned about computing. > > > ---------------------------------------------- > Dr. Warren J. Wiscombe, DAAC Project Scientist > NASA Goddard, Code 913, Greenbelt, MD 20771 Well, the reason that Unix worked so well, was that all the tools used a common language: a byte stream composed of a list of lines. And since awk, sed, grep, and all that were oriented to manipulating lines of text, it all went fine. The problem is that life is not so simple with graphics. I can't just "pipe" a graphics object into my favorite graphics tool and get good results. Well, if it is a simple Plot object, with not too much work I can extract the underlying list and convert it into a list of coordinates and then suck the result into GnuPlot. But what if its got captions? Or what about 3D plots or density plots, or polygons, or if it contains color? Some people have proposed going to a lower level and using Postscript as the exchange language. This level doesn't understand graphics, but you have a lot more control. Well YOU AS A HUMAN have a lot more control, and with the right tools you can visually edit it. But its not very clear that there is a set of tools that will take Mathematica graphics represented as Postscript and do interesting transformations on it. To see what I mean, think up an interesting graphics problem. Say, you want to eliminate the grid lines in a 3D plot. Now, what transformation tool lets you do that on the postscript file in a way similar to the way that awk '{$2 = 1/$2;print $0}' lets you reciprocate the second column of a table? So I think that the analogy with Unix breaks down, since there is no common language for graphics objects at a level where the object is understood. -- David Jacobson