[Date Index] [Thread Index] [Author Index]
Re: Numerical accuracy/precision - this is a bug or a feature?
On 7/14/2011 6:23 PM, Noqsi wrote: > On Jul 14, 2:23 am, Richard Fateman<fate... at cs.berkeley.edu> wrote: >> On 7/13/2011 12:11 AM, Noqsi wrote: >> .. >> >>>> see e.g. >>>> <http://www.av8n.com/physics/uncertainty.htm>. >> >> .. >> Learning mathematics from a physicist is hazardous. > > In some ways. But if you want to relate mathematics to reality, you > might do well to consider the viewpoints of physicists. > >> Learning computer science from a physicist is hazardous too. > > Learning computer science from computer scientists is in some ways > worse, unless your interest is pointless bit pushing. Matthew 23:24 is > very relevant (some things never change). I suppose it depends on what you mean by computer science. > > In extracting useful results from computers in the real world > ignorance of the application domain is more debilitating than > ignorance of computer science. One can muddle through the latter, but > not the former. I think you may be confusing "computer science" with "applications of computers to whatever [e.g. physics, engineering, business]" which is usually taught in the department of 'whatever'. > > Consider the howler you yourself committed a little while ago, > discussing the role of computers in the Apollo moon landings: > >> you've got to wonder how the Russians, probably WITHOUT much in the way >> of computers, put up an artificial satellite. > > This illustrates a profound ignorance of the problem. If you don't > care what orbit you go into, the computations are readily performed by > slide rule ahead of time. But Apollo had to perform a series of > precise maneuvers using imprecise rockets, with repeated re- > computation of the trajectories. For the early missions, they didn't > even know the lunar gravity very well, so actual orbits diverged > rapidly from predictions even when the initial conditions were known. > Apollo's indirect "lunar orbital rendezvous" approach was thus a > triumph of computation. Even the Saturn V was not big enough to > support the less computationally intensive direct approach. Do you know for a fact that the Russians didn't care what orbit the first manned satellite had? > > Perhaps the "programmer" I learned the most from in a long career was > a physicist whose code was littered with silly (from a CS point of > view) constructions like: > > TA=(COS(EA)-EC)/(1.-EC*COS(EA)) > IF(ABS(TA).GT.1.) TA=SIGN(.99999,TA) > TA=ACOS(TA) > > What was so great about his code? It's that every program he wrote was > an illuminating exercise in extracting important knowledge from > measurable information. The sloppy technique didn't matter so much. He > put rigor into the place it really counted: serving the needs of his > research. There were several much better technical programmers in that > research group, but they were not as good at conceptualizing how to > actually *use* the computer, rather than simply programming it. I think you are confusing application knowledge with computer science. It is fairly clear that computer scientists cannot be held responsible for the content of all programs. > >> Numbers in a computer are different from experimental measurements. > > But the experimental measurements relate much better to reality. The computation deals with representations in the computer. Mapping those representations to the external world is a separate matter that deals with sensors, actuators (robots, displays, computer-controlled instruments, sound boards, digital cameras, scanners, microphones...) These are usually parts of some other engineering discipline. > >> >> nevertheless, I like this article. It says, among other things, >> >> The technique of propagating the uncertainty from step to= > step >> throughout the calculation is a very bad technique. It might sometimes >> work for super-simple =93textbook=94 problems but it is unlikely to work = > for >> real-world problems. > > Except that error propagation techniques are used successfully in many > fields. A simple problem perhaps. Your cell phone works because engineers found a good balance > of power consumption and radio sensitivity from error propagation > methods, rather than the impractical method of tracking each electron > through the circuits. Of course there are many methods that can be programmed. You are assuming that analysis of signals is done by some kind of particle tracking? I assume that programs are designed by persons familiar with differential equations and electromagnetic radiation, as well as more seat-of-the-pants stuff like antenna design and sun spots. Also I assume that cell phones use signal strength and feedback, and do not need great accuracy. Though maybe GPS stuff is tricky if you have few triangulation points. Using 10 points, maybe not so tricky. Not something I've cared to look at. Anyway, after a few billion computations, significance arithmetic tends to lose. Getting back to orbits, one extremely useful > application of error propagation is to use it "backwards" to determine > which observations would best improve knowledge of an orbit. > > There is no universal method for tracking uncertainty that is accurate > and practical. Ah, so you are saying that Mathematica is not accurate and practical?? Your own ideologically favored method, interval > arithmetic, yields unrealistically large estimates of error in many > cases, and that can be a very bad thing. Or it can be useful to have > an upper bound. What's good depends on what the *application* needs, > not some ivory tower ideology. It's not my favorite. I point it out as a method that has been widely studied. It does not provide estimates of error. It provides bounds on error, and those bounds may be very pessimistic. > > I am *really* tired of your smug, patronizing attitude. You're a blind > man attempting to explain a rainbow. Why not, instead of whining all > the time that Mathematica doesn't conform to your profoundly narrow > notions of what computation is, spend some time with it actually > computing something of relevance to the real world? My concern is that someone attempting to compute something of relevance will fall into a real-world hole. If you were > actually interested in applications, you would *rejoice* in the fact > that there are a variety of approaches available. But instead, you > obviously see Mathematica as a threat to your narrow ideology. I am primarily interested in building systems appropriate for a range of applications. (That's more computer science). From that perspective I think that Mathematica falls short. I don't see that as a threat, but I am inclined to object to statements that claim (in my view incorrectly) that Mathematica (arithmetically speaking) is the best, or even the only way to do floating-point calculations. In some ways the Mathematica system is just fine, if you use its library routines for special functions for arbitrary precision, and you are within the appropriate ranges where they actually deliver what is promised. (I have mixed experiences near singular points...) But I think I've stated my perspective pretty clearly, even if patronizingly. RJF > >