[Date Index] [Thread Index] [Author Index]
Re: Numerical accuracy/precision - this is a bug or a feature?
On Jul 14, 2:23 am, Richard Fateman <fate... at cs.berkeley.edu> wrote: > On 7/13/2011 12:11 AM, Noqsi wrote: > .. > > >> see e.g. > >> <http://www.av8n.com/physics/uncertainty.htm>. > > .. > Learning mathematics from a physicist is hazardous. In some ways. But if you want to relate mathematics to reality, you might do well to consider the viewpoints of physicists. > Learning computer science from a physicist is hazardous too. Learning computer science from computer scientists is in some ways worse, unless your interest is pointless bit pushing. Matthew 23:24 is very relevant (some things never change). In extracting useful results from computers in the real world ignorance of the application domain is more debilitating than ignorance of computer science. One can muddle through the latter, but not the former. Consider the howler you yourself committed a little while ago, discussing the role of computers in the Apollo moon landings: > you've got to wonder how the Russians, probably WITHOUT much in the way > of computers, put up an artificial satellite. This illustrates a profound ignorance of the problem. If you don't care what orbit you go into, the computations are readily performed by slide rule ahead of time. But Apollo had to perform a series of precise maneuvers using imprecise rockets, with repeated re- computation of the trajectories. For the early missions, they didn't even know the lunar gravity very well, so actual orbits diverged rapidly from predictions even when the initial conditions were known. Apollo's indirect "lunar orbital rendezvous" approach was thus a triumph of computation. Even the Saturn V was not big enough to support the less computationally intensive direct approach. Perhaps the "programmer" I learned the most from in a long career was a physicist whose code was littered with silly (from a CS point of view) constructions like: TA=(COS(EA)-EC)/(1.-EC*COS(EA)) IF(ABS(TA).GT.1.) TA=SIGN(.99999,TA) TA=ACOS(TA) What was so great about his code? It's that every program he wrote was an illuminating exercise in extracting important knowledge from measurable information. The sloppy technique didn't matter so much. He put rigor into the place it really counted: serving the needs of his research. There were several much better technical programmers in that research group, but they were not as good at conceptualizing how to actually *use* the computer, rather than simply programming it. > Numbers in a computer are different from experimental measurements. But the experimental measurements relate much better to reality. > > nevertheless, I like this article. It says, among other things, > > The technique of propagating the uncertainty from step to= step > throughout the calculation is a very bad technique. It might sometimes > work for super-simple =93textbook=94 problems but it is unlikely to work = for > real-world problems. Except that error propagation techniques are used successfully in many fields. Your cell phone works because engineers found a good balance of power consumption and radio sensitivity from error propagation methods, rather than the impractical method of tracking each electron through the circuits. Getting back to orbits, one extremely useful application of error propagation is to use it "backwards" to determine which observations would best improve knowledge of an orbit. There is no universal method for tracking uncertainty that is accurate and practical. Your own ideologically favored method, interval arithmetic, yields unrealistically large estimates of error in many cases, and that can be a very bad thing. Or it can be useful to have an upper bound. What's good depends on what the *application* needs, not some ivory tower ideology. I am *really* tired of your smug, patronizing attitude. You're a blind man attempting to explain a rainbow. Why not, instead of whining all the time that Mathematica doesn't conform to your profoundly narrow notions of what computation is, spend some time with it actually computing something of relevance to the real world? If you were actually interested in applications, you would *rejoice* in the fact that there are a variety of approaches available. But instead, you obviously see Mathematica as a threat to your narrow ideology.