[Date Index] [Thread Index] [Author Index]
Re: Numerical accuracy/precision - this is a bug or a feature?
On Jul 15, 1:10 am, Richard Fateman <fate... at cs.berkeley.edu> wrote: > > In extracting useful results from computers in the real world > > ignorance of the application domain is more debilitating than > > ignorance of computer science. One can muddle through the latter, but > > not the former. > > I think you may be confusing "computer science" with "applications of > computers to whatever [e.g. physics, engineering, business]" which is > usually taught in the department of 'whatever'. So, in your view, the substance is all in the departments of 'whatever'? Because without applications, what's left? Computers are not natural objects: they are artificial, so without applications the whole field is just a made-up story. A true science needs the discipline of relating to *something* in the real world. You're evading this discipline. Your view makes computer science indistinguishable from religious scholasticism. > > Consider the howler you yourself committed a little while ago, > > discussing the role of computers in the Apollo moon landings: > > >> you've got to wonder how the Russians, probably WITHOUT much in the wa= y > >> of computers, put up an artificial satellite. > > > This illustrates a profound ignorance of the problem. If you don't > > care what orbit you go into, the computations are readily performed by > > slide rule ahead of time. But Apollo had to perform a series of > > precise maneuvers using imprecise rockets, with repeated re- > > computation of the trajectories. For the early missions, they didn't > > even know the lunar gravity very well, so actual orbits diverged > > rapidly from predictions even when the initial conditions were known. > > Apollo's indirect "lunar orbital rendezvous" approach was thus a > > triumph of computation. Even the Saturn V was not big enough to > > support the less computationally intensive direct approach. > > Do you know for a fact that the Russians didn't care what orbit the > first manned satellite had? I know the consequences of getting it wrong. Basically, they knew if they got the perigee high enough to go around once, nothing really bad could result from the other orbital parameters. The rocket wasn't powerful enough to do something silly like put Gagarin into an escape trajectory. The limitations of the rocket combined with very basic orbital mechanics guaranteed that about an hour and a half after launch, the spacecraft would return to a point near overhead to where the launch site had been. Consider the rotation of the Earth, and then all they had to do was fire the retrorocket in roughly the right direction at roughly the right time and Gagarin was guaranteed to come down about 23 degrees west of where he was launched. The Soviet Union was a huge place: they didn't need to do this accurately at all. Satisfying a single inequality is enormously easier than rendezvous in orbit around a body with poorly known gravity, where you must satisfy six equations to high precision. > I think you are confusing application knowledge with computer science. Without applications, computer science is vacuous. > Of course there are many methods that can be programmed. You are > assuming that analysis of signals is done by some kind of particle > tracking? I assume that programs are designed by persons familiar with > differential equations and electromagnetic radiation, as well as more > seat-of-the-pants stuff like antenna design and sun spots. > > Also I assume > that cell phones use signal strength and feedback, and do not need great > accuracy. How do you determine how large to make the transistors when manufacturing the cell phone? This requires calculation. How would you do that calculation? > Though maybe GPS stuff is tricky if you have few > triangulation points. Using 10 points, maybe not so tricky. Not > something I've cared to look at. If von Neumann was alive, he'd understand the issues completely before you'd even finished describing the problem. He was a *real* computer scientist. > > There is no universal method for tracking uncertainty that is accurate > > and practical. > > Ah, so you are saying that Mathematica is not accurate and practical?? Not universally. However, given a specific problem, it is often the tool of choice. > I am primarily interested in building systems appropriate for a range of > applications. (That's more computer science). Without knowledge of applications, you have no foundation to stand on here. And that's Wolfram's advantage: he and his people *do* understand a very wide range of applications. > From that perspective I > think that Mathematica falls short. Since many of us find Mathematica a very effective tool in real applications, this judgement is obviously based on nothing but ideology. You have repeatedly demonstrated your complete lack of any useful perspective here, and your unwillingness to do the necessary studying to gain that perspective. Instead, you carefully define "computer science" in a way that excuses you from studying anything you don't wish to study. > I don't see that as a threat, but I > am inclined to object to statements that claim (in my view incorrectly) > that Mathematica (arithmetically speaking) is the best, or even the only > way to do floating-point calculations. There is no best. It depends on what you're doing. Mathematica is very effective over a wide range of applications. It is not the right tool for every application. But you need the application knowledge to understand this.