Re: Re: FourierTransform
- To: mathgroup at smc.vnet.net
- Subject: [mg96253] Re: [mg96190] Re: FourierTransform
- From: Andrzej Kozlowski <akoz at mimuw.edu.pl>
- Date: Tue, 10 Feb 2009 05:51:47 -0500 (EST)
- References: <gm1dks$3nk$1@smc.vnet.net> <gm3r8h$mev$1@smc.vnet.net> <gm6kvu$a39$1@smc.vnet.net> <gm99v7$95$1@smc.vnet.net> <200902041021.FAA18709@smc.vnet.net> <gmecf2$adc$1@smc.vnet.net> <200902091032.FAA12234@smc.vnet.net>
I don't see anything in your post that I particularly disagree with but perhaps a few clarifications may be useful. I think I probably share you the view that the "axiomatic method" is only a method in mathematics and even not a particularly useful one. By the "axiomatic method" I mean the idea that all of mathematics should be deduced from some finite (preferably small) set of axioms. As I have already mentioned in another thread I think it has now been conclusively shown that such an approach is neither feasible nor does it really carry any particular advantages. However, I have a completely different view of another thing, which some people mistakenly identify with the axiomatic method and which I think correctly should be called "the method of proof". In my view this method is so essential to mathematics that nothing that does not rely on it can properly be called mathematics. The difference between the method of proof and the axiomatic method is that the method of proof allows one to start with whatever one believes to be solidly grounded "truth" and then proceed deductively using steps that again are "solidly grounded" mathematical or logical truths. Of course when we do so we are guided by "intuition" (more about it below) but intuition itself cannot be used as justification for any mathematics. The reason why we must proceed in this way is not that it guarantees that we can never fail but because when we happen to arrive at conclusions that either contradict other established "truths" or contradict our "intuition, we will know what assumptions we should re-examine (which could even be the "logical steps" themselves, again more about that below). An "counter-intuitive" result may but need not be wrong. A good example is the so called "Banach-Tarski paradox", that played a big role in our understanding of basic measure theory. This could never have been arrived by means of "intuition" alone, and even if it had been, we would not have known what to do with it. In case someone thinks that this example is too "pure mathematical", there are equally good ones that have very direct "physical relevance". My favorite one concerns the violation of Bell's Inequality. Henry Stapp called this the most profound discovery in all of science and I completely agree. I think the implications of this discovery are so staggering that there is nothing else to compare with them (if Bernard D'Espagnat is right and Bell's Theorem implies that there is no such thing as "objective reality" then all "scientific controversies", such as evolution vs. intelligent design etc become essentially irrelevant). But Bell's theorem could never have been discovered by any approach based on intuition. Indeed its conclusions are in stark contrast with every kind of intuition we have about the physical world. It is only be adopting a rigorously "logical" and deductive approach that Bell's Theorem could be "proved", even though it is a truth of physics rather than of mathematics. Not surprisingly physicists seem to have given up on it in frustration and now it seems to be studied by logicians (e.g. see http://www.uni-bonn.de/~tmuelle2/philo/tmtp_lmps99.pdf ). Of course intuition plays a crucial role in mathematics. In fact, in addition to the sort of "intuition" that certain physicists have about certain kinds of mathematics, there is also a purely mathematical intuition. Mathematicians often refer to it as "geometry" even in areas such as algebra (algebraic geometry) or number theory etc, where conventional geometric imagination is not of much help. Without intuition mathematical discovery would not be possible, since the method of proof only helps one with verification of mathematical truths, not with their discovery. But the reason why we cannot base mathematics on "intuition" is that on the whole intuition is a very subjective thing: different people have completely different intuitions and what is "intuitively obvious" to one person may not be at all so to another (and quite often the second one is right). This is just as much true of "purely mathematical" intuition as of mathematical intuition derived from insight into physics. One of the many amazing phenomena concerning the relationship between mathematics and physics is the fact that certain mathematical physicists, most notably Edward Witten (the only physicist ever to receive a Fields medal and the only Field's medal winner who, it is said, has never proved a theorem) and Roger Penrose have an uncanny intuition about mathematical truth which seems to be based on a very deep insight derived from physics. Mathematicians have got use to the fact that when Witten tells them that a certain mathematical statements must be true they infallibly turn out to be so. This is indeed fantastic, but the problem is that this intuition is something that is possessed by Witten as an individual and not shared by other physicists. If we were to rely on this kind of intuition that would be no different from relying on the authority of one individual: Witten says so so it must be true. We don't need a proof. What is needed is something that does not rely on the insight or genius of one or several individuals. In mathematics the method of proof provides this. The collective insights of even a large number of less than genius physicists is not a substitute. Of course, and here may be the main difference between us, I am interested here only in establishing what is "mathematical truth". If you are really arguing that "what works in practice" is good enough to establish "what works i practice", then you are almost certainly right, but then our whole argument has been a complete waste of time. What I have been claiming that no amount of evidence from the practice of physicists can establish the validity of a mathematical procedure or its truth. The difference is illustrated by the following joke that I heard as an undergraduate student many years ago in the UK. An astronomer, a physicist and engineer were given the task of proving that all numbers greater than 1 were prime. The astronomer argued: 2 is a prime and 3 is a prime. We have checked one even number and it turned out to be prime, and one odd number and that was also prime. This provides overwhelming evidence that all numbers are prime. The physicist was more careful. 2 and 3 are primes, but 4 is not a prime. However 5 is also a prime. From this we deduce that the claim that all numbers are prime is probably false but about 75% of all numbers greater than 1 are prime. I have thought it wiser to leave out the part of the joke concerning the engineer... Andrzej Kozlowski Witten Penrose intuition Bell's inequality joke On 9 Feb 2009, at 10:32, John Doty wrote: > Andrzej Kozlowski wrote: >> On 4 Feb 2009, at 11:21, John Doty wrote: >> >>> Jens-Peer Kuska wrote: >>>> Hi, >>>> >>>> this is called a distribution or generalized function >>>> and not a function and it is only defined >>>> inside of an integral as my Vladimirov >>>> >>>> http://www.amazon.de/Methods-Generalized-Functions-Analytical-Special/dp/0415273560/ref=sr_1_33?ie=UTF8&s=books-intl-de&qid=1233576829&sr=8-33 >>>> >>>> say. >>> That restriction is Vladimirov's. We who actually apply generalized >>> functions to physics and engineering problems are not shy about >>> using >>> them outside of integrals. This is the approach that Mathematica >>> implements, as you can see below. >>> >>> A better reference is Bracewell: >>> www.amazon.com/Fourier-Transform-Its-Applications/dp/0073039381, >>> especially applicable to this question. >>> >>> By the way, the term "distribution" seems designed to confuse the >>> innocent. Many applications of generalized functions also involve >>> probability, where the term "distribution" has a different and far >>> more >>> familiar meaning. >>> >>> Most physicists and engineers will drop the "generalized" and simply >>> consider things like delta functions to be functions. They often >>> have >>> the right properties to represent the behavior of real world >>> objects, >>> when other notions of "function" don't. >> >> Sorry, but the last paragraph isn't that much of a recommendation. >> There is hardly any (pseudo)-mathematical nonsense that has not been >> believed to be true or valid by some engineer, occasionally with >> lamentable consequences. I remember that when, I was a math >> undergraduate, a "popular" mathematics journal published a long list >> of examples (with detailed references) of mathematical nonsense >> perpetrated by engineers, economists and some others (some quite >> hilarious). > > Well, the classic of this genre is Berkeley's mocking demolition of > calculus. Should physicists and engineers have abandoned calculus > after > Berkeley demonstrated that it was pseudo-mathematical nonsense? > > One I recall from my undergraduate days was a university computer > center > director who advocated automated scanning of student Fortran jobs on > the > mainframe to detect time-wasting "infinite loops". He was ridiculed > as a > disgrace to his profession: "doesn't he know about the halting > problem?". But the mockers got it wrong: he wasn't asking for a > perfect > decision procedure, merely a practical one that could weed out the > easily decidable cases. > > Of course, mathematicians perpetrate mathematical howlers, too. > Euler's > assertion that a divergent series may be replaced by its genesis > formula > is an example. The idea that mathematics advances by stepping from > truth > to truth by proving theorems is easily seen to be a myth if you know > some history. Mathematics is much more interesting, creative, and > useful > than that. > > Jens' authority, Vladimirov, asserts that a delta function is only > defined inside an integral. But I've also heard a mathematician > complain > "that's not what we mean by integration". And Jens earlier insisted > on > "square integrable" functions. Different notions of "function" and > "integral" lead to different ideas about what is allowable. > > Calculus, Fourier analysis, delta functions, renormalization, all > recognized as "mathematical nonsense" in their time, yet they proved > to > be indispensable tools for effectively applying mathematics to real > world problems. > >> I think some of it might have involved the use (or misuse) >> of generalized functions, though I am no longer sure. (I guess I >> could >> still trace the list, if you really wanted to see it ...but you could >> instead just search the archives of this forum ;-)) >> >> The most intuitive yet rigorous theory of generalized functions that >> allows then to be treated as "functions" was created by Jean Francois >> Colombeau. He wrote a beautifully clear and quite short exposition of >> his theory entitled "Elementary Introduction to new Generalized >> Functions" (North-Holland 1985). >> The Colombeau genealized functions have values at all points, but >> they >> are "generalized numbers" (which include ordinary numbers). Thus the >> Dirac Delta is a function defined on R, whose value is 0 for x!=0 and >> a generalized number at x==0. Colombeau generalized functions have >> derivatives of all orders (which are themselves generalized >> functions) >> and classical distributions are precisely those generalized functions >> which, in a neighborhood of each point, are partial derivatives of >> continuous functions. >> Colombeau theory was the first one that solved satisfactorily the old >> problem of giving a satisfactory definition of multiplication of >> generalized functions (something that before Colombeau was considered >> impossible by many). Colembeau theory justified much of the >> heuristics >> that had been previously known to physicists but it also helped to >> uncover nonsense where there was nonsense to uncover. > > The physical design of the communication network you used to broadcast > this assertion involved heavy use of applied mathematics results > obtained earlier than this using generalized functions. What is "white > noise" anyway? Mathematical (at least at the time its use became > common) > and even physical nonsense, but an indispensable concept. > > Mathematical objects are products of human imagination. That we can > obtain reliable knowledge of their properties is a profound mystery. > That they can effectively model real world objects is another profound > mystery. But they don't do so perfectly: any particular mathematical > model of reality will have (often poorly understood) limits to its > applicability. The justification for any applied mathematics model is > that you can verify it gets correct answers. The scientific method. > The > problem with most bad applied math in my experience is the dogmatic > use > of inapplicable methods. > >> >> Last but not least, Mathematica's notion of a generalized function is >> based on Colombeau. To convince yourself look at the documentation >> for >> HeavisideTheta, in the section "Possible issues". I quote: >> >> Products of distributions with coincident singular support cannot be >> defined (no Colombeau algebra interpretation). > > I would guess that this sort of formalization is more important to a > CAS > than a human. Physical intuition is reasonably effective at weeding > out > the nonsense here, otherwise these techniques would never have > gained a > foothold. But a CAS has no physical intuition. > > It is nice that Mathematica 7 seems to have improved here. > >> >> I get the impression that not all engineers believe in this even now. > > Many engineers are suspicious of mathematical abstraction. > Mathematical > notions of "correct" and "true" don't map completely reliably into > real > world situations. So, understanding the connection between the > abstraction and the physical situation is essential. > >> >> Andrzej Kozlowski >> >> >>> >>>> Regards >>>> Jens >>>> >>>> John Doty wrote: >>>>> Jens-Peer Kuska wrote: >>>>>> Hi, >>>>>> >>>>>> the Fourier transform over the interval x in (-Infinity,Infinity) >>>>>> converges only for quadratic integrable functions, i.e., >>>>>> functions >>>>>> where Integrate[Conjugate[f[x]]*f[x],{x,-Infinity,Infinity}]< >>>>>> Infinity >>>>>> >>>>>> This is not the case for Cosh[x], and so no Fourier transform >>>>>> exist. >>>>> Depends on what you mean by "function". Mathematica tries in its >>>>> pragmatic way to do what you might want here: >>>>> >>>>> In[1]:= FourierTransform[t^2,t,w] >>>>> >>>>> Out[1]= -(Sqrt[2 Pi] DiracDelta''[w]) >>>>> >>>>> t^2 is certainly not square integrable, but this is the kind of >>>>> useful >>>>> result scientists and engineers want. >>>>> >>>>> Mathematica's support for "generalized functions" still has room >>>>> for >>>>> improvement, but it has come a long way. The bizarre problems I >>>>> saw in >>>>> the past trying Fourier methods to perform fractional >>>>> differentiation >>>>> and integration >>>>> (http://forums.wolfram.com/mathgroup/archive/2000/Apr/msg00043.html >>>>> ) >>>>> seem no longer to be with us in Mathematica 7. >>>>> >>> >>> -- >>> John Doty, Noqsi Aerospace, Ltd. >>> http://www.noqsi.com/ >>> -- >>> The axiomatic method of mathematics is one of the great achievements >>> of >>> our culture. However, it is only a method. Whereas the facts of >>> mathematics once discovered will never change, the method by which >>> these >>> facts are verified has changed many times in the past, and it >>> would be >>> foolhardy to expect that changes will not occur again at some future >>> date. - Gian-Carlo Rota >>> >> >> > > > -- > John Doty, Noqsi Aerospace, Ltd. > http://www.noqsi.com/ > -- > In theory there is no difference between theory and practice. In > practice there is. -Yogi Berra >
- References:
- Re: FourierTransform
- From: John Doty <jpd@whispertel.LoseTheH.net>
- Re: FourierTransform
- From: John Doty <jpd@whispertel.LoseTheH.net>
- Re: FourierTransform