[Date Index]
[Thread Index]
[Author Index]
Re: Re: function of a function
*To*: mathgroup at smc.vnet.net
*Subject*: [mg62682] Re: [mg62650] Re: function of a function
*From*: Daniel Lichtblau <danl at wolfram.com>
*Date*: Wed, 30 Nov 2005 22:08:45 -0500 (EST)
*References*: <dmha20$932$1@smc.vnet.net><dmhfhd$bit$1@smc.vnet.net> <200511300507.AAA00115@smc.vnet.net>
*Sender*: owner-wri-mathgroup at wolfram.com
Narasimham wrote:
> Jens-Peer Kuska wrote:
>
>
>>it can't work because f [0] ==1 given in your differential equation
>>f ' [0]==f [1] and NDSolve[] can't find the value for
>>f[1] until it has integrated the equation.
>
>
> ???
>
>
>>The nested dependence is equivalent to an infinite
>>system of ordinary differential equations and it seems to be
>>hard to do this by a finte computer.
>
>
> I cannot understand this. In the following two examples the first one
> works, not the second.
>
> Clear[x,f,EQ];
> EQ={f'[x] == f[Cos[x]],f[0]== 1};
> NDSolve[EQ,f,{x,0,4}];
> f[x_]=f[x]/.First[%];
> Plot[f[x],{x,0,4}];
>
> Clear[x,f,EQ];
> EQ={f'[x] == Cos[f[x]],f[0]== 1};
> NDSolve[EQ,f,{x,0,4}];
> f[x_]=f[x]/.First[%];
> Plot[f[x],{x,0,4}];
>
> It appears (to me) the power of programming with functions in
> Mathematica has not been used to the full.
>
> Regards
> Narasimham
> [...]
The second one gives a result, the first does not. This is a consequence
of mathematics, not Mathematica. A first order ODE is of the form
dy/dx=F(y,x) where F is some "reasonable" function e.g. piecewise
analytic. Your second equation conforms to this description whereas your
first does not.
My guess is one might approach your first using ansatz from integral
equations methods. One such might be to expand f as a power series with
unknown coefficients and try to find constraints to give linear
equations in those coefficients.
Another method might be to form a sequence of functions where
f'[n,x] == f[n-1,Cos[x]]. Make f[0,x] something reasonable (perhaps
constant) and iterate the process a few times in hope of getting the
f[n,x] to converge for all x of interest. The code below may give an
idea of what I have in mind.
f[0][x_] := 1
deq[n_] := {f[n]'[x]==f[n-1][Cos[x]],f[n][0]==1};
sol[n_] := (f[n][x] = f[n][x]/.First[NDSolve[deq[n],f[n][x],{x,0,4}]];
f[n][t_] := f[n][x]/.x->t)
For example, if I now do
Do[sol[j], {j, 100}]
then I get a bunch of messages telling me extrapolation was needed to go
outside the domain. But then I evaluate
Table[f[j][1.8], {j, 100}]
I see that it is converging to about 10.5214, and similarly f[j][2.8]
appears to be converging to about 5.08. Let's check the differential
condition.
Max[Abs[Table[f[100]'[j] - f[100][Cos[j]], {j, 0, 4, .1}]]]
also gives a bunch of extrapolation messages, and a value of around
.035, which I think is not too bad. Going to 500 iterations brings the
residual to .0039. So this appears to be a viable approach. It may also
be a bit naive as I'm never certain when such a method is really
justified. Probably requires a bit of analysis to decide if we have some
sort of contraction principle to tell us to expect a fixed point. Also
those extrapolations might be problematic, and evaluations of f[n][2.8]
do not seem to converge terribly well, so it may be that the range needs
to be expanded or contracted in order to get reliable results.
Daniel Lichtblau
Wolfram Research
Prev by Date:
**Re: function of a function**
Next by Date:
**Re: Re: Re: function of a function**
Previous by thread:
**Re: function of a function**
Next by thread:
**Re: Re: Re: function of a function**
| |