[Date Index]
[Thread Index]
[Author Index]
Re: functional code
*To*: mathgroup at smc.vnet.net
*Subject*: [mg4815] Re: functional code
*From*: danl (Daniel Lichtblau)
*Date*: Fri, 20 Sep 1996 01:12:38 -0400
*Organization*: Wolfram Research, Inc.
*Sender*: owner-wri-mathgroup at wolfram.com
In article <51c52l$b2u at ralph.vnet.net> gaylord at ux1.cso.uiuc.edu (richard
j. gaylord) writes:
> i received the following query from someone and thought i'd post the
> problem and a solution i've come up with:
>
> query:
>
> Here is my non-functional solution to this problem:
>
> Given a list of numbers row={18,19,1,11,25,12,22,14}
> Select the numbers from the list by taking the largest number
> from the ends of the list until the list is empty.
>
> row={18,19,1,11,25,12,22,14};
> <code deleted>>
> result
>
> {18,19,14,22,12,25,11,1}
> <...>
I have seen several solutions to this problem. I would not want to run any
of them on a list of 100K elements. If you first run them on
row = Table[Random[Integer, 1000], {500}];
and then on
row = Table[Random[Integer, 1000], {1000}];
you see that each is O(n^2). But the problem obviously can be done in
O(n). I'll summarize some of the timings for the case where row has 1000
elements. These were run on a NeXT turbo.
rowtemp = row;
p = Length[rowtemp];
result = {};
Timing[Do[If[First[rowtemp]>=Last[rowtemp],
AppendTo[result, First[rowtemp]]; rowtemp =Rest[rowtemp],
AppendTo[result,Last[rowtemp]]; rowtemp =Drop[rowtemp,-1]],
{p}];
] (* 7 sec *)
Nest[
Function[y,
({Join[y[[1]],{#}] , DeleteCases[y[[2]], #]})&[Max[First[y[[2]]],
Last[y[[2]]]]]], {{}, row}, Length[row]][[1]] // Timing (* 14.4 sec and
error messages *)
Timing[{{},row}//.{{d___},{a_,b___,c_}}:>
{{d,Max[a,c]},If[a<c,{a,b},{b,c}]}//Flatten] (* 13.9 sec *)
{{},row}//.{
{{d___},{a_,b___,c_}}/;a<c :> {{d,c},{a,b}},
{{d___},{a_,b___,c_}}/;a>=c :> {{d,a},{b,c}}} //
Flatten //Timing (* 11.5 sec *)
Apply[
(#1 - #2)&,
Partition[
(Plus @@ #)& /@ NestList[
If[First[#] >= Last[#], Rest[#], Drop[#, -1]]&,
row,
Length[row]
],
2,
1
],
1
] // Timing (* 15 sec *)
The problem these have with large inputs (that is, the cause of the O(n^2)
behavior), is twofold. Several use AppendTo[list, element] rather than
list = {list, element} followed, at the end of the loop, by list =
Flatten[list]. The former must make a copy of list each time, rather than
just adding a reference to it; hence the O(n^2). This is easily remedied.
The second typical problem is that the new set of elements is formed by
dropping an element from the old. This, too, requires copying O(n)
elements O(n) times. I do not see an obvious functional programming
workaround to make this O(n). My conclusion is that one should either use
procedural code to solve a problem that is best solved by procedural
means, or else one should find a functional programming solution that does
this without all the list copying. This latter choice is beyond my
capability. For the former, I give an O(n) procedural solution below.
takeLargestFromEnds[input_List] := Module[
{j=1, k=Length[input], result={}},
While[j<=k,
If [input[[j]]>input[[k]],
result = {result, input[[j]]}; j++,
result = {result, input[[k]]}; k--];
];
Flatten[result]
]
(tl1 = takeLargestFromEnds[row]); // Timing (* 1.25 sec *)
row = Table[Random[Integer, 1000], {10000}];
(tl2 = takeLargestFromEnds[row]); // Timing (* 12.5 sec *)
Daniel Lichtblau
Wolfram Research, Inc.
danl at wolfram.com
==== [MESSAGE SEPARATOR] ====
Prev by Date:
**Re: Books**
Next by Date:
**Re: differential equation**
Previous by thread:
**Re: functional code**
Next by thread:
**Re: functional code**
| |