Re: Appending to Lists
- To: mathgroup at smc.vnet.net
- Subject: [mg27063] Re: [mg27045] Appending to Lists
- From: Matt.Johnson at autolivasp.com
- Date: Sat, 3 Feb 2001 04:58:57 -0500 (EST)
- Sender: owner-wri-mathgroup at wolfram.com
James- This is a modified form of a function Allan Hayes sent me a while back that takes a list, a starting value, and an increment and divides the list into sublists based on the first element in each sublist: bin2[data_, start_, di_] := Part[data, #[[All, 2]]] & /@ Split[Sort[ Transpose[{(data[[All, 1]] + start)/di // Floor, Range[Length[data]]}]], First[#1] == First[#2] &]; Using this function, we can get the proper groups and then add the necessary zeros: fun[b_, c_] := Module[{bins}, bins = #[[All, 2]] & /@ bin2[Transpose[{c, b}], 0.5, 1]; Map[PadRight[#, Max[Length /@ bins]] &, bins]] Short test for accuracy: In[176]:= b = Table[Random[Real, {-2, 2}], {10}]; c = Table[Random[Integer, {1, 3}], {10}]; Sort[Transpose[{c, b}]] Out[178]= {{1, -1.75734}, {1, -0.517697}, {1, 0.772872}, {1, 1.80808}, {2, -1.99263}, {2, -1.39474}, {2, -1.02134}, {2, 0.2556}, {3, 1.51904}, {3, 1.83166}} In[179]:= fun[b, c] Out[179]= {{0.772872, -1.75734, 1.80808, -0.517697}, {0.2556, -1.99263, -1.02134, -1.39474}, {1.83166, 1.51904, 0, 0}} and the grand timing tests: In[281]:= b = Table[Random[Real, {-20, 20}], {100000}]; c = Table[Random[Integer, {1, 1000}], {100000}]; In[283]:= xf1 = fun[b, c]; // Timing Out[283]= {5.758 Second, Null} In[293]:= dofun[b_, c_] := Module[{xfinal = {}}, Do[AppendTo[xfinal, Flatten[Part[b, #] & /@ Position[c, i]]], {i, 1, Max[c]}]; long = Sort[xfinal, Length[#1] > Length[#2] &]; table = Table[0.0, {Length[First[long]]}]; Do[If[Length[Part[xfinal, i]] < Length[First[long]], AppendTo[Part[xfinal, i], Drop[table, (Length[Part[xfinal, i]])] ]], {i, 1, Length[xfinal]}]; xf2 = Map[Flatten[#, 1] &, xfinal];]; In[294]:= dofun[b, c]; // Timing Out[294]= {133.983 Second, Null} -matt "James Jones" <j.k.jones at dl.ac.uk> on 02/01/2001 01:00:32 AM cc: Subject: [mg27063] [mg27045] Appending to Lists Hi, I have a function that creates a list (a) from another list (b). The list elements are re-grouped in the new list according to a third list (c). A Position command is applied to list (c) for an element, then with this output the list (a) is created from list (b) at positions given by the element position data, list (c). This is repeated for the large number of elements in the original lists. The Position command is necessary as different elements appear in the list a different number of times. However, with the large number of elements in the lists (approx 50,000 for a simple list), this method is _very_ slow. If any one can give me help in speeding this process up I would be very grateful. The data sets would look like this b c 0.2 1 0.6 2 1.2 3 -0.2 1 0.5 2 0.3 1 0.7 2 -0.2 1 -0.6 1 A List would then be created from this data ( the list (a) ) containing vectors for 1, 2 and 3. The data in (b) is not important, and the order in which elements in (c) drop out is not set. In this case the (a) list should look like a = { { 0.2, -0.2, -0.2, -0.6} , {0.6, 0.5, 0.7} , { 1.2 } } My current function looks like this Do[AppendTo[xfinal, Flatten[Part[X, #] & /@ Position[Global`PARTICLE, i]]], {i, 1, Max[PARTICLE]}]; where xfinal is an (a) list, i.e. to be created. X is the (b) list , i.e. to be addressed, and PARTICLE is the (c) list. It is referenced by number. and it is very slow! Also, after producing this list, the different vector elements need to be made the same length, and so 0.0 are added to the ends of all vector elements shorter than the longest. My current function for doing this looks like table = Table[0.0, {Length[First[long]]}]; Print["Table Created!"]; Do[If[Length[Part[xfinal, i]] < Length[First[long]], AppendTo[Part[xfinal, i], Drop[table, (Length[Part[xfinal, i]])] ]], {i, 2, Length[xfinal]}]; where list (long) just sorts the list elements according to length. This function is also very slow, and I was wondering, again, if anyone knew a faster way of implementing this. Is the production of a table, once, and then dropping bits off and appending the fastest method? Of course this needs to be done tens of thousands of times per set of data so any small speed increase would be very helpful ;-> Again, any help much appreciated, James Jones Daresbury Laboratory