Re: Rearranging a data array containing calendrical as well as data entries.
- To: mathgroup at smc.vnet.net
- Subject: [mg54867] Re: Rearranging a data array containing calendrical as well as data entries.
- From: Mark Fisher <mark at markfisher.net>
- Date: Fri, 4 Mar 2005 05:08:02 -0500 (EST)
- Organization: BellSouth Internet Service
- References: <d0614i$krg$1@smc.vnet.net>
- Sender: owner-wri-mathgroup at wolfram.com
The following code does part of what you want: It reformats the data into date/value pairs such as {{{1991, 1, 1}, 489.82}, {1991, 1, 2}, 489.82}, {1991, 1, 3}, 495.01}, ... } (* using Version 5.1 *) stringlines = Import["http://www.gilmarlily.netfirms.com/download/flow.dat", "Lines"]; (* convert strings to numbers *) lines = ToExpression[Flatten /@ Transpose[{StringTake[stringlines, 4], StringCases[StringDrop[stringlines, 4], NumberString]}] ]; (* group into months *) monthgroups = {#[[1, 1]], Flatten[#[[All, 2]]]} & /@ Split[{Take[#, 2], Drop[#, 3]} & /@ lines, #1[[1]] == #2[[1]] &]; (* reorganize into day/value pairs *) pairs = Flatten[Transpose[{Function[x, Append[#[[1]], x]] /@ Range[Length[#[[2]]]], #[[2]]}] & /@ monthgroups, 1]; --Mark Gilmar wrote: > Dear Mathematica User Friends: > > I have a file containing flow data from the USGS, in the following > format: > > 1999 1 1 489.82 489.82 495.01 495.01 495.01 495.01 495.01 490.51 > 1999 1 2 490.51 490.51 490.51 490.51 490.38 490.38 490.38 490.38 > 1999 1 3 490.38 510.38 510.38 510.38 510.38 510.38 528.66 528.66 > 1999 1 4 528.66 528.66 528.66 501.68 501.68 501.68 501.68 > 1999 2 1 501.68 496.44 496.44 496.44 496.44 496.44 478.72 478.72 > 1999 2 2 478.72 478.72 478.72 452.82 452.82 452.82 452.82 452.82 > 1999 2 3 450.19 450.19 450.19 450.19 450.19 443.98 443.98 443.98 > 1999 2 4 443.98 443.98 440.14 440.14 > 1999 3 1 440.14 440.14 440.14 453.64 453.64 453.64 453.64 453.64 > 1999 3 2 503.98 503.98 503.98 503.98 503.98 500.84 500.84 500.84 > 1999 3 3 500.84 500.84 473.48 473.48 473.48 473.48 473.48 463.19 > 1999 3 4 463.19 463.19 463.19 463.19 457.54 457.54 457.54 > > This format is used by the USGS to compress their data records. > > Each row contains: > Year, Month Number(1 to 12), Row Number (1 to 4), and data entries. > > The first row: > 1999 1 1 489.82 489.82 495.01 495.01 495.01 495.01 495.01 490.51 > contains flow values corresponding to: January 1 to January 8, > of the year 1999. > > The second row: > 1999 1 2 490.51 490.51 490.51 490.51 490.38 490.38 490.38 490.38 > contains flow values corresponding to: January 9 to January 16, > of the year 1999. > > The third row: > 1999 1 3 490.38 510.38 510.38 510.38 510.38 510.38 528.66 528.66 > contains flow values corresponding to: January 17 to January 24, > of the year 1999. > > The fourth row: > 1999 1 4 528.66 528.66 528.66 501.68 501.68 501.68 501.68 > contains flow values corresponding to: January 25 to January 31, > of the year 1999. > > I think that you get the picture of how this data set is assembled. > > What I need is a program that can turn the above mentioned horizontal > array, into a simple vertical array, containing two columns; > the first column contains the dates when the data was collected, > and the second column contains the flow values; i.e. > > 01Jan1999 489.82 > 02Jan1999 489.82 > 03Jan1999 495.01 > etc. > > If I give the program a starting date, and ending date for an > arbitrary record; the program should be able to allocate two > arrays to: > > (1.) put the dates between the starting date, and ending date, > to form the first column of the vertical array. > > (2.)match correctly those dates with the data to appear in > the second column of the vertical array. > > The program should discern between regular years, and leap > years. Those of you that are still using FORTRAN, and have > experienced how difficult it is to deal with date functions > using FORTRAN, might sympathize with my request. > > P.S. To get a larger set of USGS flow data to test your program > please download the following file: > > http://www.gilmarlily.netfirms.com/download/flow.dat > > Thank you for your help! >