Re: Import files on accessible URL and save in
- To: mathgroup at smc.vnet.net
- Subject: [mg122730] Re: Import files on accessible URL and save in
- From: DrMajorBob <btreat1 at austin.rr.com>
- Date: Wed, 9 Nov 2011 06:22:34 -0500 (EST)
- Delivered-to: l-mathgroup@mail-archive0.wolfram.com
- References: <201111050945.EAA10302@smc.vnet.net>
- Reply-to: drmajorbob at yahoo.com
> If you want to choose a specific location and filename for the > downloaded file, you pass it as the second argument: For WHAT downloaded file, one instantly wonders? Is this e-mail the sum total of documentation for FetchURL? Bobby On Tue, 08 Nov 2011 06:14:50 -0600, Todd Gayley <tgayley at wolfram.com> wrote: > At 03:45 AM 11/5/2011, Gy Peng wrote: >> Dear Mathematica friends, >> >> I want to Import some files stored on internet by: >> >> Import["http://url";, ...] >> >> This allow me to see the file in the notebook of mathematica. However, >> what >> I want to do is to download and save the files in local machine. And >> since >> I have may be hundreds or thousands files need to be download and save. >> Is >> there any why in Mathematica to do it in a most optimized and fast way? >> >> And can read the file names in a single string in Mathematica like: >> >> { file1, file2, file3, file4, ...} >> >> How could I make a loop in Mathematica to download and save all of >> them in a same directory in local machine by Mathematica? >> >> I would thank you for all your kind reply and help!!!! > > > The FetchURL function, which is used internally by Import, will do > what you want: > > In[10]:= << Utilities`URLTools` > > In[11]:= FetchURL["http://www.wolfram.com";] > > Out[11]= > "C:\\Users\\tgayley\\AppData\\Local\\Temp\\Temp65400___www.wolfram.com" > > If you want to choose a specific location and filename for the > downloaded file, you pass it as the second argument: > > In[12]:= FetchURL["http://www.wolfram.com";, "c:\\foo.txt"] > > Out[12]= "c:\\foo.txt" > > If you want a program that downloads a list of URLs to a specific > directory, this will do the trick. > > listOfURLs = {"http://www.wolfram.com";, "http://www.apple.com";, > "http://www.microsoft.com"}; > downloadDir = "c:\\foo"; (* I assume it exists *) > n = 1; > {#, FetchURL[#, ToFileName[downloadDir, "file" <> > ToString[n++]]]} & /@ listOfURLs > > This returns a list of {url, filename} pairs so you can match up each > URL with the corresponding file. You could also create filenames > based on the URL so that it was immediately obvious which file was > the product of which URL. > > > Todd Gayley > Wolfram Research > > -- DrMajorBob at yahoo.com
- References:
- Import files on accessible URL and save in local machine.
- From: Gy Peng <hitphyopt@gmail.com>
- Import files on accessible URL and save in local machine.