MathGroup Archive 2011

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Import files on accessible URL and save in

  • To: mathgroup at smc.vnet.net
  • Subject: [mg122731] Re: Import files on accessible URL and save in
  • From: DrMajorBob <btreat1 at austin.rr.com>
  • Date: Wed, 9 Nov 2011 06:22:45 -0500 (EST)
  • Delivered-to: l-mathgroup@mail-archive0.wolfram.com
  • References: <201111050945.EAA10302@smc.vnet.net>
  • Reply-to: drmajorbob at yahoo.com

<<Utilities`URLTools`
FetchURL["http://www.wolfram.com";]
"/private/var/folders/kY/kYL-hxebGaefEKwlddWc3++++TM/-Tmp-/Temp7483389857745393361___www.wolfram.com"

Good job! Now, then... where is that file, and how do I remove it?

Bobby

On Tue, 08 Nov 2011 06:14:50 -0600, Todd Gayley <tgayley at wolfram.com>  
wrote:

> At 03:45 AM 11/5/2011, Gy Peng wrote:
>> Dear Mathematica friends,
>>
>> I want to Import some files stored on internet by:
>>
>> Import["http://url";, ...]
>>
>> This allow me to see the file in the notebook of mathematica. However,  
>> what
>> I want to do is to download and save the files in local machine. And  
>> since
>> I have may be hundreds or thousands files need to be download and save.  
>> Is
>> there any why in Mathematica to do it in a most optimized and fast way?
>>
>> And can read the file names in a single string in Mathematica like:
>>
>> { file1, file2, file3, file4, ...}
>>
>> How could I make a loop in Mathematica to download and save all of
>> them in a same directory in local machine by Mathematica?
>>
>> I would thank you for all your kind reply and help!!!!
>
>
> The FetchURL function, which is used internally by Import, will do
> what you want:
>
>     In[10]:= << Utilities`URLTools`
>
>     In[11]:= FetchURL["http://www.wolfram.com";]
>
>     Out[11]=
> "C:\\Users\\tgayley\\AppData\\Local\\Temp\\Temp65400___www.wolfram.com"
>
> If you want to choose a specific location and filename for the
> downloaded file, you pass it as the second argument:
>
>     In[12]:= FetchURL["http://www.wolfram.com";, "c:\\foo.txt"]
>
>     Out[12]= "c:\\foo.txt"
>
> If you want a program that downloads a list of URLs to a specific
> directory, this will do the trick.
>
>     listOfURLs = {"http://www.wolfram.com";, "http://www.apple.com";,
> "http://www.microsoft.com"};
>     downloadDir = "c:\\foo";  (* I assume it exists *)
>     n = 1;
>     {#, FetchURL[#, ToFileName[downloadDir, "file" <>
> ToString[n++]]]} & /@ listOfURLs
>
> This returns a list of {url, filename} pairs so you can match up each
> URL with the corresponding file. You could also create filenames
> based on the URL so that it was immediately obvious which file was
> the product of which URL.
>
>
> Todd Gayley
> Wolfram Research
>
>


-- 
DrMajorBob at yahoo.com



  • Prev by Date: Re: nVidia Optumus prevents using CUDA?
  • Next by Date: Plot the results of Findroot
  • Previous by thread: Re: Import files on accessible URL and save in
  • Next by thread: Re: Import files on accessible URL and save in