MathGroup Archive 2011

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Import files on accessible URL and save in

  • To: mathgroup at smc.vnet.net
  • Subject: [mg122758] Re: Import files on accessible URL and save in
  • From: DrMajorBob <btreat1 at austin.rr.com>
  • Date: Thu, 10 Nov 2011 06:49:50 -0500 (EST)
  • Delivered-to: l-mathgroup@mail-archive0.wolfram.com
  • References: <201111050945.EAA10302@smc.vnet.net>
  • Reply-to: drmajorbob at yahoo.com

But this also fails:

"/private/var/folders/kY/kYL-hxebGaefEKwlddWc3++++TM/-Tmp-/\
Temp7483389857745393361___www.wolfram.com" // DeleteFile

$Failed

DeleteFile::nffil: File not found during  
DeleteFile[/private/var/folders/kY/kYL-hxebGaefEKwlddWc3++++TM/-Tmp-/Temp7483389857745393361___www.wolfram.com].  
>>

It strikes me as a strange place to import files by default (on a Mac),  
where a non-Unix expert can't easily deal with them.

Bobby

On Wed, 09 Nov 2011 05:24:33 -0600, Todd Gayley <tgayley at wolfram.com>  
wrote:

> At 04:24 PM 11/8/2011, DrMajorBob wrote:
>> This fails, and I wonder why?
>>
>> DeleteDirectory["/private/var/folders/kY", DeleteContents -> True]
>>
>> $Failed
>>
>> DeleteDirectory::dirne: Directory /private/var/folders/kY not empty. >>
>
>
> I would guess it is because other processes have files open in that
> directory. Why are you trying to delete an entire directory (two
> levels up, no less), when FetchURL just created one file?
>
> As for your question about documentation, Utilities`URLTools`FetchURL
> is not an officially documented part of the system, which could be
> for a number of reasons. However, it has been a stable part of
> Mathematica for long enough (it is used internally by Import) that I
> am comfortable recommending it to users who need its functionality.
>
>
> Todd Gayley
> Wolfram Research
>
>
>> On Tue, 08 Nov 2011 16:08:46 -0600, DrMajorBob <btreat1 at austin.rr.com>
>> wrote:
>>
>>> <<Utilities`URLTools`
>>> FetchURL["http://www.wolfram.com";]
>>> "/private/var/folders/kY/kYL-hxebGaefEKwlddWc3++++TM/-Tmp-/Temp7483389857745393361___www.wolfram.com"
>>>
>>> Good job! Now, then... where is that file, and how do I remove it?
>>>
>>> Bobby
>>>
>>> On Tue, 08 Nov 2011 06:14:50 -0600, Todd Gayley <tgayley at wolfram.com>
>>> wrote:
>>>
>>>> At 03:45 AM 11/5/2011, Gy Peng wrote:
>>>>> Dear Mathematica friends,
>>>>>
>>>>> I want to Import some files stored on internet by:
>>>>>
>>>>> Import["http://url";, ...]
>>>>>
>>>>> This allow me to see the file in the notebook of mathematica.  
>>>>> However,
>>>>> what
>>>>> I want to do is to download and save the files in local machine. And
>>>>> since
>>>>> I have may be hundreds or thousands files need to be download and
>>>>> save. Is
>>>>> there any why in Mathematica to do it in a most optimized and fast  
>>>>> way?
>>>>>
>>>>> And can read the file names in a single string in Mathematica like:
>>>>>
>>>>> { file1, file2, file3, file4, ...}
>>>>>
>>>>> How could I make a loop in Mathematica to download and save all of
>>>>> them in a same directory in local machine by Mathematica?
>>>>>
>>>>> I would thank you for all your kind reply and help!!!!
>>>>
>>>>
>>>> The FetchURL function, which is used internally by Import, will do
>>>> what you want:
>>>>
>>>>     In[10]:= << Utilities`URLTools`
>>>>
>>>>     In[11]:= FetchURL["http://www.wolfram.com";]
>>>>
>>>>     Out[11]=
>>>> "C:\\Users\\tgayley\\AppData\\Local\\Temp\\Temp65400___www.wolfram.com"
>>>>
>>>> If you want to choose a specific location and filename for the
>>>> downloaded file, you pass it as the second argument:
>>>>
>>>>     In[12]:= FetchURL["http://www.wolfram.com";, "c:\\foo.txt"]
>>>>
>>>>     Out[12]= "c:\\foo.txt"
>>>>
>>>> If you want a program that downloads a list of URLs to a specific
>>>> directory, this will do the trick.
>>>>
>>>>     listOfURLs = {"http://www.wolfram.com";, "http://www.apple.com";,
>>>> "http://www.microsoft.com"};
>>>>     downloadDir = "c:\\foo";  (* I assume it exists *)
>>>>     n = 1;
>>>>     {#, FetchURL[#, ToFileName[downloadDir, "file" <>
>>>> ToString[n++]]]} & /@ listOfURLs
>>>>
>>>> This returns a list of {url, filename} pairs so you can match up each
>>>> URL with the corresponding file. You could also create filenames
>>>> based on the URL so that it was immediately obvious which file was
>>>> the product of which URL.
>>>>
>>>>
>>>> Todd Gayley
>>>> Wolfram Research
>>>>
>>>
>>
>>
>> --
>> DrMajorBob at yahoo.com
>
>


-- 
DrMajorBob at yahoo.com



  • Prev by Date: Re: Plot the results of Findroot
  • Next by Date: Re: Plot the results of Findroot
  • Previous by thread: Re: Import files on accessible URL and save in
  • Next by thread: Re: Import files on accessible URL and save in