[Date Index]
[Thread Index]
[Author Index]
Re: Import files on accessible URL and save in
*To*: mathgroup at smc.vnet.net
*Subject*: [mg122799] Re: Import files on accessible URL and save in
*From*: Patrick Scheibe <pscheibe at trm.uni-leipzig.de>
*Date*: Thu, 10 Nov 2011 06:57:14 -0500 (EST)
*Delivered-to*: l-mathgroup@mail-archive0.wolfram.com
*References*: <201111050945.EAA10302@smc.vnet.net>
Hi,
I have written a small application which uses FetchURL to download and
install a Mathematica-package of mine from within Mathematica and
without external tools like wget, tar, ..
There, I came across this behaviour too. On windows, and only on
windows, I was not able to remove my temporary "package installer"
although I closed every used file.
When I remeber right, I could remove it *after* killing the kernel which
is a bit useless if you want to download, install and remove temp. stuff
in one function call and leave the user with a fresh installed new
package.
Cheers
Patrick
On Wed, 2011-11-09 at 06:24 -0500, Todd Gayley wrote:
> At 04:24 PM 11/8/2011, DrMajorBob wrote:
> >This fails, and I wonder why?
> >
> >DeleteDirectory["/private/var/folders/kY", DeleteContents -> True]
> >
> >$Failed
> >
> >DeleteDirectory::dirne: Directory /private/var/folders/kY not empty. >>
>
>
> I would guess it is because other processes have files open in that
> directory. Why are you trying to delete an entire directory (two
> levels up, no less), when FetchURL just created one file?
>
> As for your question about documentation, Utilities`URLTools`FetchURL
> is not an officially documented part of the system, which could be
> for a number of reasons. However, it has been a stable part of
> Mathematica for long enough (it is used internally by Import) that I
> am comfortable recommending it to users who need its functionality.
>
>
> Todd Gayley
> Wolfram Research
>
>
> >On Tue, 08 Nov 2011 16:08:46 -0600, DrMajorBob <btreat1 at austin.rr.com>
> >wrote:
> >
> >><<Utilities`URLTools`
> >>FetchURL["http://www.wolfram.com";]
> >>"/private/var/folders/kY/kYL-hxebGaefEKwlddWc3++++TM/-Tmp-/Temp7483389857745393361___www.wolfram.com"
> >>
> >>Good job! Now, then... where is that file, and how do I remove it?
> >>
> >>Bobby
> >>
> >>On Tue, 08 Nov 2011 06:14:50 -0600, Todd Gayley <tgayley at wolfram.com>
> >>wrote:
> >>
> >>>At 03:45 AM 11/5/2011, Gy Peng wrote:
> >>>>Dear Mathematica friends,
> >>>>
> >>>>I want to Import some files stored on internet by:
> >>>>
> >>>>Import["http://url";, ...]
> >>>>
> >>>>This allow me to see the file in the notebook of mathematica. However,
> >>>>what
> >>>>I want to do is to download and save the files in local machine. And
> >>>>since
> >>>>I have may be hundreds or thousands files need to be download and
> >>>>save. Is
> >>>>there any why in Mathematica to do it in a most optimized and fast way?
> >>>>
> >>>>And can read the file names in a single string in Mathematica like:
> >>>>
> >>>>{ file1, file2, file3, file4, ...}
> >>>>
> >>>>How could I make a loop in Mathematica to download and save all of
> >>>>them in a same directory in local machine by Mathematica?
> >>>>
> >>>>I would thank you for all your kind reply and help!!!!
> >>>
> >>>
> >>>The FetchURL function, which is used internally by Import, will do
> >>>what you want:
> >>>
> >>> In[10]:= << Utilities`URLTools`
> >>>
> >>> In[11]:= FetchURL["http://www.wolfram.com";]
> >>>
> >>> Out[11]=
> >>>"C:\\Users\\tgayley\\AppData\\Local\\Temp\\Temp65400___www.wolfram.com"
> >>>
> >>>If you want to choose a specific location and filename for the
> >>>downloaded file, you pass it as the second argument:
> >>>
> >>> In[12]:= FetchURL["http://www.wolfram.com";, "c:\\foo.txt"]
> >>>
> >>> Out[12]= "c:\\foo.txt"
> >>>
> >>>If you want a program that downloads a list of URLs to a specific
> >>>directory, this will do the trick.
> >>>
> >>> listOfURLs = {"http://www.wolfram.com";, "http://www.apple.com";,
> >>>"http://www.microsoft.com"};
> >>> downloadDir = "c:\\foo"; (* I assume it exists *)
> >>> n = 1;
> >>> {#, FetchURL[#, ToFileName[downloadDir, "file" <>
> >>>ToString[n++]]]} & /@ listOfURLs
> >>>
> >>>This returns a list of {url, filename} pairs so you can match up each
> >>>URL with the corresponding file. You could also create filenames
> >>>based on the URL so that it was immediately obvious which file was
> >>>the product of which URL.
> >>>
> >>>
> >>>Todd Gayley
> >>>Wolfram Research
> >>>
> >>
> >
> >
> >--
> >DrMajorBob at yahoo.com
>
>
Prev by Date:
**Re: Import files on accessible URL and save in**
Next by Date:
**Re: Import files on accessible URL and save in**
Previous by thread:
**Re: Import files on accessible URL and save in**
Next by thread:
**Re: Import files on accessible URL and save in**
| |