using FileNames on large, deep directories
- To: mathgroup at smc.vnet.net
- Subject: [mg99372] using FileNames on large, deep directories
- From: Michael <michael2718 at gmail.com>
- Date: Mon, 4 May 2009 06:00:08 -0400 (EDT)
I ran into a problem today using FileNames and was wondering if
anybody else has encountered this problem before, and if so how they
solved the problem.
The problem is that FileNames can potentially take a long time to run
- in my case it took about 2 hours. This is because it was reading
the file structure of a DVD and one particular sub-directory had an
enormous number of files scattered across about 80 directories. A
simple option to exclude the directory would have saved about 1.9
hours of run-time.
I'm assuming I could just solve the problem by manually recursing
directories and building up the filelist using FileNames for each
directory, but it seems like it would be a lot cleaner if FileNames
had additional options similar to the 'find' command (e.g. prune).
Prev by Date:
INCLUDE A LEGEND IN A GRAPHIC
Next by Date:
why does DownValues not return all downvalues for a symbol?
Previous by thread:
Re: INCLUDE A LEGEND IN A GRAPHIC
Next by thread:
Re: using FileNames on large, deep directories