[Date Index]
[Thread Index]
[Author Index]
more on vector unions
*To*: mathgroup at smc.vnet.net
*Subject*: [mg23236] more on vector unions
*From*: Russell Towle <rustybel at foothill.net>
*Date*: Sat, 29 Apr 2000 22:04:45 -0400 (EDT)
*Sender*: owner-wri-mathgroup at wolfram.com
David Park supplied a new "vunion" (for "vector union") function which
vastly improved upon my own, and incorporated Carl Woll's curious
OrderedUnion function. Park's vunion, though, operated upon a list of
Round-ed proxies for the real-valued vectors.
This distressed me a little, since if my polyhedra were to grow to a
million miles across, well, Round might lead to problems.
Carl Woll was kind enough to devise a vector union function too. Here is
what he wrote:
>The vunion function I have in mind would declare two points to be the same
>if the maximum of the difference of the coordinates were less than some
>tolerance, which I'll abbreviate by tol. First, I'll define a vunion
>function which carries out the above:
>
>
>tol=0.000001;
>
>vunion[v_] := Union[v, SameTest ->
> (Max[Abs[#1 - #2]] < tol &)]
>
>For a large set of points, this vunion function is very slow, since given
>a set of N points, it goes like O(N^2). This is because the current Union
>function works by comparing every possible pair of points to determine if
>they are the same. My idea for speeding up this function is to first split
>up the data set into a disjoint union of smaller sets where the first
>coordinates of these smaller sets are basically (but not really) a
>distance tol apart, which is an O(N log N) process. The not really in the
>previous sentence is there because if the tolerance is large enough, then
>the splitting may not break up the set into smaller pieces after all. For
>example, with a tolerance of .1, the following data set will not be broken
>into smaller pieces:
>
>{{.05,1,1},{.1,1,1},{.15,1,1},{.2,1,1},{.25,1,1}}
>
>even though the first point and the last point are clearly farther than a
>tolerance apart. At any rate, after splitting up into smaller subsets,
>apply the vunion function to each of these smaller subsets. Hence, my
>modified vunion function is
>
>vunion1[v_] := Module[{r},
> r = Split[Sort[v], (Abs[#1[[1]] - #2[[1]]] < tol &)];
> Flatten[vunion /@ r, 1]]
Initial tests of this vunion1 function suggest that it is not only accurate
but very fast; it found the union of a list of 8448 3-vectors in 2.95
seconds. However, I myself believe it should be modified to Sort a Chop-ped
list:
>vunion1[v_] := Module[{r},
> r = Split[Sort[ Chop[v, .000001] ], (Abs[#1[[1]] - #2[[1]]] < tol &)];
> Flatten[vunion /@ r, 1]]
I think this is crucial, because a typical list of vectors (vertices of
polytopes, actually) with many duplicates will have many components close
to zero--which actually should be zero--but which differ in sign. Chop
fixes this and Sort proceeds as it should... right?
Russell Towle
Box 141
Dutch Flat, CA 95714
(530) 389-2872
Prev by Date:
**Re: fastest way to do pair-sum / make pair-list**
Next by Date:
**Re: Re: how to rank a list of elements?**
Previous by thread:
**Re: ...can't help, but let me muddy the waters a bit. ;-) ...**
Next by thread:
**Re: Q: PlotPoints has dramatic timing effect (and not trivialy)**
| |