[Date Index]
[Thread Index]
[Author Index]
Re: 0/1 knapsack-like minimalization problem and file
*To*: mathgroup at smc.vnet.net
*Subject*: [mg108842] Re: 0/1 knapsack-like minimalization problem and file
*From*: Daniel Lichtblau <danl at wolfram.com>
*Date*: Sat, 3 Apr 2010 06:08:30 -0500 (EST)
Peter Sisak wrote:
> Hello,
>
> Thank you for your intense work so far on the subject,
> I am very much impressed by it. I am honestly getting the
> impression that going this deep, the exploration and
> comparison of all the strategies employed so far would
> already amount to a paper - since as you yourself also
> pointed out, it is an NP-complete problem as a form of the
> "subset sum" problem, and it might have a number of other
> uses.
There are papers related to this. Work that outlines the approach in
Mathematica internals may be found here.
http://library.wolfram.com/infocenter/Conferences/7517/
Interesting [read: unsettling] thing, though, is that everything there
appears to be wrong when applied to this particular problem. That is to
say, the "naive" branch-and-bound I showed in the last note works vastly
better than the more sophisticated preprocessing (discussed in material
at the URL above) with lattice reduction. The reason is that the
variable bounds get significantly worse. At this time I do not know why
that is, or what might be done to recognize, let alone remedy, these
situations. Such knowledge might in fact make for useful R&D. I am
reminded that a fair amount of progress begins with a question like "Why
does this [insert derogatory adjective] so bad?"
> Furthermore, one additional thing occured to me, for
> this particular problem at least: since the sector size on a
> standard CDs and DVDs is almost exclusively 2048 byte nowadays,
> it would be safe to add a "divide by 2048 then round up" step,
> perhaps it would help with making the optimisation quicker
> (smaller numbers to handle, add, compare, etc.) Just an idea to test.
>
> Best regards
> Peter Sisak
I did something like this in my third reply, though using less
convenient divisors of 10^4 and 10^6. But yes, one can get the problem
under discussion into a tractable size range by using this approach, and
furthermore it readily allows for a bit of "safety" padding, which is
probably useful from a practical perspective.
Offhand I do not know how well this would compete with the various
heuristics seen in other replies. Also this does not mean that related
problems would come into a tractable range using this idea. But the
example you showed would at least behave better, and I suspect many
cases of the problem you have at hand would likewise be tractable.
Daniel Lichtblau
Wolfram Research
Prev by Date:
**Re: convert txt to binary file**
Next by Date:
**Re: trying to use InputField as a continually attentive string input**
Previous by thread:
**Re: 0/1 knapsack-like minimalization problem and file**
Next by thread:
**Re: 0/1 knapsack-like minimalization problem and file**
| |