In:
Random Structures & Algorithms, Wiley, Vol. 24, No. 3 ( 2004-05), p. 315-380
Abstract:
We consider the problem of partitioning n integers into two subsets of given cardinalities such that the discrepancy, the absolute value of the difference of their sums, is minimized. The integers are i.i.d. random variables chosen uniformly from the set {1,…, M }. We study how the typical behavior of the optimal partition depends on n , M , and the bias s , the difference between the cardinalities of the two subsets in the partition. In particular, we rigorously establish this typical behavior as a function of the two parameters κ := n −1 log 2 M and b := | s |/ n by proving the existence of three distinct “phases” in the κ b ‐plane, characterized by the value of the discrepancy and the number of optimal solutions: a “perfect phase” with exponentially many optimal solutions with discrepancy 0 or 1; a “hard phase” with minimal discrepancy of order M e −Θ( n ) ; and a “sorted phase” with an unique optimal partition with discrepancy of order Mn , obtained by putting the ( s + n )/2 smallest integers in one subset. Our phase diagram covers all but a relatively small region in the κ b ‐plane. We also show that the three phases can be alternatively characterized by the number of basis solutions of the associated linear programming problem, and by the fraction of these basis solutions whose ±1‐valued components form optimal integer partitions of the subproblem with the corresponding weights. We show in particular that this fraction is one in the sorted phase, and exponentially small in both the perfect and hard phases, and strictly exponentially smaller in the hard phase than in the perfect phase. Open problems are discussed, and numerical experiments are presented. © 2004 Wiley Periodicals, Inc. Random Struct. Alg., 2004
Type of Medium:
Online Resource
ISSN:
1042-9832
,
1098-2418
Language:
English
Publisher:
Wiley
Publication Date:
2004
detail.hit.zdb_id:
1500812-5
SSG:
17,1
Bookmarklink