Randomized rounding yields a Lagrangian-relaxation algorithm for Multicommodity Flow.
Applying the method of conditional probabilities to a sample-and-increment scheme for Multicommodity Flow yields a Lagrangian-relaxation algorithm with an approximation ratio comparable to the algorithm of Plotkin, Shmoys, and Tardos [2]. The existence proof uses a standard Chernoff bound. This leads to exponential penalties (“multiplicative weights”) in the algorithm.
Rounding scheme for Multicommodity Flow
input: instance {\cal I}=(\mbox{graph } G, \mbox{set of paths } P, \mbox{capacity constraints} C, c) | |
output: integer flow \tilde f for {\cal I} | |
0. | Compute a maximum-size fractional multicommodity flow f^*. |
1. | Initialize \tilde{f_ p}=0 for every path p\in P. |
2. | Repeat until some edge is saturated (\exists e\in C.~ \tilde f(e) = c_ e): |
3. | Choose path p\in P randomly from the distribution f^*/|f^*|. |
4. | Increase \tilde{f_ p} by 1. |
5. | Return \tilde f. |
If the minimum edge capacity is sufficiently large, then the rounding scheme works with positive probability:
Let m be the number of edges with capacity constraints. For any \varepsilon \in (0,1) such that \min _ e c_ e \ge 3(1+\varepsilon )\ln (m)/\varepsilon ^2, with positive probability, the flow \tilde f returned by the rounding scheme is feasible and has size more than T=\lfloor |f^*|/(1+\varepsilon )\rfloor .
The proof idea is that f^* sends total flow at most c_ e on the paths crossing any given edge e, so, in any given iteration the probability that the flow \tilde f(e) along e increases (by 1) is at most c_ e/|f^*|. Applying Chernoff, a calculation shows that event “\tilde f(e) \ge c_ e at the end of iteration T” has probability less than 1/m. Hence, the expected number of edges saturated at the end of iteration T is less than 1.
Next we apply the method of conditional probabilities to derive the algorithm.
Applying the method of conditional probabilities to the existence proof yields the following algorithm.
For the derivation, we need a pessimistic estimator \phi _ t for the expectation of the number of edges saturated by iteration T, conditioned on the current flow \tilde f at the end of a given iteration t \le T. The proof of the Chernoff bound gives a pessimistic estimator for each edge. Summing over the edges gives the following pessimistic estimator:
Multicommodity Flow via Chernoff
input: instance {\cal I}=(G,P,C,c) | |
output: integer flow \tilde f for {\cal I} | |
1. | Initialize \tilde{f_ p}=0 for every path p\in P. Set \tilde c= 3(1+\varepsilon )\ln (m)/\varepsilon ^2. |
2. | Repeat until some edge is saturated (\exists e\in C.~ \tilde f(e) = c_ e): |
3. | Choose a path p\in P minimizing \sum _{e\in p} w_ e/c_ e, where w_ e = (1+\varepsilon )^{\tilde f(e) \tilde c/c_ e}. (To choose p, compute a shortest path in P w.r.t. edge weights w_ e/c_ e.) |
4. | Increase \tilde{f_ p} by 1. |
5. | Return \tilde f. |
Unfortunately, to minimize this pessimistic estimator, the algorithm would need to know |f^*|.
To fix this, normalize each capacity constraint “\tilde f(e) \le c_ e” into the equivalent “\tilde f(e) \tilde c/c_ e \le \tilde c”, where \tilde c = 3(1+\varepsilon )\ln (m)/\varepsilon ^2 is the lower bound on \min _ e c_ e. Apply Chernoff to the now-normalized constraints. The proof still goes through, but now gives pessimistic estimator
The factors in \phi ’_ t that depend on f^* are independent of e (and \tilde f), so can be ignored. The algorithm makes each choice to minimize the pessimistic estimator (more precisely, an upper bound on it; see the verification of \phi ’_ t below). Hence, it guarantees a successful outcome.
Let m=|V| be the number of edges with capacity constraints. For any \varepsilon \in (0,1) such that \min _ e c_ e \ge 3(1+\varepsilon )\ln (m)/\varepsilon ^2, given the graph G, the set of paths P, the capacity constraints (C,c), and \varepsilon , the algorithm above returns a feasible integer flow \tilde f whose size is at least |f^*| / (1+\varepsilon ), where f^* is any maximum-size feasible flow.
Algorithms with comparable performance guarantees were given (for example) by Plotkin, Shmoys, and Tardos [2] and by Grigoriadis and Khachiyan [1].
Discussion
Many technical questions remain. How do we find approximately optimal fractional solutions? How do we obtain a fast (strictly polynomial-time) algorithm? Can we generalize to other packing and covering problems?
Fractional solutions can be obtained simply by scaling the capacities (given any \varepsilon \in (0,1), scale so \min _ e c_ e \ge 3(1+\varepsilon )\ln (m)/\varepsilon ^2, then the algorithm will give a (1+\varepsilon )-approximate solution). Polynomial time can be ensured using Garg and Könemann’s non-uniform increments. And, yes, the approach generalizes to other problems. We address these questions in detail in other entries.
Related
Questions for later
Can one formalize the notion of shifting slack into a pessimistic estimator to make it maximally oblivious?