AI generates interpretation videos

AI extracts and analyses the key points of the paper to generate videos automatically Go Generating

AI Traceability

AI parses the academic lineage of this thesis Generate MRT

AI Insight

AI extracts a summary of this paper

Weibo:
We provide a number of iterative algorithms which are very practical and scalable, and algorithms like Ellipsoidal Approximation based Submodular Set Cover and Ellipsoidal Approximation based Submodular Cost Knapsack, which though more intensive, obtain tight approximation bounds

Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints.

NIPS, (2013): 2436-2444

Cited by: 211|Views256
EI
Full Text
Bibtex
Weibo

Abstract

We investigate two new optimization problems -- minimizing a submodular function subject to a submodular lower bound constraint (submodular cover) and maximizing a submodular function subject to a submodular upper bound constraint (submodular knapsack). We are motivated by a number of real-world applications in machine learning includin...More

Code:

Data:

0
Introduction
• A set function f : 2V → R is said to be submodular  if for all subsets S, T ⊆ V , it holds that f (S) + f (T ) ≥ f (S ∪ T ) + f (S ∩ T ).
• The surrogate functions for gin Algorithm 1 can be the ellipsoidal approximations above, and the multiplicative bounds transform into approximation guarantees for these problems.
• When f and g are polymatroid functions, the authors can provide bounded approximation guarantees for both problems, as shown .
Highlights
• A set function f : 2V → R is said to be submodular  if for all subsets S, T ⊆ V, it holds that f (S) + f (T ) ≥ f (S ∪ T ) + f (S ∩ T )
• We study a new class of discrete optimization problems that have the following form: Problem 1 (SCSC): min{f (X) | g(X) ≥ c}, and Problem 2 (SCSK): max{g(X) | f (X) ≤ b}, where f and g are monotone non-decreasing submodular functions that w.l.o.g., are normalized (f (∅) = g(∅) = 0)1, and where b and c refer to budget and cover parameters respectively
• We provide a framework of combinatorial algorithms based on optimizing, sometimes iteratively, subproblems that are easy to solve
• This problem occurs naturally in a number of problems related to active/online learning  and summarization [21, 22]. This problem was first investigated by Wolsey , wherein he showed that a simple greedy algorithm achieves bounded approximation guarantees. We show that this greedy algorithm can naturally be viewed in the framework of our Algorithm 1 by choosing appropriate surrogate functions ft and gt
• Iterated Submodular Set Cover (ISSC): We investigate an algorithm for the general SCSC problem when both f and g are submodular
• We provide a number of iterative algorithms which are very practical and scalable, and algorithms like Ellipsoidal Approximation based Submodular Set Cover (EASSC) and Ellipsoidal Approximation based Submodular Cost Knapsack (EASK), which though more intensive, obtain tight approximation bounds
Results
• The authors consider several algorithms for Problems 1 and 2, which can all be characterized by the framework of Algorithm 1, using the surrogate functions of the form of upper/lower bounds or approximations.
• This problem was first investigated by Wolsey , wherein he showed that a simple greedy algorithm achieves bounded approximation guarantees.
• The idea here is to iteratively solve the submodular set cover problem which can be done by replacing f by a modular upper bound at every iteration.
• The authors recover the approximation guarantee of the submodular set cover problem.
• SCSK turns into the SK problem for which the greedy algorithm with partial enumeration provides a 1 − e−1 approximation .
• Choosing the surrogate function fas f and gas hπ (with π defined in eqn (5)) in Algorithm 1 with appropriate initialization obtains a guarantee of 1 − 1/e for SK.
• Iterated Submodular Cost Knapsack (ISK): Here, the authors choose ft(X) as a modular upper bound of f , tight at Xt. Let gt = g.
• Ellipsoidal Approximation based Submodular Cost Knapsack (EASK): Choosing the Ellipsoidal Approximation f ea of f as a surrogate function, the authors obtain a simpler problem:
• The authors can directly choose the ellipsoidal approximation of f as wf (X) and solve the surrogate problem: max{g(X) : wf (X) ≤ b2}.
• This surrogate problem is a submodular cost knapsack problem, which the authors can solve using the greedy algorithm.
Conclusion
• These include multiple covering and knapsack constraints – i.e., min{f (X)|gi(X) ≥ ci, i = 1, 2, · · · k} and max{g(X)|fi(X) ≤ bi, i = 1, 2, · · · k}, and robust optimization problems like max{mini gi(X)|f (X) ≤ b}, where the functions f, g, fi’s and gi’s are submodular.
• For any κ > 0, there exists submodular functions with curvature κ such that no polynomial time algorithm for Problems 1 and 2 achieves a bi-criterion factor better than σ ρ n1/2− 1+(n1/2− −1)(1−κ)
• The authors provide a number of iterative algorithms which are very practical and scalable, and algorithms like EASSC and EASK, which though more intensive, obtain tight approximation bounds.
Funding
• This material is based upon work supported by the National Science Foundation under Grant No (IIS-1162606), a Google and a Microsoft award, and by the Intel Science and Technology Center for Pervasive Computing
Reference
Author   