AI helps you reading Science

AI generates interpretation videos

AI extracts and analyses the key points of the paper to generate videos automatically


pub
Go Generating

AI Traceability

AI parses the academic lineage of this thesis


Master Reading Tree
Generate MRT

AI Insight

AI extracts a summary of this paper


Weibo:
We provide a number of iterative algorithms which are very practical and scalable, and algorithms like Ellipsoidal Approximation based Submodular Set Cover and Ellipsoidal Approximation based Submodular Cost Knapsack, which though more intensive, obtain tight approximation bounds

Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints.

NIPS, (2013): 2436-2444

Cited by: 211|Views256
EI
Full Text
Bibtex
Weibo

Abstract

We investigate two new optimization problems -- minimizing a submodular function subject to a submodular lower bound constraint (submodular cover) and maximizing a submodular function subject to a submodular upper bound constraint (submodular knapsack). We are motivated by a number of real-world applications in machine learning includin...More

Code:

Data:

0
Introduction
  • A set function f : 2V → R is said to be submodular [4] if for all subsets S, T ⊆ V , it holds that f (S) + f (T ) ≥ f (S ∪ T ) + f (S ∩ T ).
  • The surrogate functions for gin Algorithm 1 can be the ellipsoidal approximations above, and the multiplicative bounds transform into approximation guarantees for these problems.
  • When f and g are polymatroid functions, the authors can provide bounded approximation guarantees for both problems, as shown .
Highlights
  • A set function f : 2V → R is said to be submodular [4] if for all subsets S, T ⊆ V, it holds that f (S) + f (T ) ≥ f (S ∪ T ) + f (S ∩ T )
  • We study a new class of discrete optimization problems that have the following form: Problem 1 (SCSC): min{f (X) | g(X) ≥ c}, and Problem 2 (SCSK): max{g(X) | f (X) ≤ b}, where f and g are monotone non-decreasing submodular functions that w.l.o.g., are normalized (f (∅) = g(∅) = 0)1, and where b and c refer to budget and cover parameters respectively
  • We provide a framework of combinatorial algorithms based on optimizing, sometimes iteratively, subproblems that are easy to solve
  • This problem occurs naturally in a number of problems related to active/online learning [7] and summarization [21, 22]. This problem was first investigated by Wolsey [29], wherein he showed that a simple greedy algorithm achieves bounded approximation guarantees. We show that this greedy algorithm can naturally be viewed in the framework of our Algorithm 1 by choosing appropriate surrogate functions ft and gt
  • Iterated Submodular Set Cover (ISSC): We investigate an algorithm for the general SCSC problem when both f and g are submodular
  • We provide a number of iterative algorithms which are very practical and scalable, and algorithms like Ellipsoidal Approximation based Submodular Set Cover (EASSC) and Ellipsoidal Approximation based Submodular Cost Knapsack (EASK), which though more intensive, obtain tight approximation bounds
Results
  • The authors consider several algorithms for Problems 1 and 2, which can all be characterized by the framework of Algorithm 1, using the surrogate functions of the form of upper/lower bounds or approximations.
  • This problem was first investigated by Wolsey [29], wherein he showed that a simple greedy algorithm achieves bounded approximation guarantees.
  • The idea here is to iteratively solve the submodular set cover problem which can be done by replacing f by a modular upper bound at every iteration.
  • The authors recover the approximation guarantee of the submodular set cover problem.
  • SCSK turns into the SK problem for which the greedy algorithm with partial enumeration provides a 1 − e−1 approximation [28].
  • Choosing the surrogate function fas f and gas hπ (with π defined in eqn (5)) in Algorithm 1 with appropriate initialization obtains a guarantee of 1 − 1/e for SK.
  • Iterated Submodular Cost Knapsack (ISK): Here, the authors choose ft(X) as a modular upper bound of f , tight at Xt. Let gt = g.
  • Ellipsoidal Approximation based Submodular Cost Knapsack (EASK): Choosing the Ellipsoidal Approximation f ea of f as a surrogate function, the authors obtain a simpler problem:
  • The authors can directly choose the ellipsoidal approximation of f as wf (X) and solve the surrogate problem: max{g(X) : wf (X) ≤ b2}.
  • This surrogate problem is a submodular cost knapsack problem, which the authors can solve using the greedy algorithm.
Conclusion
  • These include multiple covering and knapsack constraints – i.e., min{f (X)|gi(X) ≥ ci, i = 1, 2, · · · k} and max{g(X)|fi(X) ≤ bi, i = 1, 2, · · · k}, and robust optimization problems like max{mini gi(X)|f (X) ≤ b}, where the functions f, g, fi’s and gi’s are submodular.
  • For any κ > 0, there exists submodular functions with curvature κ such that no polynomial time algorithm for Problems 1 and 2 achieves a bi-criterion factor better than σ ρ n1/2− 1+(n1/2− −1)(1−κ)
  • The authors provide a number of iterative algorithms which are very practical and scalable, and algorithms like EASSC and EASK, which though more intensive, obtain tight approximation bounds.
Funding
  • This material is based upon work supported by the National Science Foundation under Grant No (IIS-1162606), a Google and a Microsoft award, and by the Intel Science and Technology Center for Pervasive Computing
Reference
  • A. Atamturk and V. Narayanan. The submodular knapsack polytope. Discrete Optimization, 2009.
    Google ScholarLocate open access versionFindings
  • M. Conforti and G. Cornuejols. Submodular set functions, matroids and the greedy algorithm: tight worstcase bounds and some generalizations of the Rado-Edmonds theorem. Discrete Applied Mathematics, 7(3):251–274, 1984.
    Google ScholarLocate open access versionFindings
  • U. Feige. A threshold of ln n for approximating set cover. Journal of the ACM (JACM), 1998.
    Google ScholarLocate open access versionFindings
  • S. Fujishige. Submodular functions and optimization, volume 58. Elsevier Science, 2005.
    Google ScholarFindings
  • J. Garofolo, F. Lamel, L., J. W., Fiscus, D. Pallet, and N. Dahlgren. Timit, acoustic-phonetic continuous speech corpus. In DARPA, 1993.
    Google ScholarLocate open access versionFindings
  • M. Goemans, N. Harvey, S. Iwata, and V. Mirrokni. Approximating submodular functions everywhere. In SODA, pages 535–544, 2009.
    Google ScholarLocate open access versionFindings
  • A. Guillory and J. Bilmes. Interactive submodular set cover. In ICML, 2010.
    Google ScholarLocate open access versionFindings
  • A. Guillory and J. Bilmes. Simultaneous learning and covering with adversarial noise. In ICML, 2011.
    Google ScholarLocate open access versionFindings
  • R. Iyer and J. Bilmes. Algorithms for approximate minimization of the difference between submodular functions, with applications. In UAI, 2012.
    Google ScholarLocate open access versionFindings
  • R. Iyer and J. Bilmes. The submodular Bregman and Lovasz-Bregman divergences with applications. In NIPS, 2012.
    Google ScholarLocate open access versionFindings
  • R. Iyer and J. Bilmes. Submodular Optimization with Submodular Cover and Submodular Knapsack Constraints: Extended arxiv version, 2013.
    Google ScholarFindings
  • R. Iyer, S. Jegelka, and J. Bilmes. Curvature and Optimal Algorithms for Learning and Minimizing Submodular Functions. In NIPS, 2013.
    Google ScholarFindings
  • R. Iyer, S. Jegelka, and J. Bilmes. Fast semidifferential based submodular function optimization. In ICML, 2013.
    Google ScholarFindings
  • S. Jegelka and J. A. Bilmes. Submodularity beyond submodular energies: coupling edges in graph cuts. In CVPR, 2011.
    Google ScholarLocate open access versionFindings
  • Y. Kawahara and T. Washio. Prismatic algorithm for discrete dc programming problems. In NIPS, 2011.
    Google ScholarLocate open access versionFindings
  • H. Kellerer, U. Pferschy, and D. Pisinger. Knapsack problems. Springer Verlag, 2004.
    Google ScholarFindings
  • A. Krause and C. Guestrin. A note on the budgeted maximization on submodular functions. Technical Report CMU-CALD-05-103, Carnegie Mellon University, 2005.
    Google ScholarFindings
  • A. Krause, B. McMahan, C. Guestrin, and A. Gupta. Robust submodular observation selection. Journal of Machine Learning Research (JMLR), 9:2761–2801, 2008.
    Google ScholarLocate open access versionFindings
  • A. Krause, A. Singh, and C. Guestrin. Near-optimal sensor placements in Gaussian processes: Theory, efficient algorithms and empirical studies. JMLR, 9:235–284, 2008.
    Google ScholarLocate open access versionFindings
  • H. Lin and J. Bilmes. How to select a good training-data subset for transcription: Submodular active selection for sequences. In Interspeech, 2009.
    Google ScholarLocate open access versionFindings
  • H. Lin and J. Bilmes. Multi-document summarization via budgeted maximization of submodular functions. In NAACL, 2010.
    Google ScholarLocate open access versionFindings
  • H. Lin and J. Bilmes. A class of submodular functions for document summarization. In The 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (ACL/HLT2011), Portland, OR, June 2011.
    Google ScholarFindings
  • H. Lin and J. Bilmes. Optimal selection of limited vocabulary speech corpora. In Interspeech, 2011.
    Google ScholarLocate open access versionFindings
  • R. C. Moore and W. Lewis. Intelligent selection of language model training data. In Proceedings of the ACL 2010 Conference Short Papers, pages 220–2Association for Computational Linguistics, 2010.
    Google ScholarLocate open access versionFindings
  • M. Narasimhan and J. Bilmes. A submodular-supermodular procedure with applications to discriminative structure learning. In UAI, 2005.
    Google ScholarLocate open access versionFindings
  • E. Nikolova. Approximation algorithms for offline risk-averse combinatorial optimization, 2010.
    Google ScholarFindings
  • J. Rousu and J. Shawe-Taylor. Efficient computation of gapped substring kernels on large alphabets. Journal of Machine Learning Research, 6(2):1323, 2006.
    Google ScholarLocate open access versionFindings
  • M. Sviridenko. A note on maximizing a submodular set function subject to a knapsack constraint. Operations Research Letters, 32(1):41–43, 2004.
    Google ScholarLocate open access versionFindings
  • L. A. Wolsey. An analysis of the greedy algorithm for the submodular set covering problem. Combinatorica, 2(4):385–393, 1982.
    Google ScholarLocate open access versionFindings
Your rating :
0

 

Tags
Comments
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn
小科