AI helps you reading Science

AI generates interpretation videos

AI extracts and analyses the key points of the paper to generate videos automatically


pub
Go Generating

AI Traceability

AI parses the academic lineage of this thesis


Master Reading Tree
Generate MRT

AI Insight

AI extracts a summary of this paper


Weibo:
Substantial progress has been made in the case of symmetric and diagonally dominant systems, where Aii ≥ j=i |Aij|

Approaching Optimality for Solving SDD Linear Systems

Foundations of Computer Science, no. 1 (2014): 235-244

Cited by: 280|Views197
EI WOS

Abstract

We present an algorithm that on input of an n-vertex m-edge weighted graph G and a value k, produces an incremental sparsifier G with n-1+m/k edges, such that the condition number of G with G is bounded above by Õ(k log2 n), with probability 1-p. The algorithm runs in time Õ((m log n + n log n) log(1/p)). As a result, we obtain an algor...More

Code:

Data:

Introduction
  • Fast algorithms for solving linear systems and the related problem of finding a few fundamental eigenvectors is possibly one of the most important problems in algorithm design.
  • Symmetric diagonally dominant systems are linear-time reducible to linear systems whose matrix is the Laplacian of a weighted graph via a construction known as double cover that only doubles the number of non-zero entries in the system [GMZ95, Gre96].
Highlights
  • Fast algorithms for solving linear systems and the related problem of finding a few fundamental eigenvectors is possibly one of the most important problems in algorithm design
  • Substantial progress has been made in the case of symmetric and diagonally dominant (SDD) systems, where Aii ≥ j=i |Aij|
  • Spielman and Teng showed that symmetric and diagonally dominant systems can be solved in nearly-linear time [ST04, EEST05, ST06]
  • In Section 4 we present a high level description of our approach and discuss implications of our solver for the graph sparsification problem
  • The major new notion introduced by Spielman and Teng [ST04] in their nearly-linear time algorithm was that of a spectral sparsifier, i.e. a graph with a nearly-linear number of edges that α-approximates a given graph for a constant α
  • The only known nearly-linear time algorithm that produces a spectral sparsifier with O(n log n) edges is due to Spielman and Srivastava [SS08] and it is based on O calls to a symmetric and diagonally dominant linear system solver
Results
  • Presented a nearly tight construction of low-stretch trees [ABN08], giving an O(m log n + n log[2] n) time algorithm that on input a graph G produces a spanning tree of total stretch O(m log n).
  • The major new notion introduced by Spielman and Teng [ST04] in their nearly-linear time algorithm was that of a spectral sparsifier, i.e. a graph with a nearly-linear number of edges that α-approximates a given graph for a constant α.
  • The spectral sparsifier is combined with the O(m log[2] n) total stretch spanning trees of [EEST05] to produce a (k, O(k logc n)) ultrasparsifier, i.e. a graph Gwith n − 1 + (n/k) edges which O(k logc n)-approximates the given graph, for some c > 25.
  • Spielman and Srivastava [SS08] showed how to construct a much stronger spectral sparsifier with O(n log n) edges, by sampling edges with probabilities proportional to their effective resistance, if the graph is viewed as an electrical network.
  • The only known nearly-linear time algorithm that produces a spectral sparsifier with O(n log n) edges is due to Spielman and Srivastava [SS08] and it is based on O calls to a SDD linear system solver.
  • It is interesting that this algebraic approach matches up to log log n factors the running time bound of the purely combinatorial algorithm of Benczur and Karger [BK96] for the computation of the cut-preserving sparsifier.
  • Sparsifying once with the Spielman and Srivastava algorithm and applying the incremental sparsifier gives a (k, O(k log[3] n)) ultrasparsifier that runs in O(m log[3] n) randomized time.
  • In the special case where the input graph has O(n) edges, the incremental sparsifier is a (k, O(k log[2] n)) ultrasparsifier.
Conclusion
  • The authors' key idea is to scale up the low-stretch tree by a factor of κ, incurring a condition number of κ but allowing them to sample the non-tree edges aggressively using the upper bounds on their effective resistances given by the tree.
  • Unraveling the analysis of the bound for the condition number of the incremental sparsifier, it can been that one log n factor is due to the number of samples required by the Rudelson and Vershynin theorem.
Funding
  • ∗Partially supported by the National Science Foundation under grant number CCF-0635257. †Partially supported by Microsoft Research through the Center for Computational Thinking at CMU ‡Partially supported by Natural Sciences and Engineering Research Council of Canada (NSERC) under grant number M-377343-2009. 1We use the O() notation to hide a factor of at most (log log n)4
Reference
  • [ABN08] Ittai Abraham, Yair Bartal, and Ofer Neiman. Nearly tight low stretch spanning trees. In 49th Annual IEEE Symposium on Foundations of Computer Science, pages 781–790, 2008. 3, 6, 6.1
    Google ScholarLocate open access versionFindings
  • Reid Andersen, Fan Chung, and Kevin Lang. Local graph partitioning using pagerank vectors. In FOCS ’06: Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science, pages 475–486, Washington, DC, USA, 2006. IEEE Computer Society. 3
    Google ScholarLocate open access versionFindings
  • [AKPW95] Noga Alon, Richard Karp, David Peleg, and Douglas West. A graph-theoretic game and its application to the k-server problem. SIAM J. Comput., 24(1):78–100, 1995. 3
    Google ScholarLocate open access versionFindings
  • [Axe94] Owe Axelsson. Iterative Solution Methods. Cambridge University Press, New York, NY, 1999, 9
    Google ScholarFindings
  • [BGH+05] Marshall Bern, John R. Gilbert, Bruce Hendrickson, Nhat Nguyen, and Sivan Toledo. Support-graph preconditioners. SIAM J. Matrix Anal. Appl., 27:930–951, 2003
    Google ScholarLocate open access versionFindings
  • Erik G. Boman and Bruce Hendrickson. Support theory for preconditioning. SIAM J. Matrix Anal. Appl., 25(3):694–717, 2003. 2, 3
    Google ScholarLocate open access versionFindings
  • [BHV04] Erik G. Boman, Bruce Hendrickson, and Stephen A. Vavasis. Solving elliptic finite element systems in near-linear time with support preconditioners. CoRR, cs.NA/0407022, 2004. 1
    Google ScholarLocate open access versionFindings
  • Andras A. Benczur and David R. Karger. Approximating s-t Minimum Cuts in O(n2) time Time. In STOC, pages 47–55, 1996. 3, 4, 4.1
    Google ScholarLocate open access versionFindings
  • Joshua D. Batson, Daniel A. Spielman, and Nikhil Srivastava. Twice-Ramanujan sparsifiers. In Proceedings of the 41st Annual ACM Symposium on Theory of Computing, pages 255–262, 2003
    Google ScholarLocate open access versionFindings
  • F.R.K. Chung. Spectral Graph Theory, volume 92 of Regional Conference Series in Mathematics. American Mathematical Society, 1997. 1, 3
    Google ScholarLocate open access versionFindings
  • Peter G. Doyle and J. Laurie Snell. Random walks and electric networks, 2000. 6
    Google ScholarFindings
  • Michael Elkin, Yuval Emek, Daniel A. Spielman, and Shang-Hua Teng. Lower-stretch spanning trees. In Proceedings of the 37th Annual ACM Symposium on Theory of Computing, pages 494–503, 2005. 1, 3
    Google ScholarLocate open access versionFindings
  • Miroslav Fiedler. Algebraic connectivity of graphs. Czechoslovak Math. J., 23(98):298–305, 1973. 1
    Google ScholarLocate open access versionFindings
  • Alan George. Nested dissection of a regular finite element mesh. SIAM Journal on Numerical Analysis, 10:345–363, 1973. 9
    Google ScholarLocate open access versionFindings
  • [GMZ95] K.D. Gremban, Gary L. Miller, and M. Zagha. Performance evaluation of a parallel preconditioner. In 9th International Parallel Processing Symposium, pages 65–69, Santa Barbara, April 1995. IEEE. 3
    Google ScholarLocate open access versionFindings
  • Keith Gremban. Combinatorial Preconditioners for Sparse, Symmetric, Diagonally Dominant Linear Systems. PhD thesis, Carnegie Mellon University, Pittsburgh, October 1996. CMU CS Tech Report CMU-CS-96-123. 3
    Google ScholarFindings
  • Harold N. Gabow and Robert Endre Tarjan. A linear-time algorithm for a special case of disjoint set union. In STOC ’83: Proceedings of the fifteenth annual ACM symposium on Theory of computing, pages 246–251, New York, NY, USA, 1983. ACM. 6
    Google ScholarLocate open access versionFindings
  • Ramesh Hariharan and Debmalya Panigrahi. A general framework for graph sparsification. CoRR, abs/1004.4080, 2010. 4.1
    Findings
  • [JMD+07] Pushkar Joshi, Mark Meyer, Tony DeRose, Brian Green, and Tom Sanocki. Harmonic coordinates for character articulation. ACM Trans. Graph., 26(3):71, 2007. 1
    Google ScholarLocate open access versionFindings
  • Anil Joshi. Topics in Optimization and Sparse Linear Systems. PhD thesis, University of Illinois at Urbana Champaing, 1997. 3
    Google ScholarFindings
  • Ioannis Koutis and Gary L. Miller. A linear work, O(n1/6) time, parallel algorithm for solving planar Laplacians. In Proc. 18th ACM-SIAM Symposium on Discrete Algorithms (SODA 2007), 2007. 3, 8
    Google ScholarLocate open access versionFindings
  • Ioannis Koutis and Gary L. Miller. Graph partitioning into isolated, high conductance clusters: Theory, computation and applications to preconditioning. In Symposiun on Parallel Algorithms and Architectures (SPAA), 2008. 3
    Google ScholarFindings
  • Jonathan A. Kelner and Aleksander Madry. Faster generation of random spanning trees. Foundations of Computer Science, Annual IEEE Symposium on, 0:13–21, 2009. 1
    Google ScholarLocate open access versionFindings
  • [KMST09a] Alexandra Kolla, Yury Makarychev, Amin Saberi, and Shanghua Teng. Subgraph sparsification and nearly optimal ultrasparsifiers. CoRR, abs/0912.1623, 2009. 3, 4.1
    Findings
  • [KMST09b] Ioannis Koutis, Gary L. Miller, Ali Sinop, and David Tolliver. Combinatorial preconditioners and multilevel solvers for problems in computer vision and image processing. Technical report, CMU, 2009. 1
    Google ScholarFindings
  • [KMT09] Ioannis Koutis, Gary L. Miller, and David Tolliver. Combinatorial preconditioners and multilevel solvers for problems in computer vision and image processing. In International Symposium of Visual Computing, pages 1067–1078, 2009. 1
    Google ScholarLocate open access versionFindings
  • R.J. Lipton, D. Rose, and R.E. Tarjan. Generalized nested dissection. SIAM Journal of Numerical Analysis, 16:346–358, 1979. 9
    Google ScholarLocate open access versionFindings
  • James McCann and Nancy S. Pollard. Real-time gradient-domain painting. ACM Trans. Graph., 27(3):1–7, 2008. 1
    Google ScholarLocate open access versionFindings
  • Gordon Royle and Chris Godsil. Algebraic Graph Theory. Graduate Texts in Mathematics. Springer Verlag, 1997. 2, 3
    Google ScholarFindings
  • Mark Rudelson and Roman Vershynin. Sampling from large matrices: An approach through geometric functional analysis. J. ACM, 54(4):21, 2007. 5
    Google ScholarLocate open access versionFindings
  • Daniel A. Spielman and Samuel I. Daitch. Faster approximate lossy generalized flow via interior point algorithms. In Proceedings of the 40th Annual ACM Symposium on Theory of Computing, May 2008. 1
    Google ScholarLocate open access versionFindings
  • Daniel A. Spielman. Algorithms, Graph Theory, and Linear Equations in Laplacian Matrices. In Proceedings of the International Congress of Mathematicians, 2010. 1
    Google ScholarLocate open access versionFindings
  • Daniel A. Spielman and Nikhil Srivastava. Graph sparsification by effective resistances, 2008. 1, 3, 4, 4.1, 5, 5, 5.2, 5.3, 5, 5
    Google ScholarFindings
  • Daniel A. Spielman and Shang-Hua Teng. Spectral partitioning works: Planar graphs and finite element meshes. In FOCS, pages 96–105, 1996. 1
    Google ScholarLocate open access versionFindings
  • Daniel A. Spielman and Shang-Hua Teng. Solving Sparse, Symmetric, Diagonally-Dominant Linear Systems in Time 0(m1.31). In FOCS ’03: Proceedings of the 44th Annual IEEE Symposium on Foundations of Computer Science, page 416. IEEE Computer Society, 2003. 3
    Google ScholarLocate open access versionFindings
  • Daniel A. Spielman and Shang-Hua Teng. Nearly-linear time algorithms for graph partitioning, graph sparsification, and solving linear systems. In Proceedings of the 36th Annual ACM Symposium on Theory of Computing, pages 81–90, June 2004. 1, 3
    Google ScholarLocate open access versionFindings
  • Daniel A. Spielman and Shang-Hua Teng. Nearly-linear time algorithms for preconditioning and solving symmetric, diagonally dominant linear systems. CoRR, abs/cs/0607105, 2006. 1, 2, 7, 7, 7, 9, 9
    Google ScholarLocate open access versionFindings
  • Robert Endre Tarjan. Applications of path compression on balanced trees. J. ACM, 26(4):690–715, 1979. 6
    Google ScholarLocate open access versionFindings
  • Shang-Hua Teng. The Laplacian Paradigm: Emerging Algorithms for Massive Graphs. In Theory and Applications of Models of Computation, pages 2–14, 2010. 1
    Google ScholarLocate open access versionFindings
  • P.M. Vaidya. Solving linear equations with symmetric diagonally dominant matrices by constructing good preconditioners. A talk based on this manuscript, October 1991. 3, 9
    Google ScholarFindings
Your rating :
0

 

Tags
Comments
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn
小科