Learning to Accelerate Heuristic Searching for Large-Scale Maximum Weighted b-Matching Problems in Online Advertising

IJCAI, pp. 3437-3443, 2020.

Cited by: 0|Bibtex|Views108|Links
EI
Keywords:
combinatorial optimizationbipartite b matchingmultichannel graph neuraleconomic marketad allocationMore(15+)
Weibo:
Our NeuSearcher transfers knowledge learned from previous solved instances to save more than 50% of the computing time

Abstract:

Bipartite b-matching is fundamental in algorithm design, and has been widely applied into economic markets, labor markets, etc. These practical problems usually exhibit two distinct features: large-scale and dynamic, which requires the matching algorithm to be repeatedly executed at regular intervals. However, existing exact and approxi...More

Code:

Data:

0
Introduction
  • Bipartite b-matching is one of the fundamental problems in computer science and operations research.
  • Typical applications include resource allocation problems, such as job/server allocation in cloud computing and product recommendation[De Francisci Morales et al, 2011] and advertisement allocation [Agrawal et al, 2018] in economic markets.
  • It has been utilized as an algorithmic tool in a variety of domains, including document clustering [Dhillon, 2001], computer vision [Zanfir and Sminchisescu, 2018], and as a subroutine in machine learning algorithms.
  • The goal of the ad allocation is to search for a maximum weighted b-matching: selecting a subset of edges with the maximum total scores while satisfying the cardinality constraints
Highlights
  • Bipartite b-matching is one of the fundamental problems in computer science and operations research
  • A bipartite graph connects a large set of consumers and a large set of ads
  • We investigate whether we can leverage the representation capability of neural networks to transfer the knowledge learned from previous solved instances to accelerate the solution computing on similar new instances
  • We propose a parallelizable and scalable learning based framework NeuSearcher to accelerate the solution computing for large-scale b-matching
  • Our NeuSearcher with the designed multichannel graph neural network computes the same solutions at the fastest speed by reducing more than 50% computing time
  • Our NeuSearcher transfers knowledge learned from previous solved instances to save more than 50% of the computing time
Methods
  • GreedyMR is one of the fastest parallel algorithms in computing b-matching problems.
  • The authors evaluate NeuSearcher on both open and industrial datasets.
  • Due to the memory limit (128G), the authors cannot calculate the exact solution using Gurobi optimizer for the first 7 datasets.
  • The authors compare the matching quality of the approximate algorithms relative to the exact solution on the other 3 open datasets (Amazon review data [He and McAuley, 2016] and MovieLens data [Harper and Konstan, 2016])
Results
  • For problems with larger sizes, the Gurobi fails to compute an optimal solution due to the memory limit (128G).
  • This indicates that faster approximate approaches are good alternatives in solving large-scale b-matching problems and the NeuSearcher achieves the state-of-the-art solution quality.
  • The authors' NeuSearcher transfers knowledge learned from previous solved instances to save more than 50% of the computing time
Conclusion
  • To the best of the knowledge, the authors are the first to integrate deep learning methods to accelerate solving practical large-scale b-matching problems.
  • The authors' NeuSearcher transfers knowledge learned from previous solved instances to save more than 50% of the computing time.
  • The authors design a parallel heuristic search algorithm to ensure the solution quality exactly the same with the state-of-the-art approximation algorithms.
  • Experiments on open and real-world large-scale datasets show NeuSearcher can compute nearly optimal solution much faster than state-of-the-art methods
Summary
  • Introduction:

    Bipartite b-matching is one of the fundamental problems in computer science and operations research.
  • Typical applications include resource allocation problems, such as job/server allocation in cloud computing and product recommendation[De Francisci Morales et al, 2011] and advertisement allocation [Agrawal et al, 2018] in economic markets.
  • It has been utilized as an algorithmic tool in a variety of domains, including document clustering [Dhillon, 2001], computer vision [Zanfir and Sminchisescu, 2018], and as a subroutine in machine learning algorithms.
  • The goal of the ad allocation is to search for a maximum weighted b-matching: selecting a subset of edges with the maximum total scores while satisfying the cardinality constraints
  • Methods:

    GreedyMR is one of the fastest parallel algorithms in computing b-matching problems.
  • The authors evaluate NeuSearcher on both open and industrial datasets.
  • Due to the memory limit (128G), the authors cannot calculate the exact solution using Gurobi optimizer for the first 7 datasets.
  • The authors compare the matching quality of the approximate algorithms relative to the exact solution on the other 3 open datasets (Amazon review data [He and McAuley, 2016] and MovieLens data [Harper and Konstan, 2016])
  • Results:

    For problems with larger sizes, the Gurobi fails to compute an optimal solution due to the memory limit (128G).
  • This indicates that faster approximate approaches are good alternatives in solving large-scale b-matching problems and the NeuSearcher achieves the state-of-the-art solution quality.
  • The authors' NeuSearcher transfers knowledge learned from previous solved instances to save more than 50% of the computing time
  • Conclusion:

    To the best of the knowledge, the authors are the first to integrate deep learning methods to accelerate solving practical large-scale b-matching problems.
  • The authors' NeuSearcher transfers knowledge learned from previous solved instances to save more than 50% of the computing time.
  • The authors design a parallel heuristic search algorithm to ensure the solution quality exactly the same with the state-of-the-art approximation algorithms.
  • Experiments on open and real-world large-scale datasets show NeuSearcher can compute nearly optimal solution much faster than state-of-the-art methods
Tables
  • Table1: The structural properties of the datasets
  • Table2: The solution quality comparison (best in bold)
  • Table3: The runtimes (in seconds) of b-matching computation, where lower values are better (best in bold)
Download tables as Excel
Funding
  • The work is supported by the Alibaba Group through Alibaba Innovative Research Program, the National Natural Science Foundation of China (Grant Nos.: 61702362, U1836214) and the new Generation of Artificial Intelligence Science and Technology Major Project of Tianjin under grant: 19ZXZNGX00010
Reference
  • [Agrawal et al., 2018] Shipra Agrawal, Morteza Zadimoghaddam, and Vahab Mirrokni. Proportional allocation: Simple, distributed, and diverse matching with high entropy. In International Conference on Machine Learning, pages 99–108, 2018.
    Google ScholarLocate open access versionFindings
  • [Avis, 1983] David Avis. A survey of heuristics for the weighted matching problem. Networks, 13(4):475–493, 1983.
    Google ScholarLocate open access versionFindings
  • [Bayati et al., 2011] Mohsen Bayati, Christian Borgs, Jennifer Chayes, and Riccardo Zecchina. Belief propagation for weighted b-matchings on arbitrary graphs and its relation to linear programs with integer solutions. SIAM Journal on Discrete Mathematics, 25(2):989–1011, 2011.
    Google ScholarLocate open access versionFindings
  • [Chen and Tian, 2019] Xinyun Chen and Yuandong Tian. Learning to perform local rewriting for combinatorial optimization. In Advances in Neural Information Processing Systems, pages 6278–6289, 2019.
    Google ScholarLocate open access versionFindings
  • [De Francisci Morales et al., 2011] Gianmarco De Francisci Morales, Aristides Gionis, and Mauro Sozio. Social content matching in mapreduce. Proceedings of the VLDB Endowment, 4(7):460–469, 2011.
    Google ScholarLocate open access versionFindings
  • [Dhillon, 2001] Inderjit S Dhillon. Co-clustering documents and words using bipartite spectral graph partitioning. In Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining, pages 269–274. ACM, 2001.
    Google ScholarLocate open access versionFindings
  • [Ding et al., 2019] Jian-Ya Ding, Chao Zhang, Lei Shen, Shengyin Li, Bing Wang, Yinghui Xu, and Le Song. Optimal solution predictions for mixed integer programs. arXiv preprint arXiv:1906.09575, 2019.
    Findings
  • [Edmonds, 1965] Jack Edmonds. Maximum matching and a polyhedron with 0, 1-vertices. Journal of research of the National Bureau of Standards B, 69(125-130):55–56, 1965.
    Google ScholarLocate open access versionFindings
  • [Grotschel and Holland, 1985] Martin Grotschel and Olaf Holland. Solving matching problems with linear programming. Mathematical Programming, 33(3):243–259, 1985.
    Google ScholarLocate open access versionFindings
  • [Gurobi, 2014] Gurobi. Inc. gurobi optimizer reference manual, 2015. URL: http://www.gurobi.com, 2014.
    Findings
  • [Harper and Konstan, 2016] F Maxwell Harper and Joseph A Konstan. The movielens datasets: History and context. Acm transactions on interactive intelligent systems (tiis), 5(4):19, 2016.
    Google ScholarFindings
  • [He and McAuley, 2016] Ruining He and Julian McAuley. Ups and downs: Modeling the visual evolution of fashion trends with one-class collaborative filtering. In proceedings of the 25th international conference on world wide web, pages 507–517. International World Wide Web Conferences Steering Committee, 2016.
    Google ScholarLocate open access versionFindings
  • [He et al., 2014] He He, Hal Daume III, and Jason M Eisner. Learning to search in branch and bound algorithms. In Advances in neural information processing systems, pages 3293–3301, 2014.
    Google ScholarLocate open access versionFindings
  • [Hougardy, 2009] Stefan Hougardy. Linear time approximation algorithms for degree constrained subgraph problems. In Research Trends in Combinatorial Optimization, pages 185–200.
    Google ScholarLocate open access versionFindings
  • [Khalil et al., 2017] Elias Khalil, Hanjun Dai, Yuyu Zhang, Bistra Dilkina, and Le Song. Learning combinatorial optimization algorithms over graphs. In Advances in Neural Information Processing Systems, pages 6348–6358, 2017.
    Google ScholarLocate open access versionFindings
  • [Khan et al., 2016] Arif Khan, Alex Pothen, Md Mostofa Ali Patwary, Nadathur Rajagopalan Satish, Narayanan Sundaram, Fredrik Manne, Mahantesh Halappanavar, and Pradeep Dubey. Efficient approximation algorithms for weighted b-matching. SIAM Journal on Scientific Computing, 38(5):S593–S619, 2016.
    Google ScholarLocate open access versionFindings
  • [Li et al., 2018] Zhuwen Li, Qifeng Chen, and Vladlen Koltun. Combinatorial optimization with graph convolutional networks and guided tree search. In Advances in Neural Information Processing Systems, pages 539–548, 2018.
    Google ScholarLocate open access versionFindings
  • [Muller-Hannemann and Schwartz, 2000] Matthias MullerHannemann and Alexander Schwartz. Implementing weighted b-matching algorithms: insights from a computational study. Journal of Experimental Algorithmics (JEA), 5:8, 2000.
    Google ScholarLocate open access versionFindings
  • [Naim and Manne, 2018] Md Naim and Fredrik Manne. Scalable b-matching on gpus. In 2018 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW), pages 637–646. IEEE, 2018.
    Google ScholarLocate open access versionFindings
  • [Padberg and Rao, 1982] Manfred W Padberg and M Ram Rao. Odd minimum cut-sets and b-matchings. Mathematics of Operations Research, 7(1):67–80, 1982.
    Google ScholarLocate open access versionFindings
  • [Preis, 1999] Robert Preis. Linear time 1/2-approximation algorithm for maximum weighted matching in general graphs. In Annual Symposium on Theoretical Aspects of Computer Science, pages 259–269.
    Google ScholarLocate open access versionFindings
  • [Vinyals et al., 2015] Oriol Vinyals, Meire Fortunato, and Navdeep Jaitly. Pointer networks. In Advances in Neural Information Processing Systems, pages 2692–2700, 2015.
    Google ScholarLocate open access versionFindings
  • [Wu et al., 2019] Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and Philip S Yu. A comprehensive survey on graph neural networks. arXiv preprint arXiv:1901.00596, 2019.
    Findings
  • [Xu et al., 2018] Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. How powerful are graph neural networks? arXiv preprint arXiv:1810.00826, 2018.
    Findings
  • [Zanfir and Sminchisescu, 2018] Andrei Zanfir and Cristian Sminchisescu. Deep learning of graph matching. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 2684–2693, 2018.
    Google ScholarLocate open access versionFindings
Your rating :
0

 

Tags
Comments