Vcwc: A Versioning Competition Workflow Compiler

LPNMR 2013: Proceedings of the 12th International Conference on Logic Programming and Nonmonotonic Reasoning - Volume 8148(2013)

引用 3|浏览93
暂无评分
摘要
System competitions evaluate solvers and compare state-of-the-art implementations on benchmark sets in a dedicated and controlled computing environment, usually comprising of multiple machines. Recent initiatives such as [6] aim at establishing best practices in computer science evaluations, especially identifying measures to be taken for ensuring repeatability, excluding common pitfalls, and introducing appropriate tools. For instance, Asparagus [1] focusses on maintaining benchmarks and instances thereof. Other known tools such as Runlim http://fmv.jku.at/runlim/ and Runsolver [12] help to limit resources and measure CPU time and memory usage of solver runs. Other systems are tailored at specific needs of specific communities: the not publicly accessible ASP Competition evaluation platform for the 3rd ASP Competition 2011 [4] implements a framework for running a ASP competition. Another more general platform is StarExec [13], which aims at providing a generic framework for competition maintainers. The last two systems are similar in spirit, but each have restrictions that reduce the possibility of general usage: the StarExec platform does not provide support for generic solver input and has no scripting support, while the ASP Competition evaluation platform has no support for fault-tolerant execution of instance runs.Moreover, benchmark statistics and ranking can only be computed after all solver runs for all benchmark instances have been completed.
更多
查看译文
关键词
Benchmark Server, International Planning Competition, Script Support, Competition Track, Versioning Competition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要