Expose: Experimental Performance Evaluation Of Stream Processing Engines Made Easy

PERFORMANCE EVALUATION AND BENCHMARKING (TPCTC 2020)(2021)

引用 1|浏览1
暂无评分
摘要
Experimental performance evaluation of stream processing engines (SPE) can be a great challenge. Aiming to make fair comparisons of different SPEs raises this bar even higher. One important reason for this challenge is the fact that these systems often use concepts that require expert knowledge for each SPE. To address this issue, we present Expose, a distributed performance evaluation framework for SPEs that enables a user through a declarative approach to specify experiments and conduct them on multiple SPEs in a fair way and with low effort. Experimenters with few technical skills can define and execute distributed experiments that can easily be replicated. We demonstrate Expose by defining a set of experiments based on the existing NEXMark benchmark and conduct a performance evaluation of Flink, Beam with the Flink runner, Siddhi, T-Rex, and Esper, on powerful and resource-constrained hardware.
更多
查看译文
关键词
Performance evaluation, Stream processing, Distributed experiments
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要