摘要
Spark作为目前大数据处理领域广泛使用的计算平台,合理分配集群资源对Spark作业性能优化有着重要的作用.性能预测是集群资源分配优化的基础和关键,本文正是基于此提出了一种Spark性能预测模型.文中选取作业执行时间作为Spark性能衡量指标,提出了Spark作业关键阶段的概念,通过运行小批量数据集来获取关键阶段的运行时间和作业输入数据量之间关系,从而构建了Spark性能预测模型.实验结果表明该模型较为有效.
Spark is widely used as a computing platform for large data processing, reasonable allocation of cluster resources plays an important role in the operation of Spark performance optimization. The performance prediction is the basis and key of cluster resource allocation optimization, thus we put forward a Spark performance prediction model in this paper. This paper selects the job execution time as a measure indicator of Spark performance, and put forward the concept of key Stage of Spark job. Finally, we built the model by analyzing relationships between the key Stages and the amount of input data through running a small quantity of data. The experimental results show that the model is effective
作者
葛庆宝
陶耀东
高岑
田月
孟祥茹
GE Qing-Bao;TAO Yao-Dong;GAO Cen;TIAN Yue;MENG Xiang-Ru(University of Chinese Academy of Sciences, Beijing 100049, China;Shenyang Institute of Computing Technology, Chinese Academy of Sciences, Shenyang 110168, China)
出处
《计算机系统应用》
2018年第8期232-236,共5页
Computer Systems & Applications