码迷,mamicode.com
首页 > 其他好文 > 详细

spark-submit提交作业过程

时间:2015-03-05 08:06:06      阅读:195      评论:0      收藏:0      [点我收藏+]

标签:

1. 作业提交方法以及参数

我们先看一下用Spark Submit提交的方法吧,下面是从官方上面摘抄的内容。
# Run application locally on 8 cores
./bin/spark-submit   --class org.apache.spark.examples.SparkPi   --master local[8]   /path/to/examples.jar   100

# Run on a Spark standalone cluster
./bin/spark-submit   --class org.apache.spark.examples.SparkPi   --master spark://207.184.161.138:7077 \
  --executor-memory 20G   --total-executor-cores 100   /path/to/examples.jar   1000

# Run on a YARN cluster
export HADOOP_CONF_DIR=XXX
./bin/spark-submit   --class org.apache.spark.examples.SparkPi   --master yarn-cluster \  # can also be `yarn-client` for client mode
  --executor-memory 20G   --num-executors 50   /path/to/examples.jar   1000

# Run a Python application on a cluster
./bin/spark-submit   --master spark://207.184.161.138:7077 \
  examples/src/main/python/pi.py   1000

 

spark-submit提交作业过程

标签:

原文地址:http://www.cnblogs.com/gaopeng527/p/4314308.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!