1. 判断是否cygwin环境
2. 设置SCALA_VERSION
3. 设置SPARK_HOME
4. 执行conf/spark-env.sh
5. 如果运行类是org.apache.spark.deploy.master.Master或org.apache.spark.deploy.worker.Worker,设置
否则,设置
OUR_JAVA_OPTS="$SPARK_JAVA_OPTS"
6.
1) org.apache.spark.deploy.master.Master:
OUR_JAVA_OPTS="$SPARK_DAEMON_JAVA_OPTS $SPARK_MASTER_OPTS"
2) org.apache.spark.deploy.worker.Worker
OUR_JAVA_OPTS="$SPARK_DAEMON_JAVA_OPTS $SPARK_WORKER_OPTS"3) org.apache.spark.executor.CoarseGrainedExecutorBackend
OUR_JAVA_OPTS="$SPARK_JAVA_OPTS $SPARK_EXECUTOR_OPTS"
4) org.apache.spark.executor.MesosExecutorBackend
同上
5) org.apache.spark.repl.Main
OUR_JAVA_OPTS="$SPARK_JAVA_OPTS $SPARK_REPL_OPTS"
7. 检测java,JAVA_HOME->java命令->退出
8. SPARK_MEM=${SPARK_MEM:-512m}
TODO。。。
9. 执行java -cp $$CLASSPTH $JAVA_OPTS $@
Spark 0.9.0启动脚本——bin/spark-class,布布扣,bubuko.com
Spark 0.9.0启动脚本——bin/spark-class
原文:http://www.cnblogs.com/hujunfei/p/3624592.html