? yep,u can submit a app to spark ensemble by spark-submit command ,e.g.
spark-submit --master spark://gzsw-02:7077 --class org.apache.spark.examples.JavaWordCount --verbose --deploy-mode client ~/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-examples-1.4.1-hadoop2.4.0.jar spark/spark-1.4.1-bin-hadoop2.4/RELEASE,spark/spark-1.4.1-bin-hadoop2.4/README.md
?
? but also,u can spawn a command to run a app via spark-shell,
spark-shell --jars lib/spark-examples-1.4.1-hadoop2.4.0-my.jar --master spark://gzsw-02:7077 --executor-memory 600m --total-executor-cores 16
? ?note:this param total-executor-cores is a must,else u will get a?java.lang.OutOfMemoryError: Java heap space exception after this log
?
15/11/25 12:08:17 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.13:56889/user/Executor#-1124709965]) with ID 0
?after than ,i check the master ui,i found that the app‘s cores shown as the max integer value.so u can add this property to set used cores right.
?
? then ,u can import the entry class of your app like this?
import class.to.your.appentry
? followed by invoking the entry method
val arr = ...//params for running this app JavaWordCount.main(arr)
? ?but in this demo,the app will issue a new sparkcontext which differs with the default one initiated by spark-shell,so launcher will fail to complain :
WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
? so u can add a property below in the spark-defaults.conf to ignore it
spark.driver.allowMultipleContexts=true
? ?after all,u will the app is launched same as command spark-submit.
?
ref:
spark-spawn a app via spark-shell VS spark-submit
原文:http://leibnitz.iteye.com/blog/2259069