1.在Master 所在节点的conf/spark-env.sh脚本中添加以下配置:
export SPARK_MASTER_OPTS="$SPARK_MASTER_OPTS -Xdebug -server -Xrunjdwp:transport=dt_socket,address=5005,server=y,suspend=y"
2.stop
root@master:/usr/local/spark-1.6.1-bin-hadoop2.6/sbin# stop-master.sh
3.start
root@master:/usr/local/spark-1.6.1-bin-hadoop2.6/sbin# start-master.sh
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark-1.6.1-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
4.查看日志
root@master:/usr/local/spark-1.6.1-bin-hadoop2.6/sbin# cat /usr/local/spark-1.6.1-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out
Spark Command: /usr/local/jdk1.8.0_60/bin
IMF SPARK 源代码发行定制班 预习课程 Spark框架源码的调试 (2) 从master worker main入口进行调试
关注
打赏