1.下载 新版本 spark-1.6.1-bin-hadoop2.6.tgz
root@master:/usr/local/setup_tools# tar -zxvf spark-1.6.1-bin-hadoop2.6.tgz
2.编辑/etc/profile
vi /etc/profile
export SPARK_HOME=/usr/local/spark-1.6.0-bin-hadoop2.6
修改为
export SPARK_HOME=/usr/local/spark-1.6.1-bin-hadoop2.6
3.生效
root@master:/usr/local# source /etc/profile
4.备份spark1.6.1的配置文件
cp -R /usr/local/spark-1.6.1-bin-hadoop2.6/conf/. /usr/local/spark-1.6.1-bin-hadoop2.6/conf.161.bak
5.将原来的1.6.0的配置文件拷贝过来
cp -R /usr/local/spark-1.6.0-bin-hadoop2.6/conf/. /usr/local/spark-1.6.1-bin-hadoop2.6/conf
6 查看1.6.1 的配置
vi spark-env.sh
export SCALA_HOME=/usr/local/scala-2.10.4
export JAVA_HOME=/usr/local/jdk1.8.0_60
expor
spark 1.6.0 升版升级到 spark 1.6.1 spark集群1台master 8台worker设备的详细升版解密
关注
打赏