淘先锋技术网

首页 1 2 3 4 5 6 7

1、

mv slaves.template slaves

slaves 文件添加

hadoop.slave01
hadoop.slave02
hadoop.slave03

  

2、

cp spark-env.sh.template spark-env.sh 
spark-env.sh 文件添加
SPARK_MASTER_HOST=hadoop.slave01
SPARK_MASTER_PORT=7077
export JAVA_HOME=/usr/java/jdk1.8.0_201

  

3、JobHistoryServer

修改spark-default.conf.

spark.eventLog.enabled           true
spark.eventLog.dir               hdfs://hadoop.slave01:9000/directory

修改spark-env.sh

export SPARK_HISTORY_OPTS="-Dspark.history.ui.port=18080 
-Dspark.history.retainedApplications=30 
-Dspark.history.fs.logDirectory=hdfs://hadoop.slave01:9000/directory"

  

4、在hdfs 创建

hadoop fs -mkdir /directory

  

 

转载于:https://www.cnblogs.com/Jomini/p/11609805.html