一、Spark安装部署

1、下载安装

下载地址:https://archive.apache.org/dist/spark/

官方文档:https://spark.apache.org/docs/3.0.0/

  1. 解压安装:tar -zxvf /opt/software/spark-3.0.0-bin-hadoop3.2.tgz -C /opt/module/
  2. 修改文件名:mv /opt/module/spark-3.0.0-bin-hadoop3.2 /opt/module/spark
  3. 配置并生效环境变量:vim /etc/profile.d/my_env.sh
1
2
3
#SPARK_HOME
export SPARK_HOME=/opt/module/spark
export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin

二、

提交任务:bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://hadoop102:7077 ./examples/jars/spark-examples_2.12-3.1.3.jar 10

提交任务:bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://hadoop102:7077 --executor-memory 1G --total-executor-cores 2 examples/jars/spark-examples_2.12-3.0.0.jar 10