Skip to content

Commit ee571d7

Browse files
yaooqinnHyukjinKwon
authored andcommitted
[SPARK-22466][SPARK SUBMIT] export SPARK_CONF_DIR while conf is default
## What changes were proposed in this pull request? We use SPARK_CONF_DIR to switch spark conf directory and can be visited if we explicitly export it in spark-env.sh, but with default settings, it can't be done. This PR export SPARK_CONF_DIR while it is default. ### Before ``` KentKentsMacBookPro  ~/Documents/spark-packages/spark-2.3.0-SNAPSHOT-bin-master  bin/spark-shell --master local Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 17/11/08 10:28:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/11/08 10:28:45 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. Spark context Web UI available at http://169.254.168.63:4041 Spark context available as 'sc' (master = local, app id = local-1510108125770). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.3.0-SNAPSHOT /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_65) Type in expressions to have them evaluated. Type :help for more information. scala> sys.env.get("SPARK_CONF_DIR") res0: Option[String] = None ``` ### After ``` scala> sys.env.get("SPARK_CONF_DIR") res0: Option[String] = Some(/Users/Kent/Documents/spark/conf) ``` ## How was this patch tested? vanzin Author: Kent Yao <[email protected]> Closes apache#19688 from yaooqinn/SPARK-22466.
1 parent 6447d7b commit ee571d7

File tree

3 files changed

+10
-14
lines changed

3 files changed

+10
-14
lines changed

bin/load-spark-env.cmd

+5-7
Original file line numberDiff line numberDiff line change
@@ -19,15 +19,13 @@ rem
1919

2020
rem This script loads spark-env.cmd if it exists, and ensures it is only loaded once.
2121
rem spark-env.cmd is loaded from SPARK_CONF_DIR if set, or within the current directory's
22-
rem conf/ subdirectory.
22+
rem conf\ subdirectory.
2323

2424
if [%SPARK_ENV_LOADED%] == [] (
2525
set SPARK_ENV_LOADED=1
2626

27-
if not [%SPARK_CONF_DIR%] == [] (
28-
set user_conf_dir=%SPARK_CONF_DIR%
29-
) else (
30-
set user_conf_dir=..\conf
27+
if [%SPARK_CONF_DIR%] == [] (
28+
set SPARK_CONF_DIR=%~dp0..\conf
3129
)
3230

3331
call :LoadSparkEnv
@@ -54,6 +52,6 @@ if [%SPARK_SCALA_VERSION%] == [] (
5452
exit /b 0
5553

5654
:LoadSparkEnv
57-
if exist "%user_conf_dir%\spark-env.cmd" (
58-
call "%user_conf_dir%\spark-env.cmd"
55+
if exist "%SPARK_CONF_DIR%\spark-env.cmd" (
56+
call "%SPARK_CONF_DIR%\spark-env.cmd"
5957
)

bin/load-spark-env.sh

+3-6
Original file line numberDiff line numberDiff line change
@@ -29,15 +29,12 @@ fi
2929
if [ -z "$SPARK_ENV_LOADED" ]; then
3030
export SPARK_ENV_LOADED=1
3131

32-
# Returns the parent of the directory this script lives in.
33-
parent_dir="${SPARK_HOME}"
32+
export SPARK_CONF_DIR="${SPARK_CONF_DIR:-"${SPARK_HOME}"/conf}"
3433

35-
user_conf_dir="${SPARK_CONF_DIR:-"$parent_dir"/conf}"
36-
37-
if [ -f "${user_conf_dir}/spark-env.sh" ]; then
34+
if [ -f "${SPARK_CONF_DIR}/spark-env.sh" ]; then
3835
# Promote all variable declarations to environment (exported) variables
3936
set -a
40-
. "${user_conf_dir}/spark-env.sh"
37+
. "${SPARK_CONF_DIR}/spark-env.sh"
4138
set +a
4239
fi
4340
fi

conf/spark-env.sh.template

+2-1
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,8 @@
3232
# - SPARK_LOCAL_DIRS, storage directories to use on this node for shuffle and RDD data
3333
# - MESOS_NATIVE_JAVA_LIBRARY, to point to your libmesos.so if you use Mesos
3434

35-
# Options read in YARN client mode
35+
# Options read in YARN client/cluster mode
36+
# - SPARK_CONF_DIR, Alternate conf dir. (Default: ${SPARK_HOME}/conf)
3637
# - HADOOP_CONF_DIR, to point Spark towards Hadoop configuration files
3738
# - YARN_CONF_DIR, to point Spark towards YARN configuration files when you use YARN
3839
# - SPARK_EXECUTOR_CORES, Number of cores for the executors (Default: 1).

0 commit comments

Comments
 (0)