From caf903392617396f17478115c59e46b8b8aa745a Mon Sep 17 00:00:00 2001 From: Chris Lu Date: Mon, 28 Sep 2020 21:12:48 -0700 Subject: [PATCH] Updated run Spark on SeaweedFS (markdown) --- run-Spark-on-SeaweedFS.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/run-Spark-on-SeaweedFS.md b/run-Spark-on-SeaweedFS.md index 4598409..f469ac3 100644 --- a/run-Spark-on-SeaweedFS.md +++ b/run-Spark-on-SeaweedFS.md @@ -15,8 +15,8 @@ Copy the seaweedfs-hadoop2-client-x.x.x.jar to all executor machines. Add the following to spark/conf/spark-defaults.conf on every node running Spark ``` -spark.driver.extraClassPath /path/to/seaweedfs-hadoop2-client-x.x.x.jar -spark.executor.extraClassPath /path/to/seaweedfs-hadoop2-client-x.x.x.jar +spark.driver.extraClassPath=/path/to/seaweedfs-hadoop2-client-1.4.8.jar +spark.executor.extraClassPath=/path/to/seaweedfs-hadoop2-client-1.4.8.jar ``` And modify the configuration at runntime: