Updated run Spark on SeaweedFS (markdown)

Chris Lu 2020-10-18 14:19:18 -07:00
parent b4a6e86656
commit 4579555f4e

@ -60,8 +60,9 @@ file:///usr/local/spark/examples/jars/spark-examples_2.12-3.0.0.jar
# A Full Example
Here is my local example switching everything to SeaweedFS.
1. this is my local spark-defaults.conf
Here is my local example switching everything to SeaweedFS. In the `/usr/local/spark/conf/spark-defaults.conf` file,
1. this is my local `/usr/local/spark/conf/spark-defaults.conf`
```
spark.eventLog.enabled=true
spark.sql.hive.convertMetastoreOrc=true
@ -87,7 +88,7 @@ spark.hadoop.fs.defaultFS=seaweedfs://localhost:8888
```
2. create the spark history folder
```
$ curl -X POST http://192.168.2.3:8888/spark2-history/
$ curl -X POST http://192.168.2.4:8888/spark2-history/
```
3. Run a spark shell
```