diff --git a/Hadoop-Benchmark.md b/Hadoop-Benchmark.md
index 5ab2e07..795c1f3 100644
--- a/Hadoop-Benchmark.md
+++ b/Hadoop-Benchmark.md
@@ -26,7 +26,7 @@ Then get the seaweedfs hadoop client jar.
```
cd share/hadoop/common/lib/
-wget https://oss.sonatype.org/service/local/repositories/releases/content/com/github/chrislusf/seaweedfs-hadoop2-client/1.4.8/seaweedfs-hadoop2-client-1.4.8.jar
+wget https://oss.sonatype.org/service/local/repositories/releases/content/com/github/chrislusf/seaweedfs-hadoop2-client/1.4.9/seaweedfs-hadoop2-client-1.4.9.jar
```
# TestDFSIO Benchmark
diff --git a/Hadoop-Compatible-File-System.md b/Hadoop-Compatible-File-System.md
index 682062b..e1f7582 100644
--- a/Hadoop-Compatible-File-System.md
+++ b/Hadoop-Compatible-File-System.md
@@ -23,7 +23,7 @@ Maven
com.github.chrislusf
seaweedfs-hadoop3-client
- 1.4.8
+ 1.4.9
or
@@ -31,16 +31,16 @@ or
com.github.chrislusf
seaweedfs-hadoop2-client
- 1.4.8
+ 1.4.9
```
Or you can download the latest version from MavenCentral
* https://mvnrepository.com/artifact/com.github.chrislusf/seaweedfs-hadoop2-client
- * [seaweedfs-hadoop2-client-1.4.8.jar](https://oss.sonatype.org/service/local/repositories/releases/content/com/github/chrislusf/seaweedfs-hadoop2-client/1.4.8/seaweedfs-hadoop2-client-1.4.8.jar)
+ * [seaweedfs-hadoop2-client-1.4.9.jar](https://oss.sonatype.org/service/local/repositories/releases/content/com/github/chrislusf/seaweedfs-hadoop2-client/1.4.9/seaweedfs-hadoop2-client-1.4.9.jar)
* https://mvnrepository.com/artifact/com.github.chrislusf/seaweedfs-hadoop3-client
- * [seaweedfs-hadoop3-client-1.4.8.jar](https://oss.sonatype.org/service/local/repositories/releases/content/com/github/chrislusf/seaweedfs-hadoop3-client/1.4.8/seaweedfs-hadoop3-client-1.4.8.jar)
+ * [seaweedfs-hadoop3-client-1.4.9.jar](https://oss.sonatype.org/service/local/repositories/releases/content/com/github/chrislusf/seaweedfs-hadoop3-client/1.4.9/seaweedfs-hadoop3-client-1.4.9.jar)
# Test SeaweedFS on Hadoop
diff --git a/run-Spark-on-SeaweedFS.md b/run-Spark-on-SeaweedFS.md
index 00e99a2..6f8081c 100644
--- a/run-Spark-on-SeaweedFS.md
+++ b/run-Spark-on-SeaweedFS.md
@@ -11,12 +11,12 @@ To make these files visible to Spark, set HADOOP_CONF_DIR in $SPARK_HOME/conf/sp
## installation not inheriting from Hadoop cluster configuration
-Copy the seaweedfs-hadoop2-client-1.4.8.jar to all executor machines.
+Copy the seaweedfs-hadoop2-client-1.4.9.jar to all executor machines.
Add the following to spark/conf/spark-defaults.conf on every node running Spark
```
-spark.driver.extraClassPath=/path/to/seaweedfs-hadoop2-client-1.4.8.jar
-spark.executor.extraClassPath=/path/to/seaweedfs-hadoop2-client-1.4.8.jar
+spark.driver.extraClassPath=/path/to/seaweedfs-hadoop2-client-1.4.9.jar
+spark.executor.extraClassPath=/path/to/seaweedfs-hadoop2-client-1.4.9.jar
```
And modify the configuration at runtime:
@@ -37,8 +37,8 @@ And modify the configuration at runtime:
1. change the spark-defaults.conf
```
-spark.driver.extraClassPath=/Users/chris/go/src/github.com/chrislusf/seaweedfs/other/java/hdfs2/target/seaweedfs-hadoop2-client-1.4.8.jar
-spark.executor.extraClassPath=/Users/chris/go/src/github.com/chrislusf/seaweedfs/other/java/hdfs2/target/seaweedfs-hadoop2-client-1.4.8.jar
+spark.driver.extraClassPath=/Users/chris/go/src/github.com/chrislusf/seaweedfs/other/java/hdfs2/target/seaweedfs-hadoop2-client-1.4.9.jar
+spark.executor.extraClassPath=/Users/chris/go/src/github.com/chrislusf/seaweedfs/other/java/hdfs2/target/seaweedfs-hadoop2-client-1.4.9.jar
spark.hadoop.fs.seaweedfs.impl=seaweed.hdfs.SeaweedFileSystem
```