Updated Cache Remote Storage (markdown)

Chris Lu 2021-08-21 02:18:55 -07:00
parent ed25296bb3
commit 1333d60cf5

@ -1,6 +1,6 @@
After [[Mount Remote Storage]], you can already read or write files in the mounted directory.
However, sometimes you want to have a command to cache file content directly, instead of find the file, trigger one read in order to cache the file content.
However, sometimes you want to have a command to warm up the file content directly, instead of lazy cache.
And sometimes you may want to reduce local storage usage, to `uncache` the file content.
@ -17,14 +17,30 @@ In `weed shell`, checkout `remote.cache` and `remote.uncache`:
remote.mount -dir=xxx -remote=s3_1/bucket
# after mount, run one of these command to cache the content of the files
remote.cache -dir=xxx
remote.cache -dir=xxx/some/sub/dir
remote.cache -dir=/xxx
remote.cache -dir=/xxx/some/sub/dir
remote.cache -dir=/xxx/some/sub/dir -include=*.pdf
remote.cache -dir=/xxx/some/sub/dir -exclude=*.txt
remote.cache -maxSize=1024000 # cache files smaller than 100K
remote.cache -maxAge=3600 # cache files less than 1 hour old
This is designed to run regularly. So you can add it to some cronjob.
If a file is already synchronized with the remote copy, the file will be skipped to avoid unnecessary copy.
The actual data copying goes through volume severs in parallel.
> help remote.uncache
remote.uncache # keep the metadata but remote cache the file content for mounted directories or files
remote.uncache -dir=xxx
remote.uncache -dir=xxx/some/sub/dir
This is designed to run regularly. So you can add it to some cronjob.
If a file is not synchronized with the remote copy, the file will be skipped to avoid loss of data.
remote.uncache -dir=/xxx
remote.uncache -dir=/xxx/some/sub/dir
remote.uncache -dir=/xxx/some/sub/dir -include=*.pdf
remote.uncache -dir=/xxx/some/sub/dir -exclude=*.txt
remote.uncache -minSize=1024000 # uncache files larger than 100K
remote.uncache -minAge=3600 # uncache files older than 1 hour
```