From ad6738db12053375ab25a09efeebc85298a15f91 Mon Sep 17 00:00:00 2001 From: Chris Lu Date: Fri, 13 Nov 2020 13:47:09 -0800 Subject: [PATCH] Updated Filer Change Data Capture (markdown) --- Filer-Change-Data-Capture.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Filer-Change-Data-Capture.md b/Filer-Change-Data-Capture.md index a72e5b1..1273959 100644 --- a/Filer-Change-Data-Capture.md +++ b/Filer-Change-Data-Capture.md @@ -38,4 +38,4 @@ This is basically stream processing or event processing for files. The possible * A **job queue**: upload files to a folder, and processing new files as soon as possible, and delete the processed files. * Do-it-yourself **Data Replication or Backup**. * **Batch processing**: streaming data is cool, but sometimes batching is more efficient. To combine streaming and batching, you can put one batch of new data as a file and trigger the batch processing on that file. -* Folder size statistics. +* Folder size statistics and monitoring.