The archiving feature copies all incoming logs into an S3 bucket that you own. As log data enters Logz.io's pipeline, it is batched together, compressed, and copied to your bucket. This has many use cases:
1. Extended retention
You can save data for as long as you need it with minimal cost. For example, you might need to search logs older than one year to find a security incident. You can use the data in the bucket to find those specific events. You can even re-ingest this data back to Logz.io by copying the relevant data set to a new S3 bucket and adding it as a "Log Shipping" bucket.
It's always good to have a copy of your data for a rainy day. You can setup a policy in S3 to migrate data to Glacier to lower the cost.