Are there any stats including price? Say something like a TB of logs a day, retention for a year? Obviously the older they get, the less likely they are to ever be looked at.
The costs highly depend on the shape of the ingested logs and the type of queries, which are performed over the ingested logs.
For example, VictoriaLogs compresses typical logs from Kubernetes containers by up to 50 times. This means it needs 20GB of persistent storage for saving 1 terabyte of Kubernetes logs. A terabyte of HDD-based persistent storage cost $40/month at Google Cloud. This means you can keep 50TB of Kubernetes logs in VictoriaMetrics at $40/month. Additionally, you'll need to pay a few hundred bucks per month for CPU and RAM for efficient data ingestion and querying.