they'd still need to query their stuff, I guess, so you'd need to trow in there somewhere something to aggregate logs and get the metrics they're tracking out of it - which can totally be done in streaming, without the need of going trough the logs every time, for most metrics.
With that amount of data, streaming and only saving aggregated data is the only sane way. With 1.2TB/hour there is a limit to how much historical data that can be saved anyway, and we're talking about 30% utilization of a 10gbps network interface, so it's beyond using single machines for most usecases.