Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Redis streams have been a phenomenal addition to my toolbelt in designing realtime ETL/ELT pipelines. Before I had to make do with a way more complicated Pub/Sub + job q (Tasktiger). That all became redundant thanks to Redis streams.

Thank you!

It would really be awesome if there was a built in way to attach/spill/persist individual streams to external data volumes (older/busy streams could run out of memory) and have it support hot swapping.

> Btw this Kafka monster is a total of 2775 lines of code, including comments. 1731 lines of code without comments. In other systems this is the prelude in the copyright notice.

Hilarious, funny and informative :D Upvoted!



Great to hear you enjoying streams - you got my upvote :)


Do you mind sharing more how you use Redis streams for ETL pipelines?


Happy to talk shop anytime, feel free to reach out.

In short - I like to have audit-able dataflows in my pipelines. Streams are inherently timestamped and provide a great way to see how data changed over time. For one, if you had a bug or regression in the pipeline, you can precisely track down the impacted time window - no guessing needed.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: