Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For some purposes, awk+xargs can replace hours of work to write a tool to automate some process. It's my go-to for ops work that I don't expect to live very long and just needs to _happen_.

Also, happy 1337 karma day :).



> awk+xargs can replace hours of work

Including machine hours of work.

Wasn't there a famous story of replacing a Hadoop cluster with an awk script (which was a couple orders of magnitude faster)?

Oh yes, there was: https://news.ycombinator.com/item?id=17135841


In fairness it's xargs that is providing the command parallelization, not awk, but I agree both combined are a good match.


If one considers the idea of map reduce to be taking a set of data and ending up with a subset that is relevant, I've used tons of simple things to do that, and never Hadoop.

I think parsing logs to find pain areas or potential exploit/exfil is a map reduce job, for instance, and grep or awk can manage that just fine.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: