Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A couple tons spread across 400 million people with a per capita emission of 5 tons per year is in the noise. If we're at the point of trying to hyper optimize there are far more meaningful targets than pipe throughput.


You are arguing against the concept of "division of labor".

You are a few logical layers removed, but fundamentally that is at the heart of this. It isn't just about what you think can or can't be leveraged. Reducing waste in a centralized fashion is excellent because it will enable other waste to be reduced in a self reinforcing cycle as long as experts in their domain keep getting the benefits of other experts. The chip experts make better instructions, so the library experts make better software libs they add their 2% and now it is more than 4%, so the application experts can have 4% more theoughput and buy 4% fewer servers or spend way more than 4% less optimizing or whatever and add their 2% optimization and now we are at more than 6%, and the end users can do their business slightly better and so on in a chain that is all of society. Sometimes those gains are mututed. Sometimes that speed turns into error checking, power saving, more throughput, and every trying to do their best to do more with less.


Absolutely, if your focus is saving emissions don't optimize pipes. But if you optimize an interface people use it's a good thing either way right




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: