Hacker Newsnew | past | comments | ask | show | jobs | submit | more std_throwaway's commentslogin

Money spent = Emissions generated (direct or indirect by further spending)

If you want to save CO2, then save as much money as you can in paper form. Don't spend, don't invest. Instead keep everything as cash. This also has indirect effects by slowing down the economy which reduces money flow for other people, too.


You can spend your money on digital goods which have a very tiny environmental impact. This doesn't have the result of preventing other people from spending that money but it is at least sustainable if everyone does it.


Indeed. We keep chasing our own tail. It's largely pointless.


"Responsibility" is an abstraction needed for enforcement of the law. It doesn't say much about how things actually are.


What is wrong with those facts? Did they keep copies of those files in other locations? Did they explain the unaccounted money later on?

EDIT: According to other comments who explained what's wrong, the "facts" are wrong and belong into the realm of "911truther" propaganda.


Money continued to disappear and in fact keeps disappearing and the Pentagon just says “oops, e we don’t know where it went, we’ll just permanently offset the books to balance them,” with no further accountability than that (across three administrations). They are basically untouchable even with records physically around; they do not need to resort to destruction of physical evidence so long as the military industrial complex is alive and well.


as mentioned elsewhere, the part of the pentagon that was destroyed was not "the accounting wing" which is also not even a concept that makes sense.

Pretty much every 9/11 truther cites "facts" but then has trouble when the facts turn out to be not true or are willful, limited misinterpretations of reality.


To my knowledge the hard part is writing the input for the software. While in text you have a bit of leeway to glance over some details, with the computer program you must, indeed, write every last relevant detail down and you must get it right. This is very heavy work.


But it's once-off work. Thereafter the computers can take over.


There is a big grey area between "not 100% correct" and "wrong and harmful". Like a faulty computer program that produces correct results in 99.999% of the cases a wrong mathematical statement can still yield useful results in practice. You just don't want to run into those pesky edge cases where it indeed is wrong.

Having a database of all mathematical proofs where all of them are checked for validity, however, is a very useful tool to actually find those edge cases where a proof is wrong and give future mathematicians a solid foundation.

Using AI as an extension of human capabilities is a given to me. Like with heavy machinery, we don't want to do the heavy lifting. We want to steer it properly to gain most benefit.


> a wrong mathematical statement can still yield useful results in practice

Yes and no. For some basic conjectures this might be the case, but one of the greatest utilities of certain theorems is how they can be applied to completely new fields. If a theorem used in this manner is not actually true, then the proof is invalid -- and it might be the case that (several proofs down the line) the conjecture you've ended up proving is almost entirely invalid.

There's also the slippery slope of "Principle of Explosion"-type problems with accepting "mostly right" proofs -- it means you will be able to prove any given statement (even ones which are outright false).


It does not change your existing integers or compilation time. Only where you use the new types the compiler will have more work to do. What would be the alternative in those places? Manual code? Preprocessor macros? Assertions everywhere? A theorem prover?

To me the implementation seems to be a good solution given that you don't want to change the core language.


> What would be the alternative in those places?

Fix the damn language.


As in, adding more features to C++? And then people cry about how complex C++ is.


Stop worrying about 9-bit bytes, only logical shift right for signed integers, ones-complement and sign-bit integers, etc and define this stuff in the standard. PDP-11 was a long time ago. C++20 is heading in the right direction with defining twos-complement. The next good step would be declaring signed overflow as wrapping (eg -fwrapv)


I'm 40 now and I know nothing.


I'm 21 and I know everything. Or at least I think I do...


You are already data. Energy is essentially bits. Matter is an arrangement of information. Biological life and your conscious mind is a temporal phenomenon that arises on the gradient of density.


But the term "under-dosed" suggests an error on the doctor's part. We can't accept that.


Yes, if the doctor admits to a mistake, they are open to getting sued.


Doctors shouldn't ever get sued. The idea of an "individual doctor" is such a 19th century concept. They're gears in a machine- the hospitals. An ideal hospital would never get sued because it performs very well. Instead, hospitals try to never get sued but do not perform very well, because it's cheaper to avoid litigation than it is to improve performance. Improving performance could be incentivized by open transparency, pricing, and competition but they have a local monopoly, so there's no point.

It's a ridiculous legacy system.


So then who should be responsible when a Doctor messes up a surgery?


Mistakes are always going to happen in medicine. Who do you propose gets sued if not the doctor and if not the hospital?


"The devils you know vs. the devils you don't." It's been keeping C alive for decades. Now go is an old boy, too. Rust is the new kid that seasoned developers don't trust. It's got not enough features and yet too many already.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: