that's exactly the point, the _lack of_ humans during its evolution is what it has to do with us, a mushroom may be poisonous to the species that it evolved around, while at the same time not being poisonous to humans
consumer cards don't share dies with datacenter cards, but they do share dies with workstation cards (the formerly quadro line), ex. the GB202 die is used by both the RTX PRO 5000/6000 Blackwell and the RTX 5090
it's worth noting as well that the windows API has been a thing since 1985, _before_ C was standardized, such old versions of C were incredibly untyped
No. Where BCPL doesn't have types, C does have types. They're not very good types, we're not going to get any prizes in a language design class, they're basically all the machine integers wearing fancy dress, but they are types.
You know how in C you can write "int k" or "char v" ? That's types. BCPL doesn't have that. Yes that's in the original K&R book, no you didn't need to wait until 1989 to have types in C, they were obligatory from the outset.
Always possible; I haven't seen any legit mining malware in a long time. Mostly because it's so easy to spot with the CPU issue, and that CPU based mining even on someone else's machine isn't worth the trouble any more. But perhaps my circle/context is not that of a lot of people.
that's the altered part -- those more restrictive terms of the VVVVVV license only apply to the _assets_, and the license for the source code itself is far more liberal:
if your data's sequential, creating an iterator in C++ is as simple as returning a begin and end pointer, and will be optimized away by any level other than O0
But iterating over pointers is once again optimized with lots of undefined behavior at the corners. So you are replacing one source of undefined behavior with another.
Replacing undefined behavior at the program-level with undefined behavior written and tested as part of the standard library, usually vendored and distributed in concert with the compiler, seems like an obvious net-positive to me.
Of course a basic iteration between begin()/end() will never contain out of range elements, but neither will valid increment between two integers. No need for iterators in that case either.
Say I want to do something fancy, like getting element number 11 from an array.
With an integer index I can pass 11, with random access iterators I can use begin() + 11.
Now my array only has five elements. So I check.
11 < 5? Comparison valid, successfully avoided a crash.
begin() + 11 < end() ? How where the rules for pointer comparison again, something about within an allocation and one past the end?
> something about within an allocation and one past the end?
Yeah, I forgot about that. So I agree there is some subtly which is likely to catch beginners.
Your example could safely be:
if (std::distance(begin, end) > 5)
Another approach I would recommend is to write a `guarded_advance` which takes an integer and the end pointer.
Also note that the situation you are describing is still a little unusual because the baseline assumption is it takes linear time to advance an iterator by more than 1 increment.
> but neither will valid increment between two integers. No need for iterators in that case either.
The purpose of an iterator is to abstract data structure access. The coordinate inside a complex data structure may not be representable by an integer.
It's sort of this. Although it would've been nice if the size of the IP wasn't restricted, so one day we can add an optional segment on top and connect the whole Milky Way, or something.