Screw promotion. I just want a job that provides intrinsic motivation (meaningful, inspiring work; Flow), and pays enough for me to make ends meet and to save reasonably.
There are three problems:
- many companies pay like crap, so if (God forbid) you want to save some money, a promotion is required (the only way to increase benefits is to get promoted);
- meaningful work is a unicorn in its own right;
- most annoyingly, a worker that is in their comfort zone and has been delivering consistently well in their role, will inevitably be forced to "grow" and "develop their career", or will be called a "straggler", at an American corporation.
Consistent excellence at a certain level is not "stagnation", it may just as well be deliberate stability. Infinite growth (or at least, infinite perturbation), in the personal context, is an unfathomable mania of American corporations.
Sorry for being a party pooper, but it didn't take me 5 minutes to find an integer overflow in this code (which I've never seen before), as of commit 2443ff581ccd.
The public function nsfb_set_geometry() takes "width" and "height" as "int" values. Assume those are positive. Then we pass them to nsfb->surface_rtns->geometry().
Assume our surface is implemented by "surface/ram.c"; thus the call is made to ram_set_geometry(). There we store the passed-in "int" params into fields of "nsfb" (also ints). Then we do
Unchecked multiplication between signed integers (nsfb->width * nsfb->height); not only can it overflow and yield a bogus result, if that happens, it's even undefined behavior.
It sounds like your concern is that if the caller of the public function nsfb_set_geometry() passes in certain arguments, they can invoke undefined behavior, because the function doesn't correctly handle the case where the desired framebuffer is larger than 256 mebibytes (assuming 32-bit ints), and it may suffer an integer arithmetic overflow.
That doesn't seem very surprising; nearly any function in C can be crashed by passing it invalid arguments. For example, printf((char)37), free((char)main), or memcpy(argv[0], argv[1], 10485760).
Perhaps your concern is that the 256-mebibyte limitation isn't documented? libnsfb in general has very little documentation; this is the only documentation I see for nsfb_set_geometry():
/** Alter the geometry of a surface
*
* @param nsfb The context to alter.
* @param width The new display width.
* @param height The new display height.
* @param format The desired surface format.
*/
int nsfb_set_geometry(nsfb_t *nsfb, int width, int height, enum nsfb_format_e format);
The README says:
API documentation
-----------------
Currently, there is none. However, the code is well commented and the
public API may be found in the "include" directory. The testcase sources
may also be of use in working out how to use it.
So I would say that, if you're concerned about documentation, there are much greater deficiencies in documentation than the documentation of this particular limitation.
Perhaps your concern is that 256 mebibytes is actually a reasonable size for a framebuffer, not an invalid argument? With the 32-bit formats all modern displays seem to use, that would be 8192 × 8192. That seems like a colorable argument; I've worked with some images larger than that since last millennium. But it still seems serviceable for most purposes.
With 16-bit ints, it could fail if the framebuffer was larger than 4096 bytes, which seems like a more serious problem, but I don't know if libnsfb can be built on 16-bit and 8-bit platforms.
Please note that you should ensure that overflow doesn't happen, not detect when it happens. Once you let it happen, it's undefined behavior.
But you don't need to check each operation to ensure that none of them overflow. If you know that b and c are supposed to be bounded between -10 and +10, for example, the above line can't overflow. So just check that your supposition holds. In most cases, that boils down to a check on the inputs at the entry of the function.
> Do you prove that every line of arithmetic in your program will not overflow
My point is the analysis takes time, training, and is easy to regress. In practice programs operate within a reasonable N and if you push the limits they will fail. Or the devs wait for a bug report, and then set a pessimistic limit on user input.
Also undefined != crash. Your compiler has options for what to do when signed overflow is detected.
I don'tknow why we are having this discussion. The question was whether I (but I extrapolate to mean every programmer should) prove that not a single line can overflow, and the answer is yes, usually by proving that some variables are bounded.
Of course we can speculate if there are ways to mitigate the effects of, instead of avoiding, overflows, if they are free or come with performance penalties, and if the final result is a crash, a graceful exit, or a continued run as if nothing happened, and which is worse.
For the record, I think the answers are: there are, they're not free, and the latter is the worst.
AmigaOS, I believe from 2.0 onwards, could do that[0].
Width and Height being 16bit attributes of the struct for the requested screen.
I haven't seen an actual monitor able to display the whole thing at once, but that's fine, because you can make your screen gigantic yet set the video output to a much lower resolution.
Intuition will let you scroll through it by moving the mouse pointer past the edge.
I mean the integer overflow is the least of your problems. If you try to create a 64000*64000 texture most drivers are going to bark at you anyways in the best case.
There's this diagram, and there's David Graeber's book Bullshit Jobs.
"What you love" and "What you are good at" certainly have a non-empty intersection, but that's mostly a distinct set from "what you can be paid for". "What you are good at" and "What you can be paid for" also have a non-empty intersection, but that set is again (mostly) distinct from "what you love". In brief, you can enjoy work, but then it will pay shit, or you can make money, but you'll hate it.
The most interesting part however is the right hand side. "What you can be paid for" and "What the world needs" have a practically empty intersection. Regardless of both personal skill and drive, there is effectively zero money available for the sorest needs of society. (Public healthcare (including mental health), public education, public infrastructure, etc.)
The obvious question to ask about this purported "pick two" triad is, why must that be so?
- If one is good at their job, why does that imply that either they won't be paid well, or they'll hate it?
- If one enjoys their job, why does that imply they must be paid poorly or suck at it?
- If one is paid well, why does that imply they will be eaten alive by work or terrible at their job?
The assertions such diagrams make just don't stand up to scrutiny when viewed in reverse. They should stand up to symmetry, and clearly do not; the veneer of logic is peeled away. Instead it reveals the underlying issue: they serve only to elucidate a cynical outlook.
Perhaps in general, I'll admit, there is presently a shortage of opportunities working for the public good; but I'm reluctant to even give an inch on that because it lends itself to a cynical belief system about the world which the statement alone does not imply: it is not necessarily a true inference to say that, if there is a shortage, there will never be; or, that if one wants such a job, they will never be able to get it and best give up early.
Don't let cynicism take you. It will take, and take, and take, and leave you only table scraps of joy.
> The obvious question to ask about this purported "pick two" triad is, why must that be so?
It is not a law of the universe, so the answer to your question is "it isn't necessarily". But even if it isn't always true, it's usually true. And thus it's a useful metric to keep in mind. Being lucky enough to get all three qualities in your job is rare, and you can't expect that it'll happen.
> The problem is that vendors of hardware [...] do not want the OS to have full control over the hardware
I agree. At least the first half of the presentation blames the sordid status quo on Linux, all the while it is actually the responsibility of the hardware vendors. Linux not being the boot loader, Linux not being the firmware, Linux not being the secure firmware, etc etc etc is all the fault of the hardware vendors. They keep everything closed; even on totally mainstream architectures. On x86, whatever runs in SMM, whatever initializes the RAM chips, etc is all highly guarded intellectual property. On the handful select boards where everything is open (Raptor Talos II?), or reverse engineered, you get LinuxBoot, Coreboot, ... Whoever owns the lowest levels of the architecture, dictates everything; for example where Linux may run.
> Meanwhile, companies like Apple who integrate everything can have full control
Yes. Conway's law. As long as your SoC "congeals" from parts from a bunch of vendors, your operating system (in the broad sense the presenter uses the term in) is going to be a hodge-podge too. At best, you will have formal interfaces / specifications between components, and open source code for each component, but the whole will still lack an overarching design.
Edited to add: systems are incredibly overcomplicated too; they're perverse. To me, they've lost all appeal. They're unapproachable. I wish I had started my professional career twenty years earlier, when C (leading up to C89) still closely matched the hardware. (But I would have had to be born twenty years earlier for that :/)
Edit#2: the suggestion to build our own hardware is completely impractical. That only makes the barrier to entry higher. (IIRC, Linus Torvalds at one point wrote that ARM64 in Linux wasn't getting many contributions becasue there were simply no ARM64 workstations and laptops for interested individuals to buy and play with.)
While I largely agree, I think this is inaccurate:
> Whoever owns the lowest levels of the architecture, dictates everything
I think in IT, the people who can create most complexity for others, while keeping things relatively simpler for themselves, can dictate. Because these people then can sell the expertise, since they "produce" it cheaper than everyone else.
Using HW barriers, or just closed-sourcing the stuff just happen to be quite effective ways how to make things complex for others and simple for yourself. Another way is to create your own language, standard or API. Yet another way is network barrier and data ownership (aka SaaS).
My point is, it's possible to dictate on any level, not just the lowest.
Another area that could be open and cooperating in the operating system is network controllers, most have an offload engine of some kind but you can't extend what it does or fix bugs in it.
> So what does all the IT optimization bring? Just more wealth for the owners [...] It is time people in IT got to understand this
I understand it alright, but I'm trapped. Closer to 50 than to 40, I've got a family to run. I could be interested in another profession, but our daily lives & savings would tank if I stopped working, for learning another profession. Also, there's no other profession that I could realistically learn that would let me take nearly the same amount of money home every month. If someone lives alone, they could adjust their standard of living (-> downwards, of course); how do you do that for a family?
Furthermore, there is no switchover between "soulless software job for $$$" and "inspiring software job for $". There are only soulless jobs, only the $ varies. Work sucks absolutely everywhere; the only variable is compensation -- at best we can get one that "sucks less".
When I was a teenager, I could have never dreamt that programming would devolve into such a cruel daily grind for me. Mid-life crisis does change how we look at things, doesn't it. We want more meaning to our work (society has extremely decoupled livelihood from meaning), but there's just no way out. Responsibilities, real or imaginary, keep us trapped. I'd love to reboot my professional life, but the risks are extreme.
FWIW, I still appreciate interesting tasks at work; diving into the details lets me forget, at least for a while, how meaningless it all is.
There are three problems:
- many companies pay like crap, so if (God forbid) you want to save some money, a promotion is required (the only way to increase benefits is to get promoted);
- meaningful work is a unicorn in its own right;
- most annoyingly, a worker that is in their comfort zone and has been delivering consistently well in their role, will inevitably be forced to "grow" and "develop their career", or will be called a "straggler", at an American corporation.
Consistent excellence at a certain level is not "stagnation", it may just as well be deliberate stability. Infinite growth (or at least, infinite perturbation), in the personal context, is an unfathomable mania of American corporations.