Thank you for actually posting an example that people can look at; I think most other responders misunderstood the post as asking for more pointless anecdotes filled with superlatives and "trust me bro" sentiments.
People thought they were doubling their productivity and then real, actual studies showed they were actually slower. These types of claims have to be taken with entire quarries of salt at this point.
No, I wouldn't say it's super complex. I make custom 3D engines. It's just that you and I were probably never in any real competition anyway, because it's not super common to do what I do.
I will add that LLMs are very mediocre, bordering on bad, at any challenging or interesting 3D engine stuff. They're pretty decent at answering questions about surface API stuff (though, inexplicably, they're really shit at OpenGL which is odd because it has way more code out there written in it than any other API) and a bit about the APIs' structure, though.
I really don't know how effective LLMs are at that but also that puts you in an extremely narrow niche of development, so you should keep that in mind when making much more general claims about how useful they are.
My bigger point was that not everyone who is skeptical about supposed productivity gains and their veracity is in competition with you. I think any inference you made beyond that is a mistake on your part.
(I did do web development and distributed systems for quite some time, though, and I suspect while LLMs are probably good at tutorial-level stuff for those areas it falls apart quite fast once you leave the kiddy pool.)
P.S.:
I think it's very ironic that you say that you should be careful to not speak in general terms about things that might depend much more on context, when you clearly somehow were under the belief that all developers must see the same kind of (perceived) productivity gains you have.
You discount the value of being intimately familiar with each line of code, the design decisions and tradeoffs because one wrote the bloody thing.
It is negative value for me to have a mediocre machine do that job for me, that I will still have to maintain, yet I will have learned absolutely nothing from the experience of building it.
This to me seems like saying you can learn nothing from a book unless you yourself have written it. You can read the code the LLM writes the same as you can read the code your colleagues write. Moreover you have to pretty explicitly tell it what to write for it to be very useful. You're still designing what it's doing you just don't have to write every line.
"Reading is the creative center of a writer’s life.” — Stephen King, On Writing
You need to design the code in order to tell the LLM how to write it. The LLM can help with this but generally it's better to have a full plan in place to give it beforehand. I've said it before elsewhere but I think this argument will eventually be similar to the people arguing you don't truly know how to code unless you're using assembly language for everything. I mean sure assembly code is better / more efficient in every way but who has the time to bother in a post-compiler world?
Good point! You should generate a website for them with "why ai is not good" articles. Have it explore all possible angles. Make it detective style story with appealing characters.
I would also take those studies with a grain of salt at this point, or at least taking into consideration that a model from even a few months ago might have significant enough results from the current frontier models.
And in my personal experience it definitely helps in some tasks, and as someone who doesn't actually enjoy the actual coding part that much, it also adds some joy to the job.
Recently I've also been using it to write design docs, which is another aspect of the job that I somewhat dreaded.
I think the bigger part of those studies was actually that they were a clear sign that whatever productivity coefficient people were imagining back then was clearly a figment of their imagination, so it's useful to take that lesson with you forward. If people are saying they're 2 times productive with LLMs, it's still likely the case that a large part of that is hyperbole, whatever model they're working with.
It's the psychology of it that's important, not the tool itself; people are very bad at understanding where they're spending their time and cannot accurately assess the rate at which they work because of it.
I like coming up with the system design and the low level pseudo code, but actually translating it to the specific programming language and remembering the exact syntax or whatnot I find pretty uninspiring.
Same with design docs more or less, translating my thoughts into proper and professional English adds a layer I don't really enjoy (since I'm not exactly great at it), or stuff like formatting, generating a nice looking diagram, etc.
Just today I wrote a pretty decent design doc that took me two hours instead of the usual week+ slog/procrastination, and it was actually fairly enjoyable.
As far as "programming skill" goes, writing "good and idiomatic" Python is pretty bottom of the barrel. I don't think the GP is all that off, most people who are famous for some programming-adjacent skill (or even programming) aren't good at programming.
>As far as "programming skill" goes, writing "good and idiomatic" Python is pretty bottom of the barrel.
Complete bullshit. Beginning programmers writing good and idiomatic Python isn't "bottom of the barrel", or did you think I was recommending his videos to 20 year seasoned pros to improve their coding?
Some people on this site need to check their arrogance and humble themselves a bit before opening their mouths.
YC companies have pretty much always been overhyped trivial bullshit. I'm not surprised it's even worse nowadays, but it's never been more than a dog and pony show for bullshit.
> To the point where I recommend people who are dabbling in GPU work grab a Mac (Apple Silicon often required) since it's such a better learning and experimentation environment.
I don't know, buying a ridiculously overpriced computer with the least relevant OS on it just to debug graphics code written in an API not usable anywhere else doesn't seem like a good idea to me.
For anyone who seriously does want to get into this stuff, just go with Windows (or Linux if you're tired of what Microsoft is turning Windows into, you can still write Win32 applications and just use VK for your rendering, or even DX12 but have it be translated, but then you have to debug VK code while using DX12), learn DX12 or Vulkan, use RenderDoc to help you out. It's not nearly as difficult as people make it seem.
If you've got time you can learn OpenGL (4.6) with DSA to get a bit of perspective why people might feel the lower-level APIs are tedious, but if you just want to get into graphics programming just learn DX12/VK and move on. It's a lower-level endeavor and that'll help you out in the long run anyway since you've got more control, better validation, and the drivers have less of a say in how things happen (trust me, you don't want the driver vendors to decide how things happen, especially Intel).
P.S.: I like Metal as an API; I think it's the closest any modern API got to OpenGL while still being acceptable in other ways (I think it has pretty meh API validation, though). The problem is really that they never exported the API so it's useless on the actual relevant platforms for games and real interactive graphics experiences.
> I'm having to pick up some perl now, and while I don't interact with the community, it surely _feels_ like it was written by wizards, for wizards. Obscure, non-intuitive oneliners, syntax that feels like it was intentionally written to be complicated, and a few other things that feel impossible to understand without reading the docs.
Perl 5 is to me a classic scripting language (as opposed to an actual programming language), for both good and bad. I've always viewed Perl scripts with exactly that perspective and I find them fine/good. In contrast, I find Python to be a mediocre scripting language, an okay-ish programming language from a syntax perspective and a bottom-5 programming language in pretty much every other regard.
That's interesting; I feel like like it's the opposite: What used to be great work is basically unfathomable today and what used to be regular productivity is seen as almost superhuman. People get almost nothing done nowadays and I've never felt like expectations were ever really at the level they ought to be at, especially with how much money people are getting.
> Good at self-promotion == just good in most cases for most practical purposes whether it's factual or not, arguably.
This does not seem true to me. Most popular programming YouTubers are demonstrably great at self-promotion but tend to be mediocre or bad programmers who know very little, even about the topics they talk about.
If anything we have plenty of examples of where being good at self-promotion correlates inversely with actual skill and knowledge.
With that said, I wouldn't classify Brendan Gregg as being good at self-promotion.
In terms of their compensation though, it functionally doesn't really matter, and that's somewhat true for being a professional as well, it's usually only important how many people think you're good enough. A job is often as or more political as it is technical
While I understand that people might downvote the parent post because it seems in bad taste and touches on a culturally sensitive thing, haven't we all wondered this? Why is it that the poor give relatively more generously than the rich?
It's such an interesting phenomenon that so many ultra rich people are essentially just hoarding wealth beyond what they should reasonably be able to even have use of in multiple generations. Worse, some of them simply cannot seem to get enough and will literally commit crimes and/or do indisputably morally wrong things to get even more.
I would personally never ask anyone this, and I wouldn't expect anyone who could answer it to actually answer it, but I think what komali2 asked is one of the most interesting questions out there.
I think it might be because I'm autistic but can you help me understand why it's in bad taste to ask it? I see YouTube videos of people talking about how they became really wealthy or showing off their houses or cars, and this person was talking about his bank account directly and has mentioned the 3 comma thing before, so I'm a bit confused why it's not ok to ask more about it.
You did mention something I didn't think of which is lifetimes, I guess if someone wanted to guarantee an ultra wealthy lifestyle for all generations of their kids and grandkids forever, that would be a reason to hoard wealth into the hundreds of millions.
I had this issue a few years ago with certain applications and came to find out that it had to do specifically with them using GTK. I googled for it and found the fix, and after all the same apps started practically instantly. Could this be what you're running into?
(I haven't used ghostty so I wouldn't know whether it's actually fast to start up, but what you wrote reminded me about this particular issue.)
reply