Hacker News new | past | comments | ask | show | jobs | submit | more swatcoder's comments login

That's not true, as made aware to mainstream people when the Zizian murders made news.

People in crisis are finding their existential grounding in these emerging belief systems about "the simulation" and other scifi-inspired ideas, are taking them exactly as seriously as novel belief systems from other eras, and are assembling into communities that collectively reinforce their beliefs.

I don't think you can really prevent that from happening, as we see it throughout history and geography, sometimes just forthing in small communities over a generation or two and sometimes sweeping across a whole societies in a profound way, but its ultimately common for "speculating existence" to be both casual parlour talk for some people and an emerging dogmatic belief for others. And that's exactly where we're at with this stuff now.


Excellent points and I believe we all live and have always lived with poorly examined belief systems but I think the cynical assumption that folks have a slot for magical fantasies that they'll drive hard during crisies to be but unfair. Not untrue but unfair. It tends to ignore the amount of effort that went to maintain and enshrine these intricate ontologies of the past. It tends to blow past how much of a vehicle they were for other values of societies it reiged in. It underestimates the social functions of having common perspective play out, especially during times of crisies. And most of all, it forget that for a lot of people, they didn't have much evidence to the contrary. I'm not saying that you need modern science to see the magical aspect of some past system of beliefs but they were not exactly like the religious subjects of the modern age

I suppose neither you nor GP claim that most people have this "slot for magical thinking" but I find these properties of the religious thinking that I list above can be fulfilled by perspectives that aren't necessarily magical. And I dare say, such perspectives have supplanted them in general. I guess I'm replying to the adage "everyone has some religion." My reply being, "are you sure that's a religion?". Forgive the hazy brain rant.


It certainly does feel like there are all the right ingredients for a new, never before seen religion to emerge out of America from the elements you describe in the way you describe it, right about now.


> backed by a SasS service and VC money

While it's valid to distinguish this from "FOSS volunteers working for fun" in a narrow sense, I hope most here recognize by now that this is a very big red flag in exactly the same way.

A highly ambitious business soliticing VC funds will not be prioritizing the stable, long tail support for the boring little users that most of us represent.

By necessity, in their best times, they'll be chasing rocket launch opportunities that may rapidly pivot strategy, and in their worst times, they'll find themselves hunting for costly efforts to prune.

The prior invites radical rearchitecture with deprecations and breaking changes, and the latter is how things just become dusty and abandoned (or -- if a backing service was involved -- wholly inoperable).

If you want your original code to hold up for 3 and 5 and 10 years with zero/light maintenance so you can focus on emerging opportunities ofyour own, rather than endless maintenance churn, (it's reasonable that you might not need this) these "SaaS businesses with VC money" need to be seen as the pied piper luring you into the dark wood.


Pretty much everything with strict resource requirements or other reasons for aggressive resource use optimization: embedded, automotive, industrial control sytems, games, operating systems, audio and other (soft) realtime media applications, hft, tools and frameworks for all these things, etc

But you'll note that most of these things are very sensitive to quality of work. That is not something that people find easily on freelancing platforms like upwork and so people seeking contract help for these things don't post their work their.

Instead, buyers generally reach out through established connections ("word of mouth", past coworkers, approved vendor lists) or fall back to hunting through Google and LinkedIn for firms/individuals that look impressive, hoping that the impression holds up during a requirements gathering, bidding, and estimation process.

Given where it sounds like you are in your career, your best bet might be to find work inside a contracting firm that can feed you gigs (they'll do sales, they may do project management, they may staff a team alonside you, they'll take a large cut). If you make good impressions on the clients and colleagues you work with there, they will become the personal connections that feed you independent work some years down the road.

Without taking the time for that, it can pretty had to break in, and much of the work you may be able to find will be miserable table scraps for poor clients who barely know what they're doing themselves.


reasonable freelancing is for noob python,js ,c# guys i undestood your thoughts


There is a huge difference between freelancing and contracting.

With freelancing you go to sites like UpWork and seek out work.

With contracting you engage in professional networking activities, you identify the organizations that might need your C/C++ skills and then pitch to them. You build your reputation and portfolio of successfully completed projects.

Obviously, contracting requires more effort, but it also pays far better. For most of my career I actually had paid work for 5-7 months a year. Much of that time involved working 60-80 hour weeks with quite a bit of travel on top of that. During the "bench time" I would chill and network.


While it's an encouraging developement that many consumers seem to be holding Tesla responsible for its CEO's overt political actions, which seems like a fair tit for tat, Tesla is more than just that one guy.

People working for them -- for the engineering challenges, for the ecological vision, for the opportuinty to secure a living for their family, for whatver reason -- deserve the opportunity to weather the storm for a while.

It may soon be in the company's interest to boot Musk, but that doesn't mean that all the people invested in their vision should be abanonding ship just yet. Thousands of people have done good, earnest work there and needn't be held to immediate account for one person's controversial behavior.

Choosing to buy a different brand of car is a whole different thing than choosing to quit a job and uproot one's life. Give it a minute, maybe.


The psychology is getting weird in a giradian way. The perception of Elon’s power is reaching super natural levels. He controls the government and the stock market, the future of technology, etc.

So Tesla is becoming a voodoo doll for accessing Elon.


While a single talented developer can conceptually complete a whole project (of some kind) on their own, the concrete reality is that they're often being tasked to do so in the context of some time and resource bounded opportunity.

There's only so much code they write per unit time, only so many designs they can consider, only so many meetings they can attend, only so many demonstrations they can perform, only so many regressions they can debug, and really, only so many domains they can master.

Solo projects written by excellent engineers can be stunning works of craft. Many of us prefer to work that way, and accept the compromises of scale or time that are associated with it.

But most projects that you're familiar with need a team to produce them in a way that meets their real-world time and resource requirements. That's where the sports analogy comes in.

(And the same is true for the blacksmith and tailor. One master blacksmith or tailor might do stunning work, but they can't outfit and army or dress a court ball on their own. They need support, and that support often needs to be of a different level of mastery than themselves, if for no reason but to facilitate needed coordination and deference.)


Nobody is seriously claiming that all work on earth can be done by individuals. Obviously the master blacksmith that designs and leads the outfitting of the entire king's army with superior weaponry thanks to his personal leadership and expertise and care ... is not hammering every sword himself. But to call it a team sport is also highly misleading and to say that he hasn't been x10 is also just flat out wrong.


There's truth to what you're saying, but just as in sports teams, you ultimately need a certain number of players to play the game and exceptional people characteristically have an ego that only allows so many of them in the locker room. If you have too many, they get starved for the individual recognition and validation they're used to receiving, leading to crises and clashes and quittings.

Unless your project's scale is reasonably small and focused -- representing the equivalent of a true solo or duo sport like tennis -- you need committed, professional "normal" team members to flesh out the team or you'll just never have enough resources to get done everything that needs to get done.


Comforting yet false assertions. Great engineers tire of working with “normal” engineers, they want to work with others who they respect. A team of only great engineers can have a completely different culture than a team of “normal engineers”. Teams of great engineers are magnets that attract others. Building a great team weirdly does not become harder over time, as your project is derisked you get access to larger and larger pools of talent. Talent concentration pays an enormous dividend and is a worthy investment. Maybe things change when you hit like Facebook/google scale, but at that point… we’ve won anyway, wouldn’t be arguing online anymore.


Even before you hit big scale, there's a lot of boring work that great engineers won't want to do. And what really makes someone a great engineer is the ability to transform a hard problem to something regular engineers can handle the rest of. So I agree that 10x engineers are real and it's often 2 out of 12, but all-star teams don't work, which is why those people often get moved to run new teams/projects instead.


The reason people aren't motivated to do boring work is because of a poor culture of ownership, it has nothing to do with skill or "10x" stuff. Having a team of only great people allows a much deeper culture of ownership and its much easier to get people to work on the boring stuff. Allstar teams absolutely work and are the best way to work.


Uhh, I think it's becsuse boring stuff is boring. If you start to do repetitive plumbing aren't much "greater" than the ones working assembly lines, no?

People who describe themselves as "great" feel such work is beneath one and know there's only so many hours on earth. Easier to pay a grunt to do that instead. And hence power dynamics are established.

That ego alone is why a team of only "great" engineers is bound to fail. You have a bunch of strong but negative polarity magnets trying to stick together. It simply won't be allowed.


Every team doesn't need to have "great" engineers; I don't want "clever" solutions to my bog standard business application, just people to write sane, clean and maintainable code.


This might be a definitions issue, but my assertion is not that a "great engineer" is someone who can complete leetcode hards in 15 minutes for 8 hours in a row without stopping. My assertion is that about 1 in 5 people have 5-10x the business impact of the median software developer, and if you are recruiting or managing a team you should have the goal of having your team be entirely composed of these top quintile folks. The article specifically says that you should not have this goal, and I extremely strongly disagree with that assertion in the article.


Some years ago, Google published a paper whose conclusion was that high-trust teams were the most productive - not the ones with the 10x developers. This obsession with the "great man" theory as applies to software is harmful to software engineering.


Computers are all about synergy and understanding the hardware you're working with. No one algorithm will work optimally on every chip. One of the harder problems is in. Fact getting optimal hardware use by coordinating multiple threads, chips, or entire clusters to work on the same problem together.

It's a shame some can't apply such metaphors tk humans and think "no, sure, there are single processors thst outperform entire distributed networks" we're different".


You don't need a former fighter jet pilot to fly an airliner, but a former fighter jet pilot would probably do a very good job flying an airliner.

The guy who can roll his own probably also has a better idea than most what easy off the shelf solutions are out there.


A great engineer isn't the one writing the most "brilliant" code; it's the one who understands the problem, picks the simplest solution that works, and makes life easier for the next person who touches it.


In my experience, the person you're describing is hardly ever the one perceived as having "5-10x business impact". Specifically, "making life easier for the next person who touches it" is unproductive use of company time.

Which is why I have learned to stay away from people who use that metric.


Yeah, “business impact” is measured entirely on a short-term basis.

In 3-4 years when you need to make a drastic change, that’s when the actual business impact comes into play, but this is never measured.

In my experience with some groups, waiting another 2-3 months to do things correctly would have saved years in future work.


> Comforting yet false assertions. Great engineers tire of working with “normal” engineers, they want to work with others who they respect

What a great environment to train juniors. Does not sound toxic at all.


There are great and normal juniors as well.

Junior does not mean incompetent, it means unexperienced.


Wanting to work with ambitious people that match your level is not toxic.


No, what's toxic is building an environment like 99% of companies where juniors are told that everybody is the same and there's no point doing anything other than copy pasting whatever dogshit the "senior" next to them is typing into VSCode.


No, the field grew tremendously and you can see a clear generational bias -- by years of experience, not age -- where the cohort from the last 10-15 years has a completely different understanding of what the craft is [software engineering vs business development] and how to approach it [optimal solution vs soonest deliverable].

You can also trace personal backgrounds and you'll see a much higher representation in the newer cohort coming from upper middle class backgrounds with families in careers like finance, consulting, medicine/dentistry whereas more in the older cohort came from more modest middle class backgrounds in engineering, academia, or even working class trades.

Of course, there were always some of all of these people in the industry, but the balance shifted dramatically during the last couple booms, tracking the atypically high compensation standards set by FAANG's since 2010 or so.


This is what happens anytime a field gets large in terms of job applications. Replace software engineer with anything else to that measure and you see the same things with wealthy families being overrepresented in the cohort because they always have an edge in getting the best credentials due to not having to work any part time jobs and having mom and dad (or even a paid advisor) actively working on your behalf to vet potential internships or other opportunities for you. You are essentially out numbered 3:1 or even 4:1 or more, and you can't work a full 1 part anyhow due to the aforementioned other obligations life has saddled on you.


I doubt that has anything to do with getting "large in terms of job applications" alone. It's a correlation, alright, because wealthy families have it easier to get high-status and high-paying jobs for their kids, and if such a field grows, wealthy people flock to it like everyone else. But I sincerely doubt you'll find the wealthy over-represented in physical labor / blue collar jobs, regardless of how the ups and downs in the labor market for those occupations.

The way I see it, it's like 'swatcoder and 'lovich said upthread: the field became a money printer, and attracted - not revealed, attracted - a different kind of people, with a different mindset. I too saw this change happening. Applicant pool size? That's a spurious correlation - it's just driven by the same factors that make software industry a money printer.


> It's a correlation, alright, because wealthy families have it easier to get high-status and high-paying jobs for their kids

You have just written down a partial solution to this dilemma: make these high-paying jobs less attractive in terms of status for these wealthy families. :-)


I completely agree. Software engineering is just the most recent field I can think of(unless you consider data science a distinct enough portion of software to carve off as a separate field) that has had this pattern occur.

Well that and this is a forum for a lot of tech people which means a good number of software engineers here.


Speaking as an old school basement nerd (coding since middle school, 90’s): If I can do cool things with code _and_ get paid, I’m gonna go do that. Business constraints make it feel much more interesting than writing code in a vacuum.

Also money is nice.


> Business constraints make it feel much more interesting than writing code in a vacuum.

You never write code in a vacuum do you? You always have some kind of goal.


I don’t know dude, this one time I wrote a LOLCODE compiler into a Babel macro.

https://swizec.com/blog/lolcodetojavascript-compiler-babel-m...

It was pretty fun.

Also this other time I wrote a nodejs script to keep my computer at a specific temperature because our office fridge kept freezing my carrots.

https://swizec.com/blog/i-built-a-node-app-to-thaw-my-favori...


That's quite nice! You may wish to look into implementing a PID controller, so as to avoid overshoot (your carrots become too thawed initially) and unnecessary oscillation about the setpoint (meaning you are wasting energy on cooling and heating cycles that in the end cancel each other out, where you could have kept the temperature nearly constant during that time). I loved juicing carrots so much my face turned orange from the beta-carotene.


  > because our office fridge kept freezing my carrots.
That sounds like a goal to me


I should hope not. Even if your body could withstand the low pressure, you'd suffocate very quickly.


> No, the field grew tremendously and you can see a clear generational bias -- by years of experience, not age -- where the cohort from the last 10-15 years has a completely different understanding of what the craft is

I bet a lot of people 10-15 years older than you would say the same thing - except they'd say it about you and your generation.

I'm not that old, but I've been around long enough to hear people of every age over about 30 claim that everything was better back in their day until the new generation came along and ruined it.


> I bet a lot of people 10-15 years older than you would say the same thing - except they'd say it about you and your generation.

And they’d probably be right!

I remember the grognards giving me shit about memory management and me giving it right back by explaining that what they considered a large chunk of memory would be worth pennys next year because of Moore’s law and I wasn’t going to waste time considering something that I literally couldn’t learn faster than it became obsolete knowledge.

Quantitative differences can create qualitative differences and I don’t think it’s surprising that we’re in a different age of software engineering than we were 10-15 years ago for any given X year


>I remember the grognards giving me shit about memory management and me giving it right back by explaining that what they considered a large chunk of memory would be worth pennys next year because of Moore’s law and I wasn’t going to waste time considering something that I literally couldn’t learn faster than it became obsolete knowledge.

And that's why all applications are laggy as shit these days.


You both are right.

But ignore memory at your peril. I have one proj that has a 256GB instance. For a fairly boring CRUD app. I am asking a lot of questions as apparently we are having the yearly 'we need more memory' questions. Things that are leading to speedups. Just by using less memory. At the bottom of that stack is a L1 cache with less than a hundred KB. It doesnt matter right up until it does. I have seen huge 300+ item string classes that needed maybe 10 of the fields. They threw it in 'just because there is enough'. Yet something has to fill in those fields. Something has to generate all the code for those fields. The memory mangers have to keep track of all of that junk. Oh and all of that is in a pipeline of a cascade of applications so that 300+ class is copied 10 times. Plus the cost to keep it on disk and shove it thru the network.


On the other hand, I've seen developers who don't know about things like that start up a project on a small instance and wonder why everything is running at turtle speed.

People that stopped running tests because they were configured to make 10,000 API calls in one minutes and it crippled the app until everything was restarted.

"Add some more memory to your database instance....poof"


I definitely agree with the greybeards and I think we see the results of not listening to them. We have these processors, buses, networks, and all sorts that are magnitudes faster and more powerful than what they began on but many things are quite slow today. Worse, it seems to but getting slower. There is a lot of value in learning about things like caching and memory management. A lot of monetary value. It's amazing to me that these days your average undergraduate isn't coming out of a computer science degree being well versed and comfortable writing parallelized code, given that is how the hardware has moved. It is amazing to me we don't normalize caching considering a big change that was driven from the mobile computing side and adopted into our desktop and laptop environments is to fill ram because you might as well. It is crazy to me that we have these games that cost hundreds of millions of dollars to develop that are buggy as shit, hog all the resources of your machine, and can barely run at 4k60. Where you can hit a bug and go "yep, I know what's causing that memory error"

Honestly, I think so much of this comes from the belief of needing to move fast because. Because why? That would require direction. I think the money motivation is motivating speed but we've lost a lot of vision. Moving fast is great for learning but when you break things you got to clean it up. The problem is that once these tech giants formed they continued to act like a scrappy developer. To not go back and fix all the mess because we gotta go fast, we gotta go forward. But with no real vision forward. And you can't have that vision unless you understand the failures. We have so many low hanging fruits that I can't figure out why they aren't being solved. From deduplicating calendar entries, automatically turning off captioning on videos when a video has embedded captioning so you don't just overlay text on top of text, searching email, or even setting defaults to entry fields based on the browser data (e.g. if you ask for user's country, put the one the browser is telling you at the top of the fucking list!). These are all things I think you would think about if you were working in a space where you needed to consider optimization, if you were resource constrained. But we don't and so we let it slide. But the issue is a death by a thousand cuts. It isn't so bad in a few cases but these things add up. And the great irony of it all is that scale is what has made tech so powerful and wealthy in the first place. But no one stops to ask if we're also scaling up shit. If you printing gold but 1% of your gold is shit, you're still making a ton of shit. The little things matter because the little things add up. You're forced to deal with that when you think about memory management but now we just don't


As the base reality of computers and the inflated reality of software have diverged more and more, education and culture has tracked the software story and led to runaway irresponsibility. Forget not optimizing for performance; I think a lot of software today straight up fails to actually serve users some way or another. And those are paying users at that!


I agree. There are just too many obvious low hanging fruits. So I'm just trying to inspire people to take action and fix stuff. Ask not for permission, just fix it. Ask for forgiveness later.


As a fun anecdote I think this same rationale - ”next years hardware is so much better” - is why so many desktop softwares 90’s->00’s became slow - ”meh you don’t have to care about performance, next year’s cpu is going to be so much faster anyway”.

Then suddenly single threaded speedups didn’t happen anymore (and people realized even though cpu speeds had grown, it was not directly related to Moore’s law).

Ofc your rationale used Moore’s law correctly while the ”cpu infinite speed growth rah rah rah” peoples didn’t.



Seeing this multiple times in a day for multiple, articulable reasons in the mid 2010s if my memory doesnt fail me

>FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory > 1: node::Abort() [/usr/bin/node]

Is what made me decide it might finally be worth the time to actually learn how memory worked


You're not wrong but this is overly reductive. In other words this is more of a sliding scale and not a step function centered on 10-15 years ago.

For instance I was a CS undergrad in the mid/late-90s. There was an enormous difference demographically between my incoming freshman class and the incoming freshman class by the time I graduated. And the talking points were exactly the same ones we see in this thread.


This post is so weird on so many levels. I'll focus on this part:

    > You can also trace personal backgrounds and you'll see a much higher representation in the newer cohort coming from upper middle class backgrounds with families in careers like finance, consulting, medicine/dentistry whereas more in the older cohort came from more modest middle class backgrounds in engineering, academia, or even working class trades.
So, you reach for class warfare? Sheesh. It is anyone's fault that they are born into an upper middle class family? Are people from lower economic circumstances somehow superior, as you imply? This is just bizarre.

As a reminder: Bill Gates, who is certainly old school tech, was born and raised in an objectively wealthy, well-connected family, then went to Harvard. This is nearly made-for-TV silver spoon stuff.


It is telling that you considered their post to be about class warfare rather than different values.

The original focus of this thread was on technical precision vs. market efficiency, and how quality was sacrificed for faster conversion to sales.

That shift compromises products for everyone by creating a race to the bottom toward the minimum viable product and safety standards. When the consequences eventually hit, the aggregate responsibility and emergent effects lose direct attribution...but they exist all the same.


As the sibling comment noted, I think you might be projecting value judgment onto value distinction.

The most salient values of the later cohort are different than those in the prior ones, and those values do track with the values we associate with those different class backgrounds.

But there's no ranking being made there. They're just different values.

The values of the new cohort have earned the industry a great deal of political, economic, and cultural influence on an international level.

The values of the old cohort didn't do that, except insofar as they built a stage for the new one. They made software differently. They designed products differently. They operated businesses on different scales. They hired differently.

Indeed some of us from the old cohort don't personally savor all the spotlight and influence and cultural drama that Silicon Valley collectively bears now, and miss the way thing were. And others love it. But that's just personal preference, not class warfare.


> So, you reach for class warfare? Sheesh. It is anyone's fault that they are born into an upper middle class family? Are people from lower economic circumstances somehow superior, as you imply?

To be fair, I don't see any value judgements in the post you're replying to. He doesn't say if it's a good or bad thing, it's just a thing. But what I think this means is that field became more popular, entry filters became more competitive, and families with less resources to invest in their offspring became filtered out.

There's nothing good or bad about it.


That’s why the bill gates story got so much public attention. It’s surprising. Their Harvard kid is doing what?


I suppose it may have been surprising to anyone completely out of touch with what was happening at the time.

I'm only a little younger than Gates, and it seemed like, what else would you do? PCs were revolutionizing the world.


A slight addition to this topic. A lot of jobs also became software, even if your intention in signing up for the jobs was different to begin with. PCs were revolutionizing the world.

For about a decade I worked as an engineer in a field where the expectation (at least starting) was that metal gets cut, stuff gets built, and there's physical hardware.

Those existed. May have actually had more hardware interaction than many in engineering. Yet much of the day to day rapidly became computer simulations of the metal that might get cut someday.

In many fields, the organizational choice decrement on anything involving capital expenditure or purchase was so severe that usually the obvious choice was to run a computer model, and simulate what might occur. What else would you do?


> decrement on anything involving capital expenditure

Boy is this true. I don’t think we ever recovered. Imagine trying to start a capital intensive business like mining in 2025.


Frankly a shame. Since there's a been a lot of development in mining technologies over the years.

Even for the folks that have an ecological focus, there's quite a few methods developed with limited degradation of the landscape, and reclamation of the mining sites into alternative uses (park, forestry, entertainment, tourism). The Wieliczka saltmine in Poland's an especially impressive example [1]

[1] https://www.wieliczka-saltmine.com/individual-tourist/touris...

And these days, there's also a huge number of resources in terms of mineral identification and site mapping. The EMIT Imaging Spectrometer from NASA's a cool example that does remote satelite mineral identification from orbit. [2]

[2] https://earth.jpl.nasa.gov/emit/instrument/overview/


That's not obvious at all.

People refresh all day here and often on the new post and new and comment feeds specifically, and many earnest users who take pride in maintaining a certain character to the site are quick to flag anything that's likely to devolve into noise and vitriol, like most political topics.

It can be frustrating to have them do that when you really wish you could to commiserate, explore, or debate the community on these topics, but bots aren't at play. Many people just don't see this as the right place to have these discussions and work to keep it that way.

While almost all of are political in some way, and most of us are tracking all these same events, we don't all want talk about it here.


> AFAIK, 10g of sugar is 10g of sugar, regardless of if it's fructose, glucose, sucrose or lactose.

Remember that we rely on very noisy abstractions in many sciences, and especially in biology, nutrition, and public health. In terms of the (abstract) model of calories-as-fuel, different sugars mostly all have the same effect of providing about 4 calories per gram. And in terms of the (abstract) model of glucose-as-blood sugar, they all get more or less converted to glucose eventually.

But behind those useful high level abstractions, these are all different molecules and bodies have different ways of processing them: different gut absorption and biome effects, different organs, different catalysts, different pathways, different rates of processing, etc.

So from certain views, yes, "sugar is sugar" but from other views each compound that we call sugar is a different molecule and this implies critical and sometimes quite impactful differences in how any body may respond to it at any given time.


This is all true, but HFCS-55 is only a little higher in fructose than other sugars considered more healthy like cane sugar[1] and honey[2].

[1] https://kansasfarmfoodconnection.org/spotlights/which-is-bet...

[2] https://en.wikipedia.org/wiki/Fructose#Carbohydrate_content_...


Lots of words to say absolutely nothing about why HFCS has a meaningfully worse impact on you than other sugars.


I took the GP for asking why there might even be differences in the first place, and answered that question directly.

The question as to whom it might impact differently and in what way takes a critical survey of research, of which much has been done. With the output of that research being contentious and hotly debated, I have no interest in trying to summarize that here even if it was what I read the GP to have asked.


Those words also clearly didn't mean to, so unless you have reading comprehension issues, I'm not sure why you're taking an issue with that.


> As a senior engineer your job must not be to stop the use of LLMs, but create opportunities to build newer and bigger products.

I think you just hit the core point that splits people in these discussions.

For many senior engineers, we see our jobs are to build better and more lasting products. Correctness, robustness, maintainability, consistency, clarity, efficiency, extensibility, adaptability. We're trying to build things that best serve our users, outperform our competition, enable effective maintenance, and include the design foresight that lets our efforts turn on a dime when conditions change while maintaining all these other benefits.

I have never considered myself striving towards "newer and bigger" projects and I don't think any of the people I choose to work with would be able to say that either. What kind of goal is that? At best, that sounds like the prototyping effort of a confused startup that's desperately looking to catch a wave it might ride, and at worst it sounds like spam.

I assure you, little of the software that you appreciate in your life has been built by senior engineers with that vision. It might have had some people involved at some stage who pushed for it, because that sort of vision can effectively kick a struggling project out of a local minimum (albeit sometimes to worse places), but it's unlikely to have been a seasoned "senior engineer" being the one making that push and (if they were) they surely weren't wearing that particular hat in doing so.


I don't get this idea that to build a stable product you must make your life hard as much as possible.

One can use ai AND build stable products at the same time. These are not exactly opposing goals, and even above that assuming that ai will always generate bad code itself is wrong.

Very likely people will build both stable and large products using ai than ever before.

I understand and empathise with you, moving on is hard, especially when these kind of huge paradigm changing events arrive, especially when you are no longer in the upswing of life. But the arguments you are making are very similar to those made by boomers about desktops, internet and even mobile phones. People have argued endlessly how the old way was better, but things only get better with newer technology that automates more things than ever before.


I don't feel like you read my comment in context. It was quite specifically responding to the GP's point of pursuing "bigger and better" software, which just isn't something more senior engineers would claim to pursue.

I completely agree with you that "one can use ai AND build stable products at the same time", even in the context of the conversation we're having in the other reply chain.

But I think we greatly disagree about having encountered a "paradigm changing event" yet. As you can see throughout the comments here, many senior engineers recognize the tools we've seen so far for what they are, they've explored their capabilities, and they've come to understand where they fit into the work they do. And they're simply not compelling for many of us yet. They don't work for the problems we'd need them to work for yet, and are often found to be clumsy and anti-productive for the problems they can address.

It's cute and dramatic to talk about "moving on is hard" and "luddism" and some emotional reaction to a big scary immanent threat, but you're mostly talking to exceedingly practical and often very-lazy people who are always looking for tools to make their work more effective. Broadly, we're open to and even excited about tools that could be revolutionary and paradigm changing and many of us even spend our days trying to discover build those tools. A more accurate read of what they're saying in these conversations is that we're disappointed with these and in many cases and just find that they don't nearly deliver on their promise yet.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: