Hacker News new | past | comments | ask | show | jobs | submit login

In comments on this post, and elsewhere on other posts about AI, I see a lot of people referring to worries around the potential for lots of types of jobs to be heavily impacted by this technology.

I feel like people are often referring to 'coding' when they express these worries. You know, actually writing code, having been given a spec to do so, and perhaps also participating in code review, writing tests, all the usual engineer stuff.

My question is, amongst the HN crowd, what kinds of roles or areas do we think might be somewhat immune to this effect? The first thing that occurs to me are security, infrastructure & ops, networking. And of course the requirements gathering stage of software development. It is already the case that a lot of senior devs probably don't write much code and spend more time on communication between different stakeholders and overseeing whoever (or whatever) is writing the code.

Anyone else been thinking about this? What tech roles might thrive in the face of AI.




In my day job I work with TypeScript and React, and if I'm honest, I'm not worried at all. Why should I be? Automation is all around us, and has been for years.

The way I personally see it, is that AI such as ChatGPT is another tool in our arsenal that we as developers will have to figure out how it fits into our workflow. I think long term it will help us write better code, and in general be more productive. For example, less time trying to find answers hidden deep in Stack Overflow as we'll be able to get that information directly from ChatGPT.

I can completely see that some smaller places they might not require a developer and instead use ChatGPT to write code, but it still has to be verified and all the other processes around making that code "live", etc.

If anything I'd be more worried if I were a copywriter, as I think it's an under appreciated skill and companies may think they can get away with ChatGPT and a quick glance over the copy.

Either way, I'm positive and look for new ways to help me come a more productive and well-rounded programmer.


I'm not at all worried, either. Because I know we already went through this. Companies tried to outsource their work to India, the Philippines, and elsewhere, and it ended in tears and expensive reversals of strategy. They discovered that someone has to check all the work.

No one wants to pay for code reviews. They don't show up on the agile board and we all collectively pretend they are "freebies" the devs give out to the company. Just like unit tests and all the other shit that we just expect devs to do in their spare time between 3 hour meetings and feature work.

I can see ChatGPT being an augmentation. But that only works because you have a human dev that does the merging and can take the blame if their code goes wrong. Remove that human and you're staring into the abyss.


I agree with this take, in the main. I think the level of FUD elsewhere is probably unwarranted. Though I fully subscribe to the idea that it is very disruptive tech.

In the context of software engineering I am tending to see it as another layer of abstraction. Once upon a time there was perhaps not much above machine code / assembly, but now you can have quite a few layers providing abstractions over that ending up perhaps at Javascript, or maybe low-code tools. For me, AI sits somewhere vaguely in that category (though with a much higher level of sophistication).


> what kinds of roles or areas do we think might be somewhat immune to this effect?

On what time-frame?

Permanently immune? You have to postulate that there's something a human can do that a computer system just can't. To me, given the progress we have already seen, I see no strong reason to imagine that there is such a thing. More precisely, it becomes a metaphysical question about what it means to be human, "What are people for?".

- - - -

In the medium term (and this may only be a few years) the role that will come to the fore is that of the human-computer psychologist, so to speak. We already talk about "prompt engineering". There are two questions: goal and context. What do you want the computer to do, and how do you know when it's doing it successfully? -and- What are the side-effects, the "ecology", of the selected solutions? Especially, otherwise unforeseen side-effects.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: