Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How do I avoid the angst about this stuff as a student in computer science? I love this field but frankly I've been at a loss since the rapid development of these models.


LLMs are the new compilers.

As a student, you should continue to focus on fundamentals, but also adapt LLMs into your workflow where you can.

Skip writing the assembly (now curly braces and semicolons), and focus on what the software you’re building actually does, who it serves, and how it works.

Programming is both changing a lot, and not at all. The mechanics may look different, but the purpose is still the same: effectively telling computers what to do.


LLMs are actually the new computers. Compilation is only one program they can run.


LLMs are the way computers were always supposed to work!


> LLMs are the new compilers.

This shows a grave misunderstanding of what compilers and LLMs are. They're fundamentally opposite concepts.

Compilers are about optimizing abstract code down to the most efficient representation possible for some hardware. LLMs are about wasting petaflops (made possible by compiler engineers) to produce random statements that don't have any static guarantees.


How can you trust that the compiler has written the most efficient assembly, if you’re not double checking it by hand?

Jokes aside, I understand your point.

In the history of computing, LLMs and compilers are closer than one might think.

Compilers weren’t first created to optimize “abstract code down to the most efficient” assembly as possible, even if that is the goal of a compiler writer today.

Compilers were created to enable the use of higher-level languages. Abstraction, efficiency, portability, error reduction, and most importantly: saving time.

They allowed humans to create more software, faster.


- a coping Software engineer


As a former prof. What you should be learning from any STEM degree (and many other degrees as well) is to think clearly, rigorously, creatively, and with discipline, etc. You also need to learn the skill of learning content and skills quickly.

The specific contents or skills of your degree don't matter that much. In pretty much any STEM field, over the last 100ish years, whatever you learned in your undergraduate was mostly irrelevant by the time you retired.

Everyone got by, by staying on top of the new developments in the field and doing them. With AI, the particular skills needed to use the power of computers to do things in the world have changed. Just learn those skills.


It's either over, or giving a lot of idiots false confidence — I meet people somewhat regularly who believe they don't really need to know what they're doing any more. This is probably an arbitrage.


There are at least two things here.

One, about the field itself. So far, I have been a know-it-all, and I dabbled in management too, besides that. This worked for me, because no matter how the field any my opportunities shifted, I always had a card up my sleeve. This is highly personal though.

Two, about managing angst. Whatever you experience now, you will in the future too. Circumstances won't matter at all, your brain will convert whatever it perceives around you, into these feelings that you generally experience. You can be at your highest high, and the lowest low, and you will always gravitate back towards these familiar feelings of yours. So, what you can do to have a nicer experience is to be a good partner yourself, and learn how to live with these specific feelings that you have.


For all the value that they bring, there is still a good dose of parlour tricks and toy examples around, and they need an intelligent guiding hand to get the best out of them. As a meat brain, you can bring big picture design skills that the bots don't have, keeping them on track to deliver a coherent codebase, and fixing the inevitable hallucinations. Think of it like having a team of optimistic code monkeys with terrible memory, and you as the coordinator. I would focus on building skills in things like software design/architecture, requirements gathering (what do people want and how do you design software to deliver it?), in-depth hardware knowledge (how to get the best out of your platform), good API design, debugging, etc. Leave the CRUD to the robots and be the brain.


You can ask them this question and all your fears will be washed away, for now..

"Here's a riddle for you - a surgeon, who is the boy's father says, "I cannot operate on this boy, he's my son!" Who is the surgeon to the boy?"

But seriously - AI in the hands of someone well-educated in their field is going to be a lot more powerful than some random person. Knowledge is still going to be valuable, and there are still people out there who don't know how to Google things and figure things out for themselves - so there'll be plenty of people who don't realise the potential of LLMs and won't use them.


Angst?

It just means you're less likely be fixing someone else's "mistakenly _mm512_store_si512 for been _mm512_storeu_si512" error because AI fix(ed) it for you and you can focus on other parts of computer science. Computer science surely isn't just fixing _mm512_store_si512.


The cost of developing software is quickly dropping thanks to these models, and the demand for software is about to go way up because of this. LLMs will just be power tools to software builders. Learn to pop up a level.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: