Hacker Newsnew | past | comments | ask | show | jobs | submit | foxmoss's commentslogin

Yeah not awesome, heavy irony in this paragraph. Been looking at some other providers recently with comparable prices, my infrastructure isn’t to complicated to migrate just haven’t had the chance to make the jump.

Eventually I always get to a problem I can't solve by just throwing an LLM at it and have to go in and properly debug things. At that point knowing the code base helps a hell of a lot, and I would've been better off writing the entire thing by hand.

Because the question almost always comes with an undertone of “Can this replace me?”. If it’s just code search, debugging, the answer’s no because a non-developer won’t have the skills or experience to put it all together.


That undertone is overt in the statements of CEOs and managers who salivate at “reducing headcount.”

The people who should fear AI the most right now are the offshore shops. They’re the most replaceable because the only reason they exist is the desire to carve off low skill work and do it cheaply.

But all of this overblown anyway because I don’t see appetite for new software getting satiated anytime soon, even if we made everyone 2x productive.


https://foxmoss.com & https://foxmoss.com/blog/

I write about really a wide variety of topics


I was drafting a reply when you sent this, this is the correct interpretation and why I did it.


Yeah as I've dabbled with AI models more and more it's become clear to me how much my mental model is valuable to the programming process. It's easier to debug code I wrote myself then to comb through some AI's mistakes when it eventually gets to a problem too hard for the model to debug.


> Otherwise I’m already freaked out by treating a 32 bit field as a pointer… even if you extend it to first.

The cast from a 32 bit pointer to a 64 bit pointer is in fact an eBPF oddity. So what's happening here is that the virtual machine is just giving us a fake memory address just to use in the program and when the read actually needs to happen the kernel just rewrites the virtual addresses to the real ones. I'm assuming this is just a byproduct of the memory separation that eBPF does to prevent filters from accidentally reading kernel memory.

Also yes the double cast is just to keep the compiler from throwing a warning.


I had not heard of T9 before starting the project and getting interested, I'm too young to have experienced owning a pre-touch screen phone. I don't know if the average HN reader knows what T9 is, so I went with a term that I was fairly certain most people would be familiar with. Is that so people engage with my work? I certainly found the project fascinating, I made the library to share that fascination. If I can get more people to implement and use T9 and alike systems I think my work has has been a success.


Nice! I clearly made an assumption that you were an Old like me :D


Please submit an issue on the Github repo! This is a bug, it should automatically show with words as you type. Include platform details, console logs, etc. I am unable to test every platform alone sadly.


I mean thinking about students as actors of pure bad faith, a student could easily copy and paste any instructions given to them into a LLM and bypass any required training data. Even if an AI company respects the license and the source does not end up in the training set, model knowledge tends to be generalizable to a given area. The only way I could see making a language that is intentionally obtuse to write in (brainfuck or really any other esolang seems to work), but that fails at being a good introductory programming language.


Yeah I think that's a good point. Making something hard for AI to learn is hard for humans too.

My argument was like "how would you even prove that it trained on it bro?" and I think that's kind of a hard thing to do as well.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: