Hacker Newsnew | past | comments | ask | show | jobs | submit | thewebguyd's commentslogin

> protect the text from AI training

Hasn't training been already ruled to be fair use in the recent lawsuits against Meta, Antrhopic? Ruled that works must be legally acquired, yes, but training was fair use.


No one uses sharepoint because it's good or was ever good, everyone knows it sucks. They use it precisely because it's a) bundled with 365 already and b) already very well integrated into the rest of the entire ecosystem. No developer time is needed to get automations, external sharing with encryption, and any other numerous features.

So sure, one wouldn't build another SharePoint on its own, but there's still room to build an entire package like M365 and do it right, and integrate solutions that don't suck.

But no one does, because it's expensive as hell. Good luck building something just as comprehensive and integrated and selling it for a measly $22/user/month.


I agree, doubly so when you consider different packaging formats and package managers also, along with different release models.

I've been preaching for a long time that "distro" is the wrong term. Each "distro" is absolutely it's own, standalone operating system. There is no universal "Linux OS" outside of the kernel. Even userland can be swapped out.


This sums up my feelings almost exactly.

I don't want LLMs, AI, and eventually Robots to take over the fun stuff. I want them to do the mundane, physical tasks like laundry and dishes, leave me to the fun creative stuff.

But as we progress right now, the hype machine is pushing AI to take over art, photography, video, coding, etc. All the stuff I would rather be doing. Where's my house cleaning robot?


I would like to go even further and say: Those things, art, photography, video, coding ... They are forms of craft, human expression, creativity. They are part of what makes life interesting. So we are in the process of eliminating the interesting and creative parts, in the name of profit and productivity maxing (if any!). Maybe we can create the 100th online platform for the same thing soon 10x faster! Wow!

Of course this is a bit too black&white. There can still be a creative human being introducing nuance and differences, trying to get the automated tools to do things different in the details or some aspects. Question is, losing all those creative jobs (in absolute numbers of people doing them), what will we as society, or we as humanity become? What's the ETA on UBI, so that we can reap the benefits of what we automated away, instead of filling the pockets of a few?


> without seeing just how effective it can be once you zoom in.

The love/hate flame war continues because the LLM companies aren't selling you on this. The hype is all about "this tech will enable non-experts to do things they couldn't do before" not "this tech will help already existing experts with their specific niche," hence the disconnect between the sales hype and reality.

If OpenAI, Anthropic, Google, etc. were all honest and tempered their own hype and misleading marketing, I doubt there would even be a flame war. The marketing hype is "this will replace employees" without the required fine print of "this tool still needs to be operated by an expert in the field and not your average non technical manager."


The amount of GUIs I've vibe-coded works against your claim.

As we speak, my macOS menubar has an iStat Menus replacement, a Wispr Flow replacement (global hotkey for speech-to-text), and a logs visualizer for the `blocky` dns filtering program -- all of which I built without reading code aside from where I was curious.

It was so vibe-coded that there was no reason to use SwiftUI nor set them up in Xcode -- just AppKit Swift files compiled into macOS apps when I nix rebuild.

The only effort it required was the energy to QA the LLM's progress and tell it where to improve, maybe click and drag a screenshot into claude code chat if I'm feeling excessive.

Where do my 20 years of software dev experience fit into this except beyond imparting my aesthetic preferences?

In fact, insisting that you write code yourself is becoming a liability in an interesting way: you're going to make trade-offs for DX that the LLM doesn't have to make, like when you use Python or Electron when the LLM can bypass those abstractions that only exist for human brains.


You making a couple of small GUIs that could have been made with a drag and drop editor 10 years ago doesn't work against his claim as much as you think. You're just telling on your self and your "20 years" of supposed dev experience.

Dragging UI components into a WYSIWYG editor is <1% of building an app.

Else Visual Basic and Dreamweaver would have killed software engineering in the 90s.

Also, I didn't make them. A clanker did. I can see this topic brings out the claws. Honestly I used to have the same reaction, and in a large way I still hate it.


It's not bringing out claws, it's just causing certain developers to out themselves.

Outs me as what, exactly?

I'm not sure you're interacting with single claim I've made so far.


Love that you are disagreeing with parent by saying you built software all on your own, and you only had 20 years software experience.

Isn't that the point they are making?


Maybe I didn't make it clear, but I didn't build the software in my comment. A clanker did.

Vibe-coding is a claude code <-> QA loop on the end result that anyone can do (the non-experts in his claim).

An example of a cycle looks like "now add an Options tab that let's me customize the global hotkey" where I'm only an end-user.

Once again, where do my 20 years of software experience come up in a process where I don't even read code?


> An example of a cycle looks like "now add an Options tab that let's me customize the global hotkey" where I'm only an end-user

Which is a prompt that someone with experience would write. Your average, non-technical person isn't going to prompt something like that, they are going to say "make it so I can change the settings" or something else super vague and struggle. We all know how difficult it is to define software requirements.

Just because an LLM wrote the actual code doesn't mean your prompts weren't more effective because of your experience and expertise in building software.

Sit someone down in front of an LLM with zero development or UI experience at all and they will get very different results. Chances are they won't even specify "macOS menu bar app" in the prompt and the LLM will end up trying to make them a webapp.

Your vibe coding experience just proves my initial point, that these tools are useful for those who already have experience and can lean on that to craft effective prompts. Someone non-technical isn't going to make effective use of an LLM to make software.


Here's how I look at it as a roboticist:

The LLM prompt space is an ND space where you can start at any point, and then the LLM carves a path through the space for so many tokens using the instructions you provided, until it stops and asks for another direction. This frames LLM prompt coding as a sort of navigation task.

The problem is difficult because at every decision point, there's an infinite number of things you could say that could lead to better or worse results in the future.

Think of a robot going down the sidewalk. It controls itself autonomously, but it stops at every intersection and asks "where to next boss?" You can tell it either to cross the street, or drive directly into traffic, or do any number of other things that could cause it to get closer to its destination, further away, or even to obliterate itself.

In the concrete world, it's easy to direct this robot, and to direct it such that it avoids bad outcomes, and to see that it's achieving good outcomes -- it's physically getting closer to the destination.

But when prompting in an abstract sense, its hard to see where the robot is going unless you're an expert in that abstract field. As an expert, you know the right way to go is across the street. As a novice, you might tell the LLM to just drive into traffic, and it will happily oblige.

The other problem is feedback. When you direct the physical robot to drive into traffic, you witness its demise, its fate is catastrophic, and if you didn't realize it before, you'd see the danger then. The robot also becomes incapacitated, and it can't report falsely about its continued progress.

But in the abstract case, the LLM isn't obliterated, it continues to report on progress that isn't real, and as a non expert, you can't tell its been flattened into a pancake. The whole output chain is now completely and thoroughly off the rails, but you can't see the smoldering ruins of your navigation instructions because it's told you "Exactly, you're absolutely right!"


Counter point: https://news.ycombinator.com/item?id=46234943

Your original claim:

> The hype is all about "this tech will enable non-experts to do things they couldn't do before"

Are you saying that a prompt like "make a macOS weather app for me" and "make an options menu that lets me set my location" are only something an expert can do?

I need to know what you think their expertise is in.


But anyone didn't do it... you an expert in software development did it.

I would hazard a guess that your knowledge lead to better prompts, better approach... heck even understanding how to build a status bar menu on Mac OS is slightly expert knowledge.

You are illustrating the GP's point, not negating it.


> I would hazard a guess that your knowledge lead to better prompts, better approach... heck even understanding how to build a status bar menu on Mac OS is slightly expert knowledge.

You're imagining that I'm giving Claude technical advice, but that is the point I'm trying to make: I am not.

This is what "vibe-coding" tries to specify.

I am only giving Claude UX feedback from using the app it makes. "Add a dropdown that lets me change the girth".

Now, I do have a natural taste for UX as a software user, and through that I can drive Claude to make a pretty good app. But my software engineering skills are not utilized... except for that one time I told Claude to use an AGDT because I fancy them.


My mother wouldn't be able to do what you did. She wouldn't even know where to start despite using LLMs all the time. Half of my CS students wouldn't know where to start either. None of my freshman would. My grad students can do this but not all of them.

Your 20 years is assisting you in ways you don't know; you're so experienced you don't know what it means to be inexperienced anymore. Now, it's true you probably don't need 20 years to do what you did, but you need some experience. Its not that the task you posed to the LLM is trivial for everyone due to the LLM, its that its trivial for you because you have 20 years experience. For people with experience, the LLM makes moderate tasks trivial, hard tasks moderate, and impossible tasks technically doable.

For example, my MS students can vibe code a UI, but they can't vibe code a complete bytecode compiler. They can use AI to assist them, but it's not a trivial task at all, they will have to spend a lot of time on it, and if they don't have the background knowledge they will end up mired.


The person at the top of the thread only made a claim about "non-experts".

Your mom wouldn't vibe-code software that she wants not because she's not a software engineer, but because she doesn't engage with software as a user at the level where she cares to do that.

Consider these two vibe-coded examples of waybar apps in r/omarchy where the OP admits he has zero software experience:

- Weather app: https://www.reddit.com/r/waybar/comments/1p6rv12/an_update_t...

- Activity monitor app: https://www.reddit.com/r/omarchy/comments/1p3hpfq/another_on...

That is a direct refutation of OP's claim. LLM enabled a non-expert to build something they couldn't before.

Unless you too think there exists a necessary expertise in coming up with these prompts:

- "I want a menubar app that shows me the current weather"

- "Now make it show weather in my current location"

- "Color the temperatures based on hot vs cold"

- "It's broken please find out why"

Is "menubar" too much expertise for you? I just asked claude "what is that bar at the top of my screen with all the icons" and it told me that it's macOS' menubar.


Your best examples of non-experts are two Linux power users?

I didn't make clear I was responding to your question:

"Where do my 20 years of software dev experience fit into this except beyond imparting my aesthetic preferences?"

Anyway, I think you kind of unintentionally proved my point. These two examples are pretty trivial as far as software goes, and it enabled someone with a little technical experience to implement them where before they couldn't have.

They work well because:

a) the full implementation for these apps don't even fill up the AI context window. It's easy to keep the LLM on task.

b) it's a tutorial style-app that people often write as "babby's first UI widget", so there are thousands of examples of exactly this kind of thing online; therefore the LLM has little trouble summoning the correct code in its entirety.

But still, someone with zero technical experience is going to be immediately thwarted by the prompts you provided.

Take the first one "I want a menubar app that shows me the current weather".

https://chatgpt.com/share/693b20ac-dcec-8001-8ca8-50c612b074...

ChatGPT response: "Nice — here's a ready-to-run macOS menubar app you can drop into Xcode..."

She's already out of her depth by word 11. You expect your mom to use Xcode? Mine certainly can't. Even I have trouble with Xcode and I use it for work. Almost every single word in that response would need to be explained to her, it might as well be a foreign language.

Now, the LLM could help explain it to her, and that's what's great about them. But by the time she knows enough to actually find the original response actionable, she would have gained... knowledge and experience enough to operate it just to the level of writing that particular weather app. Though having done that, it's still unreasonable to now believe she could then use the LLM to write a bytecode compiler, because other people who have a Ph.D. in CS can. The LLM doesn't level the playing field, it's still lopsided toward the Ph.D.s / senior devs with 20 years exp.


> for some, it's just a paycheck. I am not sure what has happened over the decades regarding actually being proud of the work you produce.

Hard to be proud of the work you produce when you have no ownership over it, and companies show less and less loyalty and investment in their employees. When, at any random time, you can be subject to the next round of layoffs no matter how much value you contributed, it's hard to care.

So yeah, for most it's just a paycheck unless you are working for yourself, or drank a gallon of the koolaid and seriously believe in whatever the company's mission is/what it's doing.

I'm proud of my own work and projects I do for myself, tech or otherwise, and put great care into it. At $dayjob I do exactly what I am paid to do, nothing more nothing less, to conserve my own mental energy for my own time. Not saying I output poor work, but more so I will just do exactly what's expected of me. The company isn't going to get anything extra without paying for it.

Didn't used to be that way, but I've been burned far too many times by going "above and beyond" for someone else.

If employees had more ownership and stake in the companies they work for, I think the attitudes would change. Likewise, if companies went back to investing in training and retention, loyalty could go both ways again.


> where the one who are supposed to refund banks will get the money?

Liquidating other assets. The point of the banks using SRTs is to push the default risk off of the bank and onto investors.

So now, instead of banks failing, private credit gets to bear the risk of the bubble popping. Since they can't sell the (now bad) AI debt, they will need to liquidate all of their other assets to pay the banks.

That's why a potential AI bubble burst can cause the markets to enter a death spiral and bring down a bunch of other, unrelated markets.

If private credit can't cover the losses by liquidating everything else, well, then they fail, and we either let it all crumble or do bailouts again.


You forget the single greatest source of money for investors: loans. Sorry "margin". With the banks. And margin is nonnegotiable because the whole point of the stock market is to massively increased loaned money because that is the real advantage to the economy they provide.

Plus the problem of 2008. You cannot offload risk if everything is synchronized, the math still works but doesn't take "either everything crashes or nothing does" into account.


> "they can't all default at once!"

Narrator: As it turns out, they can.

The difference now is instead of banks holding the risk, they are now the safest portion of the loans. The risk is now moved to private credit, so if this bubble bursts, they will panic sell other assets to cover the AI losses, which will crash unrelated sectors as well.

Since the now bad AI loans can't be sold, they need liquidity form elsewhere to cover. AI bursting means other S&P 500 stocks, treasuries, gold, crypto, commercial real estate will all go down with it.


I wonder how much other bad private credit there is. If you want liquid funds, rolling it all over time and time again might stop working... Maybe it is really time to clean it all up.

> Apple will revive Boot Camp for Windows and deem it useful to include Linux this time?

If Apple wanted to, they could already do that right now. Windows runs on arm just fine. Heck, windows on Arm in a parallels VM runs better on my macbook pro than it does natively on an x86 laptop.

If Apple would make some drivers, even just for Windows, I bet they'd sell more macs. But it would seem Apple either calculated that ecosystem/services lock-in is way more important to them than a potential boost in hardware sales for alternative OSes, or they are really reluctant to make drivers for Apple Silicon available elsewhere out of fear it'll expose some trade secrets, which they didn't have to worry about when they used intel.


> If Apple would make some drivers, even just for Windows, I bet they'd sell more macs.

The incremental bump in sales would be very small.

Even when Apple did provide bootcamp drivers to run Windows on old laptops, very few people used it as their daily driver for a Windows computer. I'm sure Apple has a better estimate of the market for people who bought Macs to use with alternative OSes back when they supported it, but they've calculated that it's not worth the effort.


The problem is indeed Windows. Could you point me to where you can legally buy a Windows for ARM licence?

You can buy it from the Microsoft Store inside of windows once its installed. That's how it works with parallels, or any other Windows on Arm device (say, for upgrading from Home to Pro).

Definitely a nightmare.

Where I work just acquired new property and are deploying a new site. It took 9 months, from date of first contact, before the ISP could come out, bore under the road, and run fiber to our building from two poles away. And that's just a short ~500 feet underground run.

I couldn't imagine the amount of permitting and logistics involved in trying to bury an entire run across town.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: