I"m almost entirely dotnet these days, with a smattering of Go here and there.
I work in ops though, so I'm not building consumer-facing products but mostly IT glue code and internal tooling (mostly Go), dashboards, business report generators, gluing SaaS together, etc. (mostly dotnet/C#).
None of these laws are actually about protecting children. That's not the real goal. The real goal is the complete elimination of anonymity on the web, where both private companies and the state can keep tabs on everything you do.
Not being able to be at least pseudo-anonymous has a real chilling effect on speech and expression. Even if there are laws in place protecting such rights, people will self-censor when knowing they are being watched.
It's how freedom of speech and expression dies without actually scratching that part off of the bill of rights.
It's a mix. I'm sure there are some people really trying to protect kids. There are other people that just want all porn off the Internet. And there are bad actors that want total surveillance. And they are all on the same side of this issue.
Yea. People which cured their children with lobotomy also thought that they we're doing something good. These usefull idiots are in some sense worse than the perpetuators it self because they are primarly the enablers of such behaviot simply because of their naivety or worse, ignorance.
Not idiots necessarily, sometimes just long-time observers who have finally become cynical. People that were pro-guns for decades may watch several years of failure to adopt basic and uncontroversial gun-control regulation, then eventually become anti-gun. People that were in favor of regulating it once may suddenly become fearful for their safety, and want no regulations at all in case that regulation puts them out in the cold. Since both PR campaigns and any action on policy tends to cater to extremes, there's always pressure that is shrinking the middle
The comment "useful idiots" is more a play on the russian KGB strategy.
They use assets to influence people and achieve certain goals. In this case
here, terrorism or child pornography is used as cop-out rationale for censorship,
surveillance and so forth. It's never about those topics really, perhaps 5% at best,
the rest is just sugar-coated decoy to restrict people and keep them as slaves and pets.
> Since both PR campaigns and any action on policy tends to cater to extremes, there's always pressure that is shrinking the middle
This only works on people who are susceptible to this. I understand how propaganda works so I am never fooled by "this is because of terrorists". This is also why I am for 100% transparency at all times.
see, when you cut out the part about "because of terrorists" that sounds like a patently laughable claim. I would tend to agree with the poster on the strength that some propaganda is very, very easily spotted:
- anything that mentions "terrorists" (or the nouveau "narco-terrorists")
- "think of the children" / "we must protect the children"
- "we need to create jobs" / "job creators"
- "they're turning the frogs gay"
- "we need to protect America"
tbh if you're fooled by any of that (and there's no delicate way to say this) you're dumb. Even a cursory glance at history would reveal the obvious deception and it's on you that you haven't bothered.
> The comment "useful idiots" is more a play on the russian KGB strategy.
Oh, I'm familiar with the phrase, but I'm specifically disputing how applicable it really is to people that are self-aware about the situation they are facing. Useful idiots are ones that are tricked, especially ones that are evangelical about tricking others. People forced to choose between 2 extremes where both choices are very bad are called.. normal citizens participating in the democratic process.
> This only works on people who are susceptible to this. I understand how propaganda works
What? You can see through propaganda, but you can't just pencil in your own policy options. Unfortunately and by design, the things you can ultimately vote for are "all or nothing" flavored. Censor everything, censor nothing. Track everybody, track nobody. Tons of parents who totally understand the surveillance state probably got flipped by meta's memo about chatbots being "sensual" with children. They'd rather vote to force corporations to be good citizens, but they can't. So they'll vote for an age-gated internet as the best of the bad options. I wouldn't assume all those people are naive, confused, or duped.. they've simply switched from a principled/abstract stance to a convenience-based calculus after they were forced into it. Meta wins either way, as planned. Either they get to build a more addictive platform, or they track more info about more people
People forced to choose between 2 extreme evils, one (debatably) lesser, are not called "normal", they are called unfree.
The process of making sure people are always in one such situation or another is not called "governance", it's called driving insane.
>I wouldn't assume all those people are naive, confused, or duped.. they've simply switched from a principled/abstract stance to a convenience-based calculus after they were forced into it.
Forced into it under threat of violence, or under threat of denied sustenance and shelter, or "forced" by catering to their naivete, by confusing and duping them, by silently extorting them by enclosure of the commons?
Switching from "principle-based stance" to "convenience-based stance" is not called "being sensible", it's called... cowardice.
>Unfortunately and by design, the things you can ultimately vote for are "all or nothing" flavored. Censor everything, censor nothing. Track everybody, track nobody.
If voting changed anything they'd ban it.
>Tons of parents who totally understand the surveillance state
If you truly understood how the surveillance state feeds on human life, you would deny it sustenance by - yes: - refusing to breed in captivity.
That's one of the few meaningful political actions available to the individual. At least until advances in reproductive medicine get turned on us, same way it happened with the mind-bicycles. A society with the technical capacity to go Gattaca might rather go all-in on Plato's Republic.
Type of beat like yall can have the world to yourselves if yall want it that bad, but believe me, you will choke on it.
I think in this case many of these people are "useful idiots" in the sense that they lack a strong technical understanding of how the internet and www are architected. This can cause them to accept erroneous concepts like "tracking the identity of all internet users is the only way to protect the children" while alternatives like the one proposed at the beginning of this thread can be easily glossed over as some techno mumble jumble.
> The U.K. Online Safety Act was (avowedly, as revealed in a recent High Court case) “not primarily aimed at protecting children” but at regulating “services that have a significant influence over public discourse.”
Thanks, this was good info.
As an aside, I read the original source. I found the writing completely impenetrable and realized I know nothing about the British legislative process.
But this did, nonetheless, convince me that british legislators are interested in using this bill to regulate the internet.
> It genuinely doesn't seem like any more of a threat than age-gating Playboy at the bookstore
If it was really like that, I would have no problem. Simple ID check, in-person only, that's never stored anywhere.
I've proposed this several times. Age-gated websites (social media, random forums, adult websites) should require a one-time use code or token that expires once a year. The token should only be available for purchase at liquor stores or tobacco stores - someplace they check your ID on pain of losing their license. It should be reasonably priced.
Sometimes someone might resell a token they purchased to a minor. Those people should be actively hunted with sting operations and prosecuted.
There's no good reason to make age verification on the Internet more stringent than age verification to buy alcohol or tobacco. Alcohol and tobacco kill far more people.
I've never had my ID scanned. The sales clerk glances at it. These days they don't even ask :-D
If they scan your ID for alcohol or tobacco purchases where you live it might be time to fix that with legislation too. Insurance companies would love that data.
I went to check my Social Security administration account like 4 years ago - I forget why. To access it, I have to have an actual video face to face conversation with people from some Real ID company.
I'll never look at that account again in my ficking life.
I don't understand the downvotes. If you have this question then so do others and it ought to be part of the discourse. Anyhow...
From what I've seen, the current wave of ID-gating the internet is a wedge for opening the door to much broader censorship. Specifically, some jurisdictions (Wisconsin, Minnnesota, and the UK) are using recently-passed legislation to argue that we need to make VPNs illegal [0 1 2].
Speaking for my own beliefs, banning the use of VPNs is a huge problem, and it seems like basically anybody who understands the technology would be against it.
I have no problem with banning or age gating pornography at all. Personally it seems weird to me that that's the red line for people.
But this is a good point, which is that lawmakers who don't have a clue what they're regulating will see VPNs as undermining the laws they've made. Thanks for this
> Not idiots necessarily, sometimes just long-time observers who have finally become cynical.
This doesn't explain why they would support privacy-invasive ID requirements instead of the RTA header.
> People that were pro-guns for decades may watch several years of failure to adopt basic and uncontroversial gun-control regulation, then eventually become anti-gun.
I want to call this a bad example because the only people who call the rules that don't pass "basic and uncontroversial" are the people who were on the other side to begin with, but maybe it's a good example because the analogy lines up so well with exactly the same scenario:
People who are anti-X propose rules with low effectiveness against actual harms but that impose significant burdens on innocent people who are pro-X, persistently insist that their proposal is fine and supported by everyone even as it demonstrably lacks enough support to pass and then point to the period of nothing being done to try to garner enough support from independents to squeak over the line instead of considering less burdensome alternatives, because burdening the pro-X people is the point. And then the people who fall for it are the useful idiots.
Unfortunatwly "keeping kids and teenagers off of algorithmic social media" is one of the most worthy goals one can pursue right now; so is keeping them off infinite porn.
No, I believe the term is "parents don't want 8 year olds getting access to tits, violence and gore"
Given that kids need a device for school in a lot of areas (mine included) and the tools for stopping kids getting either access or bombarded by such stuff are either shit, require deep technical knowledge, or predatory, I can see why people are asking for it.
I presently hate the current system of handing over biometric data in exchange for tits. I don't want some shading startup having my biometrics so that when they go bust, pivot or get hacked, can be used to steal my stuff.
The middle ground is a system that _normal_ people can us to make sure kids who have access to devices can't easily access nefarious shit.
None of that is useful idiots.
When it get fun is the all or nothing crowd. The internet is going to be age gated, whether you like it or not. If you continue to go "INTERNET MUST BE FREEEEEEEE" without accepting that the tools that the populace _want_ don't exist means you get porn bans, or worse.
I think there's probably a middle way without going as far as "biometric data in exchange for tits"
I'm in the UK and so far the only thing I've noticed age wise is Reddit asked me for a webcam selfie, which could easily have been faked by a kid with an accomplice but if the aim of this is to stop actual vulnerable kids that kind of thing is maybe enough. If they are with it enough to use VPNs and stuff they are probably old enough to see porn etc.
Like in the old days people used to avoid the kids looking at porn by putting the porno mags on a high shelf so they couldn't reach them. I don't think you need passport control level ID for this kind of thing.
> I don't think you need passport control level ID for this kind of thing.
I 100% whole heartedly agree.
For uk mobile ISPs there is already a system that stopped most of the nasty stuff from getting past. It was pretty difficult to circumvent, hence why I turned it off for me. If that could have been rolled out wider, with an account password for turning it off, that would have been more than enough.
Can you explain to me what is being exploited here? I had to do KYC for Hetzner, for anything crypto related in the last decade, and a number of other things.
Age-gating porn doesnt seem problematic to at all. In fact it's far less worrisome than any of the former, which are kind of important for commerce. What am I missing?
Once there is a record of what porn you looked at, people, government, employeers won't hire you. could be based on that you looked at all, or that you looked at the wrong kind. Wrong = whatever fetish you're into and your employeer/government/health-ins doesn't like.
Lets just hope there's no government that wants to incriminate certain sexuality and gender, then all these logged KYC for every little social thing will be very dangerous.
But personally, I'm much more concerned about it in regular commerce.
A huge swath of the population thinks that porn is inherently harmful. An even bigger swath thinks that it should be completely separated from both. I agree with both of these things.
I'm also strongly against censorship, so I'm trying to figure out how people are worried this is being used. I do not, at all, consider age-gating Playboy at the gas station to be censorship.
If you think your porn habits are not already being logged and tracked by intelligence agencies, I think you are fully delusional.
The issue isn't age-gating Playboy, but to begin censoring requires a line to be drawn, and there's no guarantee that educational material regarding LGTBQ topics wont be considered "adult" or "pornographic".
The whole "know it when you see it" doesn't work when there's a significant group out there who would love to see queer people at large go away from society. With this, you now have teenagers being blocked from actual educational material because Carol from the "burn everyone but me" church down the street believes anything regarding sexuality is "adult" material.
The thing with the porn habits being logged by intelligence agencies, is that data has a large risk-reward for actually being used. They wouldn't burn the secret of their capabilities for something small. Most of the metadata wouldn't be admissible in court assuming courts don't go full kangaroo. The usage of the metadata is general intelligence to point investigations, or parallel reconstruction to get warrants for someone they don't actually have anything on, but want to search.
Doing KYC American style for porn/adult content means mass data leaks are a matter of "when", because there's no consumer protection and this data will be retained indefinitely because ads make money. The leak means real people are put in real danger.
I believe the term for them is evangelicals. I'm going to guess that a venn diagram of deeply religious people and people pushing for "protecting" the kids is just a circle.
"useful idiots" was a Stalinist term for people willing to cover up for the murder of millions on the grounds that communism was good and would never do the holodomor.
I really don’t care about what’s on the internet, until my kids get exposed to it. How grownups talk to other grownups in private isn’t my concern.
But when kids - and I mean my kids - enter the loop it becomes my business, and ideological concerns go out the window.
I’ve ranted and raved about how terrible filtering software is, and how school provided computers contain massive workarounds.
The real concern isn’t porn sites — the real concern is poorly moderated social media sites. Ones where kids post things other kids see. And guess what the kids post?
But a lot of the nasty content shared in these poorly moderated sites gets it start elsewhere.
I’m cynical about any law, but my bias toward legal action is only increasing as the online situation is only getting worse.
Can't you do mac filtering on your router at the very least?
Why not install root certs on all your kids' devices and then force them through your home proxy so you can run content classification and proactively block and get reports of what you've blocked? A little privacy-invasive, but if your kids are young enough, it makes sense to get alerts when they've attempted to access boobs or gore so you can have a convo about it.
The easiest route here in my opinion aside from DNS services that claim to block adult content would be to use a Squid SSL Bump proxy. It's along the lines of what you are suggesting and requires installing a self signed CA cert on the client but gives you centralized management of what domains, URLs, file types, times of day, URL patterns are allowed/permitted as well as a memory and disk cache to reduce bandwidth. This [1] is a really old example based on Squid 3.x but this concept has improved a lot in Squid 6.x. Sites that still do public key pinning there are a handful will have to be added to Squid's SSL BUMP exclusion. Ignore the term SSL, it's TLS but they kept the term the same.
Given the negative responses, I'd like to strengthen the position against positive rights by stating that positive rights require slavery the extreme whereas negative rights do not.
If the government guarantees food for children (or anyone), the government must provide it. If nobody is willing to be a farmer at any rate the private or public sector affords, the government must force someone to be a farmer to produce food to fulfill their positive right grant.
You induce people to do things with payments, not slavery. Or you ask for a volunteer corps. Or you have the army do it if all else fails. I presume you are opposed to the existence of standing armies if you are opposed to slavery in all forms as well as wage slavery.
I'm less opposed to standing armies because a voluntary contract is entered by both parties. I am not a fan of the draft or compulsory service.
My example was an edge case. I expect the government could find a price that someone would take long before they required slavery. But it still stands that it is a potential. I prefer all transactions are voluntary, even if that means you lose your country because nobody is willing to sign up for it's defense.
Claiming support from "the vast majority" is clearly nonsense. There is little support for getting rid of social security, Medicare or Medicaid, or several other current wealth transfrr programs.
The goal was to put Company A in between you and the web. Collecting data and selling it for profit. It’s never about what they say it’s about. Lobbyists have bought every aspect.
> None of these laws are actually about protecting children. That's not the real goal.
I fear that for 90% of the supporters of such laws (just like with chat control) this statement is wrong, and they truly do want to protect minors from harm. But that only makes it worse, because this type of argument completely misses the mark while the other 10% get to laugh up their sleeves while continuing to manipulate public opinion.
Technical people have been gleefully eliminating anonymity on the web for the last 20 years. Progressives should be the party on the side of maximal freedom but really in the US we have one neo-liberal party wearing two different disguises.
The problem is normies don't operate under assumed anonymity. So when the hordes of unwashed regular people joined the internet they wanted their face everywhere. People were shamed out of their handles. Some people gave up their anonymity to make yet another faceless bullshit blog-as-a-resume. Look at most of the top karma farmers on HN. Most of them post their personal information in the their bio. Pathetic.
> people will self-censor when knowing they are being watched.
This has been happening both in public and on the internet for over a decade now.
> Not being able to be at least pseudo-anonymous has a real chilling effect on speech and expression.
The normies would argue you have nothing to hide if you aren't doing anything wrong. The average voter, regardless of party, will happily surrender every ounce of freedom for the thought of security. Hell, I remember sometime around 2007 DEFCON became a first-name-basis conference!
> bill of rights
It's more of a bill of privileges given NGOs and PACs are regularly paying to erode the core rights granted to citizens. Either through lawfare by circumventing the courts and suing companies into bankruptcy, or by directly purchasing congressmen via donation.
What I have found in general is people who cry and complain about this kind of thing were, at one point, happy to have it happen to their political enemies. The laws that are paving the way for age-gated deanonymized internet were at one point used as a cudgel to beat their political enemies down. When the tables finally turn after the Nth "protect the children" bill, it's the other people left crying and now suddenly its a "problem".
You may not be old enough to remember Edward Snowden or Mark Klein (who went unnoticed), but there never was anonymity.
My pet theory is that this requirement is part of a mob war and porn and whatever else the MindGeek people are involved with is being attacked for the much of the same reasons Ukraine attacks Russian oil refineries.
> The real goal is the complete elimination of anonymity on the web...
I'm ok with running this experiment (not sure how it really turns out) BUT only if everyone participates. Governments and businesses get to watch me... I get to watch them. If the death if anonymity is inevitable, as unpleasant as that sounds, the goal to shoot for then is universal application
>Not being able to be at least pseudo-anonymous has a real chilling effect on speech and expression. Even if there are laws in place protecting such rights, people will self-censor when knowing they are being watched.
This supposed golden era of communication lasted for a very short period. Why is is so important that freedom of speech also be anonymous? What you're asking for is the right to talk to anyone with all societal, cultural, and interpersonal contexts removed.
It is not only freedom of speech, but freedom of association that would also be jeopardized.
People long ago used to have to hide that they're gay, not only because they could be ostracized, but that people they associate with could also be under scrutiny.
Being able to track one's movements, or who they associate with, could reveal information that said person would want kept secret.
There was a podcast episode I listened to once, probably Darknet Diaries but maybe some other tech one, where the person being interviewed was an active community member in some bbs back in the day. Everyone decided to meet up to play dnd, and he showed up as a 13 year old kid when everyone else was 20+. They let him stay after cleaning it with his mom.
This is one type of connection that would be unlikely to form if superficial anonymity is lost. That kid probably would be off in some "safe" walled garden.
This doesn't even touch on more obvious forms of discrimination like gender, religion, etc.
And political affiliation / speech isn't protected in the US, so an employer could term you anytime for policy disagreement. Such a policy would destroy the exploration of ideas overnight, as outrage mobs would try to get any dissident sacked.
If those are your concerns, then why is it so important that this freedom of anonymous expression only happens on the internet? I think what you are really asking for is private, encrypted comms but only to a certain subset of people. Otherwise, you should also argue for freedom of anonymous expression over any other medium.
And of course freedom of speech has practical limits. It's that very tempering that stops non-virtual discourse from turning into a cesspool. I worked for a company that permitted anonymous comments to the leadership team, which they would then review in front of the company. It was a total shit show, and I attached my name to any comments I made.
If you are not happy filling in your workplace questionnaire unless it's anonymous, then something needs to change about your company (and something that probably can't be fixed with anonymous comments).
I'm not asking for anything, I was merely pointing out the advantages of anonymity. You don't need to consider a decision the best one to see its upsides.
I don't really get the rest of that argument. What other mediums are legally deanonimised? Privacy in mail and telephone was a commonly supported right, Watergate was a scandal for a reason.
>If you are not happy filling in your workplace questionnaire unless it's anonymous, then something needs to change
That's the point I was trying to make, that it is a shortcut, but an improvement. Preaching a 'good option' that doesn't survive the real world is a common failure of justice systems.
Example: 'Anonymous tip off for sexual abuse' is a very flawed system. Tell the victims 'no, see, what you need is proper handling of abuse by authorities'. Is that useful when we know for a fact that alternative never worked?
Shortcuts should only be removed _after_ the proper alternative is in place and working. Otherwise, you're just making people lives worse.
> It's that very tempering that stops non-virtual discourse from turning into a cesspool.
Agreed, anonimity introduces many problems we haven't been able to solve properly yet. It can platform abusers. It can empower legitimately wrong behavior. It can make people less willing to take ownership of their actions, or less empathic.
Those are all legitimate points to consider and balance, I'm just not ok with pretending it's a no-brainer.
>I'm not asking for anything, I was merely pointing out the advantages of anonymity. You don't need to consider a decision the best one to see its upsides.
You haven't given any yet. You've pointed out that anonymous messages in some circumstances can be beneficial (which they can), but haven't given any advantages for a widespread, anonymous communications network with open access.
>Privacy in mail and telephone was a commonly supported right
It really wasn't. I don't know of any time or place where mail and wire tapping wasn't legal and/or practiced.
Diversity is relative. The difference between Irish and English ancestry created low trust in the mid-1800s USA but is fairly irrelevant today. Trust grew over time.
> I’m not sure how Microsoft has managed to escape market discipline for so long.
How would they? They are a monopoly, and partake in aggressive product bundling and price manipulation tactics. They juice their user numbers by enabling things in enterprise tenants by default.
If a product of theirs doesn't sell, they bundle it for "free" in the next tier up of license to drive adoption and upgrades. Case in point, the InTune suite (includes EntraID P2, Remote assistance, endpoint privilege management) will now be included in E5, and the price of E5 is going up (by $10/user/month, less than the now bundled features cost when bought separately). People didn't buy it otherwise, so now there's an incentive to move customers off E3 and into E5.
Now their customers are in a place where Microsoft can check boxes, even if the products aren't good, so there's little incentive to switch.
Try to price out Google Workspace (and also, an office license still because someone will need Excel), Identity, EDR, MDM for Windows, mac, mobile, slack, VoIP, DLP, etc. You won't come close to Microsoft's bundled pricing by piecing together the whole M365 stack yourself.
So yeah, they escape market discipline because they are the only choice. Their customers are fully captive.
I also hope its just a blip, but I don't actually think it is.
The democratization of technology was something that had the power to break down class barriers. Anyone could go get cheap, off the shelf hardware, a book, and write useful software & sell it. It became a way to take back the means of production.
Computing being accessible and affordable for everyone = working class power.
That is why its backsliding. Those in power want the opposite, they want to keep control. So we don't get to have open devices, we get pushed to thin clients & locked boot loaders, and we lose access to hardware as it increasingly only gets sold B2B (or if they do still sell to consumers, they just raise prices until most are priced out).
When the wealthy want something, that something becomes unavailable to everyone else.
Yes, and that's a part of the appeal to companies like Google. They've climbed the ladder, and now they're pulling it up behind them so others can't climb up to catch them.
Definitely. And there's a tendency for individuals and particularly corporations to pull up the ladder behind them. They know that leaving things accessible means they could face major competition 5 years down the road. So they do what they can to prevent that.
Exactly! Apple wouldn't have existed without access to the MOS 6502 and other electronics, which allowed Woz to carry out his dream of building a personal computer. Microsoft might not have existed without the Altair 8800. Many 1990s and 2000s web startups got off the ground with affordable, available hardware, whether it's hand-me-down RISC workstations or commodity x86 PCs.
Granted, to be fair, many of today's startups and small businesses are made possible by AWS, Google Cloud, Microsoft Azure, and other cloud services. It is sometimes cheaper to rent a server than to own one, and there are fewer system administration chores. However, there's something to be said about owning your own infrastructure rather than renting it out, and I think a major risk of compute power being concentrated by just a few major players is the terms of computation being increasingly dictated by those players.
> Those in power want the opposite, they want to keep control. So we don't get to have open devices, we get pushed to thin clients & locked boot loaders
While it's undeniable that MAFIAA et al have been heavily lobbying for that crap... the problem is, there are lots of bad actors out there as well.
I 'member the 00s/10s, I made good money cleaning up people's computers after they fell for the wrong porn or warez site. Driver signatures and Secure Boot killed entire classes of malware persistence.
Is this not just "with freedom comes responsibility" applied to technology? Often the freedom to do something means that, when given that sovereignty and missing the requisite experience, that means you also end up with the freedom to harm yourself (whether through a misunderstanding of a danger or just simple error.)
Do we want to accept that as a potential consequence, or have someone else choose for us what consequences we are allowed to accept?
> Do we want to accept that as a potential consequence, or have someone else choose for us what consequences we are allowed to accept?
Unfortunately, I think the old guard here is dying out and the majority want someone else choosing for them, which is why all the age verification & chat control-like bills have broad bipartisan support.
I'm in the "with freedom comes responsibility" camp. Obviously we should build secure systems, but our devices shouldn't be impenetrable by their own user. The "security" we are getting now is just security against the user having the freedom to do as they wish with their devices and software.
The cultural zeitgeist surrounding internet and computing freedom has changed to be in favor of more control and censorship. Not sure how we can stop it.
I don't see how it can be a blip if AI actually turns out to be successful. They'll likely gobble up any lose hardware for their datacenters until only scraps are left or the AI bubble pops if AGI isn't achieved in the next few years and stock values fall off a cliff
> You can vibe-code a throwaway UI for investigating some complex data in less than 30 minutes. The code quality doesn't matter, and it will make your life much easier.
I think the throwaway part is important here and people are missing it, particularly for non-programmers.
There's a lot of roles in the business world that would make great use of ephemeral little apps like this to do a specific task, then throw it away. Usually just running locally on someone's machine, or at most shared with a couple other folks in your department.
Code doesn't have to be good, hell it doesn't even have to be secure, and certainly doesn't need to look pretty. It just needs to work.
There's not enough engineering staff or time to turn every manager's pet excel sheet project into a temporary app, so LLMs make perfect sense here.
I'd go as far to say more effort should be put into ephemeral apps as a use case for LLMs over focusing on trying to use them in areas where a more permanent, high quality solution is needed.
> simply because the market has never really punished people for being less efficient at their jobs
In fact, it tends to be the opposite. You being more efficient just means you get "rewarded" with more work, typically without an appropriate increase in pay to match the additional work either.
Especially true in large, non-tech companies/bureaucratic enterprises where you are much better off not making waves, and being deliberately mediocre (assuming you're not a ladder climber and aren't trying to get promoted out of an IC role).
In a big team/org, your personal efficiency is irrelevant. The work can only move as fast as the slowest part of the system.
This is very true. So you can't just ask people to use AI and expect better output even if AI is all the hype. The bottlenecks are not how many lines of code you can produce in a typical big team/company.
I think this means a lot of big businesses are about to get "disrupted" because small teams can become more efficient because for them sheer generation of somtimes boilerplate low quality code is actually a bottleneck.
Sadly capitalism rewards scarcity at a macro level, which in some ways is the opposite of efficiency. It also grants "social status" to the scarce via more resources. As long as you aren't disrupted, and everyone in your industry does the same/colludes, restricting output and working less usually commands more money up to a certain point (prices are set more as a monopoly in these markets). Its just that scarcity was in the past correlated with difficulty which made it "somewhat fair" -> AI changes that.
Its why unions, associations, professional bodies, etc exist for example. This whole thread is an example -> the value gained from efficiency in SWE jobs doesn't seem to be accruing value to the people with SWE skills.
It's sad, people have really been shitting on copyleft licenses the past few years, when they are critical to ensuring our computing freedoms are preserved.
Copyleft protects the user. The friction is, like you said, by design. It ensures that something that started free, stays free, and can't be rug pulled out from under you.
Big monied interests have been trying, and succeeding, in changing the discourse around free software away from free and to simply just "open source" and moving toward permissive licenses, specifically so community effort can be extracted and monetized without contributing back.
> what it can do and how it can transform how we live and work is also undeniable.
I’m still not so sure on that part. Maybe, eventually? But it feels like we are still trying to find a problem for it to solve.
Has there been any actual, life transformative use cases from an LLM outside of code generation? I can certainly sit here and say how impactful Claude code has been for me, but I honestly can’t say the same thing for the other users where I work. In fact, the quality of emails has went down since we unleashed Copilot to the world, and so far no one has reported any real productivity gains.
> Has there been any actual, life transformative use cases from an LLM outside of code generation?
Content analysis and summarization is a big win in my view.
Having also been around during the emergence of the personal computer revolution I'm reminded of how having a home computer could be helpful for keeping recipes and balancing checkbooks -- it was the promise of "someday" that fueled optimism. Then the killer apps of spreadsheets, word processing, and desktop publishing sealed the deal.
Following that analogy we're at the Apple ][ stage -- it works and shows capabilities but there's likely so much more ahead.
I work in ops though, so I'm not building consumer-facing products but mostly IT glue code and internal tooling (mostly Go), dashboards, business report generators, gluing SaaS together, etc. (mostly dotnet/C#).