Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The real impact of AI is likely to be subtler: AI tools will shift power away from workers and centralize it in the hands of a few companies.

I don’t understand how the article can say something like this, but then further down essentially contradict itself.

> But a containment approach is unlikely to be effective for AI. LLMs are orders of magnitude cheaper to build than nuclear weapons or cloning — and the cost is rapidly dropping. And the technical know-how to build LLMs is already widespread.

How can AI tools cause centralization if the cost is rapidly dropping, accessible to everyone, and the technical know-how is widespread?

In my opinion, AI is doing exactly the opposite. What was only possible at large companies are now becoming possible to do by oneself. Video games, for example, used to require teams of artists and programmers. With code and art generation, a single individual has never had more capability to make something by themselves.



Low barriers-to-access for tooling absolutely democratize the creation of things with said tools.

But. It doesn’t meaningfully shift the underlying economic mechanics we live under, where people with access to capital can easily outcompete those without that access. It also erodes the relationship between the laboring class and the capital-deploying class (less interdependence), which has to have knock-on effects in terms of social cohesion and mobility.

I’m very much in favor of the tech being developed right now, but it’s hard to believe it’ll do anything but amplify the trends already present in our economy and culture.


Which is where UBI will need to come in, or a reallocate labor workers to where they can self sustain. Then it'll lead to some 86 Eighty-Six world, where humans live in holding pens and wars are fought with crazy robots all out of sight & no information comes out about it to the public


UBI was probably a zirp fantasy, but more importantly it’s a US zirp fantasy because the US controls the dollar and can thus fantasize about crazy things like UBI with little fear of repercussions.

Look at the recent UK Liz Truss crisis for what happens when a country tries to spend more than it should. And the UK is a very privileged economy that controls a reserve currency.

If you are just a normal country, attempting UBI will simply destroy your economy, nobody will buy your debt, and the IMF will come and make you stop it. Some countries may be able to fall back on natural resources, but many countries won’t.


> With code and art generation, a single individual has never had more capability to make something by themselves.

That is nice and all, but let me put it this way: if it costs you three cents in computing power to have AI write the necessary code for something or draw good art, you can't reasonably expect anyone to pay you more than three cents for it.

Your economic value as a worker matches the economically productive things you can do, and that's what is going to put food on your table. If an AI can do in ten seconds for ten cents a task that would take you an hour, congratulations, your maximum wage is ten cents. If/when the AI becomes better than you at everything, your work is worthless, your ideas are worthless, so it all comes down to the physical assets you possess.

And that's why it is centralizing: the most decentralized class of asset is human labour. AI has the potential to render it worthless, and whatever remains (land, natural resources) is already more centralized. There is a small number of people who may be uniquely talented to leverage AI to their benefit, but it is unclear how long that period might last.


Bingo.

That's also why I think lots of these folks have no qualms doing this research so long as they are paid handsomely...

And why other rich folks are happy to pay it.


"Pay these 100 people 10x more than we would normally, so we can pay 10,000 people 1000x less than we do already"


> if it costs you three cents in computing power to have AI write the necessary code for something or draw good art, you can't reasonably expect anyone to pay you more than three cents for it

I don't think that's true at all. Haven't you ever heard the expression, "the whole is greater than the sum of its parts"?


Sometimes. But it's not magic. You still have to determine what the added value is.

First, most jobs are not creative. If you have a job where you are told what to do (that is most jobs) and an AI could do it for cheaper, it is a net win for your employer to fire you and run the AI themselves. Either you do the job yourself, but less efficiently, or you delegate it to the AI, in which case you are nothing more than a middleman.

In order for AI to empower you, it needs to work for you and help you accomplish a goal that you set yourself. So if you have a great idea, it could in theory help you implement it without requiring a lot of resources. Well, that's the theory. The problem is a) very few people actually have good or original ideas; b) AI will eventually have better ideas than most people; and c) most people care about their own ideas, they don't care about yours. You have a great idea about a movie. You use AI to make it and then you watch it. I have a great idea for a movie. I use AI to make it and then I watch it. Are we going to watch each other's "creation"? Sometimes, maybe. Most of the time, probably not. Ultimately, execution is worth a lot more than ideas, so if you manage to automate execution, what's left is worth peanuts.

Note that we are talking about the endgame, here. While the technology matures, it may empower a lot of individuals (as well as a lot of corporations). The point is that it is an unstable equilibrium and there may come a point where some people have literally nothing of value to offer because AI is better than them at everything. And then there may come a point where that is true of every human.


Perfectly said. The genie is out of the bottle, but we can at least share the wealth with everyone before a revolution happens.


It's not a contradiction.

Even if many people have good models close to SotA it will still reduce the importance of workers since the models compete with certain kinds of knowledge workers. This will lead to power and wealth moving from workers and ordinary people to company owners. That is centralisation.


Maybe in the arts but in manufacturing, for instance, AI driven machinery still requires said machinery. AI driven warehouse robots still require a warehouse full of goods. AI designed chips would still require fabrication, etc.


First movers in the AI world who have amassed capital from other endeavours can leverage that capital to amass even more capital by taking advantage of AI before other people can. This will further tilt the scale in favour of large corporations at the expense of workers.

As the techniques improve and cost to implement them decreases, 3rd parties will be able to develop similar technologies, preventing any sort of complete containment from being successful, but not before other peope have been able to consolidate power.


> Video games, for example, used to require teams of artists and programmers. With code and art generation, a single individual has never had more capability to make something by themselves.

That may be so, but precisely because of this, a lot of us are gonna be left up shit creek if our indie game doesn’t go viral. The big companies don’t need so many of us any more. Inequality is going to increase dramatically unless we start taxing and redistributing what we all contributed to these massive AI models.

ChatGPT was trained on the contributions of all of us. But it will be used to make many of us redundant at our jobs. MidJourney will likewise increase inequality even as it has the potential to greatly enrich all of us.

If we continue down our current neoliberal path, refusing to give people universal health care, or basic income for anyone not disabled or over 65, and keep calling any redistributive efforts “sOcIaLiSm”, our society will lurch towards fascism as people decide genocide is the only way to make sure they have enough resources for them and their family.


Here is Tyler Cowan’s take which I think clarifies things in terms of the 3 factors of production (land, labor, capital). Even if the LLMs themselves are not monopolized by a few companies, land and capital may still become more expensive relative to labor. https://www.bloomberg.com/opinion/articles/2023-03-28/how-ai... (paywalled)


I think the author feels that LLM development is only driven by OpenAI and Bard and is ignoring/ignorant of open source LLMs




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: