Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I do wonder if OpenAI is built on a house of cards. They aren’t a nonprofit, aren’t open, and stole a huge quantity of copyrighted material to get started.

But, by moving fast and scaling quickly, are they at the Too Big to Fail stage already? The attempted board coup makes me think so.



When people say too big to fail, normally they're referring to companies which if they fail they would bring down other important parts of society's infrastructure (think biggest banks), and so someone (the gov) will last minute change the rules around to ensure they don't fail.

The openai fails, absolutely nothing happens other than its shareholder losing their paper money. So no, they're not too big to fail.


OpenAI doesnt even have shareholders, so the company would just bankrupt and few hundred people would be out of jobs.

Probably Microsoft would hire them to some AI shop, because Microsoft is the one deploying the stuff. But Microsoft has rights to use it and the code, so for them OpenAI is only a research partner.

Maybe research would get slower.


If they fail, other entities with little to no American oversight/control potentially become the leading edge in artificial intelligence.


I find your lack of faith in America disturbing.


Openai isn't even close to too big to fail. Bank of America fails the entire banking system collapses and the entire real economy grinds to a halt. If GM fails hundreds of thousands lose their jobs and entire supply chains collapse. If power utilities fail then people start actually dying within hours or days.

If OpenAI fails nothing actually important happens.


When their product is embedded at the OS level in every Windows 11 computer, would that be too big to fail?


Yet. But we are getting close to an event horizon, once enough orgs become dependent of their models.

Open source models are actually potentially worse. Even if OAI is not TBTF because of the competition, we have a scenario where AGI sector as a whole becomes TBTF and too big to halt.


I mean, there's about a hundred thousand startups built on top of their API. I'm sure most could switch to another model if they really needed, but if copyright is an issue, I'm not sure that would help.


If you've plugged your whole business into OAI's snake oil, you're an early adopter of technology and you'll likely be able to update the codebase appropriately.

The sooner SCOTUS rules that training on copyrighted material is infringement, the better.


> you'll likely be able to update the codebase appropriately

Update the codebase to what exactly? Are there generative AI companies not training on copyrighted material that achieve anything even close to the results of gpt4? I'm not aware of any


you cannot erase that much value and say "nothing important happens", market cap is largely a rough proxy for the amount of disruption if something went under


I do not think the situation is remotely comparable to the possibility of the banking system collapsing. Banks and other financial institutions exert leverage far beyond their market caps.


But they are also extremely substitutable because they deal in the most fungible commodities ever made (deposits and dollars).


The good thing about AI is that it is substitutable by humans.


"whose" money matters here. It's VC money, mostly. Well-capitalized sophisticated investors, not voters and pension funds.

If Microsoft loses 30 billion dollars, it ain't great, but they have more than that sitting in the bank. If Sequoia or Ycombinator goes bankrupt, it's not great for lots of startups, but they can probably find other investors if they have a worthwhile business. If Elon loses a billion dollars, nobody cares.


It is VC money pricing in the value of this enterprise to the rest of society.

More over, if capital markets suddenly become ways to just lose tons of money, that hurts capital investment everywhere, which hurts people everywhere.

People like to imagine the economy as super siloed and not interconnected but that is wrong, especially when it comes to capital markets.


In the case of OpenAI it's potential value that investors are assessing, not value. If they folded today, society would not care.

And as for the whole idea of "company value equals value to society", I see monopolies and rent seeking as heavy qualifiers on that front.


I agree with both of those points, it is a very rough proxy. (edit my original) Future value is still important though.


Why would OpenAI be "too big to fail"? They seemed pretty close to failing just some months ago.


I think that is actually quite illustrative of the opposite point.


What about the CEO drama indicates OAI is "too big to fail"? They're completely orthogonal. No one came to bail OAI out of a budget crisis like the banks or auto industry. I fail to see how it's related at all.


Right, and then a bunch of unclearly identified forces came in and swept it all under the rug.


Ye. The failed "coup" was really shady, in that it failed, even though they fired Sam Altman.


> built on a house of cards

The "house of cards" is outperforming everyone else.

It would have to come out that the slow generation times for GPT-4 are a sweatshop in Egypt tired of typing.

Either that, or something inconceivable like that board coup firing the CEO as a material event triggering code and IP escrow to be released to Microsoft...

PS. “Too big to fail” generally is used to mean a government+economy+sector ecosystem will step in and fund the failed enterprise rather than risk harm to the ecosystem. That's not this. Arguably not Tesla or even Google either. That said, Satya's quote in this filing suggets Microsoft already legally contracted for that eventuality: if this legal entity fails, Microsoft keeps the model online.


"stole a huge quantity of copyrighted material" <- nobody stole anything, even if it's eventually determined that there was some form of copyright infringement it wouldn't have been stealing




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: