The shared "w" with Warp triggered some memories. I hope things go better, but after being burned once I don't see myself getting off of p10k ever, possibly for the worse.
This comes after this year selling the rights to the Transformers movie series, an arguably larger hit to their ability to have mainstream impact. It seems very unlikely they wouldn't be looking for a buyer for WotC as well, and trimming down to just the IP may be making it a better sale.
The company is struggling and likely pulling out all the stops the avoid bankruptcy, when you need liquidity this could very much include cuts to otherwise profitable segments.
Well, why wouldn't they sell (license) the rights to make Transformers films (which as far as I know is just extending their existing contract with Paramount)?
They still own the underlying IP[^1], so as long as the contract is a decent one, Paramount has to deal with the actual making/distributing the film, and Hasbro just gets the money, and a toy line off the back of the film. Feels like an easier set up than taking the risk on movie-making yourself (which they did attempt with eOne for other properties, but seemingly have decided that it's probably not a good deal with them)
[1] yes, yes, it's a bit more complicated with Takara in the mix too, but you can essentially view it as a Hasbro-owned property
> they don't want to be in the legacy media licensing business.
Isn't that business basically free money? The way I see it, no capital investment is needed. You just need to keep a few accountants and lawyers around to handle occasional licensing paperwork. Am I missing something?
It may be "free money" as you frame it. But a cash stream that provides n dollars per year forever can be valued in today's dollars, assuming a discount rate of d, at n / (1-d). So it's reasonable to prefer cash now to revenue forever, at that exchange rate, depending on your corporate interests.
You have the right idea, but you got the formula wrong. That's evidenced in the source you link, but you can also reason it from first principles: a higher discount rate should make the cash stream less valuable, not more. The correct formula is n / d.
The discount rate is doing a lot of work here. There is a discount rate such that we're not talking about shortsightedness. Getting it right is difficult. But as an example, how much would you buy an investment that pays a hundred dollars, guaranteed, next year for? Trivially, the discount rate includes at least the expected amount of inflation; it's not worth a dollar.
For assets line like IP you have to factor in how risky the returns are, how much investment you'd have to make to see them (e.g. making a movie), and overall strategy (do we want to be in that line of business).
All this to say - if you have IP that pays 10 million a year, you can value future returns on that IP in today's dollars. If someone offers you more than that to buy it, you should take the deal; you come out ahead.
Well consider the recent spate of music artists who've sold their back catalogues. They're selling that future income for a lump sum [while they can enjoy it].
Hasbro is doing the same. Swap fifty years of slow income for instant liquidity. A company with as many steels in the fire as Hasbro should be able to use that to generate a lot more money than through legacy property.
It is very much not free -- they apparently raised a lot of debt to buy eOne, and they are going to have find a way to pay that off or roll it over into a much, much higher interest rate environment than 2019.
Does it matter? Hasbro probably has growth targets. They may have concluded that price hikes for legacy content matching their growth targets were unlikely to be feasible.
Just "terrible for desktop". I don't have any data to back it up so take it with a grain of salt, but I suspect for desktop Linux use, Ubuntu is far more prevalent than RHEL, and probably hacker types will otherwise be using Arch. So this change probably really only affects people that should have switched their desktop distro a long time ago.
For server use, where this isn't an issue at all for most use cases, putting aside personal preferences, RHEL is likely not a terrible choice, especially on AWS.
From what I understand, cars in Japan are all capped at 180 km/h, and cars that are capable of more often have a gps check to unlock automatically when on a known race track.
180 is still quite high but at least not completely insane as some of the gangs were before that limit. A number would probably need to be found for any country, state, etc but the tech is probably ready.
I had a similar experience with a Chinese scam seller on eBay with a $500 GPU. tl;dr neither ebay nor PayPal would accept a claim because they had proof of receipt from DHL. Of course, a simple call to DHL could easily verify it was delivery to a different address but even providing an email from DHL to this effect as evidence didn't change the line. Eventually had to charge back and luckily in my case that did go through after about 3 months of process - the phone call from the card company in the process was nice to verify unlike everyone else it did actually get handled by a real person.
Naturally that was the end of my eBay account, PayPal unfortunately I do need from time to time.
I suppose this is just standard practice in the industry but wonder why it can be so consistently bad even with real evidence.
The article specifically is about AI. Don't most useful LLM models require too much RAM for consumer Nvidia cards and also often need those newer features, making it irrelevant that a G80 could run some sort of cuda code?
I'm not particularly optimistic that ecosystem support will ever pan out for AMD to be viable but this seems to be giving a bit too much credit to Nvidia for democratizing AI development, which is a stretch.
First of all, LLMs are not the only AI in existence. A lot of ML, stats, and compute can be run on consumer grade GPUs. There are plenty of problems that aren't even applicable with an LLM.
Second, you absolutely can run and fine tune many open source LLMs on one or more 3090s at a time..
But being able just to tinker, learn to write code, etc.. on a consumer GPU is a gateway to the more compute focused cards.
I suspect many developers, including OpenAI engineers are using VSCode, maybe TypeScript for frontend engineers, GitHub for sure which had no real features like pull request approvers before Microsoft money came in. Less at OpenAI most likely, but C# and .NET are going strong with bleeding edge tech like .NET chisel containers support. And I've seen great support from their engineers working on OSS, best of FAANG.
Google created the meme that a company could be evil or not but we're past that age. Let's focus on the experiences, and while not forgetting the past (yes the 90s do not paint MS well), be forgiving. MS seems to continue to develop and even innovate on a lot of this tech, painting them as an evil dinosaur seems frankly ridiculous. Yes as with any large company, there will be parts that are good and parts that are not.
Disclaimer: I have never worked at Microsoft and own no stock directly though so have ETFs. And through OSS I have some friends there that I strongly respect and think are awesome and always think of them any time I see hate talk towards MS, which is unfortunately common...
Most of OpenAIs tech stack is AI toolage, but basically everything you interact with is written in Python. Nobody uses C# for anything serious, because why would you?
It’s not a bad language, it’s just not a very good language either. Need efficiency? Rust/C++. Need an all round language? Node or Python, which are less efficient, but still powerful enough to power Instagram and well OpenAI as far as Python goes and LEGO as far as Typescript goes. Realistically you’re looking to choose between C#, Java and Go. Both Java and Go are miles ahead of C# in terms of concurrency, I mean, C# is still stuck with “await” after all, and while I guess you can have a lengthy debate on C# vs Java, 9 gazillion large companies use Java while 0 use C#.
It’s not that C# is bad, like I said. It’s never really been better than it is now, it’s more a question of why would you ever use it? Even the C# developers at Microsoft admit to prototyping using Python because it’s just so much faster to build things with it, and while they do move things to C#, you have to wonder if they would if they weren’t working for Microsoft.
Not sure I quite follow the point, Microsoft maintains PyLance and Pyright, tools that have had a large positive impact on Python development, including AI. Not sure about live share, but if there are feature gaps with paid IDEs I would say they're not so large - the functionality available for free in VSCode or its ecosystem (enabled by Microsoft with the help of many OSS developers) has helped democratize software engineering in a great way.
Yes they have some problems in hiding their attempts for lockin on the Windows side but ignoring all the good and just labeling them terrible for that is troubling.
Most people in jail will have lost much more than just a blog. We really need to have some more perspective here, memories of when sites just disappeared completely all the time may help though.