> or even a concept of religion in the original languages
IMO this and the sources it cites are wrong. A huge chunk of the Old Testament is about how God had to keep sending prophets to tell the Israelites to stop worshipping other deities. So while they may not have had a single word that was equivalent to 'religion,' they clearly possessed the same concept. They would just use the phrase "worshipping other gods."
There are many texts written in the Greek or Roman antiquity that compare the religions of various nations known to them, i.e. which compare their beliefs about their "gods" and their methods for worshiping or for praying.
There are entire books written about such subjects, e.g. "De natura deorum" ("The nature of gods") by Cicero.
The ancient people usually did not have a precise word with the definite meaning that "religion" has today, mainly because religious practices were intermingled with most of their daily activities, so there was not a very clear separation between religion and other things.
For example, a treatise on agriculture, besides explaining how to prepare the soil and how to select the seeds for sowing, would also give the text of a prayer that should address a certain god before or after the sowing, so that it will be successful. Similarly for any other activities where divine help was believed to be necessary.
Nevertheless, they had the concept of religion and they were able to distinguish things that were related to gods from unrelated things.
But Nvidia has had high-profile industry partners for decades. Nintendo isn't "venture capital and hype" nor is PC gaming and HPC datacenter workloads.
But Nvidia wasn't able to compete with Apple for capacity on new process nodes with Nintendo volumes (the concept is laughable; compare Apple device unit volumes to game console unit volumes). What has changed in the semiconductor industry is overwhelming demand for AI focused GPUs, and that is paid for largely with speculative VC money (at this point, at least; AI companies are starting to figure out monetization).
Optionality has costs. If you live your life like it's going to go astray, then you miss out on a lot of the upside if it doesn't go astray (such as by being a stay at home mom, if that's what you actually want to do). The statistic that 50% of marriages end in divorce is often bandied about, but it also means that 50% don't. Which means that going all-in on your marriage is a completely reasonable thing to do.
what you say is true. But consider, the “cost” of going back part time is not very big. It’s not very stressful, and _greatly_ reduces long term risk.
Your take is a bit like saying in the year 2000 “i believe Apple is an amazing company, i’ll go ALL IN with my life savings”. If you’re right the you think you’re a genius. But what if you were wrong? What if apple turned out like IBM? Then you’d look back and think “how could i have been so stupid? so naive”.
It's a really bad analogy. And the "cost" of working part time for someone who doesn't want or need to work is literally every single hour they spend working. If they're working 20 hours per week, that's 20 hours per week spent doing something they don't want or need to do. It's a huge cost.
> My wife is fairly unusual in that she runs her own full-time business. Many moms don’t like her, presumably because they gave up their careers to do this and are jealous that she does both.
FWIW, my experience is that the dynamic at play in these situations is that women who run their own businesses or otherwise have high-powered careers tend to have a constellation of personality traits that is significantly shifted vs. those of stay at home moms, plus their daily lives are very different, so they don't really fit in. Saying that without value judgement, just an observation.
> It's telling that ARM, Apple, and Qualcomm have all shipped designs that are physically smaller, faster, and consume way less power vs AMD and Intel.
These companies target different workloads. ARM, Apple, and Qualcomm are all making processors primarily designed to be run in low power applications like cell phones or laptops, whereas Intel and AMD are designing processors for servers and desktops.
> x86 is quickly becoming dead last which should be possible if ISA doesn't matter at all given AMD and Intel's budgets (AMD for example spends more in R&D than ARM's entire gross revenue).
My napkin math is that Apple’s transistor volumes are roughly comparable to the entire PC market combined, and they’re doing most of that on TSMC’s latest node. So at this point, I think it’s actually the ARM ecosystem that has the larger R&D budget.
This hasn't been true for at least half of a decade.
The latest generation of phone chips run from 4.2GHz all the way up to 4.6GHz with even just a single core using 12-16 watts of power and multi-core hitting over 20w.
Those cores are designed for desktops and happen to work in phones, but the smaller, energy-efficient M-cores and E-cores still dominate in phones because they can't keep up with the P-cores.
ARM's Neoverse cores are mostly just their normal P-cores with more validation and certification. Nuvia (designers of Qualcomm's cores) was founded because the M-series designers wanted to make a server-specific chip and Apple wasn't interested. Apple themselves have made mind-blowingly huge chips for their Max/Ultra designs.
"x86 cores are worse because they are server-grade" just isn't a valid rebuttal. A phone is much more constrained than a watercooled server in a datacenter. ARM chips are faster and consume less power and use less die area.
> So at this point, I think it’s actually the ARM ecosystem that has the larger R&D budget.
Apple doesn't design ARM's chips and we know ARM's peak revenue and their R&D spending. ARM pumps out several times more cores per year along with every other thing you would need to make a chip (and they announced they are actually making their own server chips). ARM does this with an R&D budget that is a small fraction of AMD's budget to do the same thing.
What is AMD's excuse? Either everybody at AMD and Intel suck or all the extra work to make x86 fast (and validating all the weirdness around it) is a ball and chain slowing them down.
Minor thing, but I prefer ‘cultish’ to ‘cultic’ for your usage. In academia, ‘cultic’ means anything to do with worship and lacks the association with cults as discussed in this thread, whereas ‘cultish’ is how I usually see people adjectivize ‘cult’ in the way you are doing.
IMO this and the sources it cites are wrong. A huge chunk of the Old Testament is about how God had to keep sending prophets to tell the Israelites to stop worshipping other deities. So while they may not have had a single word that was equivalent to 'religion,' they clearly possessed the same concept. They would just use the phrase "worshipping other gods."
reply