It doesn't matter how high quality, convenient, or light they are, as long as wearing glasses isn't inherently cool, normal people aren't going to choose to wear them.
The parent was talking about people choosing to wear these. Today there might be reluctance to wear them because they're creepy or uncool. But that mirrors the reluctance for cool kids to wear bluetooth earpieces back when they were those chunky Borg-looking things. Then they got shrunk down. They got "high quality, convenient, [and] light".
When these types of glasses are virtually indistinguishable from regular sunglasses, and a critical mass of cool people wear them all the time, the reluctance from the rest of us will melt away.
Doubt. Apple doesn't see hardware sales as a primary revenue driver, rather they're a rent-seeking company that makes money by being the iron-fisted middleman for the app store. They don't see any benefit from user freedom if it makes them less money in the end.
One provision of the Sony Bono Copyright Extension Act [0] (which expired 6 months after passage of the law) allowed next-of-kin to revoke (the sale of) copyrights sold by the author without recourse (by the folks who paid for them). Allegedly, this was added by Disney in order to cut costs hundreds of millions of dollars in a dispute over licensing Winnie The Pooh IP/rights [1].
Expect something similar when the next big author dies; my prediction: JK Rowling.
It's not a vocab problem. It's inherent to the human brain, which appears to be fundamentally designed to prefer to view the world in terms of stories, with heroes, villains, and a narrative arc.
You don't have to tell me - even Bill S: "and what's he then that says I play the villain?"
Unfortunately, the collective quality of our storytelling is waning. Most people watch the least common denominator.
So now the greater human truth you allude to is being filtered through the streaming age mode of storytelling, and people have arcs, and bingo cards, and everything is reduced to water-cooler levels of urgency and relevance.
This isn't a new thing. Ancient stories like the Iliad or the Odyssey are discreetly historical records of a particular region mixed in with mythological foundations of a particular culture, but framed as the stories of Achilles ("Sing, O goddess, the anger of Achilles son of Peleus, that brought countless ills upon the Achaeans.") and Odysseus ("Speak, memory, of the cunning hero, the wanderer, blown off course time and again after he plundered Troy's sacred heights."). Likewise, ancient fables and parables are moral lessons couched in terms of stories with protagonists whose actions demonstrate the intended lesson, and this sort of thing is universal across every ancient culture for which we have records. Stories stick in the human mind, and they're what humans most prioritize transmitting forward through time.
Great person theory is hardly new either. I'm not sure the point. Really it's just "critical theory" for rich white men. I'm not convinced everyone or even a majority thinks this way innately. It's taught by people that want others to see them that way. Nothing here precludes that.
Depends on the bike. On some bikes the motor is mounted in the rear wheel, in which case there's no gear between the motor and the wheel. On other bikes the motor is mounted between the pedals and sent to the rear via the chain, in which case shifting works as you expect. But the latter style (a.k.a. mid-drive) demands custom frames (because mid-drive motors are nonstandardized), which increases costs and decreases repairability. In contrast, rear-wheel motors can fit on literally any frame, so they're much more accessible.
> would the same thing not have happened with the rise of high level languages?
Any argument that attempts to frame LLMs as analogous to compilers is too flawed to bother pursuing. It's not that compilers are deterministic (an LLM can also be deterministic if you have control over the seed), it's that the compiler as a translator from a high level language to machine code is a deductive logical process, whereas an LLM is inherently inductive rather than deductive. That's not to say that LLMs can't be useful as a way of generating high level code that is then fed into a compiler (an inductive process as a pipeline into a deductive process), but these are fundamentally different sorts of things, in the same way that math is fundamentally different from music (despite the fact that you can apply math to music in plenty of ways).
Oh, I do frequently have biscuits and gravy topped with fried eggs, though the biscuits and gravy would definitely pull it further towards the flour end of the spectrum, maybe not quite in the dark region.
Also, hollandaise is pretty integral to eggs benedict, I've had lots of variations but the traditional with poached egg, canadian bacon, english muffin, and hollandaise is really by far the best.
reply