That's assuming that someone recognizes which strategy was responsible for failure. Blaming the competitors, or the employees, is probably much more comforting and easier for upper management.
> But, at least for me, the industry has done a decent job of competing with piracy on convenience.
It has, but the fragmentation that you mentioned is starting to seriously erode this progress. If the situation worsens on this front, I can definitely see a resurgence of piracy, again because of convenience.
Agreed.. I'm paying for Amazon Prime, Netflix and Hulu... I'm not paying for another service at this point. Not to mention UK content I can't get at any price. I also pay for a seedbox in another country to avoid local issues.
> This is why both android and ios are NOT the mobile os of the future.
I too would like a more open OS for mobiles, but you have to realize that the majority of users doesn't care about this. They are unconcerned about most privacy issues, don't need to ever tinker at a low level with their devices, and they are the ones ultimately deciding on what the future will be. What matters is the UI, and that the various features "just work".
You are in a minority, however vocal, that has little to no say in what the future of mobile OSes should be. My take on it is that whatever will replace android and ios will be even worse on these issues.
Most people don't realize how bad all sorts of things in their life are. (Dietary dangers, sedentary lifestyle, financial irresponsibilities, etc)
That doesn't make them less bad, or less concerning overall.
If you think things in the world should only change because of popular opinion & sales, then things like emancipation, environmental protection, worker protection, consumer protection laws, and other such things would never have happened.
> Most people don't realize how bad all sorts of things in their life are.
I'd say that most people are regularly bombarded by all these issues in the medias. If they still don't realize, then this is because they actively filter them out.
> If you think things in the world should only change because of popular opinion & sales, then things like emancipation, environmental protection, worker protection, consumer protection laws, and other such things would never have happened.
I never claimed that things shouldn't change, only that I very much doubt they will. It's on the opponents to Google/Apple, to convince the majority that there are real issues and that measures should be taken. That's how democracy works, right? And I don't see that happening in the near future. Privacy? We have nothing to hide. Low-level customizability? We only need something that just works. They are the popular answers on these issues that need to be countered.
I don't think it's only up the market competitors to Google/Apple.
The market doesn't necessarily select against negative side effects, like privacy or the environment. Shifts in these need to happen outside the market. Things like legislative protection about rights, environment, and privacy are going to come from minority demands to claim those rights, not from hoping for a majority market pressure against conveniences that sell well.
I wasn't talking about economic competitors, but opponents to Google/Apple in the political area (e.g. EFF), who may lobby for changes. Therefore my note about the democratic process. Minorities may make demands, but whether they pass or not does rest on the majority. Hence the need to sway popular opinion, which I don't see happening.
Most people I talk to have a sense that all of these big companies (and the government) are violating them, but they feel completely helpless and so ignore the issue. It's extremely similar to people in a third world country that know their foods are being adulterated but can't afford imports and still have to shop.
They only have a vague sense, but they don't understand the direct consequences of specific actions.
There's also a belief in the false idea that, "they already have all of my data, so there is nothing I can do about it."
It's arguable that one needs a mobile phone to participate in modern life (and the situation with mobile phones requires fixing as well). It is not arguable that people need to be able to talk to a box that makes them a coffee or turns on a light. I think that most people would not choose the IoT device if they really understood how it works.
The majority of users are not tech aware security engineers. The majority of users (myself included) don't want to spend hours of our lives effectively specializing in android/ios under the hood design, and/or browsing shady blogs and forums looking for ways to root our phones.
Don't underestimate the domain knowledge that you've acquired, and the amount necessary to root a phone, let alone take control of linux embedded in a television. It's about more than just not knowing that we're being spied on. It simply isn't efficient to expect so many people, who have already specialized according to their professions and interests, to also take on this level of domain specific knowledge.
I agree -- people shouldn't have to learn all of that. If the system were designed better, then there would be no need. Right now, it's open season on people who don't understand how things work (and even on many people who do understand how things work). So the system needs fixing.
> If you want to advocate cryonics for the masses, think about what you would do if you had to live in a world with the reanimated masses of the 18th century.
Sounds like Riverworld. I think it could be quite interesting. But anyway, whether it's a potential mess or not is something for our future descendants to decide on.
Advocating cryonics today doesn't imply that our descendants, even if they have the technology, will accept the resurrection of masses. We have immigration laws today, I'm sure the same concept could be thought up for "people outside their own time", if it ever is an issue.
Or I could just accept my mortality and let my corpse go to some productive use, like organ donation or teaching medical students, rather than investing in some narcissistic notion of being resurrected.
As romantic as "coming to peace with mortality" might sound, I enjoy living life too much to idly accept that i will at some point feel the breeze in my hair for the last time, or never again smell a flower's perfume, or gaze on a beautiful view. Sure, if everything fails, i will have to accept it. But why not give my everything in the mission of taking life further, even if i can help it by an atomic amount? What if there were as many great minds working on solving aging as there are working on cancer research? Why not try and bring a contribution? What if everyone else capable would have the same realisation? Why sit idly and wait to be carried by waves to nothing at all?
You could do both: organ donation is compatible with brain preservation.
Perhaps some are motivated by narcissism, but I'd like to think my fellow cryonicists are motivated by hope, and a belief that the future will be a warm and welcoming place. If the future is narcissistic, I wouldn't want to be revived anyways.
From my own experience, a few months ago, on Linux mpv was still the better choice (more performant, better image quality, some more polished/reliable features like subtitles) as long as you don't mind the minimal UI (or use a front-end).
It's based on the Brain Endurance Training (BET), by Samuele Marcora of the university of Kent, who is quoted in the OP's article (for his definition of effort).
> Why dress up / acquire certain things, in order to conform to some random pre-designated style?
Why speak english and not invent your own language? Clothing and appearance is (also) communication. And to be understood, some of the signals must be common enough.
I belong to this community, to this generation, to this area/country, to this social class, etc... It can also project ideologies, affiliations and aspirations. Once upon a time for example, in Western Europe, some intellectuals adopted the Mao suit. It was a very strong political statement.
Moreover in many cases I doubt your assertion that these styles are random is true. There is a history behind most of them, and this is what imparts them meaning.
I doubt clothing could be a full human language. A human language is a remarkable thing. Our language ability is one of our chief adaptations as a species, with it I can conjour the precise and bizarre notion of a purple bus-driver giraffe with those few mere phenoms.
I would imagine we must have some systems for interpreting information about other humans from their accoutrements, but I would also imagine that they surely cannot be as highly developed as our language, as the technology to have different 'styles' or even clothes at all is relatively recent. Also as a bit of anecdotal evidence I present that, any healthy human growing up around at least one other human will develop a spoken language, or adopt the language of their peers. In contract there are examples of people growing up in societies with clothes and style who do not develop any particular style sense.
So I posit that this style language is relatively limited and only certain pre-arranged messages determined by society can be communicated by it. Also it seems extremely prone to misinterpretation . Both parties must be aware of, and agree on, the pre-arranged signals for it to work.
Which is why I personally dont even consider it. With such a limited and error prone language how could I accurately communicate anything meaningful, or personal, or that I particularly value, about myself. Also why would I would I want to blast out this signal to all and sundry?
> Which is why I personally dont even consider it.
So, assuming you're a man (bear with me if you aren't): are you so open-minded and rational that you do consider female clothes in your shopping too? Do you look at women's watches and glasses too? Do you often wear pink?
Here is my guess: you don't. You may think you're completely disconnected from the social cues inherent in clothing/accessories, that you consider clothes as purely functional. But chances are, in practice, you really don't. Signalling your gender "to all and sundry" as you put it, is one of the major non-verbal cues that clothes provide. It's part of those "random pre-designated styles" and is taught from a young age, and understood by everyone. Your tastes reflect that, whether you consciously acknowledge it or not.
You don't see men in robes, for example, in modern western cultures, whatever their potential comfort/utility/beauty, they simply are not considered for men in day-to-day life. Even though, there were periods and cultures where they were perfectly fine for men to wear.
I'm not the person you were answering but I subscribe to similar utilitarian views and must say, if it wasn't shunned by society to the point of risking to take a beating, I wouldn't mind wearing a skirt in summer. In fact, I already do wear some plain beige skirts at home because I find them far more comfortable than pants and the airy feeling is nice in hot climates.
It's far too dangerous to wear a skirt outside as a man so I wouldn't do it. I as a person don't care much for clothing beyond physical well being/comfort/utility but society is very judging and a times very dangerous for certain choices.
Yeah I agree with all that.
In terms of what other people think, I am just a stoic about it, that is outside of my control. In terms of what I wear, I have always gotten more than enough clothes as christmas presents from various relatives to keep me decent.
Though quite recently I started mountaineering and discovered the specialized clothing for that. Its the 1st time Ive ever been interested in clothes, the feature-list on them greatly surpasses anything else. Eg A jacket which is water resistant, breathable, wind-resistant, tear resistant, very light, flexible and comfortable, affording a full range of motion, with lots of pockets and also adjustable vents. Though I am a small man and have actually bought a womens jacket cus it fit better and was cheaper.
> I agree with you that keeping the file metadata in a separate fork is far superior to keeping the file metadata in a three character extension
Since we're talking about Win95, then the 3 characters limit doesn't apply (long filenames was a major new feature of this OS after all). .jpeg and .html were relatively common for example at the time, and worked fine.
I find the extension system still kludgy, but arguing than it was worse in part because of the limited pool, is incorrect starting from Win95.
Yet for the same compatibility reasons most people stuck with .jpg and .htm, at least in the Windows world. Even now it's very unusual to see a .jpeg extension on a filename.
The metadata that Macs kept in the resource fork go way beyond file type and creator too. It included things like the file's Icon, creation/modification information (so it would survive a trip over the Internet!), loads of stuff for applications (menus, graphics, sounds, etc...), formatting for plain text documents (so they fall back to plain text on unsupported systems), and so much more.
Fun fact: NTFS supports the concept of a resource fork on files, but almost nothing in Windows uses it. I think I've seen more malware hiding stuff in there than legitimate uses in the wild. Worse, even the obvious case of loading a Mac file on a Windows machine it usually fails and falls back to creating the clunky separate directory instead.
NTFS does not have resource fork in the MacOS sense nor extended attributes in the unix sense. Instead it allows for file to have multiple named contents that are accessible by same file IO API (in essence the file can behave like a simplified directory). There is no distinction between data and metadata stored this way. In the late 90s MS even intended to not use OLE compound storage fileformat (ie. what office 97/2000 formats are built on) on NTFS drives and instead write the objects into separate streams (reportedly it was not implemented because then windows would have to somehow transparently reconstruct the compound storage when you copy such file to non-NTFS drive or upload it to the internet). Today apart from malvare hidding only major usage multiple streams have are the "this file was downloaded from internet, are you sure you want to open it?" prompts which store the internet-ness of file in secondary stream.
Conceptually the MacOS Resource Fork is basically a directory where all of your filenames have to be exactly 4 characters long. The only difference is that each "file" might be a stack of "files". So you might have a CODE resource that has multiple CODE segments in it.
One thing I loved about old MacOS apps is opening them up in ResEdit and so much of how the thing was built.
> Conceptually the MacOS Resource Fork is basically a directory where all of your filenames have to be exactly 4 characters long. The only difference is that each "file" might be a stack of "files". So you might have a CODE resource that has multiple CODE segments in it.
Sort of. It would be more accurate to say that each filename was a 4-letter type code and a 16-bit ID. Each resource could also have a name, but that was less frequently used (and didn't have to be present, let alone unique).
More importantly, resource forks didn't exist in isolation. They were loaded into a chain of active resource files -- for instance, while working with a Hypercard stack, the resource chain would include the active stack, the Home stack, the Hypercard application, and the System suitcase. A stack could use resources (like icons or sounds) from any of those sources.
The filename extension is the bare minimum of metadata for a file and not easy to extend.
Even then Unix systems will skip even that minimal metadata and force you to messily search for magic numbers at the start of the file and make a guess.
http://archive.org/wayback/available?url=http://yorktownhist...
It returns a json with, notably, the closest archived page given the timestamp.
https://archive.org/help/wayback_api.php