Chrome ships new features fast, and some of them are very cool. Because of course! "Embrace, extend, extinguish" doesn't work unless the "extend" part is full of shiny, delicious candy. And platform makers can make lots and lots of shiny candy when freed from pesky things like standards.
It's hard to believe we're having this discussion again. Did we learn nothing from Internet Explorer 6? IE6 damn near ruined the web because it had 95% market share, was objectively terrible, and it encouraged lock-in with its nonstandard behavior and "convenient" proprietary features.
Now, of course Chrome isn't terrible like IE6. And that's both good and dangerous ...because Microsoft might have "won" if IE6 was any good at all. Luckily, IE6 was so terrible that Mozilla was able to win back a sizeable enough portion of the web with a far superior product, and FF's market share was big enough that people had to start paying attention to web standards again.
No such savior in the form of a "far superior product" is likely to exist this time around. Chrome's pretty darn nice, and as long as it doesn't suck as bad as IE6, we'll never see a direct competitor that eclipses Chrome in the way that FF1.0 surpassed the god-awful IE6.
That means we, the developers, are the only thing between an open web and a dangerous Google monoculture. Support web standards, not proprietary lock-in. Develop for the web, not for Chrome. Stick to the standards and it's pretty easy. And fun.
They blatantly ignore standards, the most obvious of which is their treatment of autocomplete=off in forms[1]. They broke the ability to disable autocomplete, and since then have been intentionally breaking workarounds people find to actually turn off autocomplete. This has been a major pain in the butt at work.
Before you yell at me about password managers or whatever, we don't use this on our login form: We make an app that collects some sensitive data that it is very pointless to autocomplete, and we've had user complaints about this very issue, but there's nothing we can do about it because Google unilaterally decided they know better than us.
> They blatantly ignore standards, the most obvious of which is their treatment of autocomplete=off in forms[1]. They broke the ability to disable autocomplete, and since then have been intentionally breaking workarounds people find to actually turn off autocomplete. This has been a major pain in the butt at work.
My solution to that is simple. I have a microservice that tells my services how to get autocomplete=off automatically. This microservice determines the required values and ids by scraping the search box on google.com every hour, and extracting the values of the tag matching input[name=q].
That way I can fully automate autocomplete=off, and ensure it works.
Indeed it is, but sometimes it’s necessary. I absolutely only use it for search boxes where I provide custom history for previous entries, and custom suggestions. Without autocomplete=off, these are horrible to use.
And this was the only way I could figure out that would guarantee that it would be disabled, even after browser updates.
FWIW, the HTML spec says that the autocomplete=off means either 1) the input shouldn't be autofilled or 2) the input is for sensitive information. It doesn't require handling it only as the first case.
> "When an element's autofill field name is "off", the user agent should not remember the control's data, and should not offer past values to the user."
Seems pretty clear to me. The only exception I see is:
> "A user agent may allow the user to override an element's autofill field name, e.g. to change it from "off" to "on" to allow values to be remembered and prefilled despite the page author's objections, or to always "off", never remembering values."
But that's user-initiated action, not something the browser should do for every field just because it feels like it.
And it's right to be wary of Apple. They have many of the same negative incentives as Microsoft had: if the web outshines their proprietary app platforms, then what good are those platforms?
However, out of the two behaviors ("embrace and extend" versus "slow standards adoption") I think that Chrome's "embrace and extend" is the one that's actually a threat - and to me, that's what really made IE dangerous: it was a threat to the web.
Safari's pace of standards adoption is merely annoying. I'm a developer too; I get it -- I want to use the cool new shit! But Safari's not going to break the web in the way that propietary browser lock-in could break it.
I agree that Google's Chrome tactics can be perceived as evil, I won't discount that, because it causes issues when you are forced to use a particular browser.
But if your client only runs iOS for their entire business, you can't even complain about Chrome to them. And if their browser doesn't support something as simple as offline html5 features properly, and you need this, across multiple platforms (cause they have a few laptops too). Then you can't use a browser based platform. Or if you do, you have to live with many compromises.
This is how I see Apple working. If the market didn't force them to upgrade Safari, I think they never would. They'd just say "browsers are for html, and the Apple App Store is for everything else".
But my client just wanted their stupid form to work offline. At least Chrome can do this, their adding of extra stuff doesn't _limit_ what is possible.
And I don't think many of these features are "new and shiny", I think they are fundamental to the web now. If they were new and shiny only, then Firefox would have had it's ass handed to it by the big-boys. But instead, we see MS and Apple falling behind a simple non-profit in the web standards support area.
Remember, Apple has more money in the bank than any company ever in the history of the world. It's a choice they are making not keeping up with web standards.
So the question has to be asked, what benefit does Apple get from keeping web standards very low on it's locked down platform? (any answer besides forcing devs into their store?)
When do you suppose this will end up in the Safari in iOS 10? I suspect never. Since my clients literally have thousands (maybe 10s of thousands) invested in devices that will never get any updates past iOS 10, they are forced into a hardware upgrade because Apple won't allow 3rd party browsers on their devices, and tie their browser version to their OS version.
If you see another way around this, I'd really like to know.
When do you suppose this will end up in the Safari in iOS 10? I suspect never. Since my clients literally have thousands (maybe 10s of thousands) invested in devices that will never get any updates past iOS 10, they are forced into a hardware upgrade because Apple won't allow 3rd party browsers on their devices, and tie their browser version to their OS version.
I obviously don't know your situation, but the newest iPad where iOS 10.x is its last operating system is the 4th gen iPad, which was introduced October 2012 and was discontinued October 2014. It shipped with iOS 6, so it's been through 4 major upgrades.
You make not like it, but that's about the general lifetime of computing devices today regarding their upgradeability. It's not like there are 5-year old Android tablets running the latest operating systems from Google either.
Apple’s modus operandi has been the same for the 10 years of iOS devices: all of the new features go into the latest version; the previous version only gets security updates and they don’t back-port those features to the previous operating system.
Mobile Safari 11 uses APIs and frameworks (like the machine learning for Intelligent Tracking Prevention) that only exist in iOS 11, which is why it won’t be ported to iOS 10.
In most mainstream work/production environments, 3-4 years is the useful lifetime for computers and the iPad is a computer. I did this for a living at MIT; I dealt with these exact issues for 14 years.
I get that it's convenient to blame Apple for not allowing 3rd-party web engines on iOS, but that's really the cover story, right?
The key issue here: iOS 10 was the last version to run on 32-bit A6 processors, which is what your iPads have. iPhones and iPads with 64-bit A7s (and newer) can run iOS 11. Unfortunately, you got caught in this hardware transition.
If these iPads are mission-critical for something, then there should have been some device lifecycle planning when the project started so you wouldn’t end up in a situation like the one you’re in.
BTW, both Google and Firefox stopped supporting 32-bit operating systems years ago, so even if iOS allowed 3rd-party browsers, you still wouldn’t have the option of running something that had today’s latest features like Service Worker anyway…
>It's not like there are 5-year old Android tablets running the latest operating systems from Google either.
Yah, you have a point with the hardware being old and 32bit, I am not an Apple guy by nature, so I have older hardware.
My issue is not with the OS, it's that I am blocked from installing specific software on that OS. Imagine if Windows didn't allow Chrome to be installed, everyone would be up in arms.
>Imagine if Windows didn't allow Chrome to be installed, everyone would be up in arms.
No pun intended, but this isn’t an apples-to-apples comparison—you can’t compare a desktop operating system to one that runs on phones and tablets.
I’m a web developer—I have the release, beta and canary (nightly) of Chrome and Firefox in addition to Safari and Safari Tech Preview on my iMac, which is no big deal on macOS or any other desktop operating system.
It’s very different on phones and tablets, which are closer to embedded operating systems.
This also enables Apple and others to lockdown security in ways which would be unacceptable for a desktop operating system. Given the unrelenting hacking from the kid down the street to nation states—China, N. Korea, Russia, Iran—more security isn’t a bad thing.
Apple isn’t getting a pass—at least not from normals.
It’s the supposedly aggrieved and paranoid libertarian techies’ narrative that Apple is getting away with something and their rights are being impinged upon.
Sure, I can accept that generally speaking there is a difference between phone/tablet and desktop. But how about Android vs iOS. I can install Firefox on Android, but not on iOS. (It "looks" like FF, but it's Safari with a skin, same with Chrome.)
Can you offer a valid explanation on why it's technically not possible for a real Firefox browser to work in iOS, but works fine on Android?
Short answer: Google and Apple have different priorities when it comes to security, privacy, performance and power consumption. Same thing with Mozilla but to a lesser degree.
It’s also not accurate to say Firefox and Chrome are just “skins” over Safari. Sure, they have to use WebKit via various system APIs but they also add their own features, some of which Safari doesn’t have like support for Google Assistant and a built-in QR code scanner for starters.
Heck, Google Chrome on iOS supports the Payment Request API while Safari doesn’t, which means there are web platform features Google implemented that WebKit doesn’t have. Why are you complaining again? ;-)
Same for Brave, which does lots of things Safari doesn’t.
Remember, Chrome and WebKit share a common ancestral codebase; the vast majority of a site’s HTML and CSS renders exactly the same anyway, so it’s not like Google or users like you are actually missing out on anything of substance other than some misplaced sense of being restricted from shooting yourselves in the foot because you can’t use a browser engine that’s slower and consumes more power than what Apple ships—ditto for Mozilla.
For the overwhelming majority of the browsing anyone does, it makes no difference. It’s just a manufactured grevence of a vocal minority of Apple critics.
I think you are dodging the central claim against Apple, they prohibit competition on their platform. If Microsoft or Google did this, they'd be accused of anticompetitive behavoir, or being a monopoloy.
In fact, MS actually did this with IE vs Netscape. Maybe you are too young to know or remember this. It's just astonishing that anyone could ignore the parralels.
Also, I asked if there was any "technical reasons" Apple couldn't allow real Firefox in iOS, and you ignored that. I suspect because you know the answer is "no". So, then it's purely for marketing reasons.
Please explain how it's better for end users to have only one choice of browser? If you want to say they already do, then you do not understand how browsers work. And if you want, I can do some googling for you to show you why FF and Chrome on iOS are _not_ any different at the core level than mobile Safari.
In fact, MS actually did this with IE vs Netscape. Maybe you are too young to know or remember this. It's just astonishing that anyone could ignore the parralels.
I was doing IT at MIT when the Microsoft/Netscape thing went down—I’m not new to any of this.
This is going to be my last response on this topic, since this has devolved into a political and ideological thing——I can’t help you with that.
This post sums it up for me [1]: I think this thread is full of people who want Apple be considered a monopoly more than care about whether they actually are.
I think you are dodging the central claim against Apple, they prohibit competition on their platform. If Microsoft or Google did this, they'd be accused of anticompetitive behavoir, or being a monopoloy.
Nope.
Again, this is an ideological argument, not a technical or legal argument.
They aren’t prohibiting competition——there are over two million apps on the App Store[2], including ones by every company that is considered a competitor like Microsoft, Google, Mozilla and others.
But here is a legal position: there’s nothing illegal about determining what they will and will not allow in their App Store and what they will and will not allow on their platform, especially when the companies agree to it when they sign the contract with Apple.
It’s only tech ideologues and free software zealots that think that their rights are being violated because Apple doesn’t permit other web rendering engines other than WebKit. There’s no legitimate technical or legal argument that can be made that users are somehow suffering due to this.
If that’s your deal, that’s fine; but don’t act like it’s the same thing as Microsoft/Netscape because it’s not.
I wrote about why this isn’t like Microsoft, IE and Netscape in the thread [3].
Here’s the simplest way to make this plain:
* Microsoft had 95% of the desktop operating system market back in the day. Apple has around 20% of the global cell phone market[4]
* Microsoft was accused (after signing a decent decree with the US government saying it wouldn’t do this) of using its natural monopoly in operating systems to affect emerging markets, such as the browser market
* Apple doesn’t have a monopoly——natural or otherwise; it hasn’t been under investigation by the Federal Trade Commission or the Department of Justice——Microsoft was
* Apple has direct competitors on the App Store in every category it has apps and services for: maps, music players and services, camera, file sharing
* Google makes more money from iOS than it does from Android [5],[6]
Last thing: like it or not, Safari is the fastest and most energy efficient browser engine that could exist on iOS because unless you’re Apple, there’s no way for a 3rd party to have the information of the firmware, custom processors, GPUs, etc. that would be required to do what Apple is already doing with WebKit.
It’s not like the HTML, CSS and Javascript rendering is significantly different than what the other engines do, so there’s no compelling technical reason that there should be 3rd party engines, other than to satisfy their critics and zealots.
There might be some there there if Google and Mozilla boycotted iOS because they weren’t allowed to use their rendering engines and otherwise made a big deal about this. Because they must be on the most profitable mobile platform in the world, they’re admitting that they’re okay with the situation as it is, even if you guys are not.
So in reality, your issue is with Mozilla and Google, who’ve left you guys hanging and don’t have your back on this.
I know you have a lot of reasons to say this doesn't matter, but why is it a real world problem then for a lot of people? Are these people imagining the issues with Apple products and their browser?
Does Firefox suck up battery so bad on Android that it couldn't possibly be written to run on iOS properly? (technical issue, not ideology) Your arguments are not technically sound. I never mentioned "rights", you did. I merely pointed out that if others copied Apple, they'd be accused of violating rights. (not an issue for me personally)
You present red herrings and twisted some of my comments, but you are right, it's not worth continuing, as neither of us can change anything.
Same applies to Google, lets not forget that the bad experience of web apps over Android native ones, meant that now it is also possible to use Android native apps on ChromeOS and that they took the effort to implement something like Java Web Start (Instant Apps) for Android.
I don't know how "understand everything there is to know about everyone" is substantially less evil, especially when entirely avoiding Google is effectively impossible.
I should have picked a better word! "Outwardly" probably wasn't the right word there. Google is less... bald-facedly evil, perhaps? Less overtly evil?
Microsoft's hardline "screw the open web, and buy a Windows and Office license every n years while we ensure we have no competitors by any means necessary" modus operandi in the 1990s was just so blatantly unfriendly.
IE literally refused to implement standards, rendered broken dom elements, allowed for proprietary native extensions and had many different apis for them. They had 90% of the market share and no other viable platforms. This meant developers built things which exploited these broken features causing huge compatibility issues when they aren’t (every other browser).
A simple lack of features didn’t make IE... IE. Safari is a pain in the ass but it sure as fuck ain’t no internet explorer.
I think Chrome is propagating lazy developers who think everyone uses their browser. Which means things only work in Chrome and no where else. It’s not as bad as the IE situation yet and hopefully it won’t be with Firefox kicking ass again.
Apple is doing similar stuff by refusing to implement the payment request API and instead implementing their proprietary Apple Pay API. So you can use one API to target Chrome and Firefox and another for Safari. This is starting to sound a bit like IE. They were invited to join the working group for Payment Request and decided not to take part.
Chrome doesn’t implement asm.js (EDIT: I had previously written WebAssembly, but that was a typo), concurrent JS, or autocomplete=off, or the GeoLocation API via HTTP, etc.
> allowed for proprietary native extensions and had many different apis
NaCl, PNaCl. Try running http://earth.google.com/ in a browser other than Chrome. Or the Google Hangouts Video Chat.
> They had 90% of the market share and no other viable platforms
Chrome has reached > 67% of the global market
> This meant developers built things which exploited these broken features causing huge compatibility issues when they aren’t (every other browser).
See above mentioned Google Earth, Google Hangouts, the early releases of Google Inbox, Google Allo, and WhatsApp Web, as well as the early releases of Signal Web.
> It’s not as bad as the IE situation yet
See above why it is just as bad as the IE situation.
Oh, and "IE was preinstalled" – Chrome runs malicious, misleading advertisements everywhere to get users to install Chrome, and when that wasn’t enough, they started paying companies to secretly install Chrome with their installer (same as what the Ask Toolbar, or BonzaiBuddy did – except, now it’s Google offering 30 million EUR to VLC to include it, and that project denying it and publishing that info).
Chrome is implementing a standard that pretty much all other modern browsers are also implementing. For example, service workers - implemented by Chrome initially and pretty heavily integrated into Android/PWAs - but Firefox and Safari are also shipping this feature currently. And also most features, such as service workers, can be effectively used with progressive enhancement, still providing a working experience to the user when they are using an older browser. Most sites I've seen that say "Must be used in Chrome" or whatever really have no reason to do that. There is more nuance to the progression of the modern web platform than the OP article gives off.
Service workers is one of the most egregious failures on their part. And I believe it's been done on purpose. With service workers you get full control over building your JS based app, something Apple does not want, they want everyone to got through their store.
Considering that the only way (moving forward) to create an offline JS app now is through service workers, and Apple is just starting to get it working, _maybe_ for the next version of Safari.
What does this mean? It means that it may take one or two more _years_ of _hardware_ updates to get iOS users able to use a javascript feature in their browser.
As far as I know, iOS and Safari are knitted together like MS did to Windows and IE. Try and get a new version of Safari that doesn't match with the version of iOS you are running.
I could go on about the state of local storage and other HTML5 debacles on Safari, but it's total loser and I am only going to make myself upset.
Yah, I keep tabs on that stuff. But most of my clients actually use Apple hardware for a few years now, and even if this update goes out tomorrow, and all iOS 11 users get it. iOS 10 users and below are out in the cold, and that accounts for 99% of my users.
And you know what it's like trying to use a feature even when 70% of users can take advantage of it, it's still a no-go. So Apple totally blew it on this one.
Apple has unusually high adoption (compared to Android) for new versions of their OS, I am not too worried. You can ship a service worker and it will not degrade the experience for users on older browsers.
Since Service Workers were enabled in STP as of a couple of days ago, the next upgrade to Safari in a few months will likely have it—it’s not 1-2 years away as you suggested.
I’m pretty sure I’ll be running PWAs on my iPhone 5s the first half of 2018.
IE literally refused to implement standards, rendered broken dom elements, allowed for proprietary native extensions and had many different apis for them. They had 90% of the market share and no other viable platforms.
All of those things are true of iOS Safari as well, except that on iOS you literally can't use any other engine so 100% of the browsers on Apple phones and tablets are restricted by whatever limitations or flaws Safari brings.
I sometimes feel like Chrome is the new Internet Explorer too! There have been a number of GSuite features from Google that don't work on Firefox. Google Hangouts for one! Also on that list was U2F auth and some others. Here we have Google products that only work on the Google browser. What does that sound like? Maybe Microsoft and all their ActiveX IE shenanigans in the past!
The moral of the story, which I think the article highlighted very well, is that the web platform is based on standards. Build to the standards. If you don't like the standards, advocate for new ones! It's hard work but it's how we got where we are and how we will continue to have a web 100 years from now.
That is backwards. Nearly everything in the web platform began as a proprietary extension, including JavaScript and XHR. Standardizing something that has never been used in real life is a bad idea. Convincing the other browser vendors to implement your new idea is usually impossible. So insisting that only standards ever be used would result in no progress at all.
Virtually all new major browser features are built according to standards, proposals, or explicit extensions platforms. See https://www.w3.org/TR/, where it's very easy to map recent browser features to standards or healthy proposals. Your proselytizing would be more appropriate in Redmond, Washington in 2007.
I'm not sure what gives you the impression that successful standards are developed in a vacuum. You're right that some are, but most of those are terrible technologies that were designed by the "hard work" you think got us where we are today.
The rest -- the good ones -- are battle-tested through gradual deployment and iterative improvement, eventually emerging nearly unrecognizable from their original vision, but genuinely useful and usable.
Sort of. It's true that Firefox doesn't enable U2F by default (yet), but that's actually unrelated to the reasons that it doesn't work on Google domains. The real reason that Google's U2F only works in Chrome is that they rely on non-standard implementation details: https://twitter.com/ManishEarth/status/931534674224062464
For contrast, you can use U2F on Fastmail and Github in Firefox, because they don't rely on non-standard behavior.
I believe this now works on FF, according to an HN post yesterday.
>If you don't like the standards, advocate for new ones!
I think the problem with this is it means that I can't provide something useful until the standards body updates, which could be a cadence of years. In that timeframe, a competitor will break the standard.
Every major browser vendor has a proprietary extensions platform. It's a good thing; it enables innovation and solution-building without needing to leave the web platform entirely.
Chrome, Firefox, and Edge have Native Messaging. Safari's extensions platform uses "native APIs and familiar web technologies."
In short, nobody is breaking web standards, because they didn't submit their changes to a standards body. Because it's not standardized, it doesn't count as breaking standards.
Now, if they did submit to a standards body, and it got approved, it would be standardized. Therefore, it doesn't count as breaking standards.
Hey, if you want every browser to be identical, that's your right. I happen to think that competition among browsers is a good thing, and that means that they'll always be at least a little bit different. Shouldn't it matter which browser you use?
No major browser vendor (in the modern era, at least) wants to break interoperability -- that's the bright line that separates innovation from fragmentation -- and you'll find that each works hard to stay on the right side of that line, often slowing their rate of innovation to do so.
(And no, saying extensions platforms don't count isn't just language-lawyering, as I think you're trying to suggest. Most extensions platforms, particularly ActiveX and old-style Firefox add-ons, are deeply dependent on the host OS or the specific user agent's implementation, or else they operate in contexts that just don't make sense in the web's origin-based security model. I wouldn't go so far as to say it would be impossible to standardize an extensions platform, but standardizing one would necessarily give rise to a separate platform (the extension to the extensions platform) that was host-OS-specific or user-agent-specific. As I said earlier, there are legitimate needs to get stuff done that can't and shouldn't be done on the web, but that don't require abandoning all web technologies. Maybe you're right that the intent of ActiveX was to kill the web. That doesn't mean that it didn't also solve real problems that the web couldn't solve at the time and still can't today.)
Yes, in terms of performance, user interface preferences, features like bookmark syncing, and so on.
But visiting any page should work in any browser. If proprietary extensions get used, then the pages are no longer able to interoperate. You now have a proprietary ecosystem.
If these proprietary extensions catch on, now you're back to the bad old "best viewed in IE6" days.
I'm not sure you're considering the fact that the web ecosystem itself is in competition with other platforms. The counter to your "If proprietary extensions get used" hypothetical is that if they don't get used, the world moves on, getting the job done with fully proprietary and/or platform-specific solutions -- Windows applications, Android apps, etc.
People don't sit back, put their needs on hold, and wait for the web platform to develop and implement new standards. They use available tools. Would you advise them to leave the web ecosystem entirely? Or use the lesser evil of extensions platforms, thereby solving their urgent problems and indirectly providing long-term direction to the web platform's evolution?
Ehh, I use Safari as my main browser (it feels more "native" than Chrome & FF, has seemingly lower CPU usage than Chrome, and is WAY better for the battery), and I rarely suffer for it. Once in a while there is a site that does something fancy that requires Chrome, but it is rare.
Precisely this. It’s not that the WebKit/Safari team doesn’t implement new features/standards (they usually do), it’s just on a different schedule.
I don’t know anybody on the Safari team but I suspect that they prioritize very differently than the Chrome team does. One is all about impressing web developers with the latest and greatest where the other is more interested in having resource efficient implementations.
Same experience here. Safari is my daily driver for casual browsing because it feels smoothest and integrates well with the OS, iPhone, etc. The only times I can remember it being a problem are when I see a bleeding edge tech demo posted somewhere like HN.
flexbox isn't a bleeding edge technology, and it's supposed to work properly on safari, but it really doesn't. Annoying as hell, and garbage on their organization to be honest.
This doesn't really refute what the author is saying; the main thrust behind the website and the blog post the site references is that Safari doesn't adopt new features Chrome does, which is basically the exact thing which the author of the article we're commenting on is saying is a bad position to take.
I have no horse in this race, since I like Safari when I'm on Apple products and I will happily use Chrome or Firefox or IE/Edge depending on what's available, but I agree with the author of the primary article here; we shouldn't put one browser up on a pedestal just because of release cycles or cool projects it does and let it dictate how the web should look and behave.
That's my issue. I've basically learned (as a developer) to not use any new features in Safari until the version after they come out.
IndexedDB was broken so badly it literally wouldn't work, flexbox is much better now, but I wouldn't have called it "supported" by safari for a while, and now while their WebRTC APIs work, they are really flakey and have a lot of extra "restrictions" tacked on that aren't really explained anywhere (like they won't work in a WebView, or if the page is bookmarked on the homescreen).
I like what they are trying to do, and I absolutely think there is a place for a browser that is fast, low resource usage, and more stable, even if it means it's single platform, slower to add features, and doesn't have as many customization options. But Safari is falling short of their goals while still having those downsides and on one platform is the only browser that is allowed.
Anyone who says this doesn't remember what made IE notorious back in the day.
None of the browsers now are anything like that.
Apple's approach with Safari is exactly what the author of the linked article said; they are slower to implement standards. When they do, I usually have no problems with them at all.
I really don't agree at all with this, Safari is the new Internet explorer ...
https://www.safari-is-the-new-ie.com/