Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Demoted (daringfireball.net)
125 points by joao on June 6, 2011 | hide | past | favorite | 105 comments


Gruber is repeating the obvious, but in a convoluted way.

Apple is about selling HW. Google is about selling ads.

That's it. Everything else falls from that.

Google's goal is to commodotize HW, and I this is, at least theoretically, doable. Apple can't really commodotize ads, at least in no way I can think of.


The past is not always a good predictor of the future, but I think 10 years ago most people would have said that PC hardware was commoditized. You had your choice of a beige box running Windows from HP/Dell/Gateway. Apple has done a pretty good uncommoditizing it.


Exactly. A lot of Apple's big announcements today were about features that create lock-in. Your music is tied to a $25 iTunes subscriptions; your reading list, email, calendar, to do list, and office documents are tied to iCloud; they even want your text messaging tied to your iOS devices. Then there's all those apps you bought on the Mac and iOS App Stores that you'll never be able to transfer to another system.

If you start using enough of these new features, switching away to Android, WebOS or Windows will be a nightmare.


  > Your music is tied to a $25 iTunes subscriptions
But it is not.

  > they even want your text messaging tied to your iOS
  > devices
Another way to look at it: your text messaging is freed from your carrier.

  > Then there's all those apps you bought on the Mac and iOS App
  > Stores that you'll never be able to transfer to another system.
How is that different from any other system?


Seriously? Before the unified App Store concept, cross-device upgrade pricing was very common. You get Word for Mac one year, and then you pay a fraction of the price to upgrade to word for Windows the next; you get a medical dictionary for your Palm Treo one year, and then pay a fraction of the price to upgrade to the same app for your new BlackBerry the next... Apple wants to control the business of software sale, and in so doing control what those practices mean for their platform: developers are then kept from seeing their customer lists, forcing everyone to get a new kind of "lock in" to the platform that has never before been common.


Your examples seem to be pretty small corner cases. I'll add one though: when I got the Adobe Lightroom beta, I got executables for Windows and Mac. And with my upgrades, I still get both executables. But the elephant in the room at the time was that the vast majority of software was being sold into Windows boxes and the only upgrade path was up. the lateral transfer schemes were many and varied, but all trivial compared to the big pipe called Windows XP.


On mobile, these "corner cases" are the majority of sales: every year there is some "new king of the hill", from random Nokia or Motorola devices, through Palm, Windows Mobile, and BlackBerry.

And, even on desktop with our example: Windows itself was the lock-in, not the sale; if a competitor to Windows did show up, the apps would rapidly move (and were in fact already selling both executables): if the sales were caught up in a monolithic App Store, users would balk at developer's inabilities to give them upgrade discounts, and are much more likely to stick to their platform.


I was just aiming for the cornerstone of your argument

> cross-device upgrade pricing was very common

Perhaps from within the industry having a lateral path was a common denominator, something every company had, and in that definition, it may have been "common", but not from a consumer perspective. The opportunity to buy PC software for a Mac was quite unusual for a consumer. There are still plenty of full time Windows folks thinking about getting Macs and asking their friends "Will I be able to use <app> on it?". Maybe not in SF, but most of the people I work with (and we're talking doctors, military planners, etc) are still in a Windows world, at work and at home.


My argument mentioned Windows and Mac only because most people are more comfortable with that example. As end-to-end App Stores have only become popular recently on mobile, I only consider this conversation to currently apply there. However, the premise of the Mac App Store also /assumes/ a non-Windows-dominated world /by definition/, so I fail to see why you feel your comment attacks my argument... :(.


Are your reading list, email, calendar, to do list, and office documents not tied to the google cloud? If you use enough of them, isn't switching to yahoo/msn/apple going to be a nightmare? There doesn't appear to be much difference to me.

Also, minor point, your music is not tied to a subscription. The syncing is. Cancel the subscription, keep the files.


All those things you mention are exportable, and yahoo/msn/apple could write an importer against these APIs to make it easy.


The Data Liberation project is at least a token attempt to ward off accusations of Google lock-in:

http://www.dataliberation.org/


This is a striking point. Up until now, switching costs off of the iPhone were basically a matter of dealing with AT&T. Once the iCloud tentacles are in deep enough it will be nearly impossible to say goodbye to all the licensed content that I only can access through Apple devices.


How does iCloud make it harder to move away from iOS? All of my apps are already locked into iOS, regardless of iCloud. This is true of any OS though.


I'm not tethered to my apps... I can, if necessary, move to new apps. Once I did everything in Emacs, but I have moved on.

I AM, however, tethered to my data.


Though it depends on Apple's licensing terms, Apps that work on multiple platforms have an incentive to allow you to export the data that it syncs with iCloud. They want to keep you as a customer if you jump ship to Android.


    > A lot of Apple's big announcements today were
    > about features that create lock-in.
Nonsense. It's only lock-in in that the user is locked into a feature that's really great and useful. In other words, no lock-in at all.

Their music and their content hasn't gone anywhere. Their contacts and calendar events are easily shared and exported via standard formats. They only thing they lose by leaving is the great software.


Agreed, the key here is that everything is easily shared and exported via standard formats. If that's the case, there is no lock-in.

The fact that Windows/Android/Linux doesn't have a feature or tool that you can import the data to also doesn't mean you have lock-in. It just means the market is a little inelastic, that's all.


  It's only lock-in in that the user is locked in [--] In other words, no lock-in at all.
I really don't understand what you're trying to say here. Vendor lock-in doesn't only apply when you don't like the product. Now, if there were no other products you could switch to, that would invalidate lock-in, but it's reasonable to assume that the product being initially good is essential for lock-in to work at all.


No one is forcing you to use Apple's messaging platform. You can pay for SMS instead, if you feel that's the "free" option.

No one is forcing you to upload your music to the iCloud. You can keep syncing it the same way you always have (afaict, anyway).

No one is forcing you to buy an iPhone, iPad, or any other Apple product. What kind of lock-in are you on about?

No one is even forcing you to upgrade to Lion. Keep running Snow Leopard if you want.

If you don't like it, don't buy it. There are plenty of other vendors. The only way in which you could claim that Apple was "locking people in" is that they offer an integrated experience, so that if you go all-Apple, things work better.

Well, damn them, they should be shot! How dare they try to offer an integrated user experience to their paying customers? What bastards!


  If you don't like it, don't buy it.
That's not what vendor lock-in is about. Lock-in means that after you bought it, you can't reasonably switch to something better, something that commonly you didn't know about or that didn't exist when you made the initial buy. For example, when offering integrated services, lock-in might mean that they won't allow a third party to integrate with their suite.

http://en.wikipedia.org/wiki/Vendor_lock-in


Well, I just bought car X, but then car Y came out and I like it better, but they won't allow me to switch without buying a whole new car!

I guess car makers practice vendor lock-in too by your definition.

In fact, almost anyone who sells something practices some form of lock-in.


No. For the car example, lock-in means you aren't able to use off-brand accessories, or service your car in an unlicensed shop. However, cars are mostly self-contained, so this mostly only comes up with aftermarket stereo systems and membership programs.

And yes, mild lock-in is very common. That doesn't make it a good practice from the consumer's point of view.


Considering how hard it is to get people to agree on interoperability standards even if they want to agree, I think lock-in is simply inevitable.


10 years after release, iPhoto still cannot export a complete set of user-created photo metadata. 10 years and counting.


It does title, keywords and location. What else does a casual photographer need?


This is a very narrow understanding about what businesses do. They don't think of things that lock in users. They think of features that people will want to use. Solutions to peoples problems, even ones they didn't know they had.

So if you're good enough at that, people have trouble giving up your platform or service. Calling it lock in is just good features and service in a negative light.

"Oh, Apple does great to replace things when they break, but it's just lock in. It's not actually good service."


Creating switching costs is in fact a regular market strategy and is certainly more than just doing something so well that people will have trouble giving it up.

Facebook has switching costs because all your friends are on it, not on another network, plus you'd have to learn a whole new set of privacy settings, right? The Kindle has switching costs because all your books are in your Kindle library and you can't transfer them to the Nook. Moving has switching costs because you have to give people a new address, learn a new neighborhood, memorize new driving routes, etc.

Strong businesses often have very natural switching costs built in.

http://en.wikipedia.org/wiki/Switching_costs


There will always be a switching cost, even just remembering a new URL.

Surely companies do things to make the switching cost higher. Like as you said, locking ebooks to a particular device.

But creating a service or feature that is only on your platform just means you're doing something no one else is doing; making you more attractive to consumers (and creating a higher switching cost).


I think one of the biggest reasons your being down-voted is because your confusing the act of creating solutions to problems with what lock-in actually is.

Arguments can be made that because it costs more to switch from one platform to another, that is lock-in. IMO though, lock-in specifically refers to acts where a company intentionally tries to prevent people from moving away from their product or platform, most often manifesting as closed data formats or an inability to extract data that you yourself deposited in a system.


All the features Apple announced today are features that people have been demanding from Apple and any other vendor. What's the lock in?

Putting DRM on music so you can only play it on iPod is lock in.

Though I agree Apple is integrating services into iOS that are only available to iOS.So I imagine that gives them lock in.

I don't want to start trying to guess as Apple's motivations, but I don't see them coming out with demanded features as intentional lock in as it appears to be their solution to the problem.


>> What's the lock in?

I never said there was lock-in with this new update, I was pointing out what I perceived to be one of the biggest reasons your comment was down-voted.

From the brief bits and pieces I've read about this, I've heard of nothing so far that smells like lock-in. And I'm by no stretch an Apple fanboy either.


I was responding to your comment while restating my point. Not arguing with your comment. :-)


Sure, thinking of good features is the first step. But then isn't the next step figuring out how to make money?


If you have all the features you make the money on the hardware with the features. But describing things as only "lock in" is pretty meaningless as a comment. It implies that all it's doing is increasing switching costs and not actually offering anything to the user.

But it would only increase switching costs in the event that it WAS an attractive feature. Features that are not useful or interesting to users aren't going to affect switching costs, or lock in.


Counterexample: Company X using a proprietary file format for their office suite. This is neither useful nor interesting to users, but definitely creates a lock-in.


Except that ads are not a core aspect of anything. Businesses don't need (many) ads if they can get targetted access to customers through other means (ie Facebook, Groupon). Applications don't need ads if they can get funding from other means (ie micropayments, app store dollars), and customers don't need ads, especially if they can become aware of interesting products in other ways (forums for the like-minded, such as reddit). Ads in the way that Google is selling them may be a declining market to dominate.


It almost sounds like your conjecture is that, in the long run, Apple is destined to lose this.


If Google wins, hardware will be secondary, and Apple--that thrives on selling hardware--will decline. And it's possible that that will happen...it's also possible that it won't. Apple isn't destined to lose anything. It's just that they have the potential to lose much more than Google does--even if everyone uses Apple hardware, Google can still serve ads to a certain extent.


If Google wins, Apple can still sell commoditized hardware to a certain extent.


No, Apple has never been able to sell commoditized hardware.


I'm sure if the iPhone ran Android they will receive a fraction of their current sales.


Apple's goal isn't to commoditize ads, but to commoditize the software that runs on their hardware. And they've done an excellent job of that with the app store - with hundreds of thousands of apps available.


I think that the point was that Google's revenue stream doesn't seem to be easily commodotizable, while Apple's (at least theoretically) is.

update: Downvotes? In my HN? I was just clarifying what I thought the original author was getting at, which wasn't what the parent post to mine was arguing against. If you have a disagreement with what I say, then argue against it, rather than just down-voting. Then maybe we can both learn something, instead of "I disagree with you, and I want to let people know that, but I don't want to expose my ideas to possible ridicule so I'll lurk in the background."


The point of commoditization is to commoditize your complements. It's clear that hardware and software are a complement to ads, and software is a complement to hardware. Are ads a complement to hardware?


It's not necessarily a comparison of how Apple and Google will 'attack' each other. Just a statement of their situations. Apple's primary goal could be commoditized out from under it, while Google's is less likely to see that happen.


I don't agree. It's not just about money. Both companies are more complex & emotional than that. Apple really cares about creating well-designed technology and Google really care about indexing the world's information.

It's in their DNAs.


True but there is one crucial difference: people have to want a) Apple products and b) products advertised by Google. No end user actually wants ads.


People don't want ads--but they do want hardware, and software, that's better than a mere commodity.


I think Google is more about collecting data. Ads are just a byproduct.


I think you're confusing the end with the means. Collecting data is a method Google uses to be able to sell (more expensive) ads. The data collection is not where the money comes from, the ads are.


It may be where most of the money comes from at the moment. But Google did not start out as an advertising company, and I don't think (from reading "In The Plex") they really want to be one, either. If they could kill ads, they would do it.


Hardware is already commodotized but hardware+design is still Apple's monopoly. The thing that can change that is desktop manufacturing.


And Gruber is about selling Apple. grin


Google’s frame is the browser window. Apple’s frame is the screen. That’s what we’ll remember about today’s keynote ten years from now.

Yes, but for some reason I don't interpret that as an advantage for Apple. There will always be more ways to access a web page or web app, than devices/form factors that Apple will produce.


having used the Cr48 and a 3g iPad for several months in the US and Japan, I think Apple's vision is more robust, ultimately better for me, the end user. When I'm mobile, whether I'm doing sysadmin, reading, listening to music, I can do more of it, in more locations (especially offline), with better, more reliable apps, on an iPad.

edit: reposted here from a comment on the webian group:

===

I wrote this a few weeks ago, seems relevant as the mobile space heats up with WWDC announcements.

http://wherein.posterous.com/i-review-the-cr48

bottom line: don't forget about international travelers, spartan environments, and that whole, whacky "offline" idea. You want to be relevant 50 years from now on Mars, where the light speed round-trip to earth takes minutes, offline use remains fundamental.


> offline use remains fundamental

Absolutely agree. The ChromeOS platform is absolutely focused on offline capability as well. A huge number of apps within the Chrome Web Store work offline without any connectivity. Many regular web apps work offline. All of Google Apps will within a few months.

I think many people forget that you can take web apps offline, perhaps because launching a mobile browser while you have no connectivity feels so awkward. Anyway it's just a myth to be dispelled, and some remaining evangelism work to be done. (Disclaimer: I work on the Chrome team or whatever.)


> A huge number of apps within the Chrome Web Store work offline without any connectivity. Many regular web apps work offline.

Could you name some for me? SourceKit died so many times at that conference I felt physically ill. Switching to qemacs was an easy decision. MagicScroll for my books never really worked.

Seriously, I would like to know because the Cr48 form factor, battery life, weight, temperature and speed would be great in Japan, but for me it was a paperweight, while my roommate brought his iPad2 and was loving life by comparison. I'm not talking about being offline for a few hours. Try weeks. Sure, the military provides me plenty of wired network access, for ports 80 and 443, for sites they believe are appropriate (like Fox News, but not NPR). But if I'm in the field, not at headquarters unit or near a WiFi hotspot, the Cr48 has only been truly useful for qemacs. In my experience.


I tested this for four days at a local orthopedics convention ( well, local for me). With 30,000 users in a convention center, no wireless was truly reliable. I lost a lot of notes before turning to qemacs.


Do you mind posting some pictures of your Cr48?


So, if the next fight will be apps vs web (or screen vs browser), what does this mean for us hackers? And what about Facebook?

Already we create a web app, a mobile version, an API, a Chrome extension, a Firefox extension, a Facebook app, not to mention mobile and tablet apps for anywhere between 1 and 6 platforms.

Isn't this all getting particularly spaghetti-like? Weren't we supposed to have platform convergence, not platform divergence? There's a good opportunity here for "write once, deploy everywhere" code/platforms, and methodologies that maximise DRY while playing to each platform's strengths.


Divergence = competition = innovation.

The worst thing that could happen is One Platform To Rule Them All, if that platform is controlled by a company for its own interests.


> The worst thing that could happen is One Platform To Rule Them All, if that platform is controlled by a company for its own interests.

The web is not controlled by one company.

Edit: For the downvoters, the html5 APIs are made of contributions from mozilla, microsoft, opera, yahoo, apple, google and many more. How many companies participate in writing the iOS API?


Exactly. Web standards - or standards in general - are not a player in a competition; they are a field upon which to play. We want everyone playing on that field.


Thank you. I couldn't have stated it better.


There's a good opportunity here for "write once, deploy everywhere" code/platforms

Working on it... http://www.webmynd.com


The various platforms you mention really serve different purposes and users use them with very different intentions.


Apple’s is about native apps you run on devices. Apple is as committed to native apps — on the desktop, tablet, and handheld — as it has ever been.

Google’s frame is the browser window. Apple’s frame is the screen.

I don't see this conclusion following from his premise. Perhaps there was more in the keynote that he didn't include.

Or maybe I am unimaginative and think installing native apps on all of the devices I use regardless of who owns them, or where they are located, or where my discs are is harder than installing a browser based web app or using an HTML5 website.


I don't think this follows either. Google provides many native apps on Android. The real difference is that cloud-based computing is natural for Google but a fundamental shift for Apple.


Except that Android app sales don't pay out to Google, they pay out to Verizon whereas iPhone app sales pay out to Apple.

Google does support a ton of native apps (GMail, Calendar, Places, Goggles, Latitude, Maps, etc) but their strategy going forward seems to try to push these things from apps into the browser, spending a ton of time supporting the mobile browser version of all of these things (except for Goggles).


I've never heard that before. I think Google gets the money. Verizon probably gets a cut just for the carrier billing option when used.

I just did a search and couldn't find anything saying Verizon gets the money for android app payments.

Also I can't think of a single Google Android app that has been discontinued and replaced with a web version. I think you're getting confused about google also offering mobile web versions for platforms they can't get their apps on. They've even said they'd like to offer a better maps app for iPhone but can't.


This was in October 2008:

http://android-developers.blogspot.com/2008/10/android-marke...

--------------------------------

"Developers will get 70% of the revenue from each purchase; the remaining amount goes to carriers and billing settlement fees — Google does not take a percentage. We believe this revenue model creates a fair and positive experience for users, developers, and carriers," notes Eric Chu.

--------------------------------

This statement was at that time interpreted, that Google is buying marketshare from the carriers.

I don't know if the sentence "Google does not take a persentage" still stands and how the transactions fee is splitted today.


An issue of wired I think 2 months ago claimed that the current payout was 30% to Verizon and 70% to the developer with Google getting no cut from app sales. Obviously their information could have been dated as well.


I'd phrase it as "Apple 's frame is their screen" - those native apps run in a lot of places, but they don't run everywhere.


"Google’s frame is the browser window. Apple’s frame is the screen."

So by extension Google's applications are more cross platform, and Apple's aren't?


In theory this is restrictive and bad, but when every platform you want to use is Apple's it's a different story.

When Apple stops making what I consider to be the best computers, the best phones, the best tablets, and the best way to monetize my coding skills, then I'll probably be just as upset as all of the "openness" advocates.


I think the argument goes more like: When somebody else starts making better phones, computers, blah blah... You, and many other users will be so far locked in, you won't be able to switch.

Remember Microsoft?


  > When somebody else starts making better phones,
  > computers, blah blah...
What would be your guess, when will be the time when you can start editing your document on Windows machine at work, tweak some on your tablet at home, and then show to someone on your phone without hitting "save" and "export" at all?

Even is Dell starts to make better computers than Apple, HTC—better phones, and HP—better tablets Apple would have an huge advantage of seamless experience.


This has been possible for a long time with Google Docs. Saving is not necessary, updates are almost real-time, and it works on all modern platforms.


This is already possible, try Etherpad.


native vs web is a false choice: it's not like one wins and the other loses. they're both better in different ways. do native for some things where it's a net win, and do web for others where web is a net win. And in terms of capabilities, the line is blurring further, esp from the web side.


"Google’s frame is the browser window. Apple’s frame is the screen."

I don't think it's going to matter. Google is putting a lot of work into 'promoting' the browser frame to the whole screen.


You've been able to do that forever with full screen browser modes, of course. Flash can do full screen, too. And then of course Google has done the ultimate, with an OS that only fits a browser window.


    > Google’s frame is the browser window. Apple’s frame is the screen.
No, Apple's frame is Apple's screen. That makes a huge difference.


"Google’s frame is the browser window. Apple’s frame is the screen. That’s what we’ll remember about today’s keynote ten years from now."

So... what happens when you maximize the browser window?

The actual issue is Google wants to be accessible from any device, Apple wants to sell you their custom hardware.

"But Google’s vision is about software you run in a web browser. Apple’s is about native apps you run on devices."

Other than that Apple wants to sell you hardware, there are only technical differences between those two things.


> So... what happens when you maximize the browser window?

And what happens when you minimize it?

That is Grubers point. Google wants to make the (their) browser to be the OS. Apple likes native applications. At least from John Grubers larger narrative. According to him (and I agree) ChromeOS and the Chromebook are feeling more Google-y than Android.


I don't think Apple is about native so much as Apple is about making the best integrated services for devices and selling those devices. Apple isn't native-oriented so much as device oriented. It's just that native apps fill a large and indispensable niche right now.


This feels wrong. Why wouldn't you want to have to have your programs and data stored locally (with eventually a cloud backup) ? This reminded me of http://jacquesmattheij.com/No+User+Serviceable+Parts+Inside , only now it applies to data and software as all.


I seem to have a lot of native apps on my Google phone.

Could somebody point to a summary of Google's cloud strategy? I don't know which services are supposed to be limited to browser windows.


Sometimes I wonder how much Apple is paying this Gruber guy.


Is it too much to ask to have a Gruber article without the token "If Gruber loves Apple so much, why doesn't he marry it?" comment?

We get it. Gruber is pro-Apple. Can we focus on what Gruber is actually saying now?


I'd like to focus on that, but it's so lost in his over-the-top attitude of adulation that it's difficult to discern what the actual message might be. Google makes apps for browsers, Apple makes native apps... so what? Can you explain it?


You don't have to give the preacher money to evangelize, there is a "greater" motivation involved.


"That’s what we’ll remember about today’s keynote ten years from now."

Classic Gruber. I really hope that no one that I know ever mentions a WWDC presentation from a decade before.


Oh, I don't know. The 2001 Macworld keynote that outlines the "digital hub" is still fascinating:

http://www.youtube.com/watch?v=9046oXrm7f8

It's remarkably prescient in a lot of ways. Take this quote:

Digital cameras now constitute 15% of all cameras sold in the US. 15% percent. It'll be 50% in a few years.

He also makes the point that a hub is necessary because of all these various devices can't talk to the Internet.


> Digital cameras now constitute 15% of all cameras sold in the US. 15% percent. It'll be 50% in a few years.

Really? That's the same thought I had the first time I saw a usable digital camera 10 years ago, and I don't think I'm special or that I was the only one...


Do you have a link to your keynote speech? The one where you committed your publicly-traded company to this strategy?


It's truly remarkable that a $60B company is able to make these sort of bets and execute on them. It's easy for smart individuals to make off hand remarks but it is different for a company to rally and gather around a single idea like this.

To be fair, Apple isn't the only company that does this. Google is remarkable, as well. Virgin is another. I'm sure there are many others in many other industries that take big gambles for 'game-changing' or earth shattering results.


go and read photography magazines from that era. There's was much heated and vehement objection to including articles pertaining to digital photography and cameras. Many photographers at that time were convinced digital cameras were just toys and had no part in 'professional' magazines.

of course, some thought the opposite


Classic disruption. When the new tech isn't very good and gets a foothold in new markets, the professional users (correctly) view it as inferior, rubbish, a toy. The big picture question is whether it could improve enough to be usable by professional users. It's kind of odd that many are oblivious to the obvious trajectory of improvement.

EDIT e.g.1 (from Worse is Better) could AI become good enough? A tricky one, since they aren't competing with another product, but with us. OTOH, computers are constantly replacing people for simple tasks - the history of computation is the history of mental automation. e.g.2 could PHP become good enough to replace Java in the Enterprise (apparently their target)?

Something that can stall the trajectory is the next wave of disruption breathing on their heels.


Great presentation foreshadowing the iPod.


Many of the Mac sites have been running retrospectives on past Stevenotes leading up to this one. I ended up re-watching a number of them, and in particular a Q&A session Steve did when he returned to Apple (when Gil was still CEO, and the Clone Saga was ongoing) was extremely interesting. He basically blasted the Newton's usability and described the problems the iPhone solves, and that was just about 5% of the total talk.

Link: http://www.youtube.com/watch?v=3LEXae1j6EY


What Apple didn't really address is the growing socializing of data and in what context it is consumed, not just by the data's owner, but by their social graph. Without enabling native apps to do this, cross-platform apps will have a lot of room to grow.


1 in 4 iOS app is a game. Game Center just got some social oriented enchantments.


placeholder




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: