It's funny how all these things that have been in Android forever and were criticized for their battery draining powers are suddenly the latest and greatest reason why Apple phones are the best. It's just the way it goes, remember 7 inch tablets? Worst thing ever and everyone lists reasons why. Until Apple made one. Marketshare means everything! Until Android stole that, now market share is a false signal. King of profits! Until Samsung took that too. Keep moving that goal post.
You forget a few things. On Android can't the app just run forever and decide on its own how often to poll the servers? This is a different thing. The OS decides when to schedule things if the app supports it.
Also battery life has improved a great deal since the first Android or iPhone device came out. What may have been true years ago isn't necessarily an issue anymore.
You mean like using Android's SyncAdapter and setting inInexactRepeating()? Android has supported this basic type of background scheduling since the start of time. The main difference is that Android you can do more - which can be abused. And their solution for abuse is a battery monitor that allows you to see which app is using the most battery. In a way you can say Android is more superior/powerful or you can say that the iOS is more limiting and thus potentially more user friendly.
Unfortunately, I think many Android app developers either don't know or care about "API abuse". When my Android phone is sitting "idle" with the display off, adb logcat still shows tons of debug logging from apps doing more work than they should. I like the freedom that Android gives developers, but, as you suggest, I think iOS provides a better user experience.
I can think of many ways Google could fix some of this bad behavior by throttling apps running in the background.
Bad developers do have that power. But best practice is to use a SyncAdapter, which works very similarly.
Actually, one of the biggest savings comes from apps all syncing at about the same time... firing up the cell modem is expensive, cheaper to do a bunch of transfers at once than spread out over time.
iOS will actually disable these features if the battery is getting low. It will also group them together so that the wireless chips are on as briefly as possible.
Great - another option. I prefer my phone deciding when to do this automatically (and maybe an option to disable it so I can run it into the ground if absolutely necessary).
I prefer magic, too, if it works. I doubt it does in this case, though, and I get the impression Apple knows it hasn't nailed it yet, either. At least, elsewhere in this thread people note Apple's stuff can be overridden on a per-app basis.
I also get the impression that Apple's idea about the perfect balance between flexibility and "it just works" neighs more towards the latter than Google's ideas about it, as witnessed by some remarks that Google's Sync adapters that app developers _can_ use are exactly the same as Apple's solution that app developers _must_ use.
While I do agree that iOS isn't nearly as flexible as Android, let's just say that 90% of the time you might want to go with a Sync adapter versus rolling your own solution. Also, I believe that Apple has the best intentions at heart when it comes to implementing Background Fetching in it's current spec. I mean, lets face it, it's way better to have a mobile-optimized OS manage resources than for you to do so, don't you agree?
Feature X is neither the worst thing in the world it's ridiculous that Android did it! Nor is it an unallowed good, it's completely ridiculous that iOS didn't immediately o it! (as whatever the opposite of apple fanboys would have said; hateboys?)
Rather, it's a feature with some negative repurcussions as well as benefits, some possible tradeoffs.
Apple, for better or for worse, these days, esp on iOS, generally chooses not to jump into such features right away. They wait until they can expend the developer time on them to consider all the trade-offs and all the choices and different decisions that could be made in implementing that feature. And then they spend that time, and try to implement the feature in a way that minimizes downsides, maximizes upsides, and minimizes technical debt too.
Doesn't mean they always get it right. And this strategy itself, even when gotten right, has it's own pro's and con's -- there are certainly reasons, especially for hackers, to prefer Android's wild west over Apple's walled garden.
But there are evaluations that are neither "Android was stupid to implement it 5 years ago" nor "Apple was stupid NOT to implement it 5 years ago."
Maybe you should reflect on why some people who are not fans of Apple fanboy commentary are _also_ not fans of comments that exist for no purpose except complaining about Apple fanboy commentary, in vague complaints about how "those guys always say this, but look, now they say that!"
We are not fans of apple fanboy commentary, AND we don't care.
What's really funny is how right you are, and all the comments made in response to you are proving that the reality distortion field is still in effect post-Jobs. When a competitor does it and Apple says they won't, it's the worst thing in the world. Once the competitor proves its worth and Apple does it, it's now the best thing ever (see xutopia). If you're right but with one hint of an arguable point, there's always some detail that means you're wrong and Apple is right (see simonh and nsxwolf). And if you're indisputably right, you're a fanboy whiner (see epo).
No one can possibly imagine that there could be an alternative (but still correct) choice if that wasn't the choice they themselves made.
I'm going to call you out on this one. You're dismissing perfectly viable criticism as "fanboyism".
The mechanisms are wildly different:
On Android the app developer chooses if he is a good citizen or not and can if he puts the effort to use the SyncAdapter (Bobz mentions this in the thread). The SyncAdapter afaik provide the same benefits as the iOS method.
On iOS the app developer can only provide the hooks for the OS to call and the OS decides when to call them. This is similar to how SyncAdapter works from what I can tell.
So Android provides developers with the capability to shoot the proverbial foot. We can argue that it's a good or a bad thing based on that but calling people fanboys isn't conducive to a good discussion.
I didn't call anyone a fanboy, I was quoting epo during the only time I mentioned that word. What I brought up was the reality distortion field. What you're describing is Apple doing the same thing as Android but with the only difference being developers have to use the equivalent of SyncAdapter. That's not wildly different, that's literally been baked into Android since long before Apple decided to adopt that model as well. Apple's contribution is subtle refinement to make it fit Apple's model better.
Apple has a great method of taking existing ideas, refining them until they shine, and putting them into a beautiful product. I have nothing against that, it's helped the industry immensely. I hope Apple keeps making improvements year over year and pushing the market along. What I take offense with is Apple's audience pretending these features never existed before, or that they were bad ideas then but great ideas now. That's exactly what I'm seeing you and others in this thread doing.
Well, saying people are under a reality distortion field is de facto calling someone a fanboy, in my opinion. Personally, I agree with some of what you're saying, but it kind of sounds like you've taken a straw man and run with it.
"Apple's audience pretending these features never existed before or that they were bad ideas then but great ideas now." First off, that's a wild generalization. Who is saying that? Sounds like you saw some random comments on the internet and now that represents a whole group of people that you disagree with.
"That's exactly what I'm seeing you and others in this thread doing."
Really? Xutopia's comments seem pretty reasonable to me. He pointed out a minor, but key difference. It's true, it's not "wildly different", but it is a significant, non-negligible difference. I don't see how him pointing that out turns him into your current bogie man.
I'd also venture that that key difference is where your "Apple audience" had an issue, not your claim that background updates as a whole was somehow seen as the devil. It could be abused by developers, and it was.
Arron61's comment more reasonably addresses how that was handled, and so yes, at times, you would have to do manual power management with certain unscrupulous apps. I think that aspect of it is/was annoying, but I've never thought the general idea of background tasks/multitasking as a whole was bad; that's just silly. And that's significantly different than the attribution you are making.
I should have known better than to get drawn into this discussion. You're right that no one in this thread specifically stated that these are brand new ideas, but you can't argue that people are stating that it was bad in Android but good in iOS. It all comes back to the user choice vs walled garden argument. There's the Android way of doing it, and the Apple way of doing it. You say it's a significant difference; the difference is simply in Apple not letting people do something they could do in Android. The parts that overlap are almost exactly the same. Think of how many people believe that the iPod was the first MP3 player, the iPhone was the first smartphone, and the iPad was the first tablet. Do you think that's by mistake, or because Apple really wants people to think that? How often on HN do I hear people making the claim that Apple beat Microsoft to the tablet and smartphone game? [1]
"On Android can't the app just run forever and decide on its own how often to poll the servers? This is a different thing. The OS decides when to schedule things if the app supports it. What may have been true years ago isn't necessarily an issue anymore."
vs
"When a competitor does it and Apple says they won't, it's the worst thing in the world. Once the competitor proves its worth and Apple does it, it's now the best thing ever"
Sure I'm using hyperbole, but I think I adequately summed up the argument. xutopia implied that Android's battery life suffered from having background jobs running, and that this was a bad thing. Since Apple is only letting the jobs run at predefined intervals and all at the same time to save battery life, this is a good thing. That is exactly "it was bad when Android was doing it, and it's good when iOS is doing it". I accept that perhaps he didn't know how Android's background scheduler works, but that still says to me "background tasks are a bad thing except when Apple does it."
You're also conveniently ignoring that I called out just about everyone who had responded at that point, not just xutopia. xutopia's comment was the least offensive of all of them, and I certainly understand his argument. Apple is doing it in one of the many correct ways. It's not, however, significantly different from how Android recommends developers accomplish the same task. The only difference is, on Android this is discretionary and on iOS this is enforced. Neither of these are bad things.
This is a discussion I participated in not that long ago where people were legitimately stating that Windows Mobile and Windows tablets don't count because the iPhone and Android phones look different and have faster processors. I just gave up.
As I understand it, it's not the same as Android though.
The app requests to wake up at intervals, and if the OS decides it's OK (based upon battery, network activity and how often you use the app) the app is granted that access.
Sure, except for minor details like, you know, Apple still making more money on phones than Samsung, the mini having around 40% greater screen area than the 7" Android tablets, Apple never having actually had or claimed to have a market share lead in smartphones. Plus the fact we don't know what the effects of iOS7 will be on battery life yet. You know, minor little niggles like that.
The two standard product classifications is "7 inch tablet" and "10 inch tablet". Those are standardized names, even though there is really no 10" tablet and really no 7" tablet. The iPad is grouped in with 10" tablet even though the screen size is slightly less than 10".
Similar to how the iPhone 5 has a 4" screen, but it's much smaller than a normal 4" screen in width. That's the product category. "7 inch tablet" as a market segment doesn't mean exactly 7 inches. Just because Apple said they'd never make a 7" tablet doesn't mean their product doesn't compete in the 7 inch tablet market.
I would say that the iPad is not a tablet because it doesn't have a x86 processor nor does it run Windows Tablet Edition. The iPad is a larger smartphone, and the iPad Mini is a smaller larger cell phone.
I hope I don't sound ridiculous because someone miscategorized the iPad in the same category as Windows XP Tablet Edition.
Firefox OS already uses the same model, push notifications will fire up the application, the application can then decide to create a notification, to update the internal cache to reflect the latest changes; it's called the Simple Push API[1]. It can also wake up the application every X minutes to pull the data (Alarms API[2]) and do the same thing.
This seems like something Apple would want to patent. I wouldn't be surprised if we saw some new Apple Vs. Android, Apple Vs. Mozilla lawsuit about "background fetching".
Android has had "Sync Adapters" since the beginning, a very cool and powerful framework that allows apps to piggy back off each other's data connections, reducing the number of times the cell modem needs to be fired up for background fetches.
Not a new concept, I'm sure the patent battle lines are already well drawn.
Just what I want: every developer of the 200 apps I have on my phone thinking theirs is the most important one and trying to take over my entire phone every 20 minutes to background process.
The self-centered perspective of the article is even slightly disturbing, basically implying that their app needs to pre-load data because it's so important and can't make the user wait. A common perspective, sure, but this is the tragedy of the commons in app form.
Apps don't actually have any control over when they get woken up for background fetch (you can set a minimum interval, but it's very much a minimum), and the more work you do when you get woken up the less often it happens.
In practice for occasionally used apps it seems to happen once a day, and while you're on wifi and charging if possible.
Commenters have pointed out elsewhere in the thread that there will be user options to disable the feature, and that the OS will schedule such updates per-app based on usage. Which is all good stuff.
But even without knowing that, I wasn't worried because I'm judicious on what apps I allow on my phone. And I would like it if my podcasts were downloaded without me having to explicitly open the app in the morning, FB is already updated when I open it, or the NY Times already has the news.
As has been pointed out elsewhere in this thread, it's not exactly the same: iOS is still much more strict about what runs and when. Apps get woken up only when the system decides to wake them up, and it decides based on current battery life/power state, other system activity level, and the user's usage patterns of the app. Then, once woken up, apps get a limited time to run.
To my understanding, Android offers something similar and also allows apps to wake up arbitrarily in the background. Reasonable people can disagree on whether the latter ought to be an option or not (or, perhaps even more reasonably, can be happy that both types of devices exist so people can choose which they'd prefer). But the complaint regarding Android was that it let poorly-behaved apps drain significant amounts of battery in the background and it's not true that Apple has now added that possibility.
So you know all the technical implementation details of both?
Or are you comparing random API doc material and actual technical details?
Apple says a lot of things, not all of them turn out to be technically accurate.
For example, do you know it actually will not allow poorly behaved apps to drain battery, or is this just an assumption based on what apple says will happen?
What always happens in these discussions is people say "apple's docs say x, so it must be like x". Quite often, when people actually go and look at how it operates, it isn't like x at all.
The discussion was around architecture. Of course it may not behave as intended when actually implemented because of bugs. Or because of Apple outright lying in their communications. Sure, those are possibilities, but they're irrelevant to architectural criticism.
But you have no real architecture details, only a small number of bullet points on how it's supposed to behave and a simple but not horribly descriptive API. That isn't an architecture, that's just marketing materials.
They are completely irrelevant to anything.
Given only that, you cannot possibly make informed commentary on how it will behave in practice, you are just parroting a story.
In particular, you said "it let poorly-behaved apps drain significant amounts of battery in the background and it's not true that Apple has now added that possibility."
You cannot possibly assert this with any real details to back it up, because you do not know how this architecture operates past "apple says they won't be woken up enough or run long enough to drain battery". If you have the actual details necessary to back this statement up, please add them.
Unless you are talking at such an abstract level where everything that matters is an implementation detail, in which case it's very easy to design perfect architectures that have no problems!
We don't believe people when they make crazy claims about crypto, without seeing the actual details and implementations.
We should not trust Apple or Google's marketing points about their architectures when trying to make "architectural criticism".
In the end, if you really believe "architectural criticism" is possible without actual detailed design info, carry on.
But to me, that's a worthless discussion based on what are essentially talking points.
In any case, all that matters in the end is performance in the field, so this entire discussion is mostly technical masturbation until real users have phones in hands.
I'm not sure why you think I have no real architectural details. What do you know about what I know?
Regardless, I think your pedantry is based on a statement I intended to be read in a less formal fashion than you're choosing to read it. When I said that it was not a possibility for poorly-behaved apps to drain significant amounts of battery under iOS 7, I didn't mean that it was absolutely impossible under any and all circumstances including system bugs, unforeseen architectural weaknesses, or gamma ray induced bitflips. People say "Linux uses memory protection to keep processes from corrupting the memory of other processes" and we all understand that they don't mean that it's literally impossible for memory problems to ever happen.
Likewise, my point was that it is generally true that iOS 7 doesn't allow apps to wake up as often as they want. That is a design goal behind its architecture. Doubtlessly, neither its architecture nor its implementation are perfect — I'd be surprised if it were utterly impossible for an app to end up running whenever it wants in the background. But it'd be boorish to belabor that point.
The original post that has led to this discussion was saying that Android has been criticized for being designed to allow apps to run in the background arbitrarily and now iOS has been redesigned to allow that too. That's not true. And the conclusion of hypocriticalness that was drawn from this faulty premise was untrue.
"I'm not sure why you think I have no real architectural details. What do you know about what I know?
"
They do not appear as supporting details of your argument, so ...
Again, you are trying to make it seem like pedantry and that i am addressing only extreme cases or bugs , and my point is you have offered no details to support any part of your argument.
"Likewise, my point was that it is generally true that iOS 7 doesn't allow apps to wake up as often as they want. "
I quoted your statement about battery life, and said you have offered no details to show this to be the case.
Rather than offer details to refute that, you have now instead said "I said something different".
I'd appreciate it if you would stick on point and address my contention that you have not offered details about this statement:"it let poorly-behaved apps drain significant amounts of battery in the background and it's not true that Apple has now added that possibility."
Please offer details to back this up. IE what architectural details you know that you believe make it the case that apple has not added the possibility of apps using large amounts of battery in the background.
If the details are "apps can't arbitrarily wake themselves up", then your statement about the possibility of background apps draining battery life is easily shown to be wrong, and i'll be happy to refute it for you.
If it is something else, i'd love to hear it, so i know exactly what argument i am addressing.
Let's stay with this one part of the argument, please.
> But you have no real architecture details, only a small
> number of bullet points on how it's supposed to behave
> and a simple but not horribly descriptive API.
iOS still doesn't allow apps to run in background as they wish. OS decides when to wake up an app and how long based on usage patterns. Also OS gives user capability to disable specific apps from ever using this system and also turn it off system wide.
I don't think they changed their mind, they made it less restrictive probably because newer processors can complete tasks faster and go to sleep faster which preserves battery life.
Nice passive voice, 'been told'. Who told you that? Do you have an example of an actual person telling you that 'for years', and that same person is now telling you different? If so, then, clearly, they changed their mind, sure. I doubt that 'everyone' did though.
Don't take this flippantly: have you checked to see how much data you're actually using?
I did when switching carriers and was shocked at how little non-wifi data I was really using. I use plenty of apps, get they typical corporate load of emails, upload photos, read stuff on the internet... and my cellular usage was in the low hundreds. Which made sense thinking about it, wifi at home, wifi at work.
Yes, I actually do keep an eye on it, and have the limiter set on it. My phone (from the UK) doesn't play well with the networks here so I am restricted to EDGE :( Just about to buy a new phone so with LTE will get much faster speeds. I used up over 400mb this month on EDGE, with very restricted usage. Once I get full speed I know for sure I will use it more. My wifi usage is pretty high, 3-4GB per month. I would regularly use up 2-3GB month in UK due to YouTube and streaming music.
Presumably it's only for apps with push notifications. I don't know about you, but I find it fairly easy to get rid of (or disable) apps with push notifications I don't want.
I think this speaks to something really important that's rarely touched on directly; people hate waiting for software. I despise it. Many ordinary consumers use their phones as a way to solve "waiting" in general (just look what happens when you have a bunch of people in line: they take out their phones to pass the time).
To date, the way most developers solve this (Aside from optimizing for performance where possible) is to use loading indicators of some kind, but that's a bandaid at best. Some work will definitely have to be done to manage the battery drain concerns that can arise from this, but more time spent in your app actually doing stuff, and less time waiting for the app to update will be a huge gift of time back to users (basically 2 - 5 seconds every time you open an app is about to be given back to you). Add that up over many app opens every day over many days every year; its substantial.
People hate waiting for software, but they hate unexpectedly running out of data-plan capacity even more. Until now I've known that pretty much* the only time an app will be using data is when I'm looking at it.
*There are a few exceptions, such as Mail which the average user understands.
Agreed - definitely not real productive time. Most of the time it's just entertainment (which makes sense if you consider that no one ever has any idea how long they'll be waiting for whatever it is they're waiting for).
Apple’s guidelines have always said that developers should avoid splash screens ‘or other startup experience.’ That's nothing new, but many developers ignore it anyway.
Easier said than done. This is trivial for single-purpose apps like Calculator or Weather, where it's easy to predict what the post-launch UI will look like (hint: there's only one screen...)
For much larger apps the "fake screenshot" is harmful. Take the Facebook app for example - what should the "fake screenshot" be? iOS does not differentiate between a completely fresh launch vs. a simple restore from background, and will show the same image no matter what.
So now you're in a situation where you've presented your users with a fake/blank Facebook stream, but really they were restoring to a photo they were looking at. Oops.
Or hell, do you even know if your user's logged in? Would be a shitty experience to show them the fake/blank stream but suddenly pop them back to the signup/login page no? What if they are logged in? Would be a shitty experience to show them the signup/login page and suddenly yank that out from under them.
This whole business is a shitty solution to a shitty problem: apps take forever and a day to launch.
There may or may not have been an API method in iOS 6 developer previews that allowed the app to provide the system with a new screenshot for resuming from background. That API method may or may not have been removed by the time iOS 6 shipped, and may or may not be back in iOS 7.
<whistles>
As for slow app startups, Apple hates that too. It's fine to show a loading screen, if there is loading to be done. Showing marketing videos and other bullshit? No. Not fine.
Background fetch actually provides a solution this exact use case. If the user uses your app enough for iOS to let you run in time, the app should be able to "get ready" before the user opens it.
How so? You can know all the state you want, but iOS is still going to display that one static image. If the user is returning from background into a photo gallery, it's still going to show your default image. If the user is opening the app directly into a conversation, it's still going to show your default image.
...which has always infuriated me. So many apps throw up a static PNG of what the interface is going to look like. That's a horrible user experience, because I start trying to use it and don't get any response.
If the app is loading, show a loading screen. Don't tell developers to lie to their users to make the platform look faster.
Developing for iOS without going through the UX guidelines is a pretty good way to make an app that will not endure the recklessness of the App Store business.
It's based on usage. For example, if I use an HN reader a lot, that will get frequent updates in the background. But the RSS reader I haven't opened in a week will get less/no background updates since the OS knows I'm not using it too much.
It will not. First, it doesn't do this by default. Second, it does it based on your patterns, and even then it will pool connectivity to that apps aren't just firing off requests whenever they see fit (in other words, keeping the radio spun down as much as possible). Lastly, you can disable it. Battery saved.
I can imagine scenarios where it might increase battery life - prefetching means that you won't be staring at the battery hungry screen while waiting on data to load, for example.
I'm sure data-sippers and battery-hawks can configure settings it so it's only when on wifi, or when-charging (or both).
Given about 90%+ of mobile users have access to a wifi point + charger on a nightly basis, this seems reasonable.
I'd mainly use it for podcasts and Audible - keep my casts and books updated so I don't have to sit in the car for 5min downloading the daily selections.
The app autoupdate will do that on its own if you're not careful about disabling using 3g for downloads.
That said I also played a ton of the radio last month too but I wish I could turn 3g for the radio stuff and not the rest instead of toggling it all the time.
This sort of behavior is one of the primary causes of battery drain on Android systems, so yeah, it won't at all surprise me when certain apps start getting fingered as battery hogs.
Don't worry, HN is about to tell you how apple has magically solved this problem in exactly the same way android did, but how it's different because apple did it!
The background behavior in iOS 7 is a great improvement but can't really be described by those of us who are still under non disclosure agreement. Beyond that there are lots of goodies for developers that have been added to iOS APIs.
Forgive my complete ignorance on the topic: Can't you just push down the information the customer needs? I get that it might be more difficult to use the push API but could an app theoretically have achieved the same experience via push?
Also, I'm not quite sure API enhancements are what a release is "about". Admitting that the crazy 3D designs were over the top and adopting a design similar to Metro and Android Holo seems like a rather large shift for Apple.
Sort of. Yes and no. You can bundle whatever data you want on a push notification, but there are some restrictions:
- Size limits. It's a push notification, so the amount of payload you can attach is pretty limited. Not the whole conversation history of a chat, for example.
- Lack of guarantees or SLA means using it this for time-ordered information is a bad idea. For example, assembling a chat history from a series of pushes is generally a recipe for awfulness. Pushes are not guaranteed to arrive at all, nor arrive in a certain order.
- No sensitive information in pushes, since it's (mostly) transmitted in the clear. Apple recommends payloads be IDs and such and the human-readable component be general. "You have a new message!" rather than "You have a new naughty pic from Jane!".
Long story short, when your app wakes due to a push notification it's almost always smarter to fetch the canonical state over the networking than try to piece it together from the push payload. This incurs a pretty hefty user-visible delay and results in a shitty experience.
Along with what everyone else has said about size limitations, before iOS7, every push notification resulted in a notification on the top screen, and your app would not run until the user clicked the notification.
"Apple’s guidelines for iOS7 developers actually demand we avoid splash screens ‘or other startup experience.’"
This was an interesting line. I had a debate with my designer friend about this who pointed out that this wasn't new. Turns out, the line about no splash screens is part of the Human Interface Guidelines today. The splash screens seems to have been a community driven interface decision rather than an Apple one.
What you "could" have done, and what was actually done are two different things. Both J2ME and Android had this facility for a long time. Apple, like Microsoft, dragged their feet and badmouthed this feature, then finally ended up implementing it. Now we will be told it is amazing and magical. And of course, the usual excuse will be proffered "Apple held back on it until _it could be done right_".
Remember 7inch tablets would need you to file down your fingers to use? Or that "if you see a task manager, they blew it" and low and behold, a task manager appeared on iOS.
I'm all for Apple implementing these features, but I'm tired of them badmouthing things they don't have, only to trot them out as amazing once they do have them.
Microsoft used to badmouth HTML5 Canvas, then when they finally got a great implementation in IE9, all of a sudden, its WebGL that sucks.
Don't badmouth features just because a competitor has them and you don't, badmouth them if they are indeed, bad features.
A developer can't just set a background fetch interval in iOS7 and have the app fetch at that interval (i.e. fetch every hour). Rather you can set your app to background fetch, and when it fetches is at the mercy of the OS.
The OS tries to be smart about it, lumping fetches together when it can, and noticing things like when the user opens the app typically.
So say the user launches GREAT WEATHER app app normally at 8 am, the OS notices and performs a background fetch a few minutes before (for example).
There is a new settings screen in iOS7 called "Background App Refresh" (under General) which allows users to disable app refreshes altogether or to turn off the ability for specific apps
I understand iOS7 will govern abuse/excessive background fetching. But seems unfortunate it can't cater for legitimate scheduled background fetching - e.g. updating all my podcasts every day at 8am. In this case my phone would be plugged in anyway, so the battery concern is moot - there are other cases where I'd accept a substantial battery drain to run a process I've approved.
From other comments it sounds like iOS might figure out I like to listen to podcasts en route to work and learn to pre-fetch. But I don't want to rely on it noticing this or wait for it to learn.
I have a mix of Android and iOS and it's for this use case that Android handles all my podcasts.
It's mentioned elsewhere in this thread, but I believe after you posted. Basically, the apps don't control when they can do their background fetching. It's up to the OS, which uses your usage patterns to determine how often backgrounding can occur. So, beyond the setting where you can disable this functionality, the intent is you use your apps like normal and it adjusts itself to your patterns.
Have you used it? Are you familiar with what it's doing or how it's doing it? Or just immediately dismissing it for no reason? For me, as a developer, it's fixing tons of things that were broken and allowing me to do lots of amazing new things with my apps that I haven't yet - while still letting the user dictate behavior and usage.
Yes, I am a developer as well and have been using iOS 7 on my personal device for about a month now. I am only speaking about the UI - sorry I didn't make that clear. I prefer the previous UI. Easier to read and navigate and buttons/controls stand out more, which is important for a small touchscreen device. Control center is a small improvement, but iOS 7 UI is a step backwards for overall usability. That's just my opinion after 1 month of heavy use.
I remember when someone writing for TechCrunch published a takedown of the iOS 4 beta for ruining the iPhone experience with distracting features such as folders and wallpaper.
The unlikely day that Apple takes the approach that "you don't fix what ain't broke" would be a sad sad day indeed.
I would still question the value of folders. It makes the homescreen customization experience more confusing and frustrating for everyone in exchange for functionality that's only really useful for a small % of enthusiasts.
Almost everyone I talked to uses folders - especially in iOS where you can dynamically create folders by drag/drop. The alternative if you have 3 dozen or more apps is pretty daunting.
Where I do agree with you is that Springboard needs a major update - I want to see a grid of apps I've recently launched for example (the recents tray on double-tap home is way too small). Or ones that have updates/badges so I don't have to hunt over my 5-6 screens for updates. Perhaps the notifications enhancements (and the "today overview" a la Google Now) might alleviate this.