@tsycho is right. I want to provide a little more "context" to this comment. This is hacker news after all.
As an aside and shout out:
@willvarfar's demo reminds me of Jer Thorp's talk from a TedX a couple of years back: http://youtu.be/Q9wcvFkWpsM
Background: I have some experience with iOS location APIs too. My concept was called wrkstrm (Here is a quick 1.5 minute demo I made for the AngelHack 2012 summer finals: http://youtu.be/U0adNGyXsuE)
Apple has been refining CoreLocation almost every year since iOS 2 with a focus to battery life. A huge addition came in iOS 5 which introduced the API changes that @tsycho mentioned - significant location monitoring and geo fencing. Again, @tsycho was astute to note that this information was ALREADY being logged by the phone and provided to telcos (there was a huge scare about this a couple of years ago). All Apple did was give developers access to this data without having to hack (check out @willvarfar's demo and OpenPaths). Another aside: These two APIs are what prompted me to learn how to code. I was seriously disappointed when I found out that Apple was "cheating" the implementation of these APIs. The resolution to these APIs is crap - something like football fields of disparity. This is what started my journey into finding a more accurate way to geo fence and track locations.
The screenshots @ladino linked to use a different approach to generate. This information is NEW. Also it was made possible by two sensor driven user interface innovations which seem totally irrelevant. First, in iOS 6 Apple started running the accelerometer all day with it's raise to speak to Siri feature - (simply put the phone to your ear to activate Siri). Now with iOS 7, Apple has introduced a "constant on" gyroscope, with the introduction of the Parallax effect. Now Apple can (for FREE) measure "stay" events when the iPhone is not moving (using signal processing) without necessarily resorting to expensive / inaccurate geo fencing. That is what provides the new location data @tsycho showed. Why am I so sure that this is what Apple is doing? Apple was originally planning to go even further by providing step count data (similar to the Galaxy S4's S Health data) to developers. This is only possible by running the aforementioned sensors and using signal processing. This was shown in the new iOS 7 technologies during the keynote and even in the iOS 7 beta 1 documentation, but all mentions were abruptly removed with beta 2 and beyond. Look for this is iOS 7.1 or iOS 8.
If you are interested I've decided to elaborate on this post over at my nascent blog:
The proximity sensor is not enough to trigger "raise to talk", you also have to have a little jerk of acceleration. If you have the feature enabled, try slowly raising the phone... it won't trigger.
Fair enough - however there is no evidence of how far up the stack that data goes - accelerometers do sudden movement sensing on-chip - not via the CPU.
My main point is that Apple is looking to push sensor data up the stack via new APIs in an energy efficient manner by reusing / repurposing components.
True - I don't work for Apple. But I've sampled most "context aware" frameworks and talked to some of the developers that implemented them. Pretty much every "frequent location" tracking implementation works in a similar way. I'm not saying anything revolutionary here (look up sensor fusion).
My 2 cents is just pointing out how meticulous Apple is when it comes to battery drain. There are other examples of this in iOS 6. The best one is how the CoreLocation team used the new MapKit vector graphics to clip GPS coordinates to streets. Again, Apple didn't invent a new technology to improve GPS accuracy, it leverage a new user interface innovation. To implement this Apple released a activity type (driving, walking, etc) API which alerted CoreLocation as to when it would be appropriate to clip coordinates.
As an aside and shout out: @willvarfar's demo reminds me of Jer Thorp's talk from a TedX a couple of years back: http://youtu.be/Q9wcvFkWpsM
Background: I have some experience with iOS location APIs too. My concept was called wrkstrm (Here is a quick 1.5 minute demo I made for the AngelHack 2012 summer finals: http://youtu.be/U0adNGyXsuE)
Apple has been refining CoreLocation almost every year since iOS 2 with a focus to battery life. A huge addition came in iOS 5 which introduced the API changes that @tsycho mentioned - significant location monitoring and geo fencing. Again, @tsycho was astute to note that this information was ALREADY being logged by the phone and provided to telcos (there was a huge scare about this a couple of years ago). All Apple did was give developers access to this data without having to hack (check out @willvarfar's demo and OpenPaths). Another aside: These two APIs are what prompted me to learn how to code. I was seriously disappointed when I found out that Apple was "cheating" the implementation of these APIs. The resolution to these APIs is crap - something like football fields of disparity. This is what started my journey into finding a more accurate way to geo fence and track locations.
The screenshots @ladino linked to use a different approach to generate. This information is NEW. Also it was made possible by two sensor driven user interface innovations which seem totally irrelevant. First, in iOS 6 Apple started running the accelerometer all day with it's raise to speak to Siri feature - (simply put the phone to your ear to activate Siri). Now with iOS 7, Apple has introduced a "constant on" gyroscope, with the introduction of the Parallax effect. Now Apple can (for FREE) measure "stay" events when the iPhone is not moving (using signal processing) without necessarily resorting to expensive / inaccurate geo fencing. That is what provides the new location data @tsycho showed. Why am I so sure that this is what Apple is doing? Apple was originally planning to go even further by providing step count data (similar to the Galaxy S4's S Health data) to developers. This is only possible by running the aforementioned sensors and using signal processing. This was shown in the new iOS 7 technologies during the keynote and even in the iOS 7 beta 1 documentation, but all mentions were abruptly removed with beta 2 and beyond. Look for this is iOS 7.1 or iOS 8.
If you are interested I've decided to elaborate on this post over at my nascent blog:
http://wrkstrm.postach.io/post/core-location-evolution