Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Siri uses the proximity sensor to implement raise to talk.

The gyro for parallax is only turned on when the screen is on.

The rest of your comment is pretty much tinfoil.



The proximity sensor is not enough to trigger "raise to talk", you also have to have a little jerk of acceleration. If you have the feature enabled, try slowly raising the phone... it won't trigger.


Fair enough - however there is no evidence of how far up the stack that data goes - accelerometers do sudden movement sensing on-chip - not via the CPU.

My other points still stand.


My main point is that Apple is looking to push sensor data up the stack via new APIs in an energy efficient manner by reusing / repurposing components.

True - I don't work for Apple. But I've sampled most "context aware" frameworks and talked to some of the developers that implemented them. Pretty much every "frequent location" tracking implementation works in a similar way. I'm not saying anything revolutionary here (look up sensor fusion).

My 2 cents is just pointing out how meticulous Apple is when it comes to battery drain. There are other examples of this in iOS 6. The best one is how the CoreLocation team used the new MapKit vector graphics to clip GPS coordinates to streets. Again, Apple didn't invent a new technology to improve GPS accuracy, it leverage a new user interface innovation. To implement this Apple released a activity type (driving, walking, etc) API which alerted CoreLocation as to when it would be appropriate to clip coordinates.


Fair enough!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: