My snarky reply: "It's too bad I have to pay for the priviledge of being included in a data set used for product refinement."
Also: no. When I was working as an intern at St. Jude Medical we were training neural networks to recognize heart conditions and adjust other measurements. It was 2002. That work was later published and widely adopted. Modern medical hardware can account for it (inexpensive or older hardware often asks the technician to diagnose and calibrate the machine).
It's very frustrating that Apple continues to pretend economy of scale is breaking into new territory.
Adapting your model from 2002 is not a trivial operation. Apple and others are building Machine learning platforms that make that work easier but saying that you worked on it, therefore it should be done by others comes off as a little naive about software integration. Going through other people’s code is not instant or trivial.
I'm saying Apple's system fails to catch edge cases that prior ML systems, system using a tiny fraction of the computing power and ultimately based off animal data, achieved.
I think the job they're doing is unimpressive. I think it's doubly impressive to imply I have to buy a watch to help them. I don't see why they wouldn't pay me for my unique data. I don't need or want to help them build a shoddy diagnosis product.
Then: it just lies.