And in the same way a human resembles a butterfly. didn't you learn from the case where algorithms matched white noise with some copyright material? There are many iPhones and many images in each iPhones, there will be false matches. Add on top of that that the database of hashes is secret, that it can be updated in secret, that the algorithm is secret and the "threshold" is also secret and you have a lot of suspicion from people.
No, not in the way a human resembles a butterfly. But for the sake of argument, fine: some pictures of you look exactly like pictures of a butterfly, enough to trigger the flagging threshold. These false positives would easily be caught during the review and nothing would happen.
I do not trust the review people, they are not even Apple employees, they are cheap contractors that hire cheap labor,this people are treated like crap so I can see them making mistakes , is not like you never heard of Apple review guy decided X and when other guy checked he decided !X .
The main function of Facebook is to spy on people, the providers of the data. All data gathered is then sold to paying customers. They make all their money this way.
If they only had a graph with persons and relations between these persons and did nothing else with it than show this to these people, maybe we were not facing this discussion now.
The cynic in me would say: store all data on EU soil ;)
you can look at the 30% in two ways:
1/ the dev is paying
2/ the user is paying
I look at it from 2/ so Apple charges me 30 on top of 70 which is almost 43% extra for consuming a service via their distribution channel, for which I already paid a premium price.