First a tracking tab to facilitate stalking, now a software kit to help simulate the voice of a person who may or may not consent. Entirely mass-produced. What could go wrong, and why isn't Apple concerned?
Apple’s press release specifically notes that the training lines are randomized. You’d need to get 15 minutes of audio of the person saying the exact lines that the training asks for. Doesn’t seem like a huge risk.
I doubt if majority of people would sit down to say "The pleasure of Busby's company is what I most enjoy. He put a tack on Miss Yancy's chair, when she called him a horrible boy." to even try out this feature
15 minutes of randomized phrases. There are better tools for malicious voice cloning. Seems pretty safe. And Apple has put more tools into AirTags to prevent malicious uses than any other tracking devices. And to your last point, Apple is concerned, they are running it all on device and trying to make it as safe as possible while still being useful.