It will be painfully obvious within the decade that this is exactly what's happening right now. Corporations are AI optimizing a single metric. They only use humans so much as they need them to improve the metric. As they get better, automation improves and humans are needed less and less. There's a reason so many feel like cogs in a machine.
This already happened in 19th century. That's why we need the state (which is another AI when you think about it, with some hardcoded rules and periodical review (elections)).
People do a lot of things for short term pleasure that risk heavy long term consequences. Societies don't have the right to decide for others what their choices should be.
Maybe outlaw ice cream next? Or how about marijuana smoking? (Oops, already tried that.) How about loud music (long term hearing damage)? How about motorcycles?
> Societies don't have the right to decide for others what their choices should be.
If an AGI starts inventing new and wonderful ways in which we can destroy ourselves, and we are taken in by it, we will have to restrict that, it's non optional.
You can make the argument that, say, current hard drugs should be legal - but I don't think there's a way to defend the position that any possible future 'thing' should always be legal/permitted regardless of negative effect.
An awful lot of harm has been done to people via one group who are sure they know what is best for others, and thereby are justified in forcing it upon them.
In fact, likely much more harm than those with evil intent.
I tend to invest in companies whose products and behavior I like. It's not terribly surprising that they've done well - pleasing customers is good business. I've dumped stock in companies that began a business model of suing their customers to make money - and those companies (again unsurprisingly) tilted down.
Buy companies that customers love, sell companies that customers do business with only because they have to.