This is another one of that "extreme tail risk" scenarios, like climate change and GMOs, that people have wildly different and contradictory reactions to.
Sure, the "legacy" intelligence / climate / food could also have extreme tail risks, it's just that it's been tested for 100s of millenia... whereas new technology might be better in the average case or even 99th percentile, but the 1% (or 0.0001%) is unknown and potentially much worse.
However, it seems to me that people resolve this more along ideological / political lines than with any kind of rational reasoning.
> "extreme tail risk" scenarios, like climate change
Climate change isn't a "tail risk". It is a hard wall our civilization is approaching fast. If we do not solve it, it will undo the conditions we depend on to live.
Sure, the "legacy" intelligence / climate / food could also have extreme tail risks, it's just that it's been tested for 100s of millenia... whereas new technology might be better in the average case or even 99th percentile, but the 1% (or 0.0001%) is unknown and potentially much worse.
However, it seems to me that people resolve this more along ideological / political lines than with any kind of rational reasoning.