It's not an alignment problem, it's a distribution problem. Automated ports would acutely hurt a very small group of people and help all other people a small amount.
Is our economy aligned to the benefit of people? Are we capable of aligning it to our benefit? Do we have any obligation to people we hurt through the decisions we make?
It's like asking if we should install a manned toll booth that raises exactly enough money to pay the toll booth workers. Or if everyone should pay higher taxes to raise the social security benefits of a randomly selected group of people.
That's not an alignment issue, because it's not clear if raising prices on everyone to support a few thousands workers is pro-worker or pre-human. You could just as easily argue (and I do) that lowering prices and freeing up man hours is pro-worker and pro-human.
I disagree on the part about alignment issues needing to be clear. They don't need to be.
It is a reality of misalignment discussions esp those involving AI. Part of that ambiguity is baked into the problem. For example, we can't be sure that AI is aligned with humanity if one of the fundamental issues.
The fact that we can't be sure that the economy is aligned with human benefit is itself a huge problem given the scope of the economy. The fact that we've normalized this is disturbing.