From a design perspective, that's probably the case. E.g., the other objectives cannot be otherwise achieved.
From a moral primacy perspective, though, I'd say not. Serving the interests of the owner/operator first would be the initial moral requirement, the others follow from that.
I'd suggest that Aral might also want to consider another famous set of Three Laws proposed by I.A. (not to be confused with A.I).
There are some additional suggestions in the Mastodon thread: