Any value system in which "save one person" is not obviously and objectively better than "save zero people", with no other tradeoffs or details present, seems objectively incorrect.
tbf real world tradeoffs are usually more like "save one person with high degree of likelihood of success" vs "attempt to save two people with significant likelihood of failure"