They decided against paternalistic meddling and let discourse happen naturally? That sounds best to me. I don't want Facebook to be a school teacher hovering over a lunch table to make sure nobody swears. People posting "divisive" content is far preferable to the alternative.
It's not people posting divisive content that is the big problem, the big problem is divisive content getting all the eyeballs, causing people to (due to completely normal human psychology) to believe everyone either are completely against them, or completely with them, and nothing in between.
Even disregarding anything but mental and physical health, the consequences are significant and quite real.
No, they don't need to become the gatekeeper of all "bad things"(tm) the same way they protect us from accidentally gazing at a terrifying nipple, that would be preposterous, but they could probably try a little harder to not act completely opposite to their users best interest as often as they do.
Especially when that happens to be a significant fraction of all the people on earth, that's probably not too big of an ask?
If FB wanted to "let discourse happen naturally" and not be paternalistic, they wouldn't use an opaque, non-chronological algorithm to control who gets to see what in such a way that primarily benefits FB's bottom line.
Optimizing for engagement does not favor any particular viewpoint. The authors of this article are incensed that Facebook doesn't engage in more viewpoint-based adjustment of the conversation. Favoring or disfavoring a post based on the viewpoint it expresses is very different from optimizing an algorithm to give a user more of what he wants, whatever that is.
Optimizing for engagement tends to favour extreme, simplistic, and highly emotional viewpoints. In other words, it caters to human nature. This tendency is harmful to rational discourse, regardless of whether or not you happen to agree with any given viewpoint.