That is the job of politics and the state, isn't it? They define and enforce the rules which will or will not make the goals of companies aligned with those of the society. And it is the people that decide - ideally - who is in charge and what policies they create.
Maybe in the typical case where downstream effects are unclear, but if a company is considering a direction where the obligation is clear as day, like that it could lead to the eventual destruction of that society as with climate change, I'm pretty sure we could skip that step and require corporations to factor that kind of thing into their calculations.
We're programmers here right? We want systems with direct feedback loops, preferably as short as possible, to achieve some objective as efficiently as possible. Routing back through the government for cases of clear fault doesn't add anything except unnecessary overhead and obfuscation.
I understand there can be murky lines here, but it's clear that a legal obligation to only consider immediate shareholder value is a little too unbalanced given the original objectives behind establishing corporations.
Before you can try to achieve an objective, you need one. So here the trouble already starts, different people have different objectives and some are contradictory. Not even everyone will agree that avoiding human extinction is a desirable goal. So we need some policy to resolve conflicts and weigh objectives. And people have to agree on this policy. If they don't we need some policy that everyone agrees with to find a policy that we then use. And this might repeat for some number of levels until we arrive at something that is accepted by all or at least most. And then you need the mechanisms to learn peoples objectives, and mechanisms to enforce policies, and mechanisms to avoid exploitations of policies, ...
When you are saying, we could just skip those steps, that is essentially just some kind of dictatorship - no judgment implied - you pick some policy to save the planet and declare that is what we do now. All, who agree with your objectives, have a good chance of seeing a better outcome than if we just used our gigantic machinery, but others might judge the outcome as worse as they have different objectives. And who decides when we can bypass the normal machinery? Would we not need a policy for that?
It always seems so tempting and easy, just do the right thing, but upon closer inspection, often the right thing is just what you think, the right thing is.
No, you're trying to muddy waters that are objectively clear. It is objectively clear that Exxon knew that their activity was causing climate change. It is objectively clear that climate change, if left unchecked, would lead to the destruction of society and in the most extreme cases, the extinction of humanity. If they didn't know that, why did they start a 50 year campaign to discredit such claims?
If a corporation was considering a direction where they knew this would be the case ahead of time, there is no conflict or disagreement possible that this direction should not be taken, even if it would decrease shareholder value. Calling the notion that corporations should be obliged to consider whether their activities would lead to the destruction of society a dictatorship is frankly ridiculous.
If a corporation becomes aware of the destructiveness of their activities after they're already entrenched in the economy, then they should have a legal obligation to report this fact to regulators immediately, regardless of its impact on shareholder value. Any obfuscation or delay should be met with severe punitive measures, up to an including dissolution of the corporation and nationalization of its assets.
That would properly realign corporate incentives with transparency and working to benefit society, at least in these extreme cases.
If a corporation was considering a direction where they knew this would be the case ahead of time, there is no conflict or disagreement possible that this direction should not be taken, even if it would decrease shareholder value.
This is not true, you presuppose that people want to avoid the destruction of society or extinction of humans at some point in the future. Many - probably most - people would not want society to collapse or humans to go extinct during their lifetime, or that of their children or grandchildren, but the further those events are in the future, the more likely many are to prefer some gain now over some loss far in the future that does not affect them personally.
You have decided for you that those catastrophic future costs have to be avoided and that you are willing to incur some cost for that now, but that does not make it true for everyone. Your position is essentially that your utility function is more reasonable than that of people preferring to make some money now over decreasing the probability or severity of some future event and therefore everyone should adopt or be forced to adopt your utility function.