We are giving more credit to humans than is due. We aren't capable of solving any arbitrary problem thrown our way. There are classes of problems we are good at solving due to us just being human and evolving to solve them, just as certain classes of problems are solveable by dogs. As referred to in this article, "common sense" is not some yet to be determined skill set, but rather the set of classes of problems humans are capable of solving that AI has not yet tackled.
Perhaps it's a matter of the AI connectome not being vast enough. Perhaps it's a deeper issue of AI architecture. Anyway it seems that that in this sense, "common sense" and "dark matter" are indeed analogous, in that it's just a place holder for yet to be understood (or at least emulated) phenomena. Perhaps that was intentional by the author, I can't tell.
Perhaps it's a matter of the AI connectome not being vast enough. Perhaps it's a deeper issue of AI architecture. Anyway it seems that that in this sense, "common sense" and "dark matter" are indeed analogous, in that it's just a place holder for yet to be understood (or at least emulated) phenomena. Perhaps that was intentional by the author, I can't tell.