If you have 1000000 people, there's a very high risk at least one is crazy or illogical with respect to survival.
If you have a couple hundred really well examined people who all really wanted to be somewhere the odds are much better, and all the more so, that problems can be contained when identified.
Given that mental illness is prevalent and that we misdiagnose mental illness at a rate >1%. There is a already high risk in a couple hundred people.
Furthermore, some 500 healthy adults do not form a stable society. Being in a high risk environment will flare tempers, 500 is on the low side of genetic diversity, there are no children, there will be pregnancies...
Romantic prepper fantasies. With a couple hundred people, the bus factor of a tech society is wildly negative. Meaning, you'd need miracle spawns of workers already trained to keep up with just maintenance.
On Mars the whole thing would just collapse and people suffocate once the parts shipments from Earth were disrupted for a substantial period.
Are there any instances in the history of the human race where that idea has worked? My impression is that it always fails due to corruption or incorrect evaluation metrics or some other fundamental flaw in the starry-eyed idea of separating "good" from "bad".
In theory, it's perfectly straightforward to write a flawless, unexploitable large program in C. In practice, we get hacked by plugging something into a Lightning port.
This is the way habitats ecosystems work anyway. A trivial example might be aquariums: the bigger the easier to maintain a healty equilibrium.