I want to see an incredibly accurate physics/chemistry simulator in distributed software.
This is the same first step needed to support automated testing materials science at a massive scale, which will unlock innovations in battery energy density, photovoltaic energy capture efficiency, construction materials, manufacturing efficiencies, etc.
AI/ML require some way to verify proposed designs, similar to OpenAI's Gym. A physics/chemistry simulator would save massive time by allowing software automation to allow software to run simulations in massive parallel.
I see this as a "human genome" sized project that unlocks massive amounts of basic science.
So, my research is focused around exactly this topic (high performance computing is one of the words you’re looking for, I believe) and unfortunately I think I’m obligated to tel you from an engineering perspective it’s basically a non-starter. There are a few reasons why:
1) Very high-fidelity codes which you’re describing, accurate at the molecular (atomic?) scale are typically never usable in any meaningful sense for macroscopic scenarios. This is usually because they’re either too computationally expensive, taking too long to run even on huge supercomputers, or because we actually don’t know how to bridge the gap between the small and large-scale physics behavior.
2) It’s unclear what you mean by “incredibly accurate”. Different fields have different definitions of what accurate means (as well as different kinds of physics they’re trying to take into account) so your ultra-accurate simulator in one domain might be very off in another. From reading your comments it seems you’re most interested in materials science. That’s not my direct field of expertise but if they’re anything like every other field of science/engineering, there is a very strong division in the field with regards to what kinds of simulations are used where.
3) This is something people usually don’t think about, but stuff like materials research is protected not just by proprietary laws but actual government export control laws. So not only will such a distributed system be looked at with raised eyebrows by corporations, but the US government might move in to shut it down if there’s suspicion that the distributed information is getting to nodes managed by foreign nationals.
In short, there are a plethora of issues related to the scope and objective of what you’re proposing. It would require a Herculean amount of effort for an unclear payoff, unfortunately.
That’s fair, the term is usually abused out of context. Let me give a bit more context: the amount of effort required would probably be up to a trillion dollars, just to be generous. You’re talking about making a generalized physics solver at-scale after all, a lot of domains we don’t even have a good model for how things work. It’s dangerous to assume that the scientists working on these projects/simulations don’t use generalized physics solvers just because they’re not ambitious. It’s almost universally because reality is messy and difficult to model in full, so approximations and compromises have to be made otherwise you’ll never get anything done. I find that mathematicians and theoretical physics peeps tend to fetishize universal solvers in this way, but the long and the short of it is those guys only fetishize it because they live in a perfect physics world where inconvenient factors are abstracted out to a coefficient and they just say “that’s an implementation problem”. Making simulators which can actually be trusted is difficult, time consuming, and sometimes completely counter-intuitive. So I wouldn’t call what this person is suggesting a “moonshot” per se, I’d more call it “attempting to create God”.
What's wrong with all of the simulators that already exist? There are books upon books written about this subject. (I'm not saying that there's nothing wrong with them, just asking what's missing between what already exists and what you want to see.)
>What's wrong with all of the simulators that already exist?
They're displacing the real thing.
I like incredibly accurate physics/chemistry in reality.
There's so much room for improvement in some fundamentals that would lead to more ideal simulations, seems like some of today's simulations are a bit premature.
Unsure if you are familiar with the project, but Folding@home [0] might be an approximate blueprint for this. Large scale physical simulations of protein structure dynamics, distributed all over.
I've heard of it, among others. But they are inevitably very specialized applications of what I was hoping would be a general physics simulator (and more detailed at the molecular level than one of the modern game physics engines).
I think it's interesting that humans can participate in the process of finding relevant solutions, but I think it's more important that any agent can be used to find and test solutions to specific physics/chemistry problems.
My hope was to do for low-level physics what OpenAI does with video games -- using them as simulators for problems and to create competing agents to solve that problem.
If I were to do that, I'd use Julia for it. This is a great post about why it's important to have specialized programs -- you can make simplifying assumptions that speed up computation. Multiple-dispatch in Julia allows to hide the complexity under a uniform front-end.
http://www.stochasticlifestyle.com/algorithm-efficiency-come...
I think the generalized version is important for the most accurate (if slow) simulation specifically because it doesn't make the core assumptions that lots of existing specialized simulators make.
This is for the same reason you can't use almost any existing airplane simulator for calculating space travel far beyond the parameters of {gravity near the surface of the Earth, air friction far from STP at sea level, time elapsed as the velocity of the craft approaches the speed of light, heat as the aircraft approaches temperatures far beyond its rating}. A very narrow simulator can make approximate values for these items at the cost of not being very flexible for a wide variety of testing. Example: being able to test the properties of a ton of materials at near 0 Kelvin or near their boiling point would be far cheaper and faster if they were in a digital simulation than if we had to hire specialist workers to find the pure material and create the environment required to mimic those settings.
Which brings me back to why I posted it here, in this moonshot thread and not just some fan article about a flight simulator. I think a general physics simulator is like basic science -- it's a foundation upon all of the other layers of the stack of existence on Earth is based (ignoring temporarily the more complicated things like consciousness, gods, and other philosophies). If someone sinks the cost for the simulator, they could potentially take a royalty of all of the material science breakthrus of all new innovations found using the simulator.
I think the key would be having a technique to discern when approximations were valid & specialize the right code for the right part of the problem. For instance, if the program mistakenly tried to run the airframe using DFT instead of FEM, it would never get anywhere, but DFT might be exactly what is needed to model the outcome of cosmic ray bit flips in the nav computer transistors. If the simulation is able to make judgement calls like that, it's pretty much a artificial general intelligence.
This is the same first step needed to support automated testing materials science at a massive scale, which will unlock innovations in battery energy density, photovoltaic energy capture efficiency, construction materials, manufacturing efficiencies, etc.
AI/ML require some way to verify proposed designs, similar to OpenAI's Gym. A physics/chemistry simulator would save massive time by allowing software automation to allow software to run simulations in massive parallel.
I see this as a "human genome" sized project that unlocks massive amounts of basic science.