I strongly disagree that the foundations of physics have not progressed for 40 years.
We have been looking really hard for answers to persistent questions. While we have not found affirmative answers, physicists have systematically ruled out option after option after option. We have not yet discovered a unified-theory-of-everything, but we know a whole lot more about what that theory is not.
Furthermore, the past 40 years have seen the emergence of precision cosmology (and the dark-matter/energy paradigm that it entails), the observation and confirmation of neutrino oscillation, the detection of gravitational waves (and the nuclear physics revolution that has begun with GW170817), SN1987A, and so much more.
The coming decades are poised to learn so much more, a lot of it from the stars. GAIA, LISA, updated terrestrial GW detectors, LSST/Rubin, TMT, SKA, and more are all poised to tell us much more about things we don't understand. Particle physics will move forward too, though it is uncertain how quickly. The right breakthrough in wakefield accelerators, though, could be transformative.
> I strongly disagree that the foundations of physics have not progressed for 40 years.
I think, we are getting a peak at the politics of the practice of science when you look at the article and comments, like this ones parent.
A bunch of this is about the unobserved. The theoretical rather than then experimentally observed (think scientific method).
Whose theories get the funding to be looked into? Whose ideas are published and talked about in the popular places?
A post of Sabine's talked about how she thinks the crisis in physics isn't about physics [1].
Is it instead about the politics of the money and popularity? Is it about the psychology of being wrong? I mean, different physicists contradict each other and the truth isn't what's popular it is what's observed, right? If people contradict each other they can't all be right.
What's most interesting, to me, is how this isn't about what's been observed but all those human system qualities that people bring to the table.
I think what people miss about the industry of Science is that, logically, if a paper says "A=Foo" then we'd expect it to be true. And if another comes out that says, "A=Bar" then it's confusing, are both wrong, are both right.
But instead, the way it works is a consensus is built. Researcher Bob - "We see that A=Foo." Researcher Sally -"Well, I see that A=Bar." Researcher Timmy - "Well, I see that A=Far." Researcher Kimmy - "Well, I see it as A=Fbar." Community over time - "Now the community has seen that A=Fbar is consistently correct and can be relied upon and used." 20 years pass... Researcher Jeff - "Well, I see that Ab=Fbar actually." Community - "That's bullshit." Researcher Betty - "Well, actually I see that too. But I see Ab=FbarC". etc etc
Since we don't know what we don't know, it isn't really a "We're done!" situation for someone to get 100% correct, it's a process of evolution in knowledge.
I think part of this is that scientists don't "see" as in observe these things. Instead they "think that" something is the case. And, they have math and ideas to back that up.
If we had repeatable observation of the things it would be much harder to make disagreeing arguments.
> A post of Sabine's talked about how she thinks the crisis in physics isn't about physics [1]. Is it instead about the politics of the money and popularity? Is it about the psychology of being wrong? I mean, different physicists contradict each other and the truth isn't what's popular it is what's observed, right? If people contradict each other they can't all be right.
A good book (or two) for this is Isabel Stenger's Cosmopolitics 1 and 2.
She goes into a lot of physics debates from the 20th century, along with a lot of the political debates
To add to your point, scientists have been chasing inconsistencies exactly like Sabine said. They’re chasing dark matter, dark energy, quantum gravity, early universe cosmology, naturalness (arguable; more of a theoretical inconsistency), and quantum foundations (smaller effort).
So I don’t see the point of vague allusions to the philosophy of science. If anyone has a concrete proposal---inspired from philosophy or whatever else, doesn’t matter---the proposal will typically be treated on its merits. Modulo caveats about humans being humans and all that.
The things you mention may be useful to science and even help in evolving foundations of physics, but that hasn't happened yet. Ms. Hossenfelder talks about things such as internal consistency of quantum methodologies, consistency of QT and GRT and particle physics. There is no breakthrough in these questions for decades.
That is possible, time will tell. It is quite possible (and at this point, even desirable) that breakthrough will be made elsewhere in foundations of physics.
While I mostly agree with you, this is not really a counterpoint to Hossenfelder's position. There is no denying that experimental physics has produced great results in the last 40 years, but the theories underlying these results have remained unchanged during that time.
Lee Smolin deals with this question in Einstein's Unfinished Revolution: The Search for What Lies Beyond the Quantum. He relies on the same facts, makes many of the same points you do, and he comes to the exact opposite conclusion (assuming, of course, that I actually understand your position fully, and his as well). That's interesting to me because I think it might point to a philosophical disagreement between your views and his, especially in epistemology.
Part of the difference may stem from the fact that I am an experimentalist, and he is a theorist. Theoretical progress has been slow, in large part because Nature has been unwilling to show any new experimental deviations from the Standard Model.
Experimentally, people continue to hammer away at the Standard Model (and gravity, my specialty), and the paradigm continues to hold. With few exceptions, most ideas from 40 years ago have been put to significant tests and turned out not to be how Nature operates.
From a theory standpoint, without new guidance from us, theorists are forced to attempt to out-think Nature, which is extremely hard to do. Smolin may be bummed, but theorists can take solace in the fact that the problem is extremely difficult.
In the particular case of quantum-mechanics, the day that theorists divine a compelling way that one interpretation of QM makes a prediction that differs from that of another interpretation is the day that an experimentalist starts building a test to find out which one is true.
The original article as I understood it is breaking a lance for more rigorous selection of experiments based on how much more information the experiments can yield.
With increasing costs in experimentation it makes sense to prioritize for impact.
In a way it is a meta-science because you need to find a methodology to evaluate which theory, when proven or disproven has the most impact.
I would imagine that the current working method would be to look at how often theoretical physicists cite a specific theoretical result/paper and hence "popularity" is the prioritization mechanism.
Honestly why is popularity not a good mechanism to determine how we allocate experimental investment?
If a large cadre of highly educated people find a particular theoretical mechanism sufficiently compelling to dedicate their careers and time to it, then surely that's a good basis on which to develop experiments to try and confirm whether it's right? Because the benefit of proving it wrong means they'll reallocate their resources.
I get something like this from google translate:
Thirty spokes, a total of one hub, when it is not available, there is a car. I think that when there is no device, there is a use for it. The chisel owner thinks that when there is no room, there is room. Therefore, it is good to use it, but not to use it. "
Interesting that we get different results. Google's AI-based translation can get a little crazy sometimes[1] but I would have still expected everyone to get the same result for the same thing.
Edit: As someone else has now mentioned, it's the difference between leaving the quotes(「 and 」) on the ends or not. So it is just the translate AI being weird as usual.
"Thirty spokes, there is a hub. When it is free, it is used for cars. I think it is a device, when it is free, it is used for devices. Profit, useless. "
Thanks, I was wondering what was the original quote, which is often tricky to get from unsourced translation (edit: I missed grand parent sourced chapter 11. Btw, I love the parallelism used in Classic Chinese writings)
> We have been looking really hard for answers to persistent questions. While we have not found affirmative answers, physicists have systematically ruled out option after option after option. We have not yet discovered a unified-theory-of-everything, but we know a whole lot more about what that theory is not.
I feel this answer is a bit of a self justifying cop-out. A lot of the value of physics from the perspective of society has been generating understanding about the world that actually translates into manipulating the world.
Reaching the moon, superfast internet, gps, microwaves, etc... etc..
You are not wrong of course, but knowledge for the sake of knowledge is not always useful, and also not always worth funding imo.
Agreed, and how is string theory not being considered an attempt at: "resolving inconsistencies" by the author? In fact, imho, any attempt at finding a unified theory is an attempt at resolving inconsistencies.
Does anyone else feel like the abstractions and models in physics have gone passed the point where the casual outsider (even a technically and scientifically minded one) can no longer intuitively understand it?
Because that's how I feel. There are so many things I just don't understand now like:
1. I originally thought the Heisenberg Uncertainty principle was a natural consequence of using particles (photos) for measurement. Instead however it seems to be a fundamental property of the universe, which I only learned after finding out most of the mass of hadrons comes from the relativistic motion of quarks and it explains why hadrons don't collapse to a point.
2. What does it even mean to create more space? The universe is expanding. Ok, I can accept that. But what does it mean?
3. I find the models for dark matter and dark energy to be... unsatisfying. I realize there's experimental evidence for unobservable mass but it feels like a fudge.
4. Of course we still have no quantum model for gravity.
5. I don't really understand what a fundamental force really is. Like why does electromagnetism have a repulsive opposite but gravity doesn't? When I tried to look into this I ended up down some rabbit hole of "gauge forces" and got completely lost. Why is the Higgs Field not a force?
6. Why are some predictions of the Standard Model so incredibly accurate (like the magnetic moment of an electron IIRC?) while others are so incredibly inaccurate (eg IIRC the QFT prediction of vacuum energy is off by 120 orders of magnitude).
7. Why are there exactly three generations of particles (ignoring the Higgs)? What does a generation even mean?
I could go on. I don't for a second mean to suggest any of these notions are wrong. It's just that the models have gotten so complex (it seems?) that it just feels like something huge is missing, something that will eventually seem obvious in hindsight. Or am I just a lemur trying to figure out how an airplane works?
> Or am I just a lemur trying to figure out how an airplane works?
Unfortunately yes, all of us are.
This is not a problem with physics or abstractions. It's a problem with our intuition. Our intuition is based on evolution and life experience, which is all formed based on mostly solid objects from 1cm to 100m moving at 1m/s to 100m/s.
The universe however does not care what meatbags can experience with our senses. It doesn't mean that our complex math abstractions are necessarily correct - but they are more correct than casual intuition. You can train that intuition with enough work with the maths.
For example, you can map most of basic electricity to water flow and pressure, and some electromagnetic waves to waves in water - but you need to make a small jump to abstraction to combine both, and a large jump to get a gut feeling for special relativity to "feel right".
The crazy part is really that mathematical abstractions exist for all these things at all. There seems to be no natural reason that physics should be describable by small elegant formulas at all, let alone our experience of throwing rocks into a pond. Why isn't particle physics as messy as organic chemistry?
It is not nature that describes itself in math, it's people who describe in math what their current understanding is of how nature works. The more mathematical simplicity in the formula, the better we understand it.
Organic chemistry is harder because our powers to observe it are computationally and experimentally limited at this point.
Human intuition is based on limited sensors, boundaries which we have overcome with science and applied science over time. Our intuition had to be collectively replaced with rigorous mathematical methodology to incorporate these foreign sensors.
Yes. First, intuition depends on your experience. If you never studied physics, most things will be nonintuitive (heavy bodies don't fall faster than light ones? really?)
Second, modern 20th century physics education (courses, textbooks) suffered sustained corruption of methodology by scientific authorities, where the quest for understanding was renounced in favor of "modelling" and "prediction" (e.g. authors of orthodox quantum theory and their less bright pupils perpetuating that attitude) and later by institutionalized system of university research which propels tweaking and applying old ideas to detriment of trying new ones or questioning past ideas that are too ingrained.
This leads to a large portion of theoretical physics publications being more and more about complex calculations where most applicators do not even try to understand "what is going on", they just assume the same quantum methodology with some tweaks (i.e. different configuration spaces, more dimensions, different Lagrangians, new fields that fix problems of the previous ones, tricks with removing some ugly series terms etc).
Sometimes these tweaks get fancy names (superstrings, loops, dark matter) but they are really an additional concept that needs to be put in to save the edifice from those radicals who would like to try actually new and incompatible ideas.
When you study 20th century physics yourself from original sources, you'll find the stuff taught currently actually has highly varying degree of credibility. Some stuff is rock solid, such as relativity, molecular theory and chemistry, nuclear physics and solid state theory, and some stuff is ... well, more unfinished and less credible - such as standard model, force unification, quantum gravity, dark matter, etc.).
If you want to get some solid ground on which to build intuition, start with the rock-solid physics as known till 1905, then after that makes sense, learn about its problems (explanation of emission spectra, inconsistency of EM theory with Newtonian mechanics), then after that take a deep breath and read original papers on quantum theory and particle/nuclear physics.
This will take years to understand. The later theoretical stuff around Standard Model details (lepton generations, stability of particles, unification of gravity and QFT) is a decades old project that nobody knows how to finish. It is stuck for now, and has little relevance for understanding those previous things.
Hm. Comparing paradigm shifts with model based realism is not really valid criticism of either, yet understandable.
The folks who just added one more term to the old stuff to make it predict better - let's call them the old guard necessarily did it to point out that old models can become slightly new ones too. Even if in the long run they will be seen as the evil holdouts.
Yet at the same time we know that just whipping up a new fancy maths model won't solve anything in itself. New models need to make new testable predictions.
And then even new models require lengthy fine tuning, which requires costly experiments.
Alas textbooks are very often terrible, but not because they emphasize predictions and models over "understanding" - but usually because they omit to elaborate on how to select the better of two models, how paradigm shifts happen, how anomalies are ever present - and thus make practical model selection even harder. Plus they regularly fuck up the math explanation part, exactly because they use terrible language and models.
Finally, it's always data that cleans up the mess. Either practical usefulness - engineering, applied science. Quantum experiments, q-bits, and so on. And on the high-energy end cosmology and astronomy.
Anyone harking about how the crisis is about politics usually wants to allocate more money to theorists, so we will finally get breakthrough theories. Yeah, great, we already have a lot of those, but without data we don't know which one to take seriously.
Furthermore pouring money into theory is a nice idea, and comparatively cheap (compared to a new collider), but that won't solve the very pragmatic employment question for the collider builders. (Who are out of luck anyway, because the era of building bigger underground circles seems to be over. But they would gladly build anything, but they won't make good theorists, even if pundits' articles seem to imply there's a simple slider between theory and experimentation.)
Certainly the complexity of the experiments shows why theories are so complicated. The neutron wasn't even observed until 1930, with an experiment that fits on a tabletop. Even into the 1940s you could put together a cyclotron in a lab and discover a new particle. Or observations of the cosmic background radiation, which were made with a 6 meter radio dish in suburban New Jersey. Now most research requires a facility that only governments are willing to fund. If it were easy, it would have been discovered by now.
The casual observer hasn't been able to understand the forefront of physics for a very long time. Maxwell's equations have been known for 150 years[1], but the average high school physics student hasn't the foggiest idea of what a differential equation is. At best, they have a vague understanding that electricity and magnetism are more-or-less interchangeable.
Once you get into invisible forces acting across very small distances, nothing about how the world works is 'intuitive'. The most precise description of it is... A bunch of math, and not the kind of math that people learn in their K-12.
[1] As has entropy, and the laws of thermodynamics. Yet even educated people often have no idea of what the laws of thermodynamics actually imply! You'd figure that people would take a little bit of an interest in them, given that they live in a society powered by combustion engines...
Good point. I work on laser amplifiers, and I struggle to get clear on what Planck said in 1899. I'm definitely not caught up with where physics was in 1920.
Hearing casual observers talking about quarks or dark matter just makes me run the other way. I know I don't know.
A great deal of (19th century) thermodynamics has its roots in the desire to engineer better steam engines. Entropy as an example has been introduced as an abstract concept to get to engines with better efficiency. Even when statistical thermodynamics was suggested (which makes of a less abstract and more intuitive understanding of thermodynamics) it was strongly opposed by many of the pundits at that time.
> 2. What does it even mean to create more space? The universe is expanding. Ok, I can accept that. But what does it mean?
Consider several droplets of water on the surface of a latex-based birthday balloon. The balloon is inflated, and droplets spread further apart, yet each droplet retains its shape and size. If you measure the distance in droplets it appears that you have created space as more new droplets could now fit in between those placed earlier.
We only have two comprehension tools to help in grasping the universe - math and intuition. The above is intuition, and you know better than me where to find math. There isn't more "meaning" to it than that, just the tools.
Another way to think about it is just that if you had two solid particles, and measured the distance between them, you'd find it had increased over time.
Space everywhere is expanding, it's just all the other forces over short distances ensure that everything pulls itself back together. The Big Rip - if dark energy is increasing in strength - is what happens when at some point the rate of expansion exceeds the forces at various scales that are able to hold things together, till even the strong nuclear force can't sustain it.
Everything you touch on - except the Heisenberg uncertainty (which is due to fundamental property of any finite signal, shown through a Fourier transform) - is a big (really mind-boggling) anomaly of current contemporary physics/cosmology.
Just the other day right here on HN was a link posted about dark energy, and how maybe it's just measurement calibration error. Poof, solved. Or not, we shall see.
Reading about the statistics and epistemology behind the experiments (Andrew Gelman's blog) and models (eg why preferably hierarchical Bayesian models) will help a lot to cut through the problem of understanding and satisfaction. (Rarely we can have both for complex issues.)
Oh, and everything is a field. Fields are coupled (coupling constants, running couplings). Some fields' have non-null base energy state. (So vacuum energy is non-zero. Maybe. It depends how many and which fields your model deals with.)
Some kind of energy wiggle in some fields translates to one/two/a-lot of wiggles in other/the-same fields. (See the Feynman diagrams, how there are infinite number of possible but decreasingly probable interactions between "particles".)
What is a force? It's just one well separated aspect of the whole model. Ultimately we think all of them are coupled into one big interaction that involves every field. See the electro-weak unification.
Why gravity doesn't seem to be able to attract? Because maybe it's not a field, it's just the shape of spacetime warped by energy and we haven't found negative energy. (Einstein's General Relativity) Or it's just based on entropy and thus it's a very strange emergent property of our universe. ( https://en.m.wikipedia.org/wiki/Entropic_gravity - bonus, it explains dark matter too, so maybe it's simpler - but it's just the ugly MOND (modified Newtonian) in disguise, noooo!) Okay, so maybe back to higher dimensions and branes and loops and strings? But nobody understands that! So we wait.
Why three? Because so far we found three and thus our models reproduce exactly that many.
Re: "Does anyone else feel like the abstractions and models in physics have gone passed the point where the casual outsider (even a technically and scientifically minded one) can no longer intuitively understand it?"
That's how I feel when using Twitter Bootstrap compared to the WYSIWYG days of VB-classic and Delphi. Like Quantum Physics, getting Bootstrap right relies on probability and killing of cats, or at least hair follicles. (Our shop probably needs a dedicated UI coder, but office politics won't allow it.)
> 1. I originally thought the Heisenberg Uncertainty principle was a natural consequence of using particles (photos) for measurement. Instead however it seems to be a fundamental property of the universe, which I only learned after finding out most of the mass of hadrons comes from the relativistic motion of quarks and it explains why hadrons don't collapse to a point.
> 2. What does it even mean to create more space? The universe is expanding. Ok, I can accept that. But what does it mean?
IME both QM and Relativity are much more easily understood if you start from the actual equations and/or an undergrad textbook. I don't think physics is - yet - beyond the technical and scientific minded amateur who is willing to read and understand some equations. But it's probably gone beyond the ability of any science journalist to render into prose.
> 3. I find the models for dark matter and dark energy to be... unsatisfying. I realize there's experimental evidence for unobservable mass but it feels like a fudge.
I think working scientists would agree with you at least as far as dark energy goes. Dark matter is pretty indisputably just some kind of matter that we can't see (see e.g. the Bullet Cluster) - there's still something to be solved in terms of figuring out what it actually is, but I don't expect that to be major new physics. But as to dark energy: yeah, it's a fudge. Everyone knows it's a fudge. But it's still our best description of reality. Even if you knew there was something wrong with epicycles, they were still the best way to calculate planetary orbits at the time.
> I could go on. I don't for a second mean to suggest any of these notions are wrong. It's just that the models have gotten so complex (it seems?) that it just feels like something huge is missing, something that will eventually seem obvious in hindsight. Or am I just a lemur trying to figure out how an airplane works?
I think models aren't complex so much as unfamiliar. We're getting further and further away from everyday experience, and so more and more of the fundamentals of reality have to be understood from mathematical first principles rather than everyday intuition.
The models are actually really good though. Certainly once I understood QM it seemed so clear and simple that it couldn't possibly not be true.
> We're getting further and further away from everyday experience, and so more and more of the fundamentals of reality have to be understood from mathematical first principles rather than everyday intuition.
How much of that is an insistence on thinking of things in terms of fundamental particles and definitive properties instead of fields and packets of energy? I guess the question is that what the mathematical models are really about, and whether we're just having a hard time shrugging off Greek atomism and 19th century materialism when it comes to intuition?
Intuition could be built on top of a solid understanding of fields, backed by the math. The difficult part is connecting that to the classical world of our size that we experience.
> How much of that is an insistence on thinking of things in terms of fundamental particles and definitive properties instead of fields and packets of energy? I guess the question is that what the mathematical models are really about, and whether we're just having a hard time shrugging off Greek atomism and 19th century materialism when it comes to intuition?
I don't think it's "Greek atomism and 19th century materialism" to imagine a world made of persistent physical objects in well-defined positions. That's the entirety of everyday life.
Consider the competing ontological views: everything is made up of water, the five elements, everything is a sphere, the world is a simulation, everything is ideas being perceived by the mind, the world is matter formed by the ideal forms, etc.
Those views were prominent among certain intellectuals at different times which go against the grain of everyday experience. Since physicists are trying to understand the fundamental nature of the world, I take it they're used to not accepting appearance as a guide, and are rather operating under whatever philosophical intuition is dominant at the time. Since 19th century physics confirmed the existence of atoms, and the periodic able of ordinary matter is made up of atomic bonds, then it makes since that physicists have been influenced by that guiding intuition.
Regardless, fields are probably a better building block for intuition than particles.
I'll take a shot at trying to answer some of your questions:
1. I originally thought the Heisenberg Uncertainty principle was a natural consequence of using particles (photos) for measurement.
That's the observer effect. When quantum mechanics was first constructed, it was believed that the uncertainty principle could be explained that way. As you mentioned, today, it is considered a more fundamental feature of the theory. As an analogy, consider how a signal sharply located in the time domain gets smeared out in the frequency domain, and vice versa. It's kind of like that.
2. What does it even mean to create more space? The universe is expanding. Ok, I can accept that. But what does it mean?
Specific volume (the average volume occupied by a unit of mass) increases. If the universe was spatially finite, so would total volume.
3. I find the models for dark matter and dark energy to be... unsatisfying. I realize there's experimental evidence for unobservable mass but it feels like a fudge.
Shrugs. If dark matter turns out to in fact be just a bunch of yet to discovered particles, that wouldn't be too strange. Dark energy is more of an unknown.
4. Of course we still have no quantum model for gravity.
That's an issue.
5. I don't really understand what a fundamental force really is. Like why does electromagnetism have a repulsive opposite but gravity doesn't? When I tried to look into this I ended up down some rabbit hole of "gauge forces" and got completely lost.
For now, there's a certain amount of arbitrariness involved: We could easily construct universes that worked differently. Furthermore, gravity is special: At the classical level, it's not a regular force, but a pseudo-force (a consequence of Newton's first law). Also note that it can in fact manifest repulsively (eg via the cosmological constant). From the perspective of particle physics, it's also special because the hypothesized force carrier (the graviton) would have spin 2 instead of 1.
Why is the Higgs Field not a force?
To my knowledge, there should be an effect one could call Higgs force. It would be tiny.
6. Why are some predictions of the Standard Model so incredibly accurate (like the magnetic moment of an electron IIRC?) while others are so incredibly inaccurate (eg IIRC the QFT prediction of vacuum energy is off by 120 orders of magnitude).
Quantum Electrodynamics can be solved perturbatively. We know how to do that. Things that can't be handled that way tend to be hard.
Regarding vacuum energy, you'd have to account for every fundamental field there is to get it right, so no surprise we get it wrong. Also, given that we have no quantum theory of gravity, it's not even clear to me that it is even the right approach.
7. Why are there exactly three generations of particles (ignoring the Higgs)?
Figure that out, get a Nobel prize. It could even be arbitrary, like how it's futile to ask why a particular snowflake looks different than another snowflake created in similar conditions: Random chance.
Personally I find it amusing Hossenfelder is now invoking the need for learning philosophy of science, given how hostile she's been to it before. See for instance
that begins with: "Philosophy isn’t useful for practicing physicists. On that, I am with Steven Weinberg and Lawrence Krauss who have expressed similar opinions."
Though to be fair, she clarifies that she wishes philosophy wasn't so useless, and that:
"Philosophers in that area are necessarily ahead of scientists. But they also never get the credit for actually answering a question, because for that they’ll first have to hand it over to scientists. Like a psychologist, thus, the philosopher of physics succeeds by eventually making themselves superfluous. It seems a thankless job. There’s a reason I preferred studying physics instead.
Many of the “bad philosophers” are those who aren’t quick enough to notice that a question they are thinking about has been taken over by scientists. That this failure to notice can evidently persist, in some cases, for decades is another institutionalized problem that originates in the lack of communication between both fields."
This is the sort of reasoning that got me reading Hossenfelder in the first place, not the conspiratorial posts she writes now... :(
I thought the article was not great. But the fact that she's changed her mind on the topic makes me like her a lot more. How can we learn more if we don't change our mind?
> How can we learn more if we don't change our mind?
I think it's also possible to learn without having strongly held beliefs about the subject beforehand. This alternative reminds me of Descartes (don't assimilate knowledge until you are sure of it's veracity) and Bayes (keep track of degree of belief about traditionally non-probabilistic things). Maybe such an approach would help getting trapped in local optima. E.g., I'd imagine it would be hard to climb out of the theist energy well once your world view were based on it.
I wasn’t very convinced by the 2016 text either, but the point is more that she’s gone from that clearly stated position to mentioning the importance of ”philosophy of science” in her attacks on current practice without much evidence (that is, not writing about it) that there has been any shift of conviction.
It looks more like Hossenfelder found something discarded in the shed and temporarily use it as a club in want of something better. A slight intellectual dsihonesty.
The philosophy of science is different from philosophy.
The philosophy of science is really a meta field for science while philosophy itself encompasses things the philosophy of religion and other things core to the human experience.
When studying something of scientific nature the human experience like religion, art or that kind of thing is the domain of the humanities and entertainment and not of the universe in general.
Theoretical physicists' way of working is to put forward baseless mathematical models and build $40bn machines to prove them wrong. They should instead work on theoretical inconsistencies that have been known for a while.
The problem is that the theoretical inconsistencies are too small to be useful. For example, we know that there is probably a problem with the anomalous magnetic dipole moment of the muon (it's not sure, because it's only a 3.5 sigma, it may me a fluke). Without more experiments, it's difficult to know how to modify the "Standard Model" to get the correct result. There are many alternatives, but without more precise experiments it's difficult to select the correct one. https://en.wikipedia.org/wiki/Anomalous_magnetic_dipole_mome...
So the alternative is to build a $40bn machine, look at the data and then try to imagine then how to fix the theoretical model. [There is a risk of overfitting the model, and finding patterns in the noise.]
Another alternative is to build a $40000bn (or more) machine and have enough precision to make the model obvious from the data. [I'm not sure this is possible, I guess with enough money, perhaps m000000re money.]
that's because experiment drives physics rather than mathematical consistency (no matter how much people pretend it's about "beauty"). plenty of mathematically consistent physical theories have been falsified by experiment and plenty of mathematically inconsistent physical theories have made precise and accurate predictions.
Observations of the cosmic microwave background, galactic rotation, gravitational lensing, and redshift led to the concepts of "dark matter" and "dark energy" that aren't yet explained by theory.
The cosmic microwave background was first detected in the 1960s, gravitaional lensing was predicted by Einstein in the early 1900s and redshift can be traced back to the later half of the 1800s (as an extension of the Doppler effect)...
There was an article linked here just a week or two ago. Its claim was that the distance measurements used in cosmology may be wrong on larger scales. With the correction there is no need for MOND or dark energy. This needs confirmation of course.
I have yet to see the "expected" galactic rotation curves that are contradicted by observation a lead to ideas about dark MATTER. I'm mean I've seen the curves but cant find the math behind them. You often see weak references to Keplers law which doesnt even apply, so that leaves me very skeptical.
> I'm mean I've seen the curves but cant find the math behind them. You often see weak references to Keplers law which doesnt even apply, so that leaves me very skeptical.
Then it's wrong. I will need to see the derivation to find the error. I've seen indications of a couple possible places it may be (based on simplifying assumptions people make incorrectly) but have not seen the actual derivation of the expected curve.
Some examples:
Matter/anti-matter asymmetry, arrow of time direction, why CP violations, neutrino mass questions, why masses are what they are, dark matter and energy nature, what cancels out zero point energy, many structures on universe scale don't fit models, firewall paradox, why is gravity so weak, are there gravitons (other particles..), do magnetic monopoles exist (widely conjectured to work from models, none yet seen), why 3 generations of particles, proton radius discrepancies, exotic and pentaquark (and higher) particles, Navier-Stokes open problems, lots of superconductor and metamaterial questions results not explained theoretically, and so on...
Matter/anti-matter asymmetry is expected to be present. It is a basic feature of individual random walk instances using symmetrical laws that about half will be dominated by matter and the other half dominated by antimatter for long periods of time (although the "universe" may pass through pure energy states as it switches between the two). While there is a balance on average and in the long run, there is not for individual instances or points in time.
>Matter/anti-matter asymmetry is expected to be present. It is a basic feature of individual random walk
It may be the cause, but it is not known to be the cause.
And it's not a random walk; black holes accumulate charge from pairs, making future radiation not symmetric.
My understanding is that people have probed this for some time and it's still inconclusive if it can generate the observed imbalance. Here's [1] a 1979 paper on the idea, with hundreds of citations, in case you want to poke at the literature.
If I remember, Cosmology by Weinberg has a chapter on various theories of how the imbalance may happen, none of them known to be all of or even part of the answer.
It's likely though that in the case of superconductivity the problem is the complexity of the calculations, rather than a fundamental theoretical problem. It's like protein folding - we're no where near being able to do an ab initio calculation of the shape of a protein, but that doesn't mean that there is a fundamental problem with quantum mechanics.
I'd argue back to the author that "putting forward 'baseless' mathematical models" is "working on theoretical inconsistencies." When you're testing a black box for the content of the box without the ability to open the box, one may have little basis for an idea that might, maybe, could possibly provide useful results. That testing will definitely provide information, even if it's the basis of ruling out an entire class of tests.
And academia is failing too, I can’t remember what Nobel prize mentioned that he could not get one today as most researchers are stuck in having to produce papers for the sake of keeping their grants. What would be needed is a lot of free time and freedom to think ...
This is a much bigger issue. Research has been distorted by irrelevant bureaucratic productivity metrics. So the illusion of regular activity is rewarded, while anyone who takes ten years to explore a truly original ground-breaking idea is punished and excluded.
Good essay. Yes, the sociology and politics of the way we do science is overtaking the reproducible learning aspect. Foundations for many things, like physics, are as solid as necessary for doing a lot of work, but by the time you get to the point where you should be testing, rearranging, and ferreting out flaws in the foundations, you're so indoctrinated into a culture that you don't have the mental tools necessary to do the required work. So instead you just chug along the way the last generation did, adding a decimal point here or there.
It's not wrong. It's just not changing over time. It is stagnant.
The nice thing about physics is that with new advances in astronomy and the lack of a unified theory, it keeps getting poked with reminders that there may be missing pieces. That's not true in many other fields.
> but by the time you get to the point where you should be testing, rearranging, and ferreting out flaws in the foundations, you're so indoctrinated into a culture that you don't have the mental tools necessary to do the required work.
Don't you rather believe that a much simpler explanation is that the incentives and terms for grants are at fault?
I believe I said that. You might be confusing the reason for various sciences to get stuck with the mechanism of how they actually get stuck. The reason is that groups of people have common characteristics over time. As for the mechanism, I'm with you. Always assume the simplest explanation unless there's some evidence otherwise. Missing the "mental tools required to do the work" can be as simple as not being popular in your field of study, or a character unable to get funding or provide the oversight of funding that is necessary for your science to advance. I believe if you understand the reason, you realize that you could very well end up playing whack-a-mole simply by trying to fix the various mechanisms. That would be a tragedy.
As an outsider, I don't know. There is a presumption that since things have changed over time, they will continue to change. This very well might not be true. I didn't want to get too Kuhnian, so I just used the word "stagnant" for effect.
Once again, as an outsider it doesn't seem to me that they are anywhere near "done", but they sure as heck look like a mature science. Physics and its children have given us amazing things. Spending a lot of time playing with math wasn't one of them. All sciences have one aspect in common: until you get to reproducibility, the conversation in the community tends towards groupthink over time. That's a human characteristic not related to any one field of study.
Mature implies that no further major improvement is required. That does not seem to be the case with the foundations of physics given some of the fundementally unresolved inconsistencies.
Seems there are two possible outcomes. The deluge of data leads to better correlation which smooths over the flaws in current models. And corrects errors with some minor fudge factor that contains no further significance.
From Dark Matter to Galaxies with Convolutional Networks
Or something deeply profound is discovered. The thing which cannot be ignored. And instead leads to an explosion of new physics. Recognizing patterns of the latter class will perhaps always be the domain of the human operator.
The role of this "era" may be in reformulating quantum physics and, separately, general relativity in new ways that make the ideas more accessible to more people, and earlier in their lives. The goal could be to make of modern physics... the new classical physics. That is, we start to let go the crutches we still teach because it is thought that day-to-day life is more readily explained by Newtonian physics. We are now in era where most advances (e.g. smartphones among them) could not exist in their present form without modern physics.
Once more people accept the concepts of modern physics as a way of life (perhaps intuitively?), we will be in fertile territory for any potential new revolution in physics.
These theories have a very precise mathematical formulation and very weird unintuitive consequences. If you try to teach them without math, you only keep the weird unintuitive part and it's more unintelligible.
For quantum mechanics you have to know eigenvalues and eigenvectors. This is studies in the first years of the university in a technical career. I'm not sure if it can be teach much earlier.
For Special Relativity you have to know Minkowsky spaces. It's not so difficult, it can be moved to the first years of the university.
For General Relativity you have to know curved spaces. It's not imposible to learn, but you can get a Ph.D. in Math or Physics without studding curved spaces.
Linear algebra (with diagonalization not just using gauss-jordan) could be pushed back to highschool for motivated students, and is in some countries. The coordinate system aspect of special relativity (the origin of time dilation and most of its "weird effects") only requires algebra. General relativity requires the full mechanisms of differential geometry but advances in things like differential forms are pushing this back to the undergraduate level. Overall I would say that it could be done but you would have to leave the unmotivated students behind.
Turtle Geometry gets as far as motion in curved spacetime using code in Logo. Dunno how many high schoolers have ever learned from it, but it's there. (It includes a nice concrete intro to vector algebra earlier, too.)
Re quantum mechanics without many prerequisites, I'm a fan of Feynman's book QED.
We can keep math, but switch to better theories, with plausible explanations.
A kind of Pilot Wave can explain quantum weirdness to layman people with ease.
We can ditch theory relativity and calculate speeds relatively to CMB, which is much easier to understand.
We can ditch Big Bang theory and, instead, accept that light is not immortal, because it ages with time. IMHO, Dipole Repeller and Shapley Attractor are much more attractive and easier to explain than Big Bang.
All three examples you gave have problems or inconsistencies and this is why they are not used. You are being downvoted because you are suggesting teaching formalisms that are known to be insufficient simply because they fulfill your personal criteria of intuitiveness.
We have no perfect theory to explain everything, so it's just tradeoff, exchange of one set of inconsistencies for another set of inconsistencies, but with better intuition. I'm doing it here, in my country.
The problem with current theories is that I understand them when I reading them. It's like piece of complex code or book with complex but boring text, like phonebook. I can follow it, when I read it, but I cannot reproduce it when book is closed.
Can we teach a phonebook to kids? Yep. Is it useful? Nope.
Recently, I did "quantum physics in one picture" experiment. Results are very good: lots of reposts, comments, interest in topic.
But it is not a tradeoff in the cases you picked, rather one set of formalisms has drastically more inconsistencies than the other. E.g. pilot waves: you gain having real numbers (which I personally see little value in) and you gain having a more mechanistic intuitive source of the interference (which is indeed interesting). However describing multiple interacting entangled particles becomes incredibly difficult, describing annihilation and second quantization which is needed for the quantum behavior of fields is not completely done yet, and (what I consider the most substantial problem) you can not work with finite level systems (i.e. anything but a spinless particle in a box is very difficult to describe by pilot wave theory).
In short, pilot waves were a worthwhile avenue of research, but we have seen they are incredibly cumbersome or even insufficient in many quantum mechanics problems.
Yep. Pilot Wave theory is underdeveloped theory, but it helps to develop intuition. Walking droplets are even better for that. IMHO, it's better to use QM to solve QM problems in science, but use walking droplets and Pilot Wave Theory to develop intuition for others. Walking droplets are easy to demonstrate. Double slit experiment can be reproduced in school lab. This way, quantum physics can be taught in school for children of age 12+, so they will be ready to solve much more complex problems when they will be PhD.
Entanglement is hard problem for PWT. Photos of entangled photons[0] are intriguing, because they look similar to behavior of walking droplets in some experiments (see dotwave.org feed). I hope, someone will be able to reproduce entanglement in macro. Currently, my top priority is to reproduce Stern–Gerlach experiment in macro (I suspect that interference between external field and particle wave creates channel, which guides particle into spot, but it better to see it once). Second priority is creation of "photons" in macro. Entanglement will be third. IMHO, all of them require microgravity to reproduce in 3D.
With some caveats, I happily agree with the angle from this last comment! I agree PWT is a great way to get people hooked on quantum science, even if I consider it as a dead end for fixing the inconsistencies we have (semi-personal semi-professional opinion).
One problem is that physicists are not interested in lowering the bar to understanding advanced theories. Some say it's all fairly simple once you spend a decade learning some very advanced math. The art of teaching is in making the material more accessible, and at that I dont think much progress has been made.
>physicists are not interested in lowering the bar to understanding advanced theories
That is not true, geometric algebra is an example of a recent pedagogic improvement that is getting a lot of attention. The problem is that physics will never be easy enough for someone who is not prepared to think deeply, because it is one of the few areas where truly new ideas can be found. Virtually every area of learning involves repackaging concepts we have all known from childhood (people's motivations, stories, colors, that kind of thing) in specific ways. Major exceptions are physical tasks like learning to sew or play an insturment, and "esoteric" subjects like math and physics. In all of those cases you cannot learn by casually reading because the neurons in your brain are simply not prepared for it.
I've been interested in GA for years now because it helps me visualise and understand otherwise inscrutable mathematics.
Nobody, literally nobody mired in the traditional mathematics of theoretical physics can explain why the Universe is best represented using matrices of complex numbers with constraints on them.
"Shut up and calculate" or some variant is the common response to such probing questions.
More often, it's some variant of "Well, I can understand it, you need to study more.". This is usually stated just politely enough not to be outright insulting. But if you keep asking probing questions, it turns out that they don't really understand either, the "study" didn't help them either. They only got better at pushing the symbols around on paper They're dismissive of such questions because they're too proud to admit their own ignorance.
Geometric Algebra (GA) was my "lightbulb" moment where I finally understood where Dirac matrices, Pauli matrices, and the like come from and why they have the structure that they do.
My logical conclusion was that GA is the far more elegant, clear, understandable mathematical structure that brings a wide range of Physical phenomena under a unified formulation. So clearly, it should be used for pedagogy.
Nobody agrees with that. The attitude is "well, that's nice, but it's mathematically equivalent so there's no benefit." which is just the stupidest thing I've ever heard.
Imagine if you saw a function called "add_num(a,b)" that computed the sum of two integers using the full bit-by-bit adder digital logic circuit simulated in software using boolean logic. Absolutely bonkers, insane code, right? Clearly this ought to be scrubbed from the codebase and replaced with a simple "+" operator, because we're not maniacs. Physicists would argue "no", it's equivalent, it's "working", so shut up, leave it and just move on.
If you haven't used it already, Versor[0] is nice to play with. GA is simple enough that even normal-ish teenagers can understand it and produce useful results (my sons are using it in a game they're building). Math isn't even close to my strong suit, but Dual numbers and GA make sense to me, and have made it a lot easier for me to do (seemingly, to me anyway) advanced stuff. :-)
I 100% agree with you in all respects — I don't come from a physics background but I hear you loud and clear. I think the 'why' is deep and psycho-historical in nature:
- we're exiting the "industrial" mindset where everyone is the same making the same products, to a wider topology of knowledge and skills (more and wider horizontals, more and bigger verticals, 'average' profiles become 'scattered'). This clearly drives a need to "learn a little bit of a lot of things" even at expert level.
- The walls and denial you expose here is to me but a symptom of the disease that current academia will either have to heal or die of. Seeing how Khan (and thousands of Udemy's after them, indies) changed the landscape, my money is on a major paradigm shift incoming for academia (it's already done, they just don't seem to know it yet as institutions, most of them). Lots and lots of great teachers around the world almost freely sharing incredible hands-on knowledge and insight.
- Some applied domains with dramatic tension of the demand side (lots of positions to fill) don't have the luxury of elitism and massively adopt "pragmatic" approaches especially in learning. Software dev, programming and tech in general is much like that — the "one liner" installs and 1-page "getting started", all the intelligence solely put into making things intelligible and usable is, frankly, quite humbling and inspiring in that field. A very good side of the SV/Cali culture. So, examples of how to proceed next really do exist.
Now when I think back of topics that I hurt my head against for months or years, that a simple 20-minute video could 'unlock'... Why, why do we not make it a staple of "teaching" to at least consider 2-3 angles to make sure everyone's got a fair chance at getting at least 1?
- On the topic of hubris and laziness, this is where physics went astray, imho. Too much hubris and not enough laziness. That was back in the 1980s and it took 40 years to realize, probably 10-20 more to "fix", if ever before we build a new system (see above).
That being said,
> Geometric Algebra (GA) was my "lightbulb" moment where I finally understood where Dirac matrices, Pauli matrices, and the like come from and why they have the structure that they do.
YES, please! Geometric algebra seems like the thing that could blow my mind too. I am very visual, to a fault maybe.
Would you have a 'favorite' resource to share? (book, course, youtube, whatever?)
GA is just "strongly typed" vector algebra. It recognises and embraces the inalienable fact that areas and volumes are fundamentally different to vectors and scalars.
The reason Physics "went wrong" is that in 3D space (only!) the mathematics of areas and vectors is coincidentally isomorphic, so it's possible to cheat and use only vectors and scalars and then everything "works". Similarly, volumes and scalars are easily confused as well, and appear to work fine.
GA has no such restrictions and the same formulas work in all dimensions, including high-dimensional or with degenerate metrics. Problems from classical geometry such as finding tangent lines to circles can be trivially extended to finding tangent hyperplanes to hyperspheres, even for very complex problems.
The formalities of GA force you to include things like the square of the unit pseudoscalar in some physics formulas that were accidentally dropped in the traditional form because in 3D this is just "1" and hence easily overlooked. This makes some formulas weirdly difficult to extend to become relativistic, when in fact the problem was just the "weak typing" of vector algebra.
Vector calculus also inherently requires a basis, which is an easy way to get bogged down in the weeds and get confused by issues with the algebra itself instead of the truly "hard" aspects of the problem.
Generally, the "lightbulb" moment for me was that Geometric Algebra has various subsets that are also closed algebras in their own right. For example, the "even" subset of a 3D GA is isomorphic to Quaternions, and the even subset of a 2D GA is basically the same thing as a Complex number. The various "named matrices" are just other subsets of 3D or 4D GAs. Physicists tend to avoid the full general case and simplify their algebras down to the special subset cases, using the historical names and greek symbols. We have to keep the symbols, you see, because otherwise you wouldn't be able to read 2000-year-old ancient greek texts, or... something.
University Physics is actually a study of the History of Physical Philosophy. The computer science equivalent would be learning about abacuses for the entire first semester, then progressing to mechanical calculators in the second semester, vacuum tubes in the second year, and so forth, only to briefly touch on transistors by the end of the third year. Postgraduate research students would be finally told about modern silicon chips and software development, but by this point they're so used to wiring up breadboards manually that it's too late to teach them how to do anything properly.
Starting with something elegant like pure functional programming in the first year is how I studied Computer Science, but I only found out about Geometric Algebra existing after I graduated Physics. It's nuts.
Real industrial use is few and far between, but at least a few folk have discovered that GA is ideal for robotics. Unfortunately, not everyone got the message, and most robotics software libraries are firmly vector/matrix based and have all the usual issues like numerical instability and gimbal-lock. Fun stuff.
Hey there. I'm not sure you'll ever read this, but for the record. THANK YOU, so much.
So.. I've been dabbling with GA since we talked and it is an incredible framework!! I now understand your post loud and clear. It's a new dawn of math for me, I really mean that; Clifford is my new prophet (and I think this one's a keeper possibly for life, I don't know and can't imagine something better for the problem space). So much had not clicked with linear algebra for me, so much of matrices was obscure and had no representation in my mind... And GA's base objects and concepts are so, so elegant, and exquisitely intuitive.
Turns out he's an outstandingly good teacher. Strong recommend.
I'll probably take a more "serious" course/book (with problems!) next — if anyone has a recommendation, please do!
Then make progress by working on actual stuff (I guess Hestenes' reformulations are a great starting point, retracing some of these following his reasonning).
And the penultimate goal would be to reformulate stuff myself, if I could — haha, that would be so great. More realistically use GA for research in designing models and representations.
___
TL;DR: you brought Math back into my life. We were on a break (but kept calling each other..) for the last decade and a half. GA is really, really strong. Remind me again, why don't we teach children like that for a century? /s (sigh)
Wow, thanks so much for all this. I've yet to digest it fully but it's a terrific intro, I love how you worded some of this. You should consider teaching! :)
I can't elaborate much, so just a few "mind blown" moments for posterity:
> GA is just "strongly typed" vector algebra.
That's one hell of $1B slogan, at least around these parts! :) Shut up and take my money.
> in 3D space (only!) the mathematics of areas and vectors is coincidentally isomorphic, so it's possible to cheat and use only vectors and scalars and then everything "works".
I never realized that... there's indeed a lot of confusion in my mind between those concepts. I fail to see how "different" they're supposed to be, I guess really need to go back to sane basic in that regard.
> Geometric Algebra has various subsets that are also closed algebras in their own right.
Just wow. I love this. I actually need this.
> GA has no such restrictions and the same formulas work in all dimensions, including high-dimensional or with degenerate metrics. Problems from classical geometry such as finding tangent lines to circles can be trivially extended to finding tangent hyperplanes to hyperspheres, even for very complex problems.
So that is the real kicker for me, because it fits my problem space so well. I'm exploring highly-dimensional models (basically letting complexity arise from the dimensionality of rather simple/elementary objects, rather than trying to shoehorn complex functions in low-dimensional space in hope of pretty much randomly finding "better fits" — it's a strong desire to not interpret the data before the fact, to remove bias from modeling itself).
There's interesting research around geometric deep learning as well, which seems largely informed by physics as well, and this is sort of the logical conclusion of that for big datasets.
I think industrial use may rise greatly based on this first take. But it's always a generational thing with culture — it takes ~25 years give or take for those who "grew up with it" to finally become the majority of the workforce and sway things their way. Same with politics — looking at you, academia. As you said, "but by this point they're so used to wiring up breadboards manually that it's too late to teach them how to do anything properly."
> It's nuts.
Yeah, it'll take time, never mind how infuriating in the meantime. But good on you, spreading the word about GA is exactly how we move forward, one post, one topic at a time. Eventually, we get there.
I'm trying to explain quantum physics using single photo[0] (in Ukrainian, but you will get it). It has good adoption among regular people. It based on real physical experiment, just labels are added. BUT scientist are insane when they see it. They argue that quantum physics cannot be explained using picture, because the only true way to explain quantum physics is using mathematics.
The historical attitude has been, "it doesn't matter, shut up and calculate". There's been quite a bit of push recently to try and nail down exactly what QM means rather than just what it calculates (See: Sean Carroll, etc)
I think that the article raises some interesting points, with some that I agree with and some that I do not.
I think it would have been helpful for the article to put the 40 years of no progress in perspective. Are we looking for progress on the scale of the theories of relativity and quantum mechanics, and so should we be comparing to the timescales between Newton and Einstein/Schrodinger? How should we think about the rate of progression in a ‘mature’ field such as physics? Should it be linear (big discovery every 40 years), faster (new discoveries are faster due to bootstrapping from other discoveries), or slower (diminishing returns)?
What actually is the foundation of physics? The observations or the theories?
We believe, as an assumption (or nearly as a matter of orthodoxy) that there are simple universal laws that govern consistent natural phenomena. One could argue that that is the foundation of our science of physics in that if that wrong, the whole thing falls down. But that has not “progressed” and really should not change... which seems consistent with the concept of a building foundation. Building foundations don’t move and shouldn’t move.
What about theories, which seem to be the focus of her blog post? Well we should be careful to distinguish between our theories and the fundamental laws we think they describe—the map vs the territory and all that. I would really hesitate to call our theories a foundation of physics. For one thing they are known to be provisional; intended to be changeable. That’s not how foundations usually work.
When observations contradict theories, the theories must move. From that perspective one could say that observations are more foundational than theories. Once a piece of evidence is properly observed, it doesn’t change.
And the thing is, we have collected major (I would argue foundational) observations in the last 40 years. We observed the Higgs boson and gravitational waves, and I would call both of those foundational.
That they agreed with existing theory is somehow being taken for a crisis? I guess it’s a crisis if your job is to come up with new theories and you’re lacking reasons to do so.
But there are plenty of mysterious observations yet to be explained. Many of the observations related to dark matter and dark energy fit within a retrospective 40-year time horizon. Call them astronomy if you like, but going back up to my second paragraph, we believe they should be explainable by our physical theories.
Basically you have to make some metaphysical assumptions before doing science can even get off the ground. If you believe that reality is an illusion (Buddhism? Hinduism?) then you're less likely to be interested in understanding the world's workings. If you think that things occur for capricious reasons (e.g., pagan gods being the cause of things), then there is no reason to ask "why?". If things happen not because of inherit properties but because of God's Will (Occasionalism), then who can understand the mind of God?
I've heard it argued that science mostly developed in (Western) Christendom because it brought together all of the above assumptions under its Aristotelian world view. If you look at the invention of the telescope in ~1600: it spread over the world with-in a couple of decades, but most cultures weren't really interested in it.
Another relevant segment of the Philosophy of science article is the realism/anti-realism dichotomy.
I find it an interesting line of reasoning that the current lack of progress is due to the default naturalistic approach whose sole purpose is finding "truth" vs. a more pragmatic, non-realist approach that would have a much more concrete purpose (e.g. solving particular problems). Truth for the sake of it with no practical experiments seems to have been a dead end.
The foundations of physics you are talking about sound more like foundations of scientific method, and are not specific to physics. Most people rightly call the theories and corresponding fundamental physical equations to be fundamentals of physics.
The foundations are in the mathematical sciences, beginning with Galileo and Kepler and wrestled with by Descartes until Newton came up with a brilliant work on mechanics. It’s focus has been on the mechanical, ie numerical and geometrical, principles of motion and not on substance. Studying not what it is but where it will be, giving us greater certainty of phenomena and thus greater self-determination in an unpredictable world.
This intellectual current is now upheld by the engineering sciences. The physicists are too glued to the Bohr Model and a particle universe to concern themselves with a new mechanics in light of the quantum wave phenomena discovered last century.
This is not a very original recommendation. In fact a major selling point of String Theory is that one can manage to derive both Einstein's Field equations and quantum field theory scattering amplitudes from its equations. This approach is currently the only one that can claim that for itself. Of course people like Hossenfelder never made an effort to understand String Theory in detail, so they can only make first order observations about the current state of the field.
It is also not true that no progress has been made in the understanding of String Theory in the last 40 years and it still seems like the best bet that could eventually generate a fundamental theory. What is missing is still a lot though:
- We don't seem to possess the correct mathematics to develop a non-perturbative formulation of String Theory and there are too many potential string backgrounds that we could expand around.
- It is also hard to derive the matter content of low energy effective actions from most brane configurations.
String theory provided major insight into non-perturbative quantum field theory as well. There is tons of examples, let me highlight one of them: The discovery of the Amplituhedron (Arkani-Hamed et al., 2012) was preceded by the discovery of the BCFW recursion relation (Britto et al., 2005), which in turn was motivated a relationship between perturbative Yang-Mills theory and the instanton expansion of a certain string theory in twistor space (Witten, 2003).
This is so general and vague that it is useless, even more considering that the author has been saying the same for the last 10 years (the same timeframe her career progression stopped) and there is not a single valuable paper proposing a somewhat-valuable idea. I hate to sound like Lubos Motl (for the cognoscenti) but Sabine's criticism is trite.
It's not general and vague if you're a practicing physicist and know what those inconsistencies are. For example, the Standard Model assumes neutrinos don't have mass, but they do.
> the Standard Model assumes neutrinos don't have mass
No, it doesn't. The original Standard Model from the 1970s did, but then neutrino masses were discovered and the Standard Model was modified to include them.
I did not miss this statement; for me this does not constitute a recommendation. Indeed I think many of the researchers of whom she is critical could claim that this is what they're doing.
But that's like telling me "run faster" if I am complaining that I can't run 100m in 10sec. I would think that almost all theoretical work revolves around resolving inconsistencies (such as between quantum field theory and general relativity). This advice is too generic.
She provides one of her ideas immediately prior to that quote:
”I have said many times that looking at the history of physics teaches us that resolving inconsistencies has been a reliable path to breakthroughs, so that’s what we should focus on."
And given that she has written at least one book on this general subject, I would guess that idea is elaborated in much greater detail there or on her blog.
In this case, though, the idea that we need to resolve inconsistencies is the correct diagnosis and shines a light on the incredible failure of the prevailing institutions.
Take for example the idea that a photon is both a particle and a wave. This dogma has been force fed to students for decades without the glaring inconsistency being resolved, or even pointed to as a thing that needs resolving — something is either a particle or it’s a wave; if we observe properties of both, then we need a physical explanation for how one becomes the other. Something can’t be two distinct things. Logically, all I’m pointing out is that “A is A”.
Yet my perspective that there is an unsolved inconsistency here is considered heretical. “Shut up and calculate” is the reigning dogma.
>> A photon isn't both a particle and a wave. It's something else that happens to share some features with both.
But physicists have given up on figuring out what it IS. They have decided that having math that can predict the outcome of experiments is enough. They're not wrong, but it feels rather unsatisfying. IMHO there have been a couple avenues worth exploring that are being largely ignored.
> But physicists have given up on figuring out what it IS.
It depends on what you mean by that. What they've given up on is trying to find some sort of anthropocentric analogy. It's understandable that this is unsatisfying, since analogies are fundamental to how we understand things. Unfortunately, there's no reason to believe that a good analogy will exist.
>> Unfortunately, there's no reason to believe that a good analogy will exist.
That's no reason not to try. If there IS an objective reality I think it deserves a better description than just the math which characterizes its behavior.
We shouldn't be looking for analogies, we should be looking for theories consisting of precise concepts that directly explain what is happening in the universe.
Its not satisfying because its not intuitive. Its not intuitive because our billion years of evolution never exposed use to such experiences to observe. If we limit ourselves to only whats intuitive it will be necessarily limiting to our ability to make new discoveries.
Let's grant the point that there may be some phenomena in the universe the human mind is not equipped to understand (which is a nuanced position but we can ignore those nuances for now):
Why should we assume that the contradictions in contemporary theories are of this special, inexplicable type? Every era has contradictions, before they are resolved... But they'll only ever be resolved if people are trying to resolve them, meaning that they haven't resigned themselves to the idea that our minds haven't been gifted with the capacity to make sense of our experience.
One of my favorite books is called "Architecture of Matter", it's a history of ideas about matter. One early idea was that matter is made of little tiny bits of stuff and that qualities of these little bits (such as being smooth or spikey) leads to macroscopic phenomenon (like spikey bits being acidic.)
The problem with this idea (and almost all others) is that it's just pushing the problem down a level: If matter is made out of little bits of matter, what are the little bits of matter made of?
FWIW, the wave-particle duality gets around this self-reflexive problem. Matter is made out of some other kind of stuff. But then, as you say, we still have the essential problem of "wtf is this stuff?" but we don't worry about that so much as long as our math describes the behavior of the stuff.
> They're not wrong, but it feels rather unsatisfying. IMHO there have been a couple avenues worth exploring that are being largely ignored.
What avenues? (Genuinely curious, not trolling.) It seems to me that the ultimate, existential question of what "stuff" actually is, is unanswerable (within the logical/scientific framework.)
Pilot wave theory is one. The other was a paper - forgive this explanation - the found equivalence between particle physics and fluid dynamics. Suggesting the objects in physics might be modeled as say vorticies in some kind of fluid (aether). The equivalence was IIRC only to first order, but the work to get there must have been a lot. I'm sure there are others.
Fair point, so then let me reframe the inconsistency as: there’s a phenomena that appears exhibits properties of both a particle and a wave but it can’t be both. So what is it?
And the criticism of mainstream thought in physics would be: its wrongfully dismissive of my question, and ignoring the important job of looking for the answer.
No, the mainstream physics reply is "It's a quantized excitation of the electromagnetic field." since that's a perfectly reasonable reply to a perfectly reasonable question.
Electromagnetism is a U(1) gauge theory. If you take the accompanying classical geometry at face value, there's hidden state at each point in space, like some sort of dial. The absolute position of the dial is irrelevant, but the gauge potential tells you how the dial turns as you move from place to place. The electromagnetic field strength is something called curvature of the corresponding 'connection'. The mathematics are a bit involved, but morally speaking, I would say it tells you how the turning of the dial varies across space.
So, EM field describes strength of EM force in every point of mathematical model. Nature of EM force is unknown. Right?
Let's talk about nature of EM force.
Analogy: look at a typical tropical cyclone. It rotates. Is it rotating because a unknown property of an air molecule? No. It rotates because our planet rotates, while air molecules are just trying to keep their positions. I.e. it's rotation of planet + inertia of molecules.
Is it possible that EM force is happens because our local space is moving trough global space by non-linear trajectory, so it just non-linear trajectory of local space + inertia of rotating and vibrating particles?
My point exactly - fields are real, they are not merely mathematical models. (It's just that some words acquire a more precise meaning, being formalized as part of a model, and while it's true that models may contain an additional "scaffolding" that has no analogue in reality, field is not one of those.)
Field is mathematical abstraction, used in mathematical model, to represent a physical thing.
Physical things are real. Mathematical abstractions are not. OpenGL is not real too, but it looks very real and accurately predicts reality. In OpenGL, field is array, e.g. "float[][][] field;".
How about "it's a thing that does not have an analog at the macro scale in which humans exist, therefore any explanation that does not use terminology from our daily lives will be unsatisfactory"
We have a pretty good and intuitive explanation of what a photon (or other particles) is. It is an excitation of a quantum field and quantum fields stem from the underlying symmetries of space and time. It is incredibly powerful, incredibly simple, and incredibly illuminating idea, but it requires you to learn a couple of new words that humans did not need in their vocabulary when they were inventing agriculture. I am very comfortable claiming this is much simpler than any particle-related intuition as it requires way fewer "axioms". The difference is that you arbitrarily happen to have a mental image of what a particle is, but that does not make particles simpler. As a mental exercise, try to rigorously define what a particle is according to your intuition and explain why we should expect them to even exist.
That inconsistency was subject to an immense amount of scrutiny by thousands of scientists for over a hundred years. Albert Einstein received a Nobel Prize for resolving the inconsistency. The theory that ultimately resolved it is called quantum mechanics. That's no longer an inconsistency.
Better examples would be the inconsistencies between general relativity, which demands curved spacetime, and quantum mechanics, which prohibits curved space. This is a very real inconsistency at the heart of modern physics. Many, many people are actively working on solutions for it. Search for Grand Unified Theory or Theory of Everything for more information. Candidate theories include string theory and its derivatives, loop quantum gravity, etc. There are plenty more.
The problem is that designing experiments for these theories is Hard. Big-O Capital H Hard. My flight's about to board, maybe I can expand on it during my layover.
Dark matter is another example. Observations of the speeds and orbits of stars in galaxies and galaxies in galactic clusters are not consistent with our measurements of their masses. Plenty of candidates for dark matter have been and are continuing to be tested.
Candidates include MOND, which supposes that our theories of gravity need to be modified when acceleration is astronomically low. We have designed experiments to support or disprove these theories and most of the results have landed on the side of disproving them. (search for "bullet cluster" or "dark matterless galaxies")
Another candidate is MACHOs. ("MAssive Compact Halo Object") Basically the universe is teeming with small black holes, brown dwarfs, loose planets unassociated with any star, basically a lot of stuff we can't see. We have designed and executed experiments to search for these, but the results have concluded that there are insufficient such objects to explain the inconsistency.
The third candidate with traction is WIMPs, or "weakly interacting massive particle". Basically theorizing that there are other types of unobserved particles that have mass but do not interact via the electromagnetic force, which makes them very difficult to observe. There was hope that neutrinos could explain all these, especially when it was demonstrated that neutrinos have mass. However, experiments trying to bound the mass of the neutrino have shown they are not nearly massive enough to explain the observations. Experiments are ongoing to find these particles, but have yet not discovered anything we can't already explain. However, there's a problem: maybe they're just too difficult for any experiment to observe. In that case, we might be SOL.
These are just two examples. All of science are all the other examples.
The idea that this person is the only person probing inconsistencies is pure hubris. It's not a useful starting point for a conversation, unless the point of the conversation is to talk about how awesome you are and how much everyone else sucks. Which is all I got from this article.
Personally I am not convinced a particle is a wave until collapsed. There is nothing in the double slit experiment that definitely proves that. It may be that light particles bounce off something and change trajectory and create the illusion of there being a wave.
In the double slit experiment, if you rotate the slit 90 degrees, the interference pattern is rotated 90 degrees in the same direction. Doesn't this prove there is no wave involved?
If you make the slits larger, the pattern changes, and then vanishes.
If you make the slits circular, the interference pattern becomes circular.
If you make the slits triangular, the inteference pattern becomes triangular itself.
All these things tell me that particles are not waves at any point, they just bump to something and take a different trajectory.
When, in the same experiment, a light detector is placed, and the interference pattern disappears, this does not mean the wave goes away and there is a collapse to a single particle; it means the detector produces particles that don't bounce off something. What if the detector is placed near the particle beam emitter? has it ever been tried? I don't know. If placing the detector near the emitter makes an interference pattern reappear, then we certainly have no collapsing of any wave.
What if we put the slits very close to the emitter? do we get the same inteference pattern? what if we put the slits further away from the emitter? does the interference pattern change? if yes, then we certainly have no wave.
And something else regarding quantum entanglement: how can we be sure that the particles are not created with their properties in such a state that they appear entangled when they are measured? why do we assume there is a communication between the particles on the fly rather than the two particles having relatable but not connected properties? we just assume that due to the other assumption that particles are waves and they collapse.
Finally, how do we know that matter attracts matter and it is not the void that pushes matter into clumps? how do we know that the actual distance between the furthest points in the universe is the same as it ever was, and simply new positions are created within the same distance? and these positions are what push matter to clump together?
I'd love to sit down with an honest physicist to research these types of questions, a physicist that cares more about answering these types of questions than hunting for grants and fearing to go against the status quo, but it seems only crackpots are willing to do that.
I have become a bit more pessimistic about it the state of discovery. Things have slowed despite the current generation having abundant access to overwhelming compute power and the internet, things that did not exist even 30 years ago. There has never been a better time to collaborate or prove out theoretical models, yet there has been a decrease in needle moving discoveries.
More importantly, the most brilliant minds (and there are still a LOT of them) are working for large companies on unimportant problems instead of doing research. Academia has lost all of its prestige, and companies pay ridiculously more.
I've read several of Sabine's blogs over several months. I think she has very good ends in mind, has courage to push back on corporate/academic inertia ... such inertia comes with any human organization ... On the negative side she's big on complaining but small on alternatives. She's also a bit too blunt/dismissive of people -- this from a person who also dislikes corporate happy talk. As such it's not clear if she'd confer distinction if she had a large budget, an institution, and group of experimentalists. An Oppenheimer? No.
Aye, it’s easy to make criticisms (and Zeus knows the price of good science is eternal vigilance), but the easiest and most satisfyingly Ockhamite explanation of the slowdown in physics remains: all its low-hanging fruit is long since taken.
And while there’s no harm in pondering the philosophical origins of the scientific method while debating where to go next, we should take care not to go backwards either, as that way lies fractal navel fluff and bloody string theory.
It’s amusing you jocularly employ Zeus in this conversation. Galileo, Descartes, Newton, Maxwell, etc. did not have Zeus on their minds, maybe it would be wise for you not to either?
I've followed it for years, but I think lately (last ~year) she's become somewhat clickbait-y in some articles.
In particular the posts where she keep highlighting the misguided fools that claim gravitational waves aren't real; she wrote a forbes contribution that made a big deal the LIGO people didn't respond to questions over facebook and saw it as suspicious, and awarded points to the GW denialists as a consequence.
But maybe it's just that I know have a better understanding of what Hossenfelder is writing about, and can see for myself how thin the cases can be.
I believe in gravitational waves, but specifically with LIGO I wish they did a better experiment with the multimodal observations. An easy one would be:. For an [n] month period blind all of the results, and mix them with 3X fake data. Then instruct traditional EM astronomers to search for EM phenomena corresponding to these signals. If only 25% of them correlate, we know the multimodal search is not working.
Eh? You can’t point a telescope at an event that happened (here) years ago! Also, if it ever came out you’ve wasted good night skies on fake events, preventing other astronomy from being carried out by interrupting it with too’s that are known to be fake?! Like non-transient observers get their time sniped enought already?!!! You wouldn’t have friends left in the funding agency at best.
Anyway most events are BHBH mergers with no EM counterparts, so non-detection of those mean nothing.
There is no believing in them. Gravitational waves have been observed and you can't "believe" or "not believe" in them anymore than you can "believe" or "not believe" in electricity.
The observations are still subject to debate. The signals are far below the noise and can only be discerned with fancy statistics, which some groups have failed to reproduce. See Hossenfelder's detailed critique at https://backreaction.blogspot.com/2019/09/whats-up-with-ligo...
No, it is not a detailed critique, it is a list of little more than rumours. Hossenfelder is satisfied with describing the LIGO collaboration as suspicious and then backing off with ”of course, I don’t belive this.
If you belive that clickbait I feel bad for you, but it’s too low quality to spend time debunking.
I believe with exceptionally high degrees of confidence that the sun will rise tomorrow. I have this belief because the sun has been observed to rise every day for something like a few million times in a row and I've never once witnessed it not rise. This belief is so strong I could comfortably call myself certain of it, but it's still nevertheless a belief. For most practical purposes, 'certainty' means an exceptionally high degree of confidence, where the margin for doubt becomes too small to bother ourselves with. But true certainty seems like something that can only arise from pure mathematics, not through observations of the world. I am truly certain that triangles have three sides, whereas I am 'merely' exceptionally confident that gravity exists.
A person can certainly disbelieve electricity, they would just be foolish to do so.
While I think a person would also be foolish to disbelieve gravitational waves, I think it wouldn’t be quite as foolish, because while we clearly measure gravitational waves, we don’t exactly use them to do (as opposed to “look at”) stuff, so a person’s everyday life is less impacted by disbelieving GW than by disbelieving electricity.
Pardon, I meant given the current evidence. Newton didn’t have that evidence. I thought that was obvious.
And at first I had written “probably foolish” as a hedge, but thought “oh, I hedge the things I say too much, and it makes what I say less pleasant and more difficult to read. I’ll leave it out.”.
By “foolish” I maybe I really meant something closer “probably reasoning incorrectly”.
The exact point of my comment is that the evidence is not as strong as claimed in the popular conception, as there has not been independent confirmation via a multimodal technique. To date there has been a single, unreproduced, nonindependent, multimodal observation... A good start, to be sure, but still folks like yourself are saying that it would be foolhardy to merely believe GW.
"...mindless production of mathematical fiction..."
This derisive comment betrays the authors own hypocritical stance, claiming physicists are too close-minded, while simultaneously ridiculing the role of advanced mathematics in formulating new physics hypotheses, arbitrarily declaring them mindless fiction.
Actually, Hossenfelders great fight has been with the concept of "naturalness", a fight that has now been won by the LHC killing off all the theories based on that concept.
And it wasn't ever so simple as "chasing mathematical elegance instead of trying to explain observations", the problem has been that there was a theory that could explain almost perfectly everything within a certain region of physics, but can't easily be extended.
Thus you work on crazy schemes to extend the existing theory (all the sensible ones already having failed), or you are forced to make an entirely new framework, and that takes a lot of work before it's finished enough to even reproduce the results of the limited theory.
If you take the second route, you are very vulnerable to the "chasing mathematical elegance" slander, but it's not like the other guys are doing any better: there aren't actually any unexpected observations that need explaining within the reach of the existing theory.
>"the problem has been that there was a theory that could explain almost perfectly everything within a certain region of physics, but can't easily be extended."
Compare the classical physics formula for momentum, p=mv, vs relativistic formula, p=γmv. γ is almost 1 for most low velocities, it only starts jumping up to infinity when we get close to c.
The point being that the classical formula is pretty good in it's zone of low velocities, but as soon as you get too far out of the implicit term's "constraints" the formula breaks down and you need to add more to it to get it working for both low and high velocities. Which doesn't sound easy.
It sounds like that only because scientists reformulate their models in terms that people are familiar with. Actual physicists don't work in terms of the gamma correction factor; they work in tensor fields that don't look anything like conventional arithmetic.
But they can pare all that down to something expressed in terms people are familiar with. And that has the bonus purpose of helping them understand why the familiar terms were familiar: the "correction factor" is small under circumstances we encounter, and only becomes large under circumstances we rarely do.
If that intrigues somebody enough to learn the actual physics, they'll encounter a completely different and more-encompassing formulation which looks not at all like overfitting. One that turns out to be more elegant, in fact, cramming more information into less notation. But it's information nobody needs until they're doing fairly advanced physics, so we're not going to be teaching it in elementary school any time soon.
I don't see how what you wrote could possibly address concerns with adding parameters to the model until it explains current data so perfectly that it cannot generalize to future or other data.
This is not a "theory" problem, it has to do with matching the theory to observations.
If that’s what it sounds like then that is a failure of my description: the problem is nothing at all like overfitting, and casting it in that light would be pointless.
How many free parameters do "the foundations of physics" allow?
I was concerned a few years back when a "blip" at CERN resulted in theoretical physicists publishing 300+ different theories to explain it in a short period of time. All of these theories were presumably consistent with "the foundations of physics". And I guess that "blip" got rejected as a not something worth explaining anyway.
Calling it ”overfitting” is actually you overfitting your model of how theory works on this one concept from machine learning.
The actual events of the 750 GeV peak are much closer to neural networks hallucinating, and the diversity of models created doesen’t strike me as evidence of overfitting...
Anyway, you clearly didn’t trust my summary, and I no longer trust you have an honest interest in learning more, so I’ll stop here.
Look at this: theorists come up with something like 300 different models, and you declare they all used the same theory to do it? Of course they didn't! Why would you assume they all did the same thing?!
Further, of the various frameworks used, many will have been created by imposing some further symmetry on the standard theory, in effect decreasing the number of free parameters!
I find it hard to put together her criticism of "chasing mathematical elegance, instead of trying to explain observations" with her criticism of the large and expensive colliders. The way I see it, the whole reason why the expensive experiments are needed is that the current theories do explain all the 'local' and (comparatively) low energy observations that we can do here on earth in ordinary conditions; we know that there are discrepancies between our theories and large scale / high power processes that we can observe in astrophysics, but if we actually want to probe and explore these discrepancies between theory and observations, then we need to make some discrepanct events to look at.... which we can't do without the very expensive experiments that she shuns; e.g. the Higgs boson is not going to show up on a low-power particle accelerator, no matter how smart physics you do.
First, I disagree that physics, and its foundations, have not changed. Incrementalism is common in mature areas of study, but the cumulative effect is still felt.
Second, I am reminded of Thomas Kuhn's The Structure of Scientific Revolutions [0] This work described exact the state, with historical examples of the cycles, whereby progress exhibits peaks and valleys, periods of time wherein little monumental progress is made followed by brief frantic periods of discoveries, often stemming from the fertile ground laid by those who worked in plodding toil.
And so I am more inclines to believe we are in such a trough at the moment and not even a particularly deep one. Various avenues of thought & experiment show amble potential to thrust us forward into one of Khun's Scientific Revolutions.
Consider a Venn diagram. 3 circles : the observable, the understandable, the communicable. Intersecting at a chubby triangle. That's physics. And it's pretty darn small compared to the rest of the diagram.
Maybe the triangle is exhausted. All mapped out. The limits of the method have been met. Time to find a new method.
Do we really want something fundamentally important as the foundations of physics to progress or change quickly?
History of science and the philosophy of science has shown that the foundations of sciences progress a little here and a little there until these "little progresses" gain enough momentum to create a paradigm shift. And we only recognize these "little progresses" in hindsight after the paradigm shift.
Technological advances also tend to progress science. We tend to believe that advances in science lead to advances in technology but historically, it's the other way around.
More likely than not, there are man "little progresses" being made toward an eventual paradigm shift, but until it happens, we won't recognize how important those "little progresses" are.
I really really want a reason to not dismiss this as vapid demagoguery running on the "Woman scientist challenges predominantly male establishment on stagnant paradigms" ticket because this is absolutely what it reads like.
"But for all I can tell at this moment in history I am the only physicist who has at least come up with an idea for what to do. "
This is one heavy claim (two actually), is there some place where she elaborates what that idea is in terms more specific than "resolving inconsistencies" and "more theorists" ?
Not sure why the original response got flagged but: I think that her being a woman in this context provides this kind of output with more amplification than if she hasn't been while at the same time diminishing her opportunities for honest collegiate feedback. That's not her fault or an advantage that she's deliberately taking, or an advantage all for a researcher, maybe for an author/pundit.
The last time we had progression in the foundation of physics we just got even more powerful world destroying nuclear weapons. Maybe it's just too dangerous to advance physics outside of deeply classified government programs. In order to keep new physics from destroying the earth, funding is diverted to make work projects for physicists working in cosmology and string theory that will never actually have practical significance.
Sometimes when I’m working on a coding or troubleshooting problem I get stuck iterating on an issue when really I need to step back and try to think about solving things a different way. My impression is this is what she’s trying to advocate the physics community to do. I think her overall tone is too negative and is turning a lot of people off but overall I think she is a needed voice just to make sure we’re on the right track.
Doesn't it seem likely that there are important natural phenomena of complexity that simply exceed human ability to comprehend them no matter how long we work to understand them and no matter what evidence we stumble upon? That that evidence will always remain mysterious until we first develop artificial intelligence (for example) capable of interpreting it?
Actually, string theory arose as a way of doing exactly what this person requests. It was noticed that quantum field theory and gravity do not go together, so it was attempted to do something about this. So, it did not really work out? Well, you know the thing with this kind science is that it is unknowable beforehand what you are or are not going to find.
The claim of sceptics along these lines is that string theory has produced nothing in 50 years and some string theorists appear to be in denial of that.
In science as in pretty much any other discipline you should always weigh someones opinion by their believability / credibility. The credibility of someones opinion on something is a function of their knowledge / experience of the subject and things related to it. To first approximation most sceptics have close to zero credibility making claims about string theory, as they have not produced scientific output remotely comparable to proponents of string theory. This is simply because people like Witten tower far above most other working theoretical physicists.
This sounds suspiciously like argument from authority. You will be more convincing if you address the argument, not the alleged experience of those making it.
Please see my response to the previous objection in the thread (the sibling comment to yours) requiring specific examples of progress which would silence the critics - it remains unanswered.
Said skeptics are blissfully unaware of the contributions that working on string theory has made to other branches of physics and mathematics. To say it's produced nothing is to admit ignorance.
In the context of this article - physics progression - what specific theoretical discoveries or predictions has it produced that can be verified or disproven via experiment in the forseeable future?
From an outsider's point of view (i.e. mine) it looks like physics has become too enamored with mathematics, "thinking" that reality can be totally described by mathematics. Which brings us to one of the author's point of views, the fact that physicists (and I guess most scientists) nowadays dismiss things like philosophy of science or epistemology, they just continue head-on on their "quest for knowledge" not realizing that for almost half a century now nothing big has been "found out".
There's nothing wrong with the way math is used in physics. Consistency is a good thing; whether calculations possess explanatory power is another question.
I didn't say the maths is wrong, I just said it has reached its limits. And "no big discovery" for the last 40 years kind of shows that some limits have been reached.
> Consistency is a good thing
Lots of close but in the end ineffectual "epistemic" systems were internally consistent but in the long run they proved "deficient" (as in other, more efficient systems took their place). I confess I've never read Thomas Aquinas's work (to give just one example) but I'm pretty sure his "view of the world/reality system" is pretty consistent, don't think there are any internal contradictions in his writings. Problem is his internally consistent system wouldn't have been able to allow us to build combustion engines or modern electronics, so that we have had to come up with other internally-consistent "epistemic systems" that proved to be more efficient (because they allowed us to build and reason about combustion engines and modern electronics).
In the end and in the great scheme of things I don't think this theoretical physics road-block will be of any great importance for the general public, it looks like people are content with what they already can purchase based on past physics-related discoveries. Yeah, traveling through galaxy wormholes or getting to 100% know if the Universe is finite or not would be nice things to have, but people just don't care and there's nothing wrong with that.
I may be in the wrong here, but afaik mathematics has not "re-invented" itself after WW2, to take it a step further, I fail to see how we're doing different mathematics compared to what Newton and Leibniz have put on the table.
Even if we look at the "greatest" post-WW2 maths result, Fermat's Theory, I don't see any new reality-related insights that it has brought us, and I'll go another step further and I say that even if we were to someday prove the Riemann hypothesis I don't see how it would fundamentally bring new insights regarding "reality"/the physical world.
If anything, I dare say that in a certain way maths has tainted the physical world for us, has made us believe that in the same way in which mathematics is "homogeneous" then the physical world is too, if maths has numbers and "units" and if (1002 - 1000) = (2002 - 2000) then it also means that in the physical world we have homogenous "stuff".
This is why physics has started using mystical-like language like "elementary particles" which are seen as the "foundation of the physical world", with the implicit premise (if I'm wrong here, please correct me) that given a certain "elementary particle" (let's say a boson) then the boson close to it or the boson situated at the other "side" of the Universe are pretty much the same thing, almost identical, the same way as the mathematical difference I mentioned above is the same, or how two parallel lines are "the same".
Basically almost all the theoretical physicists have become Platonists by embracing mathematics no-questions-asked, when in fact they should have remained closer to Hume. And when reality hits them in the face pretty hard they resort to even more mysticism by "inventing" concepts like dark energy and the like.
Later edit: I see that that "homogenous reality" theory even has a name, Cosmological principle [1], and as a close-enough Hume follower Karl Popper was quick to dismiss it. By reading it you have to wonder what those physicists had in mind when they wrote it down:
> Although the universe is inhomogeneous at smaller scales, it is statistically homogeneous on scales larger than 250 million light years.
like, why 250 million is ok and 240 million light years is not ok? To say nothing of the fact that the "infinitely small" (the mystical-like elementary particles I mentioned above) are ignored completely from this discussion, they're also probably seen as "statistically the same". As I said, this "statistical sameness" has made us believe that more than half of the Universe we know of (68%, to quote Wikipedia) is made out of the mother of all mystical thingies, "dark energy".
First off, mathematics did actually reinvent itself in the late 1940s, with the discovery of Category Theory, which, despite being sometimes called “abstract nonsense”, has found its way into theoretical physics. Further, there is nothing mystical at all in any of the concepts of the modern physics - not more anyway than in the concept of, say, the atom. It is actually math that should be credited with the removal of the shroud of mysticism perceived by some - perhaps even many - of the uninitiated. (Physics is not an exception here - some people still look at a working computer as a miracle, for example.) And the homogeneity at the scales at which there’s just too many things to allow for much diversity should, too, be seen as one of the manifestation of the absence of any true mystery in the universe.
Indeed, theoretical physics has long ago stopped providing foundational explanations. It became strictly what essentially it had always been - a calculational tool. Whether calculations possess explanatory power is a question of psychology and sociology.
Sometimes the impossible takes us a little longer.
US school teacher, then geologist J. Harlan Bretz spent as much time as he could 'out in the field'. It was as a result of -extensive- observations that he arrived at his 'outrageous' Missoula Floods hypothesis. He spent 40 years defending his interpretation; he remained 'out in the field' most of that time.
His critics had spent -very- little time in the field. They knew he was wrong. In 1979, he was awarded Geology's top prize.
It stagnated when they started doing NHST, ie checking for a difference from "background" vs collecting and comparing data to the predictions of various theories to distinguish between them. Same thing that has destroyed every field of research that adopted this approach.
Imagine if Einstein just predicted that the position of the stars would appear to be different during the eclipse, rather than displaced by an exact amount. The last 40 years has seen physics become more like the former (bad) than the latter (good).
I do not agree that physics has not progressed. I do however believe there may be some dogmatic contamination in some processes that may have stalled some progression. Gravity for example, big G or little g and why?
Because explanatory stories aren't strictly necessary: While it certainly helps if we have them available because they allow us to reason intuitively, the hallmark of science is the predictive model.
In contrast, a bunch of explanatory stories lacking an underlying predictive model is what we call pseudo-science.
The way to map a theory to reality is via its predictions. The interpretation is how the theory fits into my mental model of reality.
In principle, reality could be strange enough that we are incapable of holding a good model of reality in our brains that evolved to avoid getting eaten by lions instead of doing quantum mechanics.
I certainly hope that's not the case, but neither can I rule it out.
In the spirit of pointing out inconsistencies...I am not a physicist, but something I've been confused about lately is why the action constant has a unit of seconds baked into it.
Feels like E = hf is an experiment with a hard coded 1 second measure time.
Why not express the relationship in terms of power?
P=uf where u is the action constant without the seconds unit hard coded. And E=utf has a variable time parameter
Didn't downvote, but they probably came because you asked a very tangential question to the subject.
I'll try to answer it: I think you just want to move units around so h (Planck's constant) has units of Joules instead of Joule-seconds. While this works mathematically, it doesn't really make sense physically, where we care about the energy of a photon because that's the value that is conserved (along with momentum, also relating to h) during any interaction (e.g. the quantity gained by an electron if the photon is absorbed). It really doesn't make sense to want to quantify the "power" of a photon, which doesn't have any physical meaning, in favor of a simpler constant.
You are right about the constant 'h' having a "hard-coded" 1-second built in, but so do ALL derived physical constants . If we, say, changed it to two seconds, the number would be cut in half to reflect the scale change and preserve the energy of the photon (or any action) because that does not depend on scale.
As to "why is it Joule-seconds?": that's the discovered law of nature. If the energy was proportional to frequency squared, h would be in units of J*s².
Consider it this way. Suppose we all agreed to never use numerical values in the units section of an equation... then in this case we'd need a symbol for cycles in cycles per second aka Hz.
If we leaves Planck's constant as is in E=h(cycle/second), then the units would end up as
Joulecycle... odd right?
That's what brought me to writing the formula as
E=utf
I guess it should really be
Delta E=u(delta t)f where u is in joules t is (seconds per cycle) and f is (cycles per second)
h (Planck's constant) has units J*s, and f (frequency) has units 1/s. Therefore E=hf has units of Joules. E is the energy of a single photon at that frequency. It's not a measurement of power.
Indeed, that is a big impediment in many disciplines. Non-perturbative QCD (i.e. what describes the mass around you) can not be calculated analytically with a controlled level of precision. The best we can do is lattice QCD, i.e. the brute force approach. That's a big difference to QED, where we know how to find better and better approximations in a systematic way.
In other words, QCD, like many complex systems, shows emergent behavior not easy to predict from the theory input (couplings, masses). A better mathematical framework to attack these kind of problems would be a substantial jump forward.
AI is not some magical pixie dust or golden retriever chasing a stick. It may have its applications in pattern matching, particularly in chewing the exabyte datasets we increasingly have now, but it isn’t going to fetch the answers for us. Heck, it won’t even tell us the questions.
"And please spare me the complaints that I supposedly do not have anything better to suggest, because that is a false accusation. I have said many times that looking at the history of physics teaches us that resolving inconsistencies has been a reliable path to breakthroughs, so that’s what we should focus on. I may be on the wrong track with this, of course. But for all I can tell at this moment in history I am the only physicist who has at least come up with an idea for what to do."
I'm no physicist, so I can't have a (useful) opinion, but she does offer at least a broad suggestion of what to do.
But isn't that exactly what was done, e.g. in supersymmetry? The Standard Model could not explain dark matter etc., so physicists tried to come up with a new theory potentially explaining it. Yes it's based on math, on beautiful math, but isn't it always the case in physics? How did Newton create his laws? By using math. How did Einstein create his theories? By using math. It's math, math, math. So create your math theories, explain nature and then use Occam's razor to find the simplest. Maybe I'm just getting this all wrong but I don't get her point.
Not really. Supersymmetry doesn't provide a theoretical explanation for any physical observations we can't explain with current models. Instead, it arises from a desire to... "beautify" the math of the Standard Model. (My first instinct is to write "simplify", but I'm not sure it's an actual simplification.
One of the things that Sabine has pointed out with respect to supersymmetry is that, for the last two particle accelerators, physicists have been arguing that they'll be able to find the supersymmetric particles to expand the model. And when they've failed to find any evidence for such particles, they've twice said "okay, it's not fatal to our theory, but we'll find them in the next one for sure." Every potentially predictive phenomenon arising from supersymmetry has failed to be found, and the response has been to merely twist the knobs so that these phenomena will happen just out of reach of current technology.
The real risk here is that we are so wedded to particular ideas that we refuse to give up on them, even when they have given us absolutely nothing in terms of extra (validated) predictive value, no matter how much we try to squeeze it out of them.
Quantum mechanics is an example. We know how to use QM. It is an amazing theory. But physicists aren't probing QM anymore. Well most aren't. It does such a good job spitting out answers, we don't ask why. There is a lot to be explored there but academia tells people to avoid it. They say don't look at the man behind the curtain. QM is equations that gives good answers but we don't really know why.
Something Deeply Hidden is a good book on the subject(as a non physicist).
Of course I am just a layman who likes to read books about these subjects but I honestly don't know shit. Just what I read.
Physicists are probing the heck out of every theory. However, QM has proven to be _extremely_ reliable, even some 'Gedankenexperimente' by Einstein trying to make it look wrong turned out to be true ('geisterhafte Spukwirkung').
>> There is a lot to be explored there but academia tells people to avoid it.
I don't think so. I mean just look at the Large Hadron Collider (LHC) which probes fundamental forces which are... guess what... based on quantum field theory which itself is based on quantum mechanics.
By probing, I mean looking beyond predictions, the underlying nature of it. The equations of QM give us great results but it is very much an oracle type situation. We ask a question and we get a good answer. But why? Theoretical physicists are told not to look behind the equations and figure more out. Why is it this way?
Let me be frank I am getting way outside my true understanding and parotting what I have read. But if QM the way it is because it fits the many worlds theory? Are we missing another piece to explain it? If you take the simplest version of QM that can solve the problems, you are stuck with many worlds.
These are important areas that colliders aren't going to answer but physicists push new physicists to avoid.
There are other theories but require dressing up the base QM math to eliminate many worlds.
I'm sure physicists already put a lot of effort in trying to derive Schrödinger's equation, the basis of QM, out of a simpler theory. It's not easy and would for sure deserve a nobel price. There is no consortium hindering anyone from persuing this.
I received hundred of downvotes just asking questions here (like "Why we have right hand rule for current? What must be changed in properties of Nature to make it left hand rule?").
It's sad that I need to use throwaway accounts to talk about physics.
Try not to be discouraged, it's okay to talk about physics. But try to use the right language for it - mathematics. In online forums, physical ideas or questions are often described in words, which is subject to so much imagination that it can be considered philosophy, but not physics. It's almost pointless to talk without mathematical support. Words are words and have no special meaning. In the example above: A better question would be: Why is classical electrodynamics described by Maxwell's equations? Is there a more fundamental theory behind it? Reason: The 'right-hand rule' is not a rule, not a theorem. It can be derived.
So what need to be changed in properties of Nature to switch EM with Right Hand Rule into EM with Left Hand Rule? Can you explain this using a formula?
Where we can expect to see EM with LHR? Can we see it in our Laniakea? Or at opposite side of Shapley Attractor - Dipole Repeller? Or at perpendicular one? Do we have void between RHR and LHR? Can we cross it?
What is nature of EM? Is it because of nonlinear trajectory of motion of our galaxy from Dipole Repeller to Shapley Attractor? Or it because of non-linear motion of our galaxy within Laniakea? Or something else?
Yeah - I am probably misunderstanding the article, but it seems to boil down to 'we haven't absolutely verified things in an experimental fashion, let's do that again instead of building theoretical mathematic constructs'. When 1) (From my perspective as a physics lay-person) that's demonstrably false, and 2.) As I didn't even see a concrete example of the _cause_ of the perceived stagnation, much less a solution.
Stop working on string theory, multiverse, or some of the other mainstream theories she mentioned. Start fresh about how to approach theoretical physics instead of taking the above mentioned theories almost as a given and trying to do science within that framework.
Except, string theory and multiverse aren’t theories. Heck, they’re not even hypotheses (since they offer no means to test them).
They’re simple speculation; what philosophers right back to the ancient Greeks were only too happy to rest on for 2000 years, until Persian protoscientists, during their own all-too-brief golden age, annoyingly said “but wait, don’t you think we should at least test our beliefs before we take them as granted?”; a practice that fully went mainstream a half-millennium later as the European Renaissance kicked off; and so modern-day Natural Philosophy was born.
If the first rule of science is, “you must not fool yourself (and you are the easiest person to fool)”, then rule zero must surely be “stuff’s hard, folks”. And some things are just invariant, no matter how much you might wish that they not be†.
Causality is not a problem for physic. For example, electron can move faster than speed of light in a medium without violating of anything.
The problem with FTL engine is that Cherenkov radiation alone will convert any FTL object into ball of plasma.
The only way for material object to survive FTL speed is to travel in the it own space-time "bubble". The only natural object with properties of such bubble is black hole. So, warp drive is possible to create, in theory, if we will be able to create an artificial black hole like shield.
Moreover, the only way to reach FTL speed is to throw something back at FTL speed. It's not possible to create any material fuels with FTL exhaust velocity. The only way to throw something at FTL speed is to throw it in it own space-time bubble.
Faster-than-light in a medium, which slows light down. Not faster than c, which I believe is the point Einstein et al were making when fixing the speed limit of the universe.
Although a lighthearted joke about “sciencing hard” being invariant certainly flew off its rails just as fast.:p
It does seem a bit whiny... if there are truly other options, then yea sure. But it seems the fundamental issue is that where the current theories break down, is exactly where you need high-cost accelerators to probe further. And if that's truly the case, I don't see how any amount of philosophy will solve that problem.
If we've completely chewed over the particle physics data we can currently produce so thoroughly it probably makes sense to stop employing so many theoretical physicists and instead shift resources into areas that are making progress whether that's pure math or biology or what-have-you.
If theoretical physics is going nowhere in particular, and we're diverting funding to "pure math" maybe the physicists should put on hats that say "pure mathematicians"?
Can you spot the difference? The experimental result is too big, like 3.5 sigmas too big. For most sciences 3.5 sigmas is a lot. In particle physics 3.5 sigmas is interesting, but it may be a fluke.
People is trying to solve it. The idea is that the theoretical model is missing a particle or an interaction between the known particles. So, it's a guess game. You must guess how the particle must behave. Some properties like charge are fixed, some properties like mass are a parameter. So you must guess the main properties of the particle and tweak the parameters until you get a theoretical result that is closer to the experimental result.
The problem is that the new proposed particle may change the theoretical expected result of other experiments. So you must fix the theoretical result of this experiment without breaking allllllllllllllll the other experiments.
And the properties are very constrained.
The easier to explain is the charge that must be an integer number. Or in some cases n/3, but you must explain very well why the fractional charges are no found in the wild. Other charges are posible, but people will think you are nuts unless there is a lot of supporting evidence.
Other properties are more difficult to explain, like the symmetry of the W+, W- and Z0 particles. If you propose a new particle that interacts with the W+ particle, there are a lot of restrictions because the particle you propose must interact with the W- particle, or you have to propose another particle that interacts with the W- particle in a very similar way. And at the same time, you must also fix a similar problem with the Z0 particle.
The main problem is that most of this extensions to the standard model are very technical and boring, and they are not covered by the popularization press.
Realizing that they're on bad path could be a first step in the right direction. Nothing big not having been "discovered" for 40 years probably means that the problems are structural.
Physics has made a lot of progress in social justice and representation and these kinds of comments putting down "her point of view" is the type of toxic masculinity that will send us back 100 years.
Would you please stop taking HN threads into ideological flamewar? This was a clear violation of the site guidelines, which say: "Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."
We've had to ask you a ton of times not to post like this. If you keep doing it we're going to have to ban you.
Just out of curiosity, how was op supposed to phrase that sentence in order to keep from being toxic? Was it just the pronoun that was offensive, or was it just having any critique at all?
Disagreement and critique is a positive thing if it's done with respect. Handling someone with kid-gloves is demeaning.
I don’t think physics operates on anything resembling a social-justice system - nothing like the parent comment is going to send us backwards any amount of time in the field, but an attitude that bars commentary might stagnate us for a long time.
>I don’t think physics operates on anything resembling a social-justice system
Physicists are chosen within the Academic-Governmental Complex, and as such physicists are chosen according to its preferences. Maybe this is a big problem? It's not just with respect to selection based on gender, race or socioeconomics. Have you read Paul Graham's Lesson to Unlearn, about the adverse incentives and maligned merits which academia chooses for: http://www.paulgraham.com/lesson.html
Personally, I just hate it when people praise Academia (I see it often with Wall Steeet, too) as being a shining example of social justice like its some gift-wrapped paper around a rotten core.
>an attitude that bars commentary might stagnate us for a long time.
I agree. Now watch comments such as these incite downvotes and inflammatory responses because they question assumptions and make people feel uncomfortable.
You aren’t getting my position - results drive the field, whether those results come from a tenured professor and his team of PhD students, or some kid in his garage with no formal training.
Physicists are not “chosen”. They are built and trained.
These argumentative tactics are of poor conduct, and seem to be clearly made in bad faith.
Someone who disagrees with you is just someone who has a different opinion, and that does not make it appropriate for you to attack them.
In other words, what you are doing is wrapping your argument in argumentative tactics. Instead of acknowledging another user has a different conclusion than you, you assert that they "don't get your position." In this context, all you did was reiterate your disagreement, and so the personal attack comes across as either a put down or an statement of arrogance.
At the end of your statement about physics being a meritocracy, you again use argumentative tactics, suggesting that any disagreement is a conspiracy.
First of all, the poor results of physics as a field is not a conspiracy. And you do not need to be a conspiracy theorist to criticize the field of physics. That is just ridiculous. Did you even take a moment to look at the Paul Graham post I referenced? Something tells me you are more interested in broadcasting and arguing about your world view than you are in engaging in though-provoking discussion.
But finally, social justice in academia is blatantly obvious. For example, affirmative action awards up to a few standaDs deviations of promotion in rank alone in just undergraduate admissions. Also, the author of the OP has written about social justice in the field herself. It is just both dishonorable and disingenuous for you to ignore the truth and be so argumentative.
It isn't a throwaway comment, it is reference to writing and work she has been doing for quite a while. It does come across as fairly arrogant without the context of all her other writing and work. With that context it comes across to me as more of an expression of frustration.
Yeah, but if you've followed Hossenfelders writhing you know that it's been wobbling along the line between relevant critique and pure clickbait for over a year now.
This is likely the first time I've seen any claim that she has a good idea of what to do instead (she's had the idea before, but not claimed it to be a good one as I recall), the past year it's been mostly about how all other physicists are stupid... and it's not like this supposedly "superior" strategy has yielded any great advanced for in her work.
Einstein isn't alone though, there are others. Like how Heaviside ruined Maxwells original 20 equations. And there aren't really that many explanations to how this came about. It could be a result of stupidity of course - or it could be that someone for some reason maybe didn't want everyone to know.
Imagine if you could modulate electromagnetic scalar waves to in-phase at a specific point in space. Someone could put a quad of energy confined to your bedroom and no one could tell where it came from. It would make the Hiroshima bomb sound like a firecracker and all you need is a lot of electricity and antennas. Big antennas though.
Two reasons: Either scientists and academics have stopped exploring (true to a degree..) or they have been looking in the wrong places with the wrong mindsets and incomplete reasoning. (mostly this.); That said science has evolved a lot yes. But it could have evolved a thousand fold more if it wasn't for that second reason;
>But all shortcomings of these theories – the lacking quantization of gravity, dark matter, the quantum measurement problem, and more – have been known for more than 80 years. And they are as unsolved today as they were then.
Would solving these shortcomings make life better for anyone?
Other than the now famous physicist that came up with the solution, wouldn't the others working on these problems become unemployed?
We have grave inconsistencies in cosmology, yet very bold conclusions are made, assuming that our theoretical understanding of gravity and general relativity is 100% correct at all scales.
The claim about accelerated expansion, only 30 years old, is an example of the enormous arrogance of modern science, knowing that such theories have been developed and debunked every few decennia in the past 100-200 years.
The historical perspective seems to fail completely, and todays physicists seem to be so confident even though their colleagues of only a few generations ago have been proven wrong many times. Why would anything be different now, because we throw so much money at it, because we have wonderful computers, have we become so much more intelligent as a species?
We have been looking really hard for answers to persistent questions. While we have not found affirmative answers, physicists have systematically ruled out option after option after option. We have not yet discovered a unified-theory-of-everything, but we know a whole lot more about what that theory is not.
Furthermore, the past 40 years have seen the emergence of precision cosmology (and the dark-matter/energy paradigm that it entails), the observation and confirmation of neutrino oscillation, the detection of gravitational waves (and the nuclear physics revolution that has begun with GW170817), SN1987A, and so much more.
The coming decades are poised to learn so much more, a lot of it from the stars. GAIA, LISA, updated terrestrial GW detectors, LSST/Rubin, TMT, SKA, and more are all poised to tell us much more about things we don't understand. Particle physics will move forward too, though it is uncertain how quickly. The right breakthrough in wakefield accelerators, though, could be transformative.
Thirty spokes share the wheel's hub;
It is the center hole that makes it useful.
Shape clay into a vessel;
It is the space within that makes it useful.
Cut doors and windows for a room;
It is the holes which make it useful.
Therefore profit comes from what is there;
Usefulness from what is not there.
Tao Te Ching - Lao Tzu - chapter 11