This is a new proposal, fresh on the arXiv today, from a group of U.S. particle physicists. The introduction is very readable and lays out the mission clearly:
> We can now confidently claim that the “Standard Model” of particle physics (SM) is established. At the same time, we are more and more strongly persuaded that this SM is incomplete. [...] It is now common to describe the SM as an “effective” theory that should be derived from some more fundamental theory at higher energies. But we have almost no evidence on the properties of that theory.
> Our successes have become a liability in reaching this goal. Scientists from other fields now have the impression that particle physics is a finished subject. They question our motivations to go on to explore still higher energies. The scale of an energy frontier collider is also challenging to the young people in our field. They need to see qualitatively new capabilities realized during their active scientific careers. [...] That is where the urgency lies.
> [T]he entire C3 program could be sited in the United States. With the cancellation of the Superconducting Super Collider and the end of Tevatron operations the US has largely abandoned construction of domestic accelerators at the energy frontier. C3 offers the opportunity to realize an affordable energy frontier facility in the US. This may be crucial to realize a Higgs factory in the near term, and it will also position the US to lead the drive to the next, higher energy stage of exploration.
The main innovation is that they propose to use non-superconducting cavities, which allow much higher accelerating fields, cooled to increase their quality factor. The resulting shorter length dramatically decreases the cost, to an estimated $4 billion, which is 80% to 90% less than other proposals. Of course, $4 billion is no small amount of money, but for perspective that's about equal to the monthly budget of the National Institutes of Health, a third of the cost of the James Webb Space Telescope, or 2% of the total cost of the space shuttle.
I would expand on this to say that they propose to use an innovative cavity shape. Normal-conducting cavities can reach higher accelerating fields than superconducting cavities, but at much lower duty factors and with high voltage breakdown events. This new design would allow for lower power dissipation than typical normal-conducting structures, ultimately allowing higher RF duty factors.
There are still some downsides/tradeoffs compared to superconducting structures, including a much smaller beam aperture (5 mm diameter vs. ~100 mm for superconducting cavities), which disrupts the quality of the beam. Superconducting machines can also be run in continuous wave mode (100% duty factor), and state-of-the-art niobium cavities have been driven at ~50 MV/m in CW.
I dropped out of my theoretical particle physics Ph.D. program when the SSC was cancelled. Experimental results for the stuff I worked on was delayed 20 years.
Agree. Had much the same thought. Ssc maybe have found higgs as predicted but nothing new. Now, that's not to give the CERN ppl a hard time. They tried and are trying very hard to go beyond SM but mom (mother nature) isn't making it easy for them. In the US I'm hoping neutrino work will give us the edge there
I was just thinking that laser-driven plasma wakefield accelerators have produced some promising results recently. Particle physicists still seem stuck in the big accelerator mentality when maybe there are better uses for that money, even in their own field.
Of course it has its own problems, the question is whether a novel approach is likely to yield more novel insights than the standard approach for the past 50 years. Answer seems kind of obvious.
Generally when papers like this come out, there are a couple of decades' history behind them.
(Disclaimer: I'm an experimental physicist, but don't have domain knowledge in the subtleties of modern accelerator-cavity design, which is both an art and a science. Most modern accelerators, including LHC, are driven with superconducting niobium cavities.)
Since essentially every author is at SLAC, I suspect that there is a strong cold-copper research group there.
Can anyone with domain-knowledge summarize the generally-accepted strengths/weaknesses/risks of cold-copper accelerator designs?
The paper describes a design for a linear accelerator that would reach 250GeV in center of mass frame and that may be extended to 550GeV (see section 2) and even multi-TeV (see section 5).
I was curious how this compares against LHC which Wikipedia [1] says reached record 13TeV total collision energy. However, it isn't clear whether Wikipedia cites energy in center of mass frame that could be directly compared.
Does anyone know how the two accelerators would compare in this respect? In particular, would the proposed C³ accelerator actually achieve higher total collision energy than LHC or is it instead hoped that future extensions of a C³ accelerator would exceed LHC's capabilities?
It looks like the proposed accelerator smashes electrons and positrons together which are nice and clean elementary particles (as far as we know), so much of the collision energy is directly usable to create, say, Higgs bosons with a mass of 125GeV.
At the LHC they smash protons into each other which at this scale are a bit like a soupy mess of quarks and gluons. So its 13 TeV c.o.m. energy actually gets spread out over multiple elementary particles, which might not even hit each other straight on. Because of this the number of high-energy collisions between elementary particles is significantly lower.
This is just my lay-physicist perspective by the way; an accelerator specialist can probably provide more details.
The difference is that the LHC is a proton-proton collider, and the proton constituents (partons) that participate in the "hard" collision only have a fraction of the total energy, given by some probability distribution (parton distribution functions). For an e+e- machine you essentially use the full energy in the hard collision.
This proposes a linear e+e- collider with beam energies of about 250 GeV.
Now if you shine a laser light on these electron beams, then the back-scattered photons will carry ~80% of the electrons' energy. So you'll have 200 GeV photons. And, since photons do not repel from each other, you can focus them much better - meaning that you might achieve even higher luminosities, despite conversion losses.
> It is now common to describe the SM as an “effective” theory that should be derived from some more fundamental theory at higher energies.
I'm curious why the more fundamental theory should be at higher energies.
In some sense I know part of the answer is 'because we haven't found it at low energies yet', another being 'because it takes sufficiently high energy to produce additional particles (e.g., Higgs)'.
Interactions at lower energies are very well predicted by the Standard Model, so deviations are either so small they're hard to find (but see the muon g-2 experiments), or at energies higher than what we've probed.
This isn't like the discovery of quantum mechanics where we knew there were things in the "everyday" regime we couldn't predict (e.g. blackbody spectra), more like general relativity where we have an otherwise extremely successful theory that breaks down in an extremal regime. Except we don't even have an example like the precession of Mercury's orbit to point at and say, "this is where it breaks down".
"We haven't found it at low energies yet" understates how much they've looked and how much hasn't been found.
Well, it’s not that you need higher energies to invent a new theory (even though you do, on a personal level), the problem is that it takes a discovery of a new particle to see which of these theories may be correct (and which of them aren’t), and new particles are only expected to be found as products of an interaction between particles at higher and higher energies.
We believe we should have new physics at higher energies because treating the SM as an effective field theory introduces additional terms in the Lagrangian which are suppressed (~ divided) by some energy scale. These additional terms are the ones allowing lepton flavor violation (so a muon decaying to an electron and a photon, for example) but we know from experiments that these events are incredibly rare. This tells us that the energy scale of new physics is very high.
However, we could also have new physics at lower energy scales, as is the case with axions.
Stating that such a project is worth doing only if it produces other technological advances is not something I would like to be accused of, but fortunately your question does not have to be interpreted in that way.
Here is an exchange between physicist Robert Wilson and senator John Pastore in 1969, about establishing Fermilab.
Pastore: Is there anything connected with the hopes of this accelerator that in any way involves the security of the country?
Wilson: No sir, I don't believe so.
P: Nothing at all?
W: Nothing at all.
P: It has no value in that respect?
W: It has only to do with the respect with which we regard one another, the dignity of man, our love of culture. It has to do with: Are we good painters, good sculptors, great poets? I mean all the things we really venerate in our country and are patriotic about. It has nothing to do directly with defending our country except to make it worth defending.
Spending four billion dollars to prove (to whom?) that we are really good at building super colliders is not something I personally would like to see my government do. If I want to prove my country is worth defending I want it to be a place where no children go hungry and the citizens have the resources and feel empowered to continue our elimination of human suffering in this world. I personally can't see what a new accelerator would provide to humanity over using that money providing food to the hungry.
What do you think happens with the money? It's not burned. Ideally, the project also employs people that use their money to exchange it for goods and services, which in turn might make some children not go hungry.
Consumption is not bad (up to ecological damage). Wealth hoarding and concentration is bad.
You could also try to repair all the bridges from the east coast to the west coast, but I'm not sure you'd cross the Appalachian trail with that little of a budget.
If we're talking about megaprojects, a mini-Manhattan Project of sorts for developing better theories of superconductivity would be extremely beneficial. Practical room temperature superconductors would totally revolutionize our society. We're close, but we're not quite there yet.
The usual question is what to do with the those who don't know anything about that, don't really want to. Should particle physics simply ended? Should they go and beg for money from billionaires?
We don't coddle other professions with declining ROI like that. Why should particle physics be an exception? Anyone else would be told to "learn to code".
That's false perspective. People keep repeating things like "why India sends a probe to Mars, heck, there are so many poor people in India". In the long run investments in new technologies always leads to common people life improvement. Even building pyramids in Egypt was beneficial for peasants living there, as it required development of skilled workforce, required building efficient administration, "project management" skills that were used in other fields (agriculture, warfare).
Scientists are people too! What's wrong with feeding them? You can also see any research facilities and space programs as a cheap jobs program that keeps employed the people building them (scientist are really underpaid compared to software engineers, lawyers etc....), doing research in there and all the people that build anything that these people use.
It's also subsidizing the high precision research and development that private companies do to be able to provide the tools used by these research facility (thought to a much lower extend than military programs), which in turns should lower the cost of producing high precision doodads you put on your wrist or your pocket.
Understanding the deep structure of matter is necessary for two things - materials science and energy science - both of which could have massive economic payoffs for society.
In the modern world, much of our economic growth is driven by innovation - new scientific understandings result in new technologies that result in new products and services that introduce new capabilities into the world that never existed before, and which are in demand because they improve people's lives.
Scaling production and volume lowers prices making them more accessible to everyone. That's the wealth creation pipeline at work, a modern miracle that brought humanity out of the permanent dark ages and repeating Malthusian cycles.
For example, much of modern technology - silicon chips, mobile phone networks, etc, would not have been possible without a deeper understanding of quantum physics.
However, continuing this growth and wealth generation process depends on continuing to deepen our understanding of the fundamental building blocks of matter and energy. Stop or slow the latter and we stop or slow the former. To date, particle accelerators are key component of doing that. And even if someone discovers a better way, we'll still have to invest in that. Either way we still have to invest in the process of unveiling the deep structure of matter.
There are miserably poor people in this world, that's right, but it's not because of the LHC or particle physicists. And assuming you're American (most likely) then your country has far, far, far, far bigger problems than the support it may opt to provide to a fundamental physics search.
It created White Rabbit[0], a system for sub nanosecond time keeping precision over ethernet. It is now used in many financial exchanges to ensure fairness for market participants. [1]
That sounds like it has very marginal value to me. What's the problem it's solving in financial exchanges that you can't do with pedestrian cables and techniques inside a datacenter? Let alone a wild and crazy solution like only processing orders once per second.
I'm especially looking at where it says "typical distances of 10 km between network elements"
Yes, it's just that they aren't directly linked with LHC - most of the time you're dealing with second/third or higher degree connection, where part of LHC required support technology that in turn required technology that often can be used in for something else. The most famous example is of course WWW, although it's fairly remote - a bit of the encouragement for building it was help CERN work better, and at the time work on LHC's design already started.
Other things are work on grid/cloud technology as well as supercomputing, computer visualisation, research on industrial automation (something I personally used for work, directly related to challenges of building and running LHC), etc. etc.
I have no direct insight, but from what I gather, the amount of data generated there is staggering (1 petabyte/second raw, 1 petabyte/day after filtering). Storing and processing this much data certainly required immense technological advances (which have been shared with the community at large). Again, my impression from far outside.
But fundamental research like this does not have short-term prosperity or technological breakthroughs as main priority anyway, so I'm not sure why you're so concerned about that.
Goodhart's Law ensures that if you require technological advances, you will get some. For instance, the SSC was going to bring us flat panel displays.
What?
Yes, I recall at the time that when news articles talked about the SSC, flat panel displays were mentioned in multiple articles. Now this was certainly a case of one article influencing the others, with the original information source buried in obscurity.
We're really bad at projecting the technological outcomes of any specific science research project, and asking scientists or their funding agencies to do so is just an invitation for them to make up some bullshit. I'd prefer that they are just honest with their expectations.
Is it not the case that a lot of machine learning research was done to make the data processing at LHC possible?
There is literally civilizations worth of knowledge generated by projects like this, not all of it exactly commercially useful but if you bump into it and they've solved your problem or written your tutorial better than you it's very nice to have.
Are you implying advancing particle physics itself is useless? I believe that today's biggest challenge in engineering is finding applications for the most cutting edge physics, and I think focus on Skunk Works for the same is the need of the hour.
If we have not found any applications, its probably because we have not tried enough / invested enough.
LHC does basic research which is opposite of what you are asking (applied research). They seek to increase our understanding on particle physics, not make our lifes directly better.
Yes, building an accelerator would have economic benefits for whatever location is chosen as the site. This could be built in Japan (as the paper mentions, it may build the ILC) or even the US.
It would probably read better if they focused on the science benefits and were less overtly "need jobs for particle physicists or no-one will go into our field".
> We can now confidently claim that the “Standard Model” of particle physics (SM) is established. At the same time, we are more and more strongly persuaded that this SM is incomplete. [...] It is now common to describe the SM as an “effective” theory that should be derived from some more fundamental theory at higher energies. But we have almost no evidence on the properties of that theory.
> Our successes have become a liability in reaching this goal. Scientists from other fields now have the impression that particle physics is a finished subject. They question our motivations to go on to explore still higher energies. The scale of an energy frontier collider is also challenging to the young people in our field. They need to see qualitatively new capabilities realized during their active scientific careers. [...] That is where the urgency lies.
> [T]he entire C3 program could be sited in the United States. With the cancellation of the Superconducting Super Collider and the end of Tevatron operations the US has largely abandoned construction of domestic accelerators at the energy frontier. C3 offers the opportunity to realize an affordable energy frontier facility in the US. This may be crucial to realize a Higgs factory in the near term, and it will also position the US to lead the drive to the next, higher energy stage of exploration.
The main innovation is that they propose to use non-superconducting cavities, which allow much higher accelerating fields, cooled to increase their quality factor. The resulting shorter length dramatically decreases the cost, to an estimated $4 billion, which is 80% to 90% less than other proposals. Of course, $4 billion is no small amount of money, but for perspective that's about equal to the monthly budget of the National Institutes of Health, a third of the cost of the James Webb Space Telescope, or 2% of the total cost of the space shuttle.