> The headline here should be about the first measurement of a mass asymmetry between matter and antimatter.
The mass difference is not between particle and antiparticle, but between the mass eigenstates, which are not antiparticles of eachother. It's also not the first Δm to be measured. In fact, it was the last remaining one.
So if the mass eigenstates aren't different then you don't get mixing between the flavor eigenstates, so this is expected? And by measuring the frequency of the FCNCs then that is what gives them the difference between the mass eigenstates? (bear with me, its been 25 years since I decided to become a programmer instead...)
Δm is directly proportional to the frequency of the oscillation. (In natural units it is exactly the angular frequency: cos(Δm·t) appears in the decay rate) So indeed Δm=0 means no oscillation. You measure it by essentially counting the number of particle and antiparticle decays as a function of decay-time.
What about charged systems like proton-antiproton? We expect there to be some similarly tiny Δm there, but obviously you can't have FCNC because its no longer neutral?
This reminds me of a 2004 weblog post by one of the co-creators of ZFS, Jeff Bonwick, on why 128 bits was chosen:
> Some customers already have datasets on the order of a petabyte, or 2^50 bytes. Thus the 64-bit capacity limit of 2^64 bytes is only 14 doublings away. Moore's Law for storage predicts that capacity will continue to double every 9-12 months, which means we'll start to hit the 64-bit limit in about a decade. Storage systems tend to live for several decades, so it would be foolish to create a new one without anticipating the needs that will surely arise within its projected lifetime.
> If 64 bits isn't enough, the next logical step is 128 bits. That's enough to survive Moore's Law until I'm dead, and after that, it's not my problem. But it does raise the question: what are the theoretical limits to storage capacity?
> Although we'd all like Moore's Law to continue forever, quantum mechanics imposes some fundamental limits on the computation rate and information capacity of any physical device. In particular, it has been shown that 1 kilogram of matter confined to 1 liter of space can perform at most 10^51 operations per second on at most 10^31 bits of information [see Seth Lloyd, "Ultimate physical limits to computation." Nature 406, 1047-1054 (2000)]. A fully-populated 128-bit storage pool would contain 2^128 blocks = 2^137 bytes = 2^140 bits; therefore the minimum mass required to hold the bits would be (2^140 bits) / (10^31 bits/kg) = 136 billion kg.
> That's a lot of gear.
> To operate at the 10^31 bits/kg limit, however, the entire mass of the computer must be in the form of pure energy. By E=mc^2, the rest energy of 136 billion kg is 1.2x10^28 J. The mass of the oceans is about 1.4x10^21 kg. It takes about 4,000 J to raise the temperature of 1 kg of water by 1 degree Celcius, and thus about 400,000 J to heat 1 kg of water from freezing to boiling. The latent heat of vaporization adds another 2 million J/kg. Thus the energy required to boil the oceans is about 2.4x10^6 J/kg * 1.4x10^21 kg =
3.4x10^27 J. Thus, fully populating a 128-bit storage pool would, literally, require more energy than boiling
the oceans.
A fully-populated 128-bit storage pool would contain 2^128 blocks = 2^137 bytes = 2^140 bits; therefore the minimum mass required to hold the bits would be (2^140 bits) / (10^31 bits/kg) = 136 billion kg.
136 billion kg:
- ≈ 0.64 × mass of trash produced in the United States in one year ( ≈ 2.36×10^8 sh tn )
- ≈ 0.35 × estimated wet biomass of all humans alive ( ≈ 385 Mt )
- ≈ 1.3 × estimated dry biomass of all humans alive ( 105 Mt )
Another one: 136 billion kg is the mass of a block of water with a footprint of a square kilometer and a height of 136 meters. Put this way, it does not sound that much anymore, in my opinion.
Hold up. The article shows a mass difference of 10^-38 for the matter/antimatter states. That’s not a few orders of magnitude off your number, that’s literally your exponent.
I’d love it if they could stop trying to assign these results to individual institutes (Oxford in this case). This is an LHCb publication and that’s that c
Did the Oxford group actually not do the relevant analysis as the university suggested?
(I conclude from the lack of mention in my current and previous universities' news -- wot, no Tara Shears? -- that at least their LHCb groups weren't significantly involved!)
Consider it like any other resource for scientific exploration. If it was a radio telescope array and a group from Oxford made a discovery while analyzing the massive amounts of data those generate, the array facility wouldn't get top credit for the discovery.
That might be how it works in astronomy, but not in particle physics. The collaborations which build and operate the detectors are also the people who get to analyse the data. There is "open data" but it's released after a long embargo period and is much less complete. (In fact, you won't find any LHCb open data yet because CERN won't give us enough storage to host it.)
Officially, all qualified members sign all papers. The names of the people who worked on an analysis are not publicised. There is already a preference to only work on analysis, because it's cheap and easy (no need to send people to CERN or kit-out expensive labs), but that doesn't help much in the way of actually collecting data, so "service work" shouldn't be discouraged by attributing extra credit to people doing analysis.
Institutes sometimes write press releases such as this where they exaggerate their involvement. In this instance, only 2 out of 10 people in the analysis group are affiliated to Oxford. It's very disingenuous not to acknowledge that they're part of a larger collaboration.
>> the Big Bang should have produced matter and antimatter in equal amounts, and over time that all would have collided and annihilated, leaving the cosmos a very empty place. Obviously that didn’t happen
Why do we expect the Big Bang would have created little more matter than we observe? How do we know that almost all matter wasn't annihilated, and what's left really isn't just a rounding error after ~50% annihilated ~50%?
1. Because there isn't even close to the amount of radiation around we would see from such incredible amounts of anihlillation
2. Matter/antimatter annihilation is symmetric - it removes the same amount of matter and antimatter from the equation so however much stuff we started with we expect to still see the same amount of matter and antimatter and the antimatter doesn't seem to be around
Its worth emphasising that there is some matter/antimatter asymmetry in the standard model, they don't behave as exact "mirror opposites" particularly in the context of some kaon physics, but there isn't even close to enough asymmetry to explain the amount of matter and the lack of antimatter we observe.
There is no reason to believe that matter and antimatter was produced in equal quantities in the Big Bang.
2 particles, one of which is the anti-particle of the other form a system where the sum of all quantities that are conserved can be 0, so a transformation generating or destroying both particles is possible.
Nevertheless there are also other systems of particles where the sum of all conserved quantities can be 0 so there should exist transformations that generate all of them simultaneously.
Besides the systems of 2 particles, where generation and annihilation is well known, there are systems of 4 particles with zero sum of all conserved quantities (e.g. a quark, an anti-quark, an electron and an anti-neutrino) and also systems of 8 particles e.g. the 3 kinds of u quarks + the 3 kinds of d-quarks + electron + anti-neutrino.
The systems of 4 particles can actually be generated and destroyed in weak interactions, similarly like the the systems of 2 particles can be generated and destroyed in electromagnetic interactions.
For the systems of 8 particles, there is currently no theory about a mechanism of generation or annihilation, but my bet is that this is how matter was created during Big Bang (those 8 particles aggregate into 1 proton, 1 neutron, 1 electron and 1 neutrino).
For now we have a theory of Big Bang not from the starting point but from a little later when there already was normal matter as we know it.
There is no theory yet for the origin of Big Bang, but there is absolutely no reason to believe that at the beginning there was an electromagnetic interaction generating simultaneously equal numbers of particles and anti-particles. For that to be true, the matter and anti-matter should have been preceded by electromagnetic waves i.e. photons, of equivalent energy with the mass of the generated matter and anti-matter, which would then annihilate restoring the previous photons. Such a theory goes nowhere, so it does not match reality. Whatever started Big Bang, it was not an electromagnetic field generating pairs of particles and anti-particles.
Theres a lot of stuff here that I'm not going to pretend to be expert enough to talk about but some is definitely wrong.
I think the main point that is missing is that the standard model is very close to symmetric, so if you have some process that generates systems of 4 or 8 or however many particles from a neutral boson then exactly the same process should occur producing all the anti-particles of the above e.g. if you think you make
1 proton, 1 neutron, 1 electron and 1 neutrino
in such a way that all the numbers balance then you should also see a process that makes
1 anti-proton, 1 anti-neutron, 1 anti-electron and 1 anti-neutrino
Its also worth pointing out that pair production is not specifically an electromagnetic thing. For example the Z0 boson pair produces fermion+anti-fermion pairs in a very similar way to the photon and W bosons decay to lepton+anti-neutrino pairs in a similar way (although W bosons are charged so its a little different). The Higgs boson also decays into quark-anti-quark,lepton-anti-lepton or even boson-anti-boson pairs.
You are right that that the sum of the conserved quantities can be zero also for the 8 quarks & leptons composing 1 anti-proton, 1 anti-neutron, 1 anti-electron and 1 neutrino.
I did not claim that this is how matter was generated in the Big Bang, especially because for now there exists no theory for such a process.
My point is that this is a strange coincidence as it would provide matter in the right proportions. Unlike for electromagnetic interactions, there might be some asymmetry making the process generating the particles more probable than the process generating the antiparticles. Because the 2 processes are decoupled, they might have happened at different rates, which is not possible with electromagnetic generation and annihilation.
Of course, the correct explanation for Big Bang might be completely different, only the fact that there was no simultaneous generation of matter and anti-matter is certain.
The neutral Z0 boson behaves like the photons, so I did not mention it.
The charged W bosons cannot generate particle/anti-particle pairs because that would violate the conservation laws. They generate simultaneously 4 elementary particles: quark, anti-quark, lepton and anti-lepton.
The quark and the anti-quark remain bound as one meson.
The reverse event is also possible, but highly improbable, because it is unlikely for a meson, a charged lepton and a neutrino to collide simultaneously, in order to annihilate into W bosons. What happens frequently is the equivalent event when 2 of the 3 collide and transform into a W boson together with the antiparticle of the third.
These type of events involving W bosons are what I was referring to as the generation/annihilation of systems of 4 elementary particles through weak interactions.
Unlike the stable photon, the W bosons have a negligible lifetime.
So in fact those processes listed by you that result in 2 particles are 4-particle processes.
Two particles collide, either a quark and an anti-quark giving other 2 particles, a lepton + an anti-lepton (in your exemples positron + electron neutrino or muon + muon antineutrino) or a lepton collides with an anti-lepton giving a quark and an anti-quark (in your exemple up quark + anti-down quark).
So there are always 4 particles, either quark, anti-quark, lepton and anti-lepton, or there may be both at input and at output a lepton and an anti-lepton, which includes the case of scattering through weak interactions, or there maybe both at input and at output a quark and an anti-quark, which includes the case when a meson decays into another meson through a weak interaction.
All the interactions involving W bosons are 4-particle interactions, where the W boson is an intermediate particle that exists only during a negligible time.
The 4 particles may be all at the input, all at the output, 3 at the input and 1 at the output, 1 at the input and 3 at the output or the most frequent case, 2 at the input and 2 at the output.
Like all the interactions between elementary particles all these cases are equivalent and all the variants are interchangeable by moving a particle from the input to its anti-particle at the output or vice-versa.
In most cases you need to pay attention only to the 4 particles participating in the weak interaction and not to the intermediate W boson.
The existence of the W boson matters only when you need to compute various things because this intermediate particle transforms diagram nodes with 4 edges that cannot be computed into diagram nodes with 3 edges that can be computed with the methods of quantum electrodynamics.
Everything you've written here applies exactly the same to the Z0. I'm not sure why you insist on claiming some fundamental difference in the case of the W bosons while claiming "The neutral Z0 boson behaves like the photons"
None of them can explain anything besides what is already explained without them.
The Big Bang theory that is useful starts with matter at very high temperature and density, but otherwise not different from the matter that we know. That happens at some uncertain time interval after the beginning of Big Bang.
We may try to extrapolate towards time 0 and increasing temperatures and densities, but that reaches soon values of temperature and pressure high enough that we do not really know how matter behaves in those conditions and it does not matter anyway because it does not influence what happens later.
For the later evolution, it is enough to start the modeling of Big Bang from the moment when the temperature became low enough, e.g. of some tens of MeV, i.e. when matter was too hot for nuclei and atoms to exist, but cold enough so that it consisted of a plasma composed of protons, neutrons, electrons, positrons, photons and neutrinos, with only negligible quantities of heavier particles.
For the time 0 there is no theory that can predict anything quantitative.
Suppose the amount of matter/antimatter produced at creation is chi-distributed. So there's some random variation, and each universe gets a different random value. Then if it's too close, then there's too much background radiation and the universe doesn't support life.
Can radiation be absorbed by matter and and turned into kinetic energy? Because matter in the universe has plenty of that when super massive blackholes and things that fly around them run away from each other.
Sure, but the amount of kinetic energy (in sensible frames of reference) isn't really significant compared to the amount of energy locked up as mass energy of stuff.
To close approximation the kinetic energy of something is 1/2 * mv^2 and the mass energy is m c^2 so to have a meaningful contribution to the total energy thespeed has to be close to the speed of light. We generally don't observe big things (stars, black holes, galaxies) moving anywhere near to that fast and their energy is basically entirely due to their mass rather than their speed.
Really? Distatnt galaxies move away from each other at insane speeds. That seems like a plenty of kinetic energy to me...
What if kinetic energy is the only thing responsible for galaxies moving away from each other? Or at least the bulk of it? Maybe the part that we interpret as a period of super fast inflation right after the big bang? Maybe this rapid inflation is just a way of interpreting matter having huge amount of kinetic energy relative to each other right from the beginning?
That would mean that we are (as in our galaxy is) in a very, very special position because we see all distant galaxies moving away from each other with velocities proportional their distance from us.
This is exactly what we would expect if the universe is expanding isotropically but under the "galaxies actually moving away from each other" hypothesis this is only possible if Earth is exactly at the center of a sort of "explosion of galaxies".
A second issue with this theory is that we see objects with redshifts that would mean that if their apparent motion is actually real motion then they would be moving away from us faster than the speed of light. As far as we know this is completely impossible for actual motion but is exactly what we would expect from expansion.
Start with bunch of objects at coordinates 0,0,0. Give them random velocities from 0 to c. Then just let them move according to Newtons law.
Now focus on single random object and narrow down your field of view so you won't see the edge. Look at other objects. They will seem to be moving away directly from the point you focused on with velocities proportional to distance from it.
I made such simulation and made calculations to ensure that the velocities of other points face directly away from the point I'm observing. And they really do.
Even in completely flat Newtonian universe there could have been a sort of Big Bang with epicenter and it could be as simple as "give matter random speeds" and we living on a one speck of matter would have no way to figure out that there is a center or where is it.
When I asked about this on physics stack exchane I got a shurg that, yeah cosomology is basically that but with Einstein not Newton.
All that talk about spacetime inflating is just a result of matter 'dragging' the spacetime along as it moves.
The faster than light galaxies far away are not a problem because the speed of their movement that we measure is sum of their kinetic movement (which could be almost at light speed) and expansion of the space time between us and them as the spacetime is 'dragged' by them with GR. But you can equally well interpret the math and data as galaxies roughly at rest and all the speed coming from spacetime expansion and for I have no idea what reasons people actually prefer to do that.
>Really? Distatnt galaxies move away from each other at insane speeds. That seems like a plenty of kinetic energy to me...
US 708 is going fast and it's still only 0.4 percent the speed of light.
if you're talking about relative speeds between astronomical objects being very high due to cosmic expansion, i'm not well enough versed in the physics to know why or explain how, but the energy locked up in that movement doesn't seem as practical to talk about with regards to creating work.
we know how to harness kinetic and thermal energy, as far as I know we're not yet able to surf the cosmic expansion, except inadvertently.
What if only the part of the speed of ftl remote galaxies is due to expansion and part (lower than c but arbitratily close to c for distant galaxies) is due to kinetic energy?
Their sum might easily be faster than light and they still might have insane kinetic energy.
Things can't really "leave", well they can, but the best guess we have is that there is no particularly special place in the universe, everywhere is roughly the same. So if even if all the really speedy stuff from here "left" we'd see the speedy stuff arriving from other places.
I thought 'no special place' was meant to say physics was the same everywhere; not the environment.
If you have a sparse gas with a random distribution of velocities and you let it sit for a while, you will find that the outliers on the upper end have have moved further away from cluster center. They have boiled off.
Try simulating this:
Start with bunch of objects at coordinates 0,0,0. Give them random velocities from 0 to c. Then just let them move according to Newtons laws of motion, ignore gravity.
Now focus on single random object and narrow down your field of view so you won't see the edge. Look at other objects. They will seem to be moving away directly from the point you focused on with velocities proportional to distance from it.
I made such simulation and calculated that the relative velocities of other points face directly away from the point I'm observing. And they really do.
There's definitely a center of this system, yet from the point of view of any object far enough from the edge to not be able to see it, it seems like there's no center and other objects just move away from us, and from each other with speed proportional to the distance.
But in that model you would not see a uniform distribution of objects within your field of view. It's observably anisotropic, while we observe the universe to be isotropic.
In that model if you can't see the edge because you are far enough from it the distribution you observe is exactly isotropic regardless of which object you choose as your point of view.
Same way that you can't figure out where the center is, you can't figure any special direction or plane.
Try sumulating this model yourself. Everything looks exactly as if you were in the center of Big Bang, even when you are not.
Not really if the initial speeds were random from 0 to c. Some stuff would leave but some would stay relatively close.
And you wouldn't be able to tell what's your speed when you look at how other stuff moves. Everythin would look like moving away from you with speed proportional to the distance.
Thanks for the information. Does dark matter ever enter the discussion in terms of "there isn't even close to enough asymmetry to explain the amount of matter and the lack of antimatter we observe."
We don’t know a ton about dark matter. Basically (and I’m simplifying a bit) you need CP violation (and more than is explained by the standard model) to account for the asymmetry. If your dark matter model of choice can generate CP asymmetry, then it could enter the discussion.
I'm pretty sure dark matter just represents the part of the universe that respects the privacy rights of matter, compared to the intrusive light emitting matter the blasts its details out into the light spectrum for all the universe to see. Think of dark matter as an effective equivalent of the GDPR for matter.
I'm not really an expert on that so I can't comment definitively, I think the answer is a tentative "yes" but I don't think theres really enough evidence to favour any proposals there.
I'm not reading that as the matter/antimatter didn't annihilate each other after the big bang. Rather, that matter and antimatter were not produced in equal amounts - if they were produced in equal quantities, why do our observations show such a large amount of matter and not equal amounts of antimatter? production and annihilation are balanced processes, so why the heavy sway to one side?
Th obvious suggestion is that if there was equal amounts of antimatter near us, we would likely be annihilated already. But that doesn't explain why so little can be found remotely either.
The potential for matter and antimatter to alternate is an interesting proposal that could explain our little island of stability in the universe.
Or it could be something else entirely. I'm a software engineer, not a physicist.
My baseless speculation of choice is that if antimatter is basically regular matter moving backwards in time, all the antimatter we expect to see from the big bang went flying out into negative time, into an anti-universe temporally disjoint from our own.
That would mean we should see the events where anti-particle flies, absorbs (or emits, or does none of that) some energy and turns into normal particle. Unless the moment of big bang was totally unique time when such things happened.
No. Pair creation is two photons disappearing and turning into particle and antiparicle. And annihilation is turning particle and antiparticle into two photons.
When particle and photon meet the result is particle and photon just moving differently. You probably might think of this as old particle and photon disappearing and new paricle and new photon appearing.
Tangentially related, but veering off-topic very quickly:
We define "matter" to be what there is in greater abundance in the local neighborhood. Thus we define protons and neutrons (or up and down quarks) as matter, along with electrons.
But what if that's wrong? What if electrons are anti-particles? That would go some distance toward "matter and antimatter in equal amounts" (though not, I think, all the way there).
Is there any physical reason that, if we identify the up quark, say, as matter, then the electron also has to be matter rather than antimatter? (I think there is reason that if the up quark is matter, the down quark has to also be matter, but I could be wrong on that one, too.) Can anyone with better knowledge than mine shed some light here?
Arguments about matter-antimatter balance are actually per quantum field.
Electrons can't, in any sense, be "antimatter". That's because there antimatter as a concept only applies to a particle's relationship to its antiparticle -- it's not a feature of a single type of particle, only of two types of particle. Which must be variant excitations of the same quantum field.
That's almost irrelevant, however. The big bang should have created roughly equal amounts of matter and antimatter for each quantum field; an excess of protons cannot be balanced by deficit of positrons, as I believe you were suggesting.
Protons and positrons do not annihilate, nor would protons and electrons for that matter. Only protons and anti-protons, or electrons and positrons, have that reaction.
(Though, aside: Protons are composite particles. None of the composites are electrons, though, so this still applies.)
This reminds me of a question that popped into my mind earlier this year. Forewarning: I’m not a professional physicist.
If the Big Bang didn’t create an equal quantity of matter and antimatter, and given that antimatter is still being produced by various cosmological sources and at a lower rate for proton-antiproton pairs than for electron-positron pairs, would that imply that the universe has a non-zero net electric charge which is still changing over time?
And if so, what would a universe with a significant non-zero electric charge look like?
And if not, because Noether, could the combination of e.g. antiprotons and positrons into anti-neutrons turn out to be stable and one possible candidate for dark matter?
As far as we know, during Big Bang, after the matter had cooled enough for heavier particles to decay but before cooling enough to allow the formation of nuclei and atoms, there were equal numbers of protons, neutrons and net electrons (i.e. electrons - positrons), so the electric charge of the universe was zero.
All the events that happened after that, e.g. the decay of a part of the neutrons into protons and electrons, the annihilation of positrons with electrons and various generations and annihilations of particle/anti-particle pairs, have not changed the total electric charge, so it has remained zero.
The electric forces are extremely strong and they ensure that matter is on average electrically neutral, so the only long-distance forces are magnetic and gravitational.
Any region with electric charge would generate strong electric fields with various obvious effects.
The observed bias of matter in favour of antimatter is a problem for the claim that particle-antiparticle pairs should have only ever been created or destroyed in matching pairs.
To rephrase my two questions about that observation:
1) What if the observed baryon asymmetry is an illusion because anti-neutrons are stable, unlike their mater counterparts? Would that explain anything?
2) If baryon asymmetry is real and not an illusion, what else should we expect to see?
> The electric forces are extremely strong and they ensure that matter is on average electrically neutral, so the only long-distance forces are magnetic and gravitational.
Which is why my question is specifically with regard to the observed baryon asymmetry as that might (in the case of question 2) mean the charges can’t be balanced.
> Any region with electric charge would generate strong electric fields with various obvious effects.
If the charge was distributed isotropically rather than in a limited region?
Wouldn’t this just produce energy in the form of photons which are neutral in terms of being matter or antimatter which could then turn back into matter? I don’t see how anything has to be lost in this situation. Other than a large amount of entropy being produced.
> This subatomic particle is normally made up of a charm quark and an up antiquark, while its antimatter equivalent consists of a charm antiquark and an up quark.
Q: if these subatomic particles consist of a quark and an anti-quark, why don't they self-annihilate?
There are six quarks: down, up, strange, charm, bottom, top.
There are six anti-quarks: down, up, strange, charm, bottom, top.
Using the strong force, a quark can annihilate only an antiquark of the same type, so the number of charm quark minus the number of charm antiquarks is "almost" conserved. If you start with a system with one charm quark like the particle here, you "allays" will have a system with one charm quark, or 2 charm quarks and on charm antiquark, or ... but "never" a system with no charm quarks like a few photons after annihilation.
It's more complicated, because the weak force can annihilate one quark with a different antiquark. (The rules are somewhat complicated, and some annihilations are more usual than others.) So after some time, you can be lucky and a weak interaction can destroy your charm quark and a up antiquark, so the number of charm quark minus the number of charm antiquarks changes "slowly".
Here "slowly" means that to see an effect of the weak interaction you must wait like 10E-12 seconds instead of the 10E-24 seconds you must wait until you see an effect of the strong interaction. The numbers change for each decay and are difficult to calculate and may be much longer. So everything in scare quotes means it's just an aproximation.
For example the neutron->proton+electron+antineutrino decay use the weak force and you must wait like 10 minutes that is a lot for an unstable particle. The particle discussed in this article has a life of almost a millisecond, that is very long for a particle but short for us.
(Electromagnetism can annihilate only an antiquark of the same type, so it's not important here and it's much fainter.)
---
On the other hand, you are right. This article is about the decay of a particle D0 into a Ks0 and pions. After some time (probably a few milliseconds after the experiment finalized) the Ks0 decays in more pions. And the new pions and the other pions after some time decay into photons and muons and neutrinos, and after some time the muons decay into electrons and neutrinos.
The number of electrons and antielectrons (positrons) are equal, so they annihilate each other and you get more photons. Also, the number of neutrinos is equal to the number of antineutrinos, so they theoretically can annihilate each other, but in practice they just escape.
Sure it can, but it has to proceed via the weak interaction, and it's suppressed by the GIM mechanism. For example, D⁰→γγ (c̅u or cu̅ annihilation) is rare and not yet observed, but it's allowed.
I've often heard it said that a particular prediction of Quantum Electrodynamics [1] may be the most precise experimentally confirmed value in physics [2], with an accuracy in the region of one part in a trillion (10^12).
I haven't heard the same claim made for the LIGO experiment yet, but I understand it is capable of detecting distance changes smaller than one part in a billion trillion (10^21) [3].
If this LHCb result is confirmed, given that it involves the detection of a mass difference of one part in a hundred million quadrillion quadrillion (10^38), would it qualify as a new precision record in physics? I'd be grateful if anyone familiar with the field could comment.
This isn't a ratio of one part in 10^38; it's an absolute difference between the particles' masses of 1x10^-38 grams. The ratio implied depends on the mass of each particle. I presume this is far less than a gram, which means the radio is also far less.
Measured in grams like the 10^-38, there would still be lots of zeros first, so not 38 significant figures.
Besides, relative measurements are often easier than absolute ones, so even if you could determine that the masses of two objects of ~1g differed by 10^-38g, that wouldn't necessarily tell you the absolute masses to anywhere near that accuracy.
> if ever a matter and antimatter particle come into contact, they will annihilate each other in a burst of energy.
> To complicate things, some particles, such as photons, are actually their own antiparticles.
I believe "a burst of energy" means a bunch of photons? So, we still end up with the same amount of energy as before, but all particles are now photons? Like in the heat death of the universe?
An electron and a positron can annihilate, producing a pair of photons. Or two photons can annihilate, producing an electron and a positron. Or...
Well, in fact the symmetry works for any rotation, not just 180 degrees. An electron and a photon can "annihilate", producing an electron* and a photon going the other way. This is also known as "A photon bouncing off a mirror".
*: The careful reader will note there's no positron in this one. That's because positrons are what happen when an electron hits a photon so hard it bounces backwards in time -- that is, it's an electron going backwards in time. For one definition of 'time', at least.
To make the mirror version work, the photon needs to have lower energy than would be involved with matter annihilation. Otherwise your gamma-ray photon will tear through the mirror instead of bouncing back, probably making a nice hole. This doesn't mean the rotation doesn't work, it's just that what I was describing wasn't really a "mirror" as commonly considered. But also, because momentum (or, more usefully, rapidity) is defined in terms of the angle the particle makes with time, rotations other than 180 degrees don't conserve momentum. So it's not a gamma-ray photon anymore.**
**: Photons, obviously, have a fixed 'angle with relation to time'. You can take that as referring to the angle their waves make with time, instead.
About 43 megatons of TNT for one kilogram of cat annihilating with one kilogram of anti-cat. So about 200 megatons of TNT, four Tsar Bombas, for an average cat and an average anti-cat of 4.5 kilograms each.
Ah, very true. But then (not having much knowledge on this area of physics) I also don't know if it would be the energy of the mass of one cat, or two cats.
I think that would depend. If half the particles in the cat turn to antimatter and the other half stay as matter and annihilate each other, it’d be the mass of one cat.
Yes, but the only ways we know to generate antimatter require a lot of energy. Theoretically it could be an incredibly energy dense "battery" for spaceships or whatever though.
The title is confusing, because the whole particle does not switch between matter and antimatter, only its internal components switch.
The hadrons, i.e. the particles made of components bound together by the so-called strong nuclear forces are partitioned into 3 groups: particles made of 3 quarks, particles made of 3 anti-quarks and particles made of 1 quark together with 1 anti-quark.
Particles made from more quarks than in these 3 groups are possible in principle, but they would disintegrate extremely quickly, i.e. in times many orders of magnitude less than 1 nanosecond.
The reasons why only those 3 kinds of hadrons are normally encountered is that exactly like the electric forces attempt to neutralize the electric charge and prevent the negative charges to separate from the positive charges, the strong nuclear forces also try to neutralize a similar quantity, with the difference that this quantity is 2-dimensional, unlike the electric charge, which is 1-dimensional.
The condition of neutralizing that 2-dimensional quantity can be satisfied by the 3 kinds of quark combinations listed above.
The lightest hadron made of 3 quarks is the proton and the lightest hadron made of 3 anti-quarks is the anti-proton.
The proton and the similar hadrons can be considered as matter and the anti-proton and similar hadrons can be considered as anti-matter.
None of these hadrons made of either 3 quarks or 3 anti-quarks can switch between matter and anti-matter and this remains true for their compounds, like nuclei or anti-nuclei.
On the other hand, the 3rd kind of hadrons, which are usually named mesons, and which are made from 1 quark and 1 anti-quark, are in fact neither matter nor anti-matter.
The article linked here refers to a particle of this kind, where the total number of quarks is +1 -1 = 0.
So in mesons it is possible that the pair of 1 quark + 1 anti-quark will transform in another pair of 1 quark + 1 anti-quark, but where the quarks are different types of quarks than before the transformation. This transformation is possible because the total number of quarks is 0, both before and after the transformation, so any such transformation is possible as long as the new types of quarks are such so that the electrical charge remains the same.
The compositions of mesons can oscillate between different pairs quark + anti-quark, but the heavier mesons decay extremely fast so it is exceedingly difficult to observe such oscillations.
The linked article reports the experimental observation of such an oscillation, which was known as possible since decades ago, but it was not seen in experiments due to technical difficulties.
So in these transformations some quarks and anti-quarks appear and disappear, but the total quantity of matter and anti-matter is the same before and after, which is why the title is misleading.
Spin two toy tops in opposite directions. When they bump into each other they both stop.
All fun stops.
This experiment says when they bump into each other ones spinning a certain direction always have a little spin to it left over.
A tiny bit of fun still exists.
That little bit of fun is all physical matter in the universe.
That simply is not true. HN is filled with subject matter experts that can break down complex topics better than popsci journalists, and provide commentary on how this fits within the framework of current and future research.
HN is filled with people who believe themselves to be subject matter experts, and who refuse to engage with posted articles as they consider leaving the sanctum sanctorum of HN to expose themselves to the plebian knowledge and interests of others to be a waste of time. In other words, with people who wear their ignorance as a badge of elitism.
Not unrelated, HN has a few actual subject matter experts who are often aghast at how wrong HN tends to be about any field or subject not directly related to programming, and how often their confidence is in inverse proportion to their knowledge.
You can safely assume that while there may be actual, credible particle physicists on HN, their voices are likely to be drowned out by armchair expositors of bullshit ranting about dark matter being a hoax perpetrated by cultural Marxists or somesuch. Do the work yourself. Read the article yourself. Enrich your own mind and satisfy your own intellectual curiosity, don't expect HN to do it for you, or to be better than the mainstream in terms of general expertise, or you'll be misled and disappointed more often than not.
You're projecting here. I only stepped in to defend someone asking for an explanation from the HN community.
I'm not putting aside skepticism and intellectual rigour when I read the comments here.
I also think you're wrong about the members of this community. There are a tremendously wide assortment of backgrounds represented here.
I come to Hacker News mostly for, not in spite of, the comments.
I think telling people to not ask questions is dangerous. The person I originally responded to was not doing that exactly, but it felt borderline enough to remind them that the community here is filled with experts and teachers.
It's also worth pointing out that we shouldn't silence curiosity. Attacking what you perceive as a lazy mind can only harm the desire to learn.
I'm not objecting to asking questions, I'm objecting to preferring the HN commentariat over at least engaging with the posted content. Asking questions after reading the article is fine - that's part of creating fruitful discussion.
But HN does have a problem in that no one believes reading the article is worth the effort, and everyone prefers to just read the comments. The end result of that is a negative feedback loop of insularity and incuriosity, more people only engaging with the subject (or what they believe the subject is based on their reading of the title) on a superficial level rather than real intellectual discussion.
HN and Reddit are filled with people sneering at how dumb everyone else on HN and Reddit are. It makes for tedious reading, and a cringeworthy "I'm superior because I see how dumb everyone else is" status grab. If something is wrong, correct it or downvote it or ignore it; don't conjour up some imaginary mind-reading fantasy about how great the author must think they are and then sneerily tear down the strawman.
What status do you think I'm trying to grab? The only thing that's going to happen here is that I get aggressively downvoted and the problems I'm complaining about will continue unabated, because mentioning them makes for "tedious reading."
The one where instead of saying "here's a good book", you side-eye your friends and say "OMG Becky can you believe it, they are reading Twilight! THEY think it's a GOOD BOOK! a-haw-HAW-haw-HAW!", where the purpose is not to enjoy whatever book you are reading, or to share good books with other people and why you think they are good, not to improve the quality of books in general or to observe and people-watch with curiosity, but to re-inforce that people who read Twilight are inferior and therefore your group is superior.
From a PG essay[1] "When you tread water, you lift yourself up by pushing water down. Likewise, in any social hierarchy, people [...] will try to emphasize [their status] by maltreating those they think rank below. But I think the main reason other kids persecute nerds is that it's part of the mechanism of popularity. Popularity is only partially about individual attractiveness. It's much more about alliances. To become more popular, you need to be constantly doing things that bring you close to other popular people, and nothing brings people closer than a common enemy."
The status grab is "I must be superior to those people, because I'm here, and they are there so I'm not part of them, and they are inferior".
> "The only thing that's going to happen here is that I get aggressively downvoted and the problems I'm complaining about will continue unabated, because mentioning them makes for "tedious reading."
That will continue regardless; a big part of your rant was that Dunning-Kruger dross about how people who write a comment on the internet without adding a pile of IMHO to it must be cock-sure idiots who don't know how dumb and wrong they are. If that's correct and they don't know how wrong they are, how are they going to identify as the target of your rant and change? If they are "soooo confident", why would they know to change their belief?
The only thing that's going to change it is people pointing out incorrectness in a way that helps (i.e. without closing off avenues to change with insults), or downvoting it to hide it so it doesn't spread. In the "be the change you want to see in the world" sense.
>The one where instead of saying "here's a good book", you side-eye your friends and say "OMG Becky can you believe it, they are reading Twilight! THEY think it's a GOOD BOOK! a-haw-HAW-haw-HAW!", where the purpose is not to enjoy whatever book you are reading, or to share good books with other people and why you think they are good, not to improve the quality of books in general or to observe and people-watch with curiosity, but to re-inforce that people who read Twilight are inferior and therefore your group is superior.
Except the entire point of my comment (fairly, rant) was to tell the people who refuse to read Twilight that they should actually try reading Twilight instead of asking their circle of friends who probably haven't read Twilight but insist that it's garbage because all modern literature is infantile commercial pap whether it's worth reading.
>If that's correct and they don't know how wrong they are, how are they going to identify as the target of your rant and change?
They won't. They'll do what you're doing, attack me personally and lecture me and, ironically, attempt the exact same "status grab" you're accusing me of. Except something tells me you'll wind up with far more invisible kudo points at the end of the day than I will, because defending Hacker News and refusing to admit its flaws is always a better social strategy than criticizing it. Who's pushing whom under the water here?
I'm attacking one specific comment (yours), linking to the HN guidelines to say why.
You're ranting about "armchair expositors of bullshit ranting about dark matter being a hoax perpetrated by cultural Marxists" [any examples of this happening?], claiming that posting a comment on the internet means the author "believes they are a subject matter expert", presumably reasoning that "posting on the internet" is something only people who believe they are experts do? Claiming that people "wear their ignorance as a badge of elitism", because "not clicking a link" is somehow a claim of eliteness? You're accusing people of "thinking themselves superior", because "not clicking a link" is the same as saying "people who read the link are plebian"? (This isn't encouraging people to read articles, this is "everyone else is so dumb, eyeroll").
In your terms, you're pushing everyone under the water. I'm pushing you and people who post similar "everyone except me is dumb" rants, under the water.
That article didn't clear up a whole lot for me personally. Maybe some others are like me. I also read the actual CERN article, which this article copied almost entirely.
But after reading twice, what seems to be going on is that a particle has a slightly different mass from its antiparticle. (That is the most unbelievable sentence I have written so far this year! It's just... wow. I didn't think that was even theoretically possible.)
> a particle has a slightly different mass from its antiparticle
This is not the case, and that would violate CPT symmetry.
There are two states with different masses (the mass eigenstates) but they are not antiparticles of eachother. They are linear superpositions of the flavour eigenstates:
|D1> = p|D⁰> + q|D̅⁰>
|D2> = p|D⁰> − q|D̅⁰>
where p and q are complex coefficients.
The difference in mass between D1 and D2 is directly proportional to the oscillation frequency.
If p/q ≠ 1, you have CP violation in this mixing. The amount of CP violation in this measurement was found to be consistent with zero.
If you think of the Hamiltonian as a 2x2 matrix operating on the flavour eigenstate, there are off-diagonal terms arising from the amplitude of spontaneously changing from particle to antiparticle. The leading-order diagrams look like this: https://ppd.fnal.gov/experiments/e871/public/images/Kaon_mix...
You get CP violation even if |p/q|=1 but Arg(p/q)≠0.
Edit: Official seminar: https://indico.cern.ch/event/1027299/