Hacker Newsnew | past | comments | ask | show | jobs | submit | quantum_mcts's commentslogin

0. There's money laundering.


Here’s an exercise. Go to Wikipedia for “Money Laundering” -> “Notable cases” and sum the amounts laundered by all famous banks for the last 10 or 5 years.

Then try to say again that “crypto is for money laundering” with a straight face.


> Go to Wikipedia for “Money Laundering” -> “Notable cases” and sum the amounts laundered by all famous banks for the last 10 or 5 years

This is sort of like concluding bull riding is safer than eating food because more people have died from choking than bull riding.


It is sort of like saying that crypto is less money-laundery than banks because more money gets laundered through banks than through crypto.


It's a terrible argument though. If I ran a bank that literally only did money laundering, and I applied your logic, it wouldn't matter because, hey, others have done it more in aggregate.

Even then it's like, if I killed someone, I could say, well Americans in general have killed X many people, so I'm certainly less kill-y than Americans.


I can at least say that people use banks for other reasons.


What's your point? That these people shouldn't have been arrested, or that money laundering on crypto should be ignored? Or something else?

Besides, crypto is clearly sometimes used for money laundering - hence the article.


Yes


Good. Now explain the Bullet Cluster.


No one can explain the Bullet Cluster.


Cold dark matter made of weakly interacting massive particles can.


No, LCDM can't explain the observed velocities.



> Good.

Is it, or isn't it, and why? You don't say.

The authors asked themselves how their quantum gravity theory differs from General Relativity, and whether the successes of General Relativity in astrophysical settings would be fatal if their theory has strong differences, and that's the basis for this paper. The tl;dr is that their theory predicts different trajectories outside of large central masses, but that might not conflict with evidence from galactic-dynamics astronomy.

This is the second paper released in the past few days by the University College London Oppenheim group. It's a preliminary investigation of the longer length scale features of their classical stochastic theory. The central question is how its version of Schwarzschild-de Sitter (SdS) differs from standard General Relativity.

The first paper, and I think the more interesting one, is about the short length scale aspects of their asymptotically free theory, in which the gravitational interaction weakens as distances between interacting sources decreases. The asymptotic freedom means the theory is amenable to renormalization, unlike perturbative quantum gravity and a number of other approaches. That paper is at <https://arxiv.org/abs/2402.17844>. Note that they do not know how to make the gravitational part quantum mechanical without introducing problems (i.e., it is haunted by "bad" ghosts in the sense of <https://en.wikipedia.org/wiki/Ghost_(physics)>); their classical and stochastic gravitational sector is ghost-free (a point also made at the end of Appendix A in the large-scale paper), and it is reasonable for them to believe that could be good enough that it's worth continuing to investigate what the theory predicts and how its parameters are set.

The second paper was motivated by the first: "The theory was not developed to explain dark matter, but rather, to reconcile quantum theory with gravity. However, it was [noted] that diffusion in the metric could result in stronger gravitational fields when one might otherwise expect none to be present, and that this raised the possibility that gravitational diffusion may explain galactic rotation curves".

That MOND-like effects might arise in their approach to the problem of small-scale quantum gravity is at least interesting. It was not the starting point.

Moreover, they did not start with the idea of modifying General Relativity to get rid of the need for (some or all) cold dark matter. As they say: "While this study demonstrates that galactic rotation curves can undergo modification due to stochastic fluctuations, a phenomenon attributed to dark matter, it is important to acknowledge the existence of separate, independent evidence supporting ΛCDM. In particular, in the CMB power spectrum, in gravitational lensing, in the necessity of dark matter for structure formation, and in a varied collection of other methods used to estimate the mass in galaxies."

> Now explain the Bullet Cluster

This paper does not seek to do so. "To make it tractable analytically, we have restricted ourselves to spherically symmetric and static spacetimes, with metrics of the form of Eqs. (17)." Eqn 17 describes out an adapted Schwarzschild-de Sitter spacetime and leans on an argument that Birkhoff's theorem applies (in particular that their model spacetime is stable against certain perturbations, notably those concentric upon the source mass). There is further detail in Appendix B.

Studying this restricted model, the de Sitter expansion of the spacetime and MOND-like anomalous Kepler orbits at some remove from the Schwarzschild central mass are in their theory driven by entropic forces generated by the fluctuations in the gravitational field of the central mass (and they do a good job in Appendix D explaining this).

In GR's Schwarzschild-de Sitter the free-fall trajectories of test particles around the central mass are totally determined by the mass; the gravitational field doesn't fluctuate. The (Boltzmann) gravitational entropy of the region outside the central mass is everywhere very high.

In GR-SdS we can consider adaptations where with M=const. we turn the pointlike central mass into a spherically symmetric shell, or a concentric set of such shells, or even a ball of fluid, or a ball of dust, or a ball of stars and other galactic matter. None of these symmetry-preserving adaptations changes the free-fall trajectories of test particles outside the outer surface, or the gravitational entropy at any outside point.

In the author's theory, the spacetime is stochastic. It fluctuates. Close to the central mass fluctuations are unnoticeably small; the gravitational entropy is very low. Far from the central mass the gravitational entropy is very high, and gravitational fluctuations are noticeable. A sort of thermodynamics leads to a diffusive flow outwards from the central mass, from the low entropy near there to the high entropy at increasing radial distance. This diffusion is carefully constructed so that the outwards flow is only really appreciable at large-scale distances. The effect is that large-radius orbits are statistically pulled inwards by something describable as stronger gravity at larger radiuses (see around Eqn (21)). This is an "entropic force", very roughly analogous to squashing a sponge ball in your hand then releasing the pressure and watching the sponge ball expand, where the material of the sponge represents the gravitational field.

Their stochastic fluctuations are still generated by the spherically-symmetric central mass. These fluctuations break the spherical symmetry of the outside metric. Consequently they have to do some work to make the outside metric look appropriately Schwarzschild-like in their "diffusion regime", and to keep that stable against the stochastic perturbations.

The authors contend that with reasonable choices of parameters, and restricted to static spherical symmetry of the central mass (and no additional dynamics), this effect comes close to duplicating MOND's low-acceleration regime.

They don't go into anything like a backreaction upon the Schwarzschild metric by large fluctuations.

(They do have an idea about how to get the de Sitter trajectories though, but that doesn't fit very naturally into this comment, which is already long.)

> Bullet cluster

The authors know full well that the metric for a gravitationally bound cluster of galaxies isn't well-represented by their choice of SdS-like metric. A galaxy cluster is too lumpy for the Schwarzschild part.

Two gravitationally bound galaxy clusters having passed through each other (trailing collided gas and dust, and tidally stripped stars and other matter) is even less like Schwarzschild. This is because SdS solutions of the Einstein Field Equations do not linearly superpose. So their metric is a poor description of any sort of "close call" interaction between galaxies or galaxy clusters, even if the individual components are "close enough" to Schwarzschild from the perspective of an observer sufficiently large (as in cosmologically large) distances. They do not (and within this initial paper should not really be expected to) offer a more suitable metric. I'm sure they'd love to look into things like that though.

The non-linear superposeability of useful solutions of General Relativity is a problem for asking how astrophysics differ in most theories that preserve the equivalence principle (this one does, it's a metric theory of gravitation). As the replacement for the Einstein Field Equations lose symmetries (sphericity, staticity) they tend to become analytically intractable and non-numerical approximations become unreliable.

The authors -- imho in a strikingly principled way -- call attention to various difficulties in using this work to describe astrophysical systems, particulary from the middle of the fifth page of the PDF.

They are not obviously worse off than the Verlinde programme of emergent-entropic gravity, where the gravitational field is generated by entropic forces rather than vice-versa.


> First, take some extremely obvious platitude or truism. Then, try to restate your platitude using as many words as possible, as unintelligibly as possible, while never repeating yourself exactly. Use highly technical language drawn from many different academic disciplines, so that no one person will ever have adequate training to fully evaluate your work. Construct elaborate theories with many parts. Draw diagrams. Use italics liberally to indicate that you are using words in a highly specific and idiosyncratic sense. Never say anything too specific, and if you do, qualify it heavily so that you can always insist you meant the opposite. Then evangelize: speak as confidently as possible, as if you are sharing God’s own truth. Accept no criticisms: insist that any skeptic has either misinterpreted you or has actually already admitted that you are correct. Talk as much as possible and listen as little as possible.

Sounds like an LLM prompt...


Really weird post and discussion around it. The "family" is not some abstract entity / offender's property. The "family" is made of other humans. "Not punishing people that didn't do anything wrong" is pretty high at the list of my consequentialist morals.


The point is that there's nothing inherently immoral about family punishment from a consequentialist lens. Consequentialist morality is based solely on outcomes; therefore, if family punishment results in fewer crimes perpetrated on innocent people, there is not a coherent consequentialist argument for family punishment being immoral. It suggests that there's some cost/benefit ratio at which family punishment could be considered morally correct.


Morality frameworks defined as a single rigid rule will always encounter some "repugnant conclusion" given a sufficiently contrived situation, news at 11.

The "retributive morality" the author espouses doesn't actually dodge "justification of family-punishment" either. It relies on the same external axiom, that family should not be held culpable for their members' crimes (and are thus categorized as "innocents"). The same assumption should cause consequentialism to assign an extremely negative valence to family-punishment, because the consequence of family-punishment is that "everyone lives in a society where family members are held culpable for their members' crimes, and live/act in fear of such largely-uncontrollable punishment".


What's an example repugnant conclusion for the axiom "don't punish innocent people"?


Easy -- No one is ever punished for anything and crime gets to the point that society crumbles, because as humans we cannot enact a justice system that does not occasionally punish innocent people.

Slightly less easy, but closer to current reality: For "don't knowingly punish innocent people" -- enact a justice system that is terrible at fact finding (or has gross bureaucratic and procedural inefficiencies), so that you punish a lot of innocent people but never do so knowingly.

Easy again: For "don't punish innocent people, knowingly or otherwise -- you have no criminal justice system that is workable within these constraints in the real world.


That doesn't constitute a framework for answering "who do you punish" or "how do you punish them", so it's not relevant to my assertion.

It is a popular secondary/tertiary/N-ary axiom, though, as demonstrated in the article.


I see a huge logical fallacy here. It doesn't stand.

> therefore, if family punishment results in fewer crimes perpetrated on innocent people, there is not a coherent consequentialist argument for family punishment being immoral.

Punishing a criminal's family _is_ perpetrating violence on innocent people; unless family itself is criminal, in which case, you can still investigate each family member.

Unless you deny any individual's own responsibility.

In which case, the issue is not only family, but society as a whole. That's a pretty twisted vicious circle.


> Punishing a criminal's family _is_ perpetrating violence on innocent people

What's the logical fallacy here?

The point is that consequentialism doesn't object to ideas like "perpetrating violence on innocent people" with any kind of principle. If the consequences of such violence are overall negative, then the act becomes immoral. Punishing the family members of a criminal has negative consqeuences for those people, but those could be outweighed by the positives for society as a whole by reducing other kinds of immoral actions (which themselves have greater negative consequence.) All of these individual actions have their own moral weight - there are no categories of moral/immoral actions in general.


> but those could be outweighed by the positives for society as a whole by reducing other kinds of immoral actions (which themselves have greater negative consequence.)

No. Because you cannot outweigh a wrong by another wrong. There's no balance, it only adds up on wrong.

The negative consequences are not only for the innocents wronged in this scenario.

They are also to all the rest of society that, witnessing that, can only deduce and fear that _no one_ is safe from being wronged the same way, because of the actions of a third party (be it family or other). And that it's not anymore a matter of justice, but of power (of who decides what is wrong or not, and who decides how many circles around the criminal should be punished).

Ruling by fear and violence never brought good (but only from the partial and twisted perspective of those in power). Neither in education for kids, neither in training for animals, neither in society for people, never.

Even if still imperfect, democratic-tending societies have this figured out above autocratic ones.

> there are no categories of moral/immoral actions in general.

That depends highly on how you define and consider morality as a virtue.


>No. Because you cannot outweigh a wrong by another wrong. There's no balance, it only adds up on wrong.

This is not in line with consequentialist thinking, so there is no logical fallacy. You're failing to consider a line of reasoning in terms of a different moral philosophy to your own.

>They are also to all the rest of society that, witnessing that, can only deduce and fear that _no one_ is safe from being wronged the same way, because of the actions of a third party (be it family or other).

As other commenters have pointed out, this is not a statement that holds in general in consequentialist terms. How great are the harms to the rest of society? How great are the harms of the crimes prevented in this way? What is the real net benefit or downside to the whole population? These are the questions consequentialism wants answered to judge the morality of such a policy.


> This is not in line with consequentialist thinking, so there is no logical fallacy.

Ok, but then what's the point of a consequentialist take, if that's so removed from past experiences?

> These are the questions consequentialism wants answered to judge the morality of such a policy.

Correct. The problem/flaw is deep in the roots of consequentialism itself: if you wait only for the outcomes to judge whether something is moral or not, you can only be a spectator, not an actor. You can't act without a principle. If you want to take action, you've got to act after principles, from memory and/or reasoning (or you may act irrationally - but then you may only invoke amorality, which defeats the consequentialist definition as well).


>The point is that there's nothing inherently immoral about family punishment from a consequentialist lens. Consequentialist morality is based solely on outcomes; therefore, if family punishment results in fewer crimes perpetrated on innocent people, there is not a coherent consequentialist argument for family punishment being immoral. It suggests that there's some cost/benefit ratio at which family punishment could be considered morally correct.

If one assumes that the families of those who commit crimes are innocent as well, punishing those families are also crimes against innocent people. By your own logic, punishing ten innocent members of a family for a crime committed by a relative against a single person makes things worse, not better -- as more crimes (and more innocent people hurt) are committed in executing such punishment than the harm caused by the initial crime.

In fact, in the US at least, we already inflict a version of this on the communities of those who commit crimes. If someone commits a crime and is convicted, they are generally discriminated against in finding jobs, getting housing, being able to exercise the political franchise and a variety of other punitive "punishments" that are beyond those prescribed by law.

This encourages recidivism, reduces potential economic output, reduces economic/social/political opportunities and otherwise negatively affects the families and communities of those convicted, even after completing the official punishment for whatever crime may have been committed.

While there certainly are folks who cannot integrate into society without harming it (e.g., serial killers) and, as such, should be permanently removed from society in order to protect the other members of that society, most folks who commit crimes are not such people. However, when we discriminate against those who have already been punished for their crimes, we intentionally put them at odds with society, increasing the risk of recidivism. And more's the pity.


Similar stuff by this guy already posted here: https://news.ycombinator.com/item?id=32367085 It is a crackpot pseudoscience.


Sampling temperature.


We trade with bees though.


I've long noticed that this idea - that scientists just "jumped to conclusion" that there must be some dark matter to resolve some observational inconsistencies - is very prevalent among non-specialists. It could not be further from the truth, of course - as soon as the rigorous statistical study of galaxy rotation curves was provided by Vera Rubin et. al. in the 80s, the MOND classes of alternatives to gravity started to appear.

I tried to understand why this narrative is so prevalent, and it looks like it is due to its very appealing nature - both to the listener and to the speaker: "all those egghead scientists can't see an obvious solution to a problem they are fighting with for decades". I also noticed that when you get deeper into it, then you find that the source - the "speaker" part in the whole narrative - is some kind of science popularizer. Who at the same time (surprise) is a proponent of his own flavor of modified gravity theory.

Truth is - everyone, including me, had this idea as soon as they first heard about the DM. If you actually study all the evidence and discussions around the subject, you'll see that introducing an extra "dark" particle is the most obvious solution to the whole collection of observational inconsistencies. Modifying gravity in such a way so that it is self-consistent and consistent with all the observations... I actually don't think anyone completely finished that project so far.


Everyone got drilled into them in school the stories of scientists and philosophers clinging to outdated models and introducing extraneous elements to extend their life in light of contradicting evidence, the piling up of epicycles to explain celestial movements by circular motions, or luminiferous aether for light to be wave in a fluid medium.

So of course when you first hear "so astronomers found out the equations don't work out with the matter we can detect, so they figured that there must be uhhh... more matter, invisible matter! That exists but you can't detect no matter what!" it sounds like those examples.

If you dig in deeper, then you realize that it's not just fudging of equations and that there being mass that's very sparse and unaccounted-for makes sense.


When Sabine Hossenfelder and Sean Caroll agree on something, I pay attention.

I do take some humility there. We have stunningly brilliant contrarians who would be the first to dismantle dark matter on the public stage, and instead they tell us, “no, it’s a real thing. We just haven’t seen it in a lab on earth much yet.”

Now if Eric Weinstein started saying dark matter is actually the consequence of a massively egocentric MOND, and everyone else is a WIMP, I might start believing that too ;)


What do Hossenfelder and Carroll agree on? It was my understanding that Hossenfelder thinks we need both DM and MoND while Carroll is firmly in the conventional DM camp.


You caught me, in that she’s 100% open to new evidence. Kinda why she is so effective.

So, over the past decade, Hossenfelder made a series of videos and talks about “dark matter is real.” And then “is dark matter real?”

Her opinion evolves. But the conclusions are generally that the data points to a measurable thing along many lines of evidence.

I find the cosmic web the most compelling, personally. Like, what else could that be besides a mystery? You can’t draw the web in some other configuration, it’s a real thing somehow.

From my understanding. Her hesitation that she expresses is more linked to an underlying intuition: the universe is completely deterministic—we just can’t see all of the rules being computed, and the initial conditions are intractable.

I also find that pretty compelling.


can it be possible that the galaxy just appears to be rotating faster due to time dilation, so from the POV of the galaxy itself it doesn't seem like there is anything out of the ordinary?


[It's not my branch, so I have to guess.] They are using special and general relativity to make a lot of corrections, like red shift and gravitational lens, so there is 0% chance that all the community forgot to add that correction.


Galactic rotational velocities are on the order of 100 km/s, which, though blazingly fast by human standards, is so slow compared to the speed of light that special relativistic effects are completely negligible.

Moreover, it's not just the absolute magnitude of the velocity that is the problem, it's also the shape of the curve. Newtonian mechanics (even with the extremely tiny relativistic corrections taken into account) predict that the rotational speed of a galaxy would decrease the further you go out from the center, and we in fact find that this speed remains more or less constant.


except its not consistent with all the observations, which is why it's not accepted consensus.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: