So much of what you said is hilariously wrong. Probably the most acute example is your claim "Standard Model is mute on gravity". Are you kidding me? Page one of any book of the Standard Model discusses the 4 fundamental forces. Guess what one of them is: GRAVITY.
You conducted yourself so nastily in this thread that we've banned this account. This sort of flamewar is exactly what we're hoping to avoid on this site.
Really? So what's the gravity term in the Standard Model Lagrangian? [1]
> Page one of any book of the Standard Model discusses the 4 fundamental forces. Guess what one of them is: GRAVITY.
Well, let's just test this assertion of yours.
Right at hand I have Halzen & Martin [2] and "gravity" appears in the index (on p. 389) pointing to pp 27 and 348. Section 1.8 (pp 27-28) explains why gravity is not addressed in the book, and at p. 348 there is a brief discussion following the Weinberg-Salam unification scale at eq 15.58 about whether, given it is large, the gravitational interaction can still be neglected. The treatment there is unsurprisingly fully classical.
Maybe you don't like this particular textbook.
How about Cottingham & Greenwood [3]? This is an excellent book aimed at grad students, and has the advantage of having its introductory chapter online:
"The Standard Model excludes from consideration the gravitational field."
Well, at least that's on page one.
Who next? How about Griffiths [4]? In the middle of page 50 we find:
"This is all adding up to an embarrassingly large number of supposedly 'elementary' particles: 12 leptons, 36 quarks, 12 mediators (I won't count the graviton, since gravity is not included in the Standard Model)."
Above are three standard textbooks introducing the Standard Model, and they all support my assertion and not yours.
"The Standard Model includes the electromagnetic, strong and weak forces and all their carrier particles, and explains well how these forces act on all of the matter particles. However, the most familiar force in our everyday lives, gravity, is not part of the Standard Model, as fitting gravity comfortably into this framework has proved to be a difficult challenge"
I would be very keen on any evidence that supports your claim that the Standard Model is not mute on gravity.
I'd also be keen on what else you believe I was "hillariously wrong" about. I'd be happy to expand upon, back, or source most of the statements as I do here, if you particularize your complaints and are reasonable and polite about it.
And finally, what really are you trying to accomplish here?
[1] here's Cottingham & Greenwood's[3] write-down of the SM Lagrangian:
You can google the phrase "Standard Model" or look it up in any physics book you want, and there will always be a table labeled "Standard Model" that shows all the particles, which relate to the 4 fundamental forces (carriers), charges, spins, masses, etc. I'm sure you're familiar with this chart. The graviton is on that chart and is specifically describing gravity based on each mass value of each particle (hardly "silence" there). I mean literally every particle type has a mass value (even when zero) right? Is that silence? The general laymans meaning of "Standard Model" means everything in physics that is referenced on that chart (including gravity), and that's how I meant it.
However, what YOU apparently mean by "silent on gravity" is referencing the fact that the Standard Model is incomplete for gravity, because General Relativity is not yet accounted for. I should've realized you might be referring to the lack of any unified field theory, but I didn't. If you had said "doesn't fully explain" rather than "is silent on", it would've sounded perfectly fine to me.
No, what I mean is that there is no gravitational term in any formalism of the Standard Model, and never has been. Its Lagrangian does not include a kinetic or interaction term whereby the spectrum of the Standard Model can include a graviton. Gravity is simply absent from the mathematics of the Standard Model, which is a specific quantum field theory.
You can certainly describe gravitation using a massless spin-2 graviton, and perturbative quantum gravity -- another specific quantum field theory -- does exactly that, and is a perfectly fine effective field theory that entirely matches General Relativity absent superposed sources and outside of strong gravity (which is defined by the EFT's renormalization group flow).
You can add this graviton to a particle zoo. But then it's not the particle zoo of the Standard Model, any more than writing down an "action of everything"[1] gives the action of the Standard Model.
> The general laymans meaning of "Standard Model" means everything in physics that is referenced on that chart (including gravity), and that's how I meant it.
is only correct if you underline the "how I [mean] it" part and make that central to evaluating the sentence's truth. It's hard to disprove solipsistic physics, though, and rarely interesting.
Additionally,
> which relate to the 4 fundamental forces (carriers) ... I'm sure you're familiar with this chart.
Sure, but there are several such charts, and they're not all the same. Here's one from Fermilab and SLAC, institutions that are pretty familiar with the Standard Model: http://www.symmetrymagazine.org/standard-model/ (There's no graviton in it).
Fundamental forces come from fundamental interactions, but there is no force police force enforcing a law that there are four of them or that the carriers are gauge bosons in a particle zoo. The Higgs interaction can be treated as a force[2] and its mediator is a scalar boson rather than a gauge boson, so does the Standard Model have four fundamental forces? One can take differing views of the electroweak interaction above 80-90 GeV: does the Standard Model have only two fundamental forces?
One can certainly treat Einstein gravity as at least a d'Alembert force arising from the affine connection; but alternatively one could take the position that d'Alembert forces generically depend on a choice of frame of reference and on that basis they cannot be fundamental. Is gravity a force? You get a different answer from Newton (yes) than you do from either Einstein (yes, but it's fictitious) or Misner, Thorne & Wheeler (mu, there is only spacetime geometry).
So an argument that as there are four fundamental forces there must be four force carriers is on shaky ground to start with. Even if we were to accept that, it does not follow that all the forces in question relate to the local symmetries of the Standard Model.
Most people consider the 4 fundamental forces to be 'part of' the Standard Model, even if the field equations are not complete. But like I said, i did understand what you meant by 'silent on gravity' after your first clarification.
There is no evidence that dark matter even exists other than the fact that our universe is not expanding at the rate we think it should, based on our current theory of gravity and how much mass we can account for based on the radiation currently reaching us (light). I think it's far more likely that our theory is incomplete rather than some whole new class of invisible matter/energy being conjured into existence just to counter-balance our 1) wrong equations and/or 2) wrong observations. So many non-scientists think dark matter is proven. It isn't. It's nothing but pure conjecture.
> There is no evidence that dark matter even exists other than...
Most of our theories about the universe are based on evidence from secondary, tertiary, etc effects. It's often the best we can do.
I don't think you'd find anyone who would object to the idea that "our theory is incomplete" ("all models are wrong", etc), so it seems like your main objection seems to be that we've given a name to a family of theories that attempt to explain the phenomenon we've grouped under "dark matter".
> I think it's far more likely that our theory is incomplete rather than some whole new class of invisible matter/energy being conjured into existence just to counter-balance our 1) wrong equations and/or 2) wrong observations
It's rather human to make an estimate of likelihood based on how long ago something was conjectured to exist :) Moreover "wrong equations" and/or "wrong observations" are part of the very theories attempting to explain our current state of knowledge.
> So many non-scientists think dark matter is proven. It isn't. It's nothing but pure conjecture.
Actually, usually the issue with these discussions are with scientifically literate folks who read a Scientific American article on dark matter in 1998 and have engaged with that as a strawman ever since. Somewhat tongue-in-cheek, but it is odd to me how many people are convinced they're bringing light to the darkness with these kinds of comments.
> There is no evidence that dark matter even exists other than the fact that our universe is not expanding at the rate we think it should ...
You're thinking of dark energy, the net effect of which is to change the overall expansion profile of the universe as a whole. Dark Matter was first proposed when galactic rotation profiles failed to meet theoretical expectations -- matter far from the center of each galaxy had a higher velocity than it would if visible matter had been the only factor.
> I think it's far more likely that our theory is incomplete rather than some whole new class of invisible matter/energy being conjured into existence
Occam's razor (the simplest explanation tends to be the right one) suggests that a new unobserved particle is more likely than abandonment of F = GMmr^-2 . This doesn't mean the equation must be correct, it's a question of reaching for the low-hanging fruit.
> So many non-scientists think dark matter is proven. It isn't. It's nothing but pure conjecture.
First, nothing is ever proven true in science, only false. Second, dark matter is more than pure conjecture, since there is observational evidence. A pure conjecture would be an idea about reality having no observational support at all (unicorns, Bigfoot). Dark matter is a hypothesis crafted to explain observations, but so far there's no persuasive theory to explain it, and no observations of candidate particles either.
I should have been clear that I'm not making a distinction between energy and matter. I'm referring to the postulated "dark" forms of both. Both of them are pure speculation, and is nothing but the equivalent of taking whatever observable "error" there is in our formulas and labeling it "dark". The only reason there's two kinds of "dark" is because both the space aspect (matter) of our math is wrong AND the time aspect (energy) is wrong. GR and SR are correct but incomplete, for representing spacetime. Just like Newtonian rules are correct but incomplete.
And as for your last sentence, trust me I understand the scientific method, and how proof, evidence, and fact interrelate with knowledge. Nice philosophical observations, but having nothing to do with this discussion.
> The only reason there's two kinds of "dark" is because both the space aspect (matter) of our math is wrong AND the time aspect (energy) is wrong.
You're pairing space and matter, then time and energy, and comparing them as though these entities are naturally paired in current theory. They aren't. Space and time are elements of spacetime, matter and energy are interchangeable by way of a rather well-known equation, but these things don't arrange themselves as you're trying to do.
> And as for your last sentence, trust me I understand the scientific method, and how proof, evidence, and fact interrelate with knowledge.
So you didn't say, "So many non-scientists think dark matter is proven. It isn't. It's nothing but pure conjecture."
Nothing is ever proven in science (falsified, yes, proven, no). And dark matter and dark energy are both more than "pure conjecture," a domain reserved to notions lacking observational evidence.
I also use the words "space" and "time" independently as if they were separate things, when I know full well every last detail about SR/GR (being an engineer myself). So yes, only in the context of discussing "dark stuff" I will lazily interchange matter and energy. Only on HackerNews do I ever encounter the type of ass-holes who will parse sentences intentionally wrong as to yield the incorrect conclusion. You are one such person.
> Only on HackerNews do I ever encounter the type of ass-holes who will parse sentences intentionally wrong as to yield the incorrect conclusion. You are one such person.
Troll alert. Space and time are elements of spacetime, they are an integrated whole, a fact first pointed out by Einstein's math teacher Minkowski. When Einstein first read what Minkowski had written, he said, “Since the mathematicians have invaded the theory of relativity, I do not understand it myself anymore.”
Only later, after learning tensor calculus and beginning work on GR, did Einstein understand what Minkowski was going on about. But in those days people were interested only in getting it right, not posturing as right even when they're wrong.
It's not name-calling. This person really is a troll, a term with an unambiguous definition: http://www.urbandictionary.com/define.php?term=troll . He tries so hard to earn the label that it seems unjust to withhold it.
In the days of Usenet, before there was an Internet, this sort of language was regarded as neutral and informative -- it's not abusive when it's accurate. But the Politically Correct movement in social media is seeing a revival, such that even accurate use of these terms is regarded as counterproductive.
In the Usenet era, some individuals would strive to earn the label, and applying it would save people a lot of time trying to engage in constructive conversations with people who were manifestly unable to rise to the occasion.
Bottom line -- it's possible to take PC to a pointless extreme. And I'm hardly the first to make this point.
Fair enough, but this begs the question of what constitutes incivility. If present trends continue, telling someone that they're wrong will be regarded as uncivil behavior. To avoid censure it will only be possible to assert that they've posted "alternative facts."
I happen to agree that incivility represents a real problem in social media, and we've seen many sites abandon their discussion groups because of uncivil posts and people. But I think the argument can be made that definitions have changed as well as behavior.
> That goes back to Usenet as well.
Not really. Having posted there for many years, I can tell you from direct experience that the perceived threshold of incivility has changed completely. One need only review posts from that era to see the point that definitions and standards have changed.
But this is now, and an argument about what was once acceptable doesn't seem particularly persuasive even to me, especially now that we have an embodiment of incivility running the country.
Just to keep score: You called me a troll, because I called someone an a-hole, for calling me a liar. That's what just happened. What a totally PC Snowflake Soap-Opera. Of course that's based on the assumption that the 'Sock Puppet' burner accounts aren't BOTH YOURS to begin with. hahahahaha. omfg that would be hilarious. In case it doesn't even matter, i'm using my REAL identity.
Nice dissertation on 2+2=4. I've known relativity for 30years. The point i was making (as i'm sure you genuinely DO actually know, despite pretending once again to need to correct me), is that even physics professors in the middle of physics lectures will say "space" or "time", depending on context. Learn the fact that English and all languages have syntactical nuances. Oh,and thanks for the warning that you are a Troll, but I don't mind. I'm biting the hook. Nothing thrills me more than debating physics. Thus the 30yrs.
the same thing could have been said about the discovery of Neptune [1]: "there's no evidence a planet is there, it's pure speculation. The changes in Uranus' orbit must be the result of mathematical error and not another planet."
If man had discovered Gravitational Lensing BEFORE Einstein had formulated General Relativity we would have given some name to the effect, and perhaps even considered it a characteristic relationship between stars and light. The relationship happens to be an INDIRECT one. Star mass bends space, and the light merely "appears" to bend, when it traverses that space. It doesn't actually bend.
What I'm saying about Dark energy/matter, is that there is also an INDIRECT relationship there. It's not just a special invisible mass and invisible energy. It's a fundamental misunderstanding about what spacetime is. I think the "dark" quantities are every bit as much an illusion as the "light bending" illusion created by stars.
The light is traveling in a straight line ALWAYS, but merely appears (to us) to bend, because the space it's traveling thru (in a straight line) is itself bent. The space itself is bent. The light goes straight.
That's the distinction i was attempting to point out, when I said if mankind had visually noticed star-induced 'lensing' (before Einstein explaining what to expect) we would have ASSUMED the light itself was bending, and that space was 'flat' (unbent). Thinking space is flat and light is bent (the opposite of what is true), would have been the same kind of blunder we are making today believing that Dark matter/energy is actually real.
> The light is traveling in a straight line ALWAYS, but merely appears (to us) to bend, because the space it's traveling thru (in a straight line) is itself bent. The space itself is bent. The light goes straight.
Space is curved, and the light passing through it is also curved. That was the point of my link to the Einstein Ring page -- to show that light is in fact curved along with space.
The first important confirmation of GR was an experiment conducted in 1919 that showed curved light paths of starlight passing near the sun, observed during an eclipse.
If you happened to be located near a black hole, at 1.5 times the radius of the event horizon, by looking along a tangential path, you would see the back of your own head, regardless of which direction you looked. The reason? Light is curved along with space.
It's not accurate to say, as you are doing, that light always follows straight paths. It is accurate to say that light follows the curvature of the space through which it passes.
> ... would have been the same kind of blunder we are making today believing that Dark matter/energy is actually real.
Try to avoid moving ahead of the evidence. The present evidence is that dark matter and energy are real, again following Occam's razor. But I can't say these things are real as a matter of concluded fact, and you can't say they aren't. No one knows, and science requires us to wait for observation and theory to sort it out. Science doesn't progress by proclamation, but by way of theories that resist sincere efforts at falsification.
I do understand why you think what you do. Most physics articles (and even the Wikipedia page on Gravitational Lensing) are explaining it wrong. They are saying that the light bent. That's not what's really happening. The light doesn't bend relative to the space it's flowing thru. It flows in a perfectly straight line. Gravity has no direct effect whatsoever on light, because light is massless. Gravity only can effect the SHAPE of the spacetime density field. The fact that the spacetime is bent makes it look to us, as if the light changed direction. It didn't. Light passing thru an ordinary optical glass lense DOES bend, but light passing by a massive object in space absolutely does not bend. It merely "looks" like it did.
> Most physics articles (and even the Wikipedia page on Gravitational Lensing) are explaining it wrong.
Ah, the encyclopedias are wrong. That may be true, but only if you meet your burden of evidence. You cannot meet your burden of evidence, and you show no sign of even trying.
Oh, I can prove it with evidence. The only thing that can change the direction of motion of a particle is a force. Photons have zero mass and zero charge, therefore no force (including gravity) can act on light (strong and weak interactions are not applicable here). Therefore light will always travel in a straight line. When light appears to 'bend' it is not because the light itself changed direction in its reference frame, but because the space the light is traveling thru in is warped. Any physics professor will understand precisely what i'm saying, and all agree. Someone who has only read a few articles online will not. I've understood this since 1986. I'm very old and wise you little child.
> The only thing that can change the direction of motion of a particle is a force.
Quite false. You're overlooking the fact that photons are the carrier particle of the electromagnetic field, which takes the form of waves in space -- waves that change direction without the application of forces. An optical lens changes the direction of photons without exerting a force. So does curved spacetime. These are examples of hundreds of things about physics that contradict your outlook.
Again, at 1.5 times the radius of a black hole, looking tangentially, you would see the back of your own head. So even in this local frame of reference, light has taken a curved path along with curved spacetime. In other frames of reference, light is obviously not traveling in straight lines. In fact, it can be argued that light never travels in straight lines -- that would be true only in a universe without any mass at all.
You could argue that water always travels along straight lines inside a pipe and never changes direction, and in the case of the pipe itself changing direction, you could argue that the water is always traveling in a straight line from its own perspective inside the curved pipe, but having said that, people would see your ideas for what they are.
> Any physics professor will understand precisely what i'm saying, and all agree.
You appear to have forgotten I have already disproven this with my John Wheeler quote: "Mass tells space-time how to curve, and space-time tells mass how to move." As with masses, so with photons. If masses could travel at c, they would take the exact path photons do -- curved ones.
We've banned that account—for obvious reasons, given how it behaved in this thread.
Unfortunately, several of your comments were also uncivil. Please err on the side of civility when posting here.
Also, please don't engage in flamewars on HN. A good-faith discussion about how someone is wrong is fine, but at the point when good faith dries up, such discussions become tedious tit-for-tats, which amounts to mutual trolling. We definitely don't want those kinds of threads here.
What about the evidence from the bullet cluster? The gravitational center of mass is offset from the the visible center of mass. So something must be gravitating that is not visible - i.e. dark matter.
Matter isn't the only thing that produces gravity. Energy also produces gravity. Given that the bullet cluster is also incredibly hot, it seems entirely plausible that its unusual gravitational morphology is somehow related to its energy content.
You'd expect it to be incredibly hot where the gas is, no? And the interesting thing about the bullet cluster is that the excess of matter is found where there is no gas and stars. Plus, if the gas were super hot, it would emit lots of light, including x-rays.
My understanding was that it is in fact incredibly hot where the gas is, and it is emitting lots of x-rays. And you're jumping ahead of yourself to say that it is matter, we just know that there is gravitational lensing.
My main issue is that the bullet is anomalous in multiple ways, and dark matter doesn't explain the heat issue. If it did, then it wouldn't be useful as an explanation for the rotation curve problem in other galaxies. I feel that a more parsimonious explanation would cover both the lensing and the heat.
You're right. Collisional matter that dissipates radiatively completely explains the heat issue: many of the individual components of the dust and gas in the galaxy clusters came close enough to each other to interact electromagnetically, trading off their momentum-energy for momentum-energy in photons and other products of scatterings. Ordinary intergalactic gas and dust from the two clusters got squashed together and the squashing made it hot enough to glow.
In the standard cosmology (\Lambda-CDM), CDM is cold dark matter that is collisionless, does not dissipate radiatively and does not feel electromagnetism even in very close quarters. So it does not heat up, and its momentum is only influenced the gravitation sourced by the components of the colliding galaxy clusters (which includes the dark matter itself), with the result that the "clouds" of dark matter in each cluster sail right through one another and past all the ordinary matter, with effectively no deflections of their trajectories.
The result is that gravity (measured by lensing of background objects) points to a hot cloud of dust left in the middle of the collision, the stars and other denser visible ahead of that debris, and two mostly-transparent regions leading the way ahead of the visible matter.
> the bullet is anomalous in multiple ways
It's a bit more accurate to say that there are questions still unanswerable by observation of the bullet cluster alone, and that searches for other cluster collisions are likely to provide further partial answers.
In particular, further observations will favour or disfavour different numerical simulations of large scale structure formation under the standard cosmology. However the variables most directly tested align roughly with whether cold dark matter is almost wholly particles similar to heavy sterile neutrinos or almost wholly particles like axions, rather than whether particle cold dark matter is there at all. Even TeVeS proponents aren't especially optimistic on that latter point (e.g. Angus and Diaferio, two prolific TeVeS-as-relativistic-MOND researchers, in their 2012 paper https://arxiv.org/pdf/1206.6231.pdf starting at the bottom of page 23).
Ah, ok, I was going to say, if the excess energy in heat was sufficient to explain the gravitational lensing, you'd have several problems. You're just pointing out that there are other significant anonymous things going on we don't understand. Until they are also explained, the solution is not a complete one.
You've confused the logic for inferring dark energy or some other force at work, with dark matter which is involved in lensing experiments, galaxy rotation curves, etc.
No one knows why space is expanding. The dark energy is just the 'label' they give to whatever is causing it to expand. Both dark matter and dark energy are postulated to be whatever shape of puzzle piece fits into the puzzle slot where the error in our formulas exists. What i'm saying is that it's more likely that the formula itself is wrong than it is that something magically exists to conveniently fit that slot.
> The dark energy is just the 'label' they give to whatever is causing it to expand
This depends on how you look at it.
In the standard cosmology we define a preferred frame wherein an observer will see the matter (that's in the most general sense of "not the gravitational field", so it includes atoms and their components, photons, and various types of dark matter (e.g. neutrinos, which are "hot" dark matter, since they move relativistically and do not experience electromagnetism)) content of the universe as homogeneous and isotropic. This is physically reasonable since along every unobscured line of sight we see a lot of galaxies of various shapes, "tilts", sizes, surface brightnesses, and spectral lines. Observations also lead us to conclude that there is a relationship between redshifting of the spectral lines of common types of galaxies (and common radiative occurrences within them, like type A supernovas), and the change of the other observables (angular size on the sky, luminosity, etc.) that correlate with greater distance. This in turn led to the discovery of the Hubble "constant", and provoked ever deeper field telescopic studies to prove its value.
So if we assume that along every line of sight, including obscured ones, we have much the same view of many many galaxies at a variety of distances, we can make a variation on the Friedmann equation that parameterize several things that would lead to the observables of galaxies when we model their known (and unknown) components as a set of perfect fluids.
We can take the Hubble "constant" and put it into a Robertson-Walker vacuum spacetime. RW spacetimes can be grokked by dimensional reduction. Consider a cylinder that we slice (foliate) along its axis into a set of infinitesimally thin circles stacked on top of each other. We describe the radius of each circle with a function r(h) where h is the height of the circle from the base of the cylinder. Where r(h) is constant, we have a cylinder, but if r(h) increases with h, then we have something like a cone balancing on its apex; r(h) can describe a wide variety of shapes. For an 3+1 RW spacetime that is similar to our universe, we foliate on the timelike axis and define a function a(t) where t_0 == now with the spacelike coordinates set on a chosen observer (us here on Earth, for example). "t" counts upwards as we go into the past from t_0, and a(t) goes to zero as t increases. "t" is the lookback time and a(t) is the scale factor. The chosen observer is the special observer mentioned above, who sees the matter of the universe as isotropic and homogeneous at the largest scales. The most useful coordinates on such a spacetime are comoving, that is each gravitationally bound galaxy cluster stays at the same coordinates at every time t.
If we mix together the Friedmann equations, the Lemaître idea of spacetime having a zero radius at some large lookback time, and the RW spacetime that can model that, we get the FLRW model of the standard cosmology. We take the RW case where there is no extrinsic curvature, that is, when we foliate on the timelike axis each spacelike hypersurface is spatially flat; that is similar to saying that when we slice up our dimensionally reduced solid along its height, we get a set of circles of the same radius (i.e. a cylinder rather than a cone). We absorb the expansion parameter into a(t) as another of the fluids.
We then consider two types of fluid: those that dilute away as t -> t_0 -> future and those that do not dilute away. The former is "matter", including dark matter; the latter is "dark energy".
When we consider them as components of an action, diluting-away fluids are attractive and non-diluting fluids are repuslive. When we consider them in terms of the matter tensor T in General Relativity, the diluting-away fluids have positive pressure and the non-diluting fluids have negative pressure.
It's important to return to the point that this model has a preferred frame, and that translating the non-diluting fluid into "the same for all observers in all frames of reference" physics leads one to assume that dark energy is just a feature of the Lorentz-invariant vacuum. So dark energy arises in the cosmological model but corresponds to the ground state of the empty-of-matter spacetime in frames of reference other than the preferred one picked out by the cosmological model.
That is, the statement that "dark energy drives the expansion (via negative pressure or repulsion)" is frame-dependent, and thus observer-dependent, and with a change of frames of reference (and even a change of coordinates on the preferred frame), the statement becomes untrue. What is true in all frames is that there is an intrinsic property of space in an expanding spacetime that has a constant energy-density no matter how large a volume of space is considered.
The exact equation of state of dark energy is an area of active research and also tests to make sure that the assumptions that inevitably lead to it (isotropy at huge scales, homogeneity, spatial flatness, redshift-distance relations and other things implying expansion) are not blown up by evidence from ever finer observations.
So it's not so much a 'label' as a phenomenon whose microscopic details have yet to be discovered.
There are lots of those in physics, and we've had a good century of probing the microscopic details of phenomena discovered at the end of the 19th century and since, so this shouldn't really be causing anyone sleepless nights.
in which Schrödinger (in 1918!) describes, "matter in the large essentially as a compressible fluid of constant density at rest under a constant spatially isotropic tension which ... must be equal to 1/3 of the rest density of energy".
The notation is old-fashioned, but check out Schrödinger's paragraph at the bottom of Harvey's page 5 !
Your statement that there's "an expanding space-time that has a constant energy-density", gets to the nub of it for sure, but in reality no one knows if any of the fundamental field strengths are indeed constant. To me the need for "dark" stuff to be postulated, is more likely to indicate that gravity strength, or light speed, are more likely to NOT be constant over large space-time ranges, than the likelihood that there is a whole class of 'dark' particles and waves that currently don't interfere with any other matter in a provable way.
So if someone asked me what's my 'evidence' that gravity or light-speed are not universally constant, my answer is simply: "The evidence for dark-energy/matter, is that exact same evidence". There should be some wave-function (possibly resulting from what we call a big bang), in terms of G and C, that once integrated out over the current life of the universe, will yield precisely that positions and velocities of the galaxies that we currently observe. To me it seems far less likely that there's an entirely new set of particles we cannot see, despite the Standard Model of particles being proven correct out to 40 decimal places.
You already have the neutrino which interacts only via weak force and gravity, They show up much the same in bubble chambers as dark matter does in our telescopes; they don't. We only see where they interact with more observable matter, is it really that far fetched to imagine a particle that interacts through gravity only?
I'm not sure what you're trying to argue here, or even why. I'll guess that you're interested in physical cosmology and would like a set of brief and not-too-technical reactions to your ideas (which I take to be implied questions) from someone who knows the \Lambda-CDM cosmology reasonably well.
Dark energy it is a property of empty space that does not dilute away as more space appears in an expanding universe. The only way to abolish dark energy is to eliminate the expansion of the universe. It only has a "fundamental field strength" in particular chosen frames of reference in which one can represent it as a field with a constant energy-density. The comoving frame of the standard cosmology is one such frame of reference. However, in most other frames this energy density vanishes, and when that happens a general relativist will decide that the energy density was an artifact of the choice of frame of reference or system of coordinates and ditch the word "fundamental".
Now, there are lots of ways we can complicate the action S_{\Lambda-CDM} in a Lagrangian formulation of the standard cosmology by introducing further repulsive terms into it beyond constant * \Lambda (as in the Einstein-Hilbert action or an expansion of it) or L_{repulsivematter} in a parameterization, and indeed there have been numerous attempts to do so. However, short of eliminating the expansion of space at all times and in all reference frames there is no way to get rid of a repulsive, non-diluting, non-concentrating (when time-reversed or if the critical density of the universe turns out to lead to a contraction of space in the future) term.
But in the standard cosmology, we have \Lambda, which is just the cosmological constant, i.e., dark energy is a property of the ground state, which is the vacuum. When there's more vacuum, there's more dark energy. Mathematically, we start with an action that leads to the standard write-down of the EFEs. Physically, this matches observational tests at large scales, which is convenient because the standard write-down of the EFEs is almost the only way to match observational tests within our solar system (c.f. the parameterized post-Newtonian formalism).
> gravity or light speed are not universally constant
What exactly do you mean by "gravity ... not universally constant"? In particular, what do you mean by gravity?
In General Relativity speeds are something that are extremely hard to talk about except in the local neighbourhood around a single point. In fact, many general relativists would argue that talking about "universal" speeds violates the spirit of general relativity. However, if our universe continues to be modellable as a smooth manifold with a Lorentz metric, we get a sort of quasi-universality of "c" in that at every point one can construct an infinitesimal region of spacetime in which "c" is a parameter in the action of matter and takes on the same _locally measured_ value in each such region for an observer in that region. There is ample evidence that favours this up to energy scales accessible to us on Earth and visible with observational platforms on and near our planet.
There are certainly metric theories other than GR that allow for different sources to couple to different metrics, but most of these have to undergo a phase change to an effective single metric with universal coupling in the early universe or we would see clear evidence for them (in particular, the distribution of heat in the early dense phase of the universe still has to produce the Standard Model at lab energies and also stars and galaxies and labs). Some productive and well-regarded physical cosmologists have proposed these types of theories, even recently (e.g. Afshordi & Magueijo), and they explictly reject a universal value of "c" (in particular c approaches infinity in their model's extremely early universe).
But in the standard cosmology we have General Relativity, and we don't vary the value of G or of c when grinding through the Einstein Field Equation; we take those values as given to us by nature, and have no evidence for them varying in the observable universe (and a pretty substantial amount of evidence against such variations, in particular including petabytes of spectroscopic data from objects in the sky).
> "There should be some wave-function (possibly resulting from what we call a big bang), in terms of G and C, that once integrated out...
I'm sorry, I don't understand this. Could you explain further?
> To me it seems far less likely that there's an entirely new set of particles we cannot see, despite the Standard Model of particles being proven correct out to 40 decimal places.
Dark energy is not particles. As I said above, it's a feature of the vacuum, and the most fundamental feature of the vacuum is that it is empty of particles. A tl;dr here is that we don't know what creates more space rather than less space, but we know that when more space is created there's more dark energy in the comoving frame but not more particles.
Are you thinking of dark matter here, rather than dark energy? If so, we already have hot dark matter in the form of neutrinos. Neutrinos are "hot" because they move relativistically, and thus are prone to carry momentum far away from galaxies very quickly. Cold dark matter is "cold" because it moves non-relativistically and so the momentum CDM carries lingers in place in and around galaxy clusters. While it would be convenient if CDM were like heavy neutrinos, there is no reason in the standard cosmology that CDM has to be particles at all, or even just one species; the only requirement is that it be almost entirely collisionless and non-radiative so that we don't see it and so that it doesn't release its trapped momentum (e.g. by converting some of it into hot dark matter or light or standard model particles).
Finally, the Standard Model is mute on gravity, and yet you -- made up of Standard Model particles -- are feeling it right now. So I'm not sure how your argument about the correctness of the Standard Model fits with your argument.
These 'from the ground up' totally all-new-code approaches to DBs are just a scary proposition. Think of the thousands of man-years of effort that went into building MySQL, testing its codebase, and perfecting it's robustness (fail-proof-ness). What does MongoDB bring that couldn't have 'built on top' of MySQL codebase, and used MySQL transational layer as it's underpinnings. Sure, MongoDB gets all its performance gains from delaying writes to the DB (eventually consistent), caching in memory, and dispensing with ACID, however there is nothing about MongoDB that couldn't have been written to use the DB layer of MySQL at the point where it actually does its writes to disk. In this way, MongoDB would have revolutionized the world rather than mostly "fragmenting" the storage space.
I guess there are those who will say that even using batch-commits, that MongoDB could never have achieved the performance it currently does (by bypassing any ACID) if it was built on top of MySQL. But regardless, why not focus efforts on improving MySQL batch processing performance rather than throwing it all out, starting from scratch, and writing directly to disk. I know MongoDB became a success, but I think that is 'in spite of' their decision to start from scratch and not 'because of' it. Also think of the power that would be available if there were some relational table capability (true real MySQL ACID) right inside MongoDB whenever it was needed, if they were the 'same animal', rather than having to use two totally and completely separate DBs if you need NoSQL and also ACID in an app, which 99% of apps DO NEED, at some point, once an app grows beyond the round-one funding startup-toy phase and MongoDB falls flat in it's RDB capabilities.
> Also think of the power that would be available if there were some relational table capability (true real MySQL ACID) right inside MongoDB whenever it was needed, if they were the 'same animal', rather than having to use two totally and completely separate DBs
Just use Postgres (or several other options) then? It has all that built right in if it's what you need. But...
> if you need NoSQL and also ACID in an app, which 99% of apps DO NEED, at some point,
...I doubt 99% of apps really need this.
> once an app grows beyond the round-one funding startup-toy phase and MongoDB falls flat in it's RDB capabilities.
Or don't try to force MongoDB to behave like a RDB and you won't hit those problems? I just moved a system from Postgres to MongoDB, and it's running faster on way cheaper hardware now. Not because either database is inherently better than the other, but the use case lined up with mongodb perfectly and the old model was leveraging Postgres really poorly. Eventual consistency is fine, and I can denormalize certain data for faster reads because I know it won't be modified later.
I don't know Postgres, and I just consider MySQL to be the leading best-in-class open source DB engine available. I realize that's debatable. But personally my mind is made up on that.
I think relational-type queries are so common that any significant app will need them. For my own side project (meta64.com) I'm using JCR-SQL2, on Apache Oak jackrabbit, so I get to do lookups fairly easily, but it sure would be nice to have a full-blown ACID RDBMS engine sitting right there to use also, in the same engine.
What does MongoDB bring that couldn't have 'built on top' of MySQL codebase, and used MySQL transational layer as it's underpinnings.
There are two challenges here. One is that MongoDB has a different data model than MySQL: hierarchical, schema-less documents instead of uniformly-typed flat tables. It's possible to map one to the other--look at Postgres' support for JSON datatypes. That involves extensions to both the storage format and query language, but it's certainly doable.
The bigger problem is that MySQL's local transactional isolation is nice, but not particularly helpful in a distributed context. Distributed transactions--even single-document ones--are still tough to implement correctly and efficiently. As an example, consider Percona XtraDB, or MySQL+Galera Cluster. Both are building on a serializable single-node system--but in a distributed context, they wind up failing to provide snapshot isolation, let alone serializability.
The point is that MongoDB writes/reads directly to File system. It could have been (and still could be retro-fitted) to ride on top of an RDBMS for all it's actual IO, and create a hybrid. I'm aware of all the orthogonal points you raised which have nothing to do with ruling out RDBMS as I/O
Probably because there has been a lot of development in distributed databases in the last few years. Raft is fairly new, for one.
Taking really old code and trying to make it do new things is very very hard. After you've invested all that time and effort making mysql better, it's still a database owned by oracle.
Think how Java would have turned out if Sun made it to be cross compiled to C rather than run in a VM.
I haven't seen the codebase of MySQL, but it's open source and not "owned" by Oracle. The only reason anyone should want to start over is if that codebase itself was crappy code. I bet it isn't. Implementations of a robust RDBMS takes decades. MySQL is very mature. That doesn't act against it, it acts in favor of it. The better analogy is Linux. I'd be more likely to say "Linux is bad because of age" long before I'd ever say "MySQL is bad only due to age."
There are plenty of examples in the Language world also: Go, Rust, etc. All those new languages SHOULD have built on top of Java, for the same reason Mongo should have built on top of MySQL. BTW, Java DID leverage C++. The Java compiler IS written in C. Mongo didn't use any part of MySQL, but Java uses EVERY part of C.
Yeah, you are taking that sentence too literally. What I mean is we don't need a new language created every week. Java is already perfectly fine, and IF somebody finds something Java can't do, they shouldn't start from scratch and create a new language, but instead build and improve on what already exists: Java. It's like you can either stand on the shoulders of giants or you can start out at dirt level. I don't like starting back in the dirt. Not to mention the fact that every time somebody reinvents an already existing wheel, they just fragment the industry, because half the script-kiddies out there (not knowing any better) will simply jump on the bandwagon of whatever language most recently went viral. The Sisyphus approach to technological advancement.
TypeScript won the language game already on the browser. Is the only language on the browser that matters any more. For those of you who disagree, trust me, it won't be long before you change your mind. Aside from backing that up with fact, i'll mention what's relevant here: It's bundling compiler. It has the ability to compile all your code in to a single file bundle OR compile separately as ES5/ES6 modules. Since I don't bundle 3rd party stuff, TypeScript compiler is the only bundler I need. See meta64 on GitHub if you're interested in my hyjinks.
To me it's shocking that there's nothing for editing XML that equates to an editable tree GUI interface where nodes and properties can be rendered onto something resembling a document but yet browsable (by expanding and collapsing node) with the same paradigm of a file-system browser gui. I'm working on this myself actually, in meta64.com (see it on github, because the site is not always live, and is experimental). I am using JCR as the back end data storage but seriously considering adding a feature so support direct XML editing. XML and also REST are highly structural, and yet everyone seems to just use syntax-highlighting text editors to edit them rather than something more akin to a tree-based browser, that would render something more friendly looking (with expand/collapse capability). Think of it like this, you have seen RSS XML before right? You have also seen web sites that RENDER the RSS feed into a document-looking thing. That's what i'm getting at. Going from editing this stuff as a text file, to something much more advanced, like a tree-browser. Maybe there are some things i'm not aware of, like perhaps even an Atom Editor plugin or whatever, but I don't think there's anything in wide use or i'd know about it, having been a web developer for 25yrs now.
The dirty little secret about React is that it tries to 'reinvent' the DOM tree, and Web Components will be out soon making it all obsolete. Why not just move to Web Components NOW, and not have to rewrite your view logic in 3 or 4 years. Polymer wins easily. React got a head start, sure, but the W3C standards and native browser support for WebComponents makes React redundant and unnecessary, in the very near future.
I don't know anything about ReactNative. Nor did I claim I did. I am simply stating that React itself is a dead-end technology. I would think no one would want to use ReactNative unless they buy into the prospect of React itself having a long life ahead. But doesn't. It's a dead end. WebComponents are the future and react will be unnecessary.
The absolute LAST thing the world needed in 2016 was another programming language. Google did with Dart precisely what Microsoft tried to do when they (MSFT) created a proprietary JavaScript back in 1997(ish). Big tech companies just have a lot of hubris and want to "own" everything they touch. Any large software company, given enough time and success, will eventually create it's own programming language. It's just in their nature. It's the natural progression of market domination.
They COULD contribute their efforts to improving EXISTING languages (like MSFT finally did to their credit with TypeScript, building on non-Proprietary JS, which is great) but there is a seemingly irresistible allure to the thought of owning the design and molding a language to perfectly fit your own company's desires. You never have to deal with standards, when you're just making it up as you go along. PLUS the massive benefit of the anti-competitive foot-hold (monopolistic foot-hold) that having its own language gives to a company, is just too tempting. They cannot resist.
However most developers are sick of the lanuage-Thrash. 99% of it is not innovative in the slightest, but just somebody reinventing and resolving problems that have already been invented and solved countless times before. I really really hope dart dies (and GO can also go). In my opinion there are three languages that need to still exist: Java, JavaScript/TypesScript, and C++. Those fill EVERY need, and if the innovators would ADD TO THOSE rather than starting from scratch and pushing the reset button the software world would be a MUCH better place, and so would overall software quality.
You are aware that Dart was not released in 2016, yes? The first public release was in 2011.
Feel free to stick only to your company's Recommended Best Practices Industry Standard Languages and Libraries. But, there are lots of languages out there, and many of us have good reasons for using them. Deal with it.
Well if you jump from language to language, you'll never build a strong resume. Keep that in mind. I'm nearly 50yrs old now and have stayed on course with either C++ or Java, and never needed or wanted any of the "Flavor of the Week" crap the script kiddies love piddling in. Personally I think not being able to stick to something is a form of A.D.D. but that's just me. Deal with that. And no I didn't give a shit about nor claim I knew when Dart was invented.
> In my opinion there are three languages that need to still exist: Java, JavaScript/TypesScript, and C++
How many languages do you know well and which are they? How many years have you been programming? Do you know Haskell and Lisp? Have you done any work in econometrics or bioinformatics?
My main point was this: Before creating a NEW language, a person or organization should have a firm reason that no existing language will fit their needs. I think creating new languages has become a bit of a hobby for a lot of developers, and whenever a new one gains any traction, it damages the entire industry by fragmenting it. Sorry, not interested in the rest of your interrogation.
It speaks volumes about human psychology. The fact that sometimes all it takes is a catchy name to build a company from nothing that has not one single innovation.
Just look at one of the most hilarious examples: "meundies.com" It has a cute little name that people like, so it's gaining traction. Is there anything innovative or good about the product or business model? Nope. The catchy name is all.
That is so true. Frustrated me so much as a student, and even an adult taking interview tests as a software developer. If you happen to know more about the subject than the person who wrote the test question, frequently (or at least from time to time) you will see questions with multiple correct answers, and end up having to guess.
It does not solve every problem JavaScript ever had. By definition of being a superset, it's still JavaScript. It just induces a prescription to mitigate symptoms of type unsafeness.
Null AND Undefined still coexist, though type safety helps. There's an idiomatic way to specify classes, as opposed to using a pattern of object returning closure function for instance. There's still both the == and === operators.
I use the '=>' operator to get a 'this' reference that does what you'd expect (like in Java) by getting whatever object is running rather than the typical JS 'this' of the function. Trivial example is here:
If you want to trade contrarian jabs, I can play that game: TypeScript does far more than 'mitigate symptoms of type unsafeness' it is a fully type-safe language that will get proper compile-time errors if you use the wrong type. It doesn't "mitigate" unsafeness. It ELIMINATES unsafeness.
When comparing the types of function parameters, assignment succeeds if either the source parameter is assignable to the target parameter, or vice versa. This is UNSOUND because a caller might end up being given a function that takes a more specialized type, but invokes the function with a less specialized type.
A Google query for the definition of unsound: "not safe or robust"
You're portraying TypeScript as if it's perfect. Typescript is is permissive and structurally, rather than nominally typed. This is not TypeScript's fault. There will always be tradeoffs. Among other things in the FAQ, there is a section on how to prevent two types from being structurally incompatible. "A possible FIX if you really want two types to be incompatible is to add a 'brand' member"...
The 'assignability rules' of TS catches type-errors in a way that 'solves' all the problems JS ever had. You can critique the nature of that solution if you want (after all, there are an infinite number of solutions in the solution-set), but I think it's a perfect solution.
Give any good developer TS to code in, and he'll know how to use it to check every type in every variable, class, or method parameter, and the checking will be 100% perfectly reliable. That's what I mean by "problem solved".
I like TypeScript but it doesn't come close to solving every problem. You still have floating-point only numerics, bizarre array semantics, crappy Unicode support, nigh-unusable threading, perf optimization is hard, lots more.
If you took a poll of developers about what the problems with JavaScript really are, the things you mentioned wouldn't even make it onto the list.
Imagine how much of a disaster the web would be today if JavaScript allowed threading all these years. All the script-kiddies who have never written one line of back-end code in their lives suddenly having to grapple with concurrency? Oh my god what a disaster.