I'm part of the LIGO team responding to the AMA. If anyone on HN sees a good question that hasn't been answered, link it here and I'll sure we get to it!
Again, asking here since it's easier, but sorry if that wasn't the intent.
To get a better understanding of what LIGO is observing, does every massive object create a gravitational wave as it moves? Do gravitational waves act similarly as to how moving a charged particle creates magnetic flux? Do we know from LIGO if gravitational waves propagate at exactly the speed of light?
Not on the team. Not even close. My under standing from other comments from the team is yes and yes. The reason we needed a collision of black holes was to get the signal up to where we can detect it. And they definitely said that this proves that gravitational waves propagate at the speed of light.
No, only accelerating objects create gravitational waves. (Massless particles have energy so also have gravitational fields and produce gravity waves when accelerating, by mass-energy equivalence.) The analogue of a gravity wave is not a magnetic field, which is created by a moving charge, but rather the electromagnetic radiation produced by an accelerating charge, e.g. synchrotron radiation or radio waves from an antenna.
I know this is speculative, but do you think it is possible and/or plausible that measurement precision could be improved to the point where we generate gravity waves to transmit data much as we do with EM waves?
Another LIGO scientist here. It takes an overwhelming amount of energy to generate gravitational waves, and detecting them from terrestrial sources is about 20 orders of magnitude more difficult than the measurement we just made. Space, is extremely stiff; bending it enough to be detectable requires a huge amount of mass-energy.
I haven't looked into LIGO at all, so this is very likely not just off-the-cuff but out to lunch. But I'll ask: What if people start looking at the (very) precise timing of particle beam circulation e.g. at CERN?
I would guess there is not sufficient resolution. And maybe it's beyond what's possible. But, thinking of additional avenues for measurement, and something that might be extant or require less build-out.
And now, there would be something against which to compare anomalies, were they to be measurable.
I am not sure if it was already asked, I read the Reddit thread a day or two ago, but were there additional events after the first one or was it the only one so far?
I'm just a bystander, but: the paper mentions another, weaker signal in the time period they wrote up (18 days IIRC). That one had an estimated false-signal rate of once in a couple years (the main one was more like once a century).
> Myself and one of my supervisors had a conversation about this a few weeks ago. We did some calculations which suggest that if you were in a space-ship close to the merging black holes you would feel a force which was pretty comparable to the force you feel by standing next to a loudspeaker at a music concert. You'd feel a vibration travelling through your body, but we were pretty confident it wouldn't hurt you!
Maybe the gravitational waves wouldn't hurt, but the gammas etc would be painful. I'm reminded of Greg Egan's Diaspora.
Edit: That tutorial also explains that black holes are very simple objects, so the chirps and ringdowns are likewise very simple. So black hole masses can be calculated very precisely. Also, amazingly:
> Just as optical radiation and radio waves, the luminosity of gravitational radiation falls off in inverse proportion to the square of the distance from the source. This makes binary black hole inspirals standard sirens: if we know what the masses of the two black holes are then we can infer the distance to the source by measuring its apparent luminosity. We can precisely measure the masses because the rate at which the frequency and amplitude of an inspiral increases depends only on the masses.
I doubt that the merger was converting all energy into GW. I suppose there might be some effects occurring at such high energy that we don't have precise simulation of what would be emitted.
In any case, that emitted energy certainly wouldn't be good for any living tissue. I don't think you could expect to witness a black hole merger from the comfort of your spaceship a few thousands of miles away.
As a physicist and programmer I just love that Python was used both for controlling the actual experiment and for analyzing the data! Six years ago when I started my PhD I also moved all the code for my experiment from proprietary software (Matlab/Mathematica/Labview) to Python (+matplotlob, pyvisa, PyQt/PySide). Many of my colleagues and supervisors were rather skeptical about this back then as Python was still considered quite new and the tooling was not as good as it is today. It seems though that Python is finally becoming the de facto standard for experiment control and data analysis, especially for more complex use cases that require a good software architecture (which is hard to implement e.g. with Labview). This is really great news for me as it makes sharing code and data much easier (if they had done their analysis in Matlab it would be hard to reproduce for people that don't have access to that software, but with the iPython notebook everything you need to reproduce the results is open-source). Great stuff, let's keep pushing Python forward :)
In addition to the two LIGO sites, there is the Virgo instrument outside of Pisa, Italy, which will come online later this year. The KAGRA detector is currently being assembled underneath a mountain in Japan. And, mentioned in another reply, LIGO has the equipment for a third detector. This is currently in storage in the hopes that the Indian government will build a facility. By 2023 there should be five widely-spaced detectors worldwide.
The resolution of time-of-flight between the two LIGO sites for the signal we just detected was about half a millisecond. This resolution is somewhat dependent on the signal strength and the location of the source relative to the detectors. With three sites we can localize most sources to tens of square degrees on the sky. This is still very large; the moon is a quarter of a square degree.
The odds are pretty good to observe only one event in 16 days of data, and the likelihood of seeing the event on the first day is the same as the likelihood as seeing it on the last day. The analysis of the remaining data from the first observing run (ended Jan 12th) will probably take a couple of months.
>How many signals we see has to do both with how sensitive our detectors are and how often events that can cause waves strong enough happen; because we observed one of these events in 18 days of observation, we can tell that there are between 2 and 400 of these events per year per gigaparsec cubed events like this in the space around us.
Thanks for doing this. I am concerned about the base rate fallacy[1] and would like to know whether/how the probability that this signal was a false positive has been estimated.
I read much discussion regarding the probability of a false alarm given the baseline model.[2] However, without an estimate of the probability of detecting a true signal during the same time frame, we cannot calculate the probability this is a false positive. Such an estimate would obviously require some prior estimates of the rate at which these black hole mergers occur and the percent that can get detected, etc.
It sounds like you've already read about how we estimate the false-alarm rate for signals, so I'll just add that the estimated rate of BBH mergers was highly uncertain before this observation; the error bars spanned three orders of magnitude. See, for example, Fig 5 of http://arxiv.org/abs/1111.7314, which compares the previous LIGO-Virgo upper limits on similar events to the expected rate from population synthesis models and observations of high-mass X-ray binaries, known BNS systems, etc.
The rate inferred from GW150914 is on the high end of the rate estimates from astronomers, but it's completely consistent with prior observations. Certainly, if we had seen ten events in the first 16 days of data, it would not have made sense! But one event is well within expectations.
A lot of this information seems to be available, for example figure 4 here[1]. According to that, the horizon distance for "a binary black hole system with the same observed spin and mass parameters as GW150914 for optimal sky location and source orientation and detected with an SNR of 8" was 1.5-2 Gpc. Now we only need an estimate of the rate at which events with these properties occur, not using the data from GW150914. I presume this is somewhat (a few orders of magnitude) less than the rate at which the mergers occur in general.
Thanks, I'll have to look more closely at that paper since I only see discussion of the upper bounds. The reason for my concern is that I noted elsewhere that the prior lower bounds on the merger rates got pretty low: ~0.1 Gpc^-3 yr^-1. For only 16 days of observing the expected number of events drops to 4.38e-3 Gpc^-3.
Then if the horizon distance is somewhat less than than 1 Gpc we need to scale this further. Say it was 0.5 Gpc, we scale by 0.5^3 to get ~5.475e-4 expected events. For 0.2 Gpc horizon we get ~3.5e-5! These values are getting dangerously close to the estimated background rate, at least using this crude calculation at the lower ends of the prior estimates.
Some dumbass responded....they said oh so are you saying that nasa isnt perfect and didnt build a perfect detector then they actually just said "duh". So lame
I couldn't see it in the post, but I'm interested in how much this specific event, and future events, allow us to tighten the bounds on our models.
It's mentioned a few times that the size of the black holes/other model parameters were "within expectations" and so were a good fit.
What are some of the specific quantities that these measurements will allow us to refine? Previous astronomical measurements have allowed us to put better and better estimates on things like the size of the sun, or the gravitational constant - do we have any idea what estimates this technology will allow us to improve?
In Figs 6 and 7 of the second paper you can see the constraints from GW150914 on what are known as the "post-Newtonian" expansion terms of Newtonian gravity. Previously, terms beyond first order were only loosed bound, mostly from observations of the Double Pulsar system J0737-3039.
Just from this one event, we can also constrain the mass of the graviton to an order of magnitude less than the previous best measurement.
The announcement is the climax of a century of speculation, 50 years of trial and error, and 25 years perfecting a set of instruments so sensitive they could identify a distortion in spacetime a thousandth the diameter of one atomic nucleus across a 4km strip of laserbeam and mirror.
The phenomenon detected was the collision of two black holes. Using the world’s most sophisticated detector, the scientists listened for 20 thousandths of a second as the two giant black holes, one 35 times the mass of the sun, the other slightly smaller, circled around each other.
At the beginning of the signal, their calculations told them how stars perish: the two objects had begun by circling each other 30 times a second. By the end of the 20 millisecond snatch of data, the two had accelerated to 250 times a second before the final collision and a dark, violent merger.
The observation signals the opening of a new window on to the universe.
A century of speculation, no wonder people are starving. Come on are u guys reading this? Are you just closing your eyes opening your mouth and letting them stick in what ever they want?tf?
firstly i think there is bs going on but if i do accept this bs then we are way far behind in our spiritual growth. Dont forget what sting said... we are spirits in the material world....better get balanced soon or the world will do it for you. Or is that what you need?