A conceptual issue that some of the commenters may have missed is that part of the detection is done by matched filtering (https://en.wikipedia.org/wiki/Matched_filter), in which it is necessary to have a good idea of the signal you're looking for. This detection has built upon analytical and numerical advances in relativity. While people may not know about the prevalence of e.g. binary black hole collisions, they have a pretty good idea of the signal that would result if such a collision were to occur. Similarly with other potential sources like binary neutron star collisions.
Yeah, too many LHC reports have primed people to expect counting experiments where the scientists struggle to get to 5 sigma. The waveforms we're talking about here have a signal to noise ratio over 20.
I'm assuming that the same rules apply as do in straight RF detection. A signal becomes a decent signal at 6db above noise and gets exponentially better every 6db above that. Something 20db above noise is rock solid reliable.
A signal 20db above the noise, you could put your eye out with it.
db is confusing, when you're talking voltage it's 20log(Vs/Vw) And in absolute terms engineers talk about the power over 1mW.
Myself I get miffed a bit because people have been conditioned to think in terms of trying to pull facts out of crappy data sets using poorly thought out statistics. However in a lot of engineering and physics fields the data is often really good. Often good enough that you can work off a single measurement.