Bayesian probability still requires data as an input or your result is meaningless. If you start with a made up value and update it with more made up values, you still have a made up value. You still haven't offered any explanation for how you were able to come up with a 1% probability that P=NP.
Your example still isn't the same as mine. Mine didn't involve goal post moving. For your example to be the same, you would have to start off with a dragon whose presence is not detectable in any way. What's the difference between an empty garage and one with an undetectable dragon in it? Absolutely nothing. I'm not saying you should believe in the dragon, I'm saying that determining the presence of the dragon is not done scientifically.
It is a made up number. So what? It's not made up out of thin air though. I should expect to be wrong about 1% of the time when I make such a prediction (and if I'm not it's because I was over or under confident, and should adjust my confidence accordingly.)
So where did the prediction itself come from? My brain, obviously. And that's not a bad thing. Humans are generally good at estimating probabilities. It is essentially what the brain evolved to do. This isn't unique to humans though, there are computer algorithms which can do similar tasks, and mathematical formalizations for calculating certainty.
You can and should be able to assign a probability estimate to anything. When you open your garage door you should expect not to see a dragon. You should be very certain that you won't, actually.
You can't not have any degree of certainty about something. You can expect something to happen. You can expect something not to happen. You can be slightly certain that something will happen. You can be moderately certain. You can think it might have just as good a chance as happening as it does of not happening.
But you can't have no idea what so ever how likely it is to happen. You have to have some expectation of how likely an event is to happen, you can't have no expectation at all.
There is no such thing as a "separate realm" where ideas can't have any certainty values of how likely they are to be true.
As for moving the goal post, religion has moved the goal post plenty. From perfectly testable predictions, to less testable claims, to claims that can't be tested at all.
If you are upset that I, personally, moved the goal post, then just pretend that it was my ancestors that claimed there was a dragon in my garage, and my grandparents decided it was invisible, and my parents decided that it was impermeable to flour, and now I believe that it is a completely untestable dragon, and have decided that dragons are not a matter that is testable by science, and that I've been completely consistent with this belief.
>It is a made up number. So what? It's not made up out of thin air though.
Wait what?
Yes you can assign a probability to everything, it doesn't necessarily mean anything, but you can do it. I hope that is not how you prove all of your math problems.
I'm sorry I must not have been clear in my previous post. I did not mean that you personally were moving the goalposts. I meant that your hypothetical person who believed in garage dragons was moving the goalposts. I was trying to argue that your analogy was a straw man argument.
You are completely missing the point of the Bayesian interpretation of probability and I'm not really sure how to explain it any better.
Let's say you have to bet money on whether or not P=NP will be proven next year. You get to choose the odds you are willing to take it at, and you want to do it so that you will win the most amount of money on average. The on average is the important part.
So if you say you are 99% sure that it won't, that merely means you would make a bet where you will pay $1 if it doesn't happen, and get $100 if it does. If you made a hundred such bets and lost only 1% of the time, you would walk away with just as much money as you started with.
The point of the thought experiment is you don't get the luxury of saying "I don't know", you have to actually make a decision of how certain you are. And you can't take 10 years to calculate how certain you are mathematically either, you have to make a decision. And it's all probabilistic. You decide what bet to take based on how likely you think it is to happen. This is how we make most our decisions. We couldn't go about our daily lives if we didn't do this.
I don't think it's an association fallacy. I am not saying "All claims religions made in the past turned out to be false when tested, therefore all religious claims are guaranteed to be false." I was just trying to point out the history of religions removing more and more of the actual testable claims. Because the testable claims are all that's left, since everything else has long since been proven to be false.
It sounds like we are talking past each other slightly. It sounds like you are trying to produce the probability that a proof is found that P=NP in a given year. However, I'm asking for the probability that ultimately P=NP. What would you average that over? What would your input data be?
I understand that people work up all sorts of heuristics for their daily lives, but that doesn't constitute scientific proof. There still just heuristics.
> Because the testable claims are all that's left, since everything else has long since been proven to be false
How can testable claims be all that's left. If it's testable then it can be proven or disproven. Did you mean untestable?
Your example still isn't the same as mine. Mine didn't involve goal post moving. For your example to be the same, you would have to start off with a dragon whose presence is not detectable in any way. What's the difference between an empty garage and one with an undetectable dragon in it? Absolutely nothing. I'm not saying you should believe in the dragon, I'm saying that determining the presence of the dragon is not done scientifically.