For the context, this was written at the time when there was still a lot of confusion on how to represent uncertainty. There were at least 3 different ideas besides using probabilities and in fact the probabilities were considered not good enough.
Ah, references "fuzzy logic". That was one system that really needed to go away. I actually took a course in multivalued logic at UCB in the 80s. Fuzzy logic was "kinda like probability but with no logical basis, just a cute name".
That said, I think "quantitative measures of likelihood that we can't easily reduce to probabilities" deserve study.
I mean, it seems like, in rough, "fuzzy" sort of way, neural network activations are such measures of likelihood and certain they are used.
Fuzzy "logic" is best used when you take out all the references to logic and just think about fuzzy sets -- then it's just fuzzy set theory, which is useful when you need to model partial set membership.
I enjoyed reading Probabilistic Logic Networks (https://www.amazon.com/Probabilistic-Logic-Networks-Comprehe...) as it introduces a possible framework to sanely deal with different kinds of truth values about different things which aren't necessarily a single number.
I don't mean to be negative here but you can axiomatize anything if "axiomatize" one just means "describe what you do".
It took quite a while but Kolmogorov and company actually formalized what is really meant by a "random sequence". But now you have a strong description of the real world behaviors are expected asymptotically if a probability is assigned to a behavior. As far as I can tell, all that's being axiomatized here is how the arbitrary quantities people make up for the fuzzy logic of things get manipulated. IE, there's not description of the relation of fuzzy logic and "reality" (because if there were, it would map to either regular logic or probability).
Judea Pearl has added considerably to this line of thinking (Bayesian Networks), by annotating them with causality. This allows us to explicitly model the traditional notion of "hidden variables", or "confounders", and make sound inferences in many (most?) practical cases even when one can't directly observe -- but merely suspect the existence of -- some hidden cause.
I thoroughly enjoyed his "The Book of Why", a lay introduction to this subject.
It is worth saying that there are still situations where any use of quantitative probabilities becomes something of an abuse. The most extreme example is Pascal's Wager[1]; if you can assign a "small but meaingful" probability to any X you happen to mention in the discussion, you can assign a probability to the existence of the Great Old Ones or the Flying Spaghetti Monster or whatever implausible entity is going to create a hypothetical action of negative utility sufficient to counter it's unlikeliness.
And, of course, acknowledging some stuff outside of the domain of probability means you need a fuzzy border between two realms, which also can't be determined by probability.
This is not the only problem - there is also the small problem of worshiping the wrong god if the real god is jealous and vengeful. Will this god be more angry if you worship the wrong god or no god?
That's not much of a problem for Pascal, because he made his wager in an environment where the cultural bias regarding god was that there was either the Christian god or no god.
I would be pretty certain that Pascal knew there were other gods worshiped in the world. He might have thought all the other gods were false gods, but he would have been aware of the concept of different gods.
Still doesn't change the problem of worshiping the wrong god. Is it better to worship no god than worship the wrong god?
I'd claim assigning a probability to the existence of God is problematic in itself. Could you assign a probability to the inconsistency of mathematics?
Under the Bayesian school of thought, probability represents our degree of belief and is ultimately subjective. There is no a priori reason I can't assign a probability to the existence of God, since the probability reflects my belief about God's existence. Evidence lets us update our beliefs, and therefore our probabilities.
While I have no issue at all with assigning a probability to the inconsistency mathematics, the value I'd assign varies with the branch and mathematics. For Zermelo–Fraenkel set theory, for example, I'd assign a probability very very very close to 0, but not equal to zero (because to be equal to zero, I'd need a proof).
The inconsistency of math was a trick question. It doesn't matter what probability you assign to it since if math is inconsistent, all numbers are equal!
It does, insofar as you can express anything meaningful in an inconsistent system.
A formal system being inconsistent implies being able to prove some statement A, and also its converse ~A.
If both A and ~A are true, then we can prove that every other statement in this system is also true.
A quick proof:
1. A
2. A v B
3. ~A & A v B
4. B
We start with A.
The union (logical-or) of true statement (here A) and any other statement (say B) is a true statement, thus A v B.
Then we introduce another true statement, ~A, via logical-and to get ~A & (A v B), which simplifies (disjunctive syllogism) to just B.
So we have proved B, but B was arbitrary.
It could be anything, including the statement that x = y for x and y two ostensible non-equal numbers.
They also mean futures with a large number of humans, for someone who cares about that as part of their utility, are incredibly unlikely. This is implausible.
>The most extreme example is Pascal's Wager[1]; if you can assign a "small but meaingful" probability to any X you happen to mention in the discussion, you can assign a probability to the existence of the Great Old Ones or the Flying Spaghetti Monster or whatever implausible entity is going to create a hypothetical action of negative utility sufficient to counter it's unlikeliness.
Yes, but Pascal used a big prior of his cultural upbringing and 2.5+ millennia of Jewish+Christian religion to consider just one God as plausible, not every random possible entity.
Judea Pearl pioneered erasing this view by inventing Bayesian Networks. More here: https://ftp.cs.ucla.edu/pub/stat_ser/r476.pdf