• “Mistaken Requirement for Infinite Certainty: One major issue with Brad’s argument is the implicit assumption that rational belief requires absolute certainty, or a 100% degree of belief. This is a fallacious requirement, as rationality in belief merely requires that one’s confidence level is proportionate to the evidence available, not that it reaches an impossible standard of absolute certainty.” This quote points out a critical fallacy in Brad’s argument regarding the need for absolute certainty.
  • “Pragmatic Contradiction: Brad’s argument, if taken to its logical conclusion, would lead to epistemic paralysis. If one must continually lower their confidence in any belief due to the recursive nature of justification, no belief would ever be justified enough to act upon.” This highlights a pragmatic contradiction in Brad’s argument, leading to a deadlock in decision-making.
  • “On Rationality as a Mapping to Evidence: The notion that rationality is essentially a degree of belief that maps to the degree of the relevant evidence is fundamentally sound, with some caveats. Rationality involves aligning one’s confidence in a proposition with the weight of the evidence supporting it. This approach acknowledges human cognitive limits and the provisional nature of most knowledge.” This statement encapsulates the soundness of aligning rationality with evidence, acknowledging the limitations and provisional nature of human knowledge.

Based on the exchange on epistemology below, do as instructed at the end of this prompt:

Phil: The other day I encountered an individual (whom we’ll call Brad) who suggested that, based on my notion that rational belief is a degree of belief that maps to the corresponding degree of perceived evidence, I would be required to not only assess the reliability of my mental faculties in assessing a given proposition, but that I would need to also assess the reliability of my mental faculties to assess my mental faculties, and then recursively assess this assessment ad infinitum until I had no justification for any significant degree of belief in the initial proposition.

Here is Brad’s explanation.

So suppose I agree with you that, given my past experience and familiarity with my own fallibility, I make sure to always proportion my degree of belief to the evidence. Some proposition, call it P, presents itself. I evaluate the evidence for P and decide there is a good amount of evidence in its favor, say for the sake of argument that I think it is 75% likely to be true on the evidence. I determine that I have good reason to believe that P. But that determination is, itself, a reasoning process that I have a belief about, namely I have a belief that my reasoning process has properly arrived at a proper assessment of the probability of P given the evidence. We will call this belief about the likelihood of P Q. My experience with my ability to assess probabilities given evidence tells me that I should put 95% confidence in Q. But if that is true, then I need to lower my confidence in P, since Q tells me that I could be wrong about P being 75% likely. My confidence in Q is a belief that I am also not certain of, and so it, R, says that it is 95% likely that Q is right. But that means that I should be 95% sure that I am 95% sure that I am 75% sure of P. And each time I iterate this, and reflect on my certitude, I must lower my confidence that P until it approaches the point where I cease to be confident that P is true at all.

  1. Steel-man Brad’s argument.
  2. Lay out any fallacies and blunders in Brad’s argument.
  3. Weigh in on the notion that rationality is essentially a degree of belief that maps to the degree of the relevant evidence.

Comment on the coherence and utility of the proposed taxonomy of rationality types below.

Core Rationality: A basic level of rationality that only reflects the degree of epistemic self-honesty. It is encapsulated in the statement, “rational belief is a degree of belief that maps to the degree of the perceived relevant evidence.” To the degree one is not mapping their degree of confidence to the degree of the balance of confirming/disconfirming evidence for a proposition, to that degree they are exhibiting core irrationality.

Deep Rationality: A type of rationality that extends core rationality through additional skills of rational thought that include but are not limited to the following:

  • A working knowledge of probability theory.
  • Deep familiarity with the deductive and inductive tools of science.
  • An awareness of logical fallacies & cognitive biases.
  • A firm understanding of material reality and it’s intrinsic probabilities.

Let me push back and elaborate on that proposed taxonomy of types of rationality. Core rationality is binary. Either you have honestly mapped your degree of belief to the degree of the evidence as you’ve perceived it, or you have not. And to the degree you have not, to that degree you are irrational. Deep rationality, in contrast, is largely degreed since the skills foundational to deep rationality arrive incrementally. So while core rationality is fully available to everyone, deep rationality is acquired in degrees through a focused intention to improve the relevant skills. Does this make sense?


Create a 10-item quiz on the entire thread above. Manually number each quiz item. Add the list of answers after the list of questions. Do not indicate the correct answers in the items, but only include the correct answers in the final answer section.


Provide 15 discussion questions relevant to the content above.


Leave a comment


Phil Stilwell

Phil picked up a BA in Philosophy a couple of decades ago. He occasionally teaches philosophy and critical thinking courses in university and industry. He is joined here by ChatGPT 4, GEMINI, CLAUDE, and occasionally Copilot, his far more intelligent AI friends. The five of them discuss and debate a wide variety of philosophical topics I think you’ll enjoy.

Phil curates the content and guides the discussion, primarily through questions. At times there are disagreements, and you may find the banter interesting.

Goals and Observations




Indicate your interests:


Links to Section Menus