top of page

Presentation

Doubt counts among the most celebrated epistemic virtues. It is certainly essential to scientific inquiry. More generally, critical thinking seems to be the right stance towards most complex and uncertain issues, as they arise in everyone’s life. The wise person always keeps her mind open, as the most implausible hypothesis might finally prove to be true.

            In many real life situations, however, doubt may appear as unreasonable. Practical necessity often compels us to stop doubting and take action: despite pervasive uncertainty and conflicting information about issues as complex and important as climate change, healthy eating, medical or financial risks, one has no choice but to make decisions. Complete certainty is not possible, so decisions must be based on a trade-off between the degree of uncertainty and the price of a possible mistake. In other words, there seems to be a context-dependent confirmation threshold below which it is reasonable to doubt a hypothesis and seek more information: as prescribed by the legal standard of proof for criminal trials in Common Law systems, jurors in criminal trials should reach a very high level of confidence (‘Beyond a Reasonable Doubt’) in the hypothesis of a defendants’ guilt for convicting him; doctors or policy-makers, however, may have to take action upon less strongly confirmed hypotheses, depending on the relative costs and benefits of action/inaction.

            Moreover, even when no practical decision is immediately at stake, following the sceptic’s recommendation of keeping one’s mind open to any alternative hypothesis than the most widely held need not always be reasonable. More specifically, what does (desirable) ‘open-mindedness’ actually mean? In particular, open-mindedness is often misused as a slogan in support of highly unreasonable — and sometimes harmful — conjectures. Under the guise of critical thinking approaches, many conspiracy theories, supposedly sceptical in spirit, often result in a stubborn and irrational rejection of science and expertise. The juror embodied by Henry Fonda in Twelve Angry Men is probably right to consider any conceivable, even though prima facie implausible, scenario. But doubting that the Earth is round or that there was any plane hitting the Twin Towers on 9/11 in the name of critical thinking seems clearly misguided, and unreasonable.

            There is a ‘commonsense’ intuition to this effect, but drawing a boundary between healthy critical thinking and undue scepticism — be it social paranoia or over-cautiousness — is not a trivial task. In what kind of situations is doubt unreasonable? And according to what kind of criteria? Who is legitimate in judging so, and in virtue of what kind of credentials? Even as a legal concept, ‘reasonable doubt’ is far from clear definition: neither legal theory nor the judicial practice have ever managed to provide any consensual and implementable construal of this standard of proof. Reasonable doubt is no more straightforward to define from a more general reasoning and decision-making perspective: its basic decision-theoretic description raises several problems, calling for a closer epistemological and psychological attention.

​

Specific issues to be addressed:

​

  • The meaning of Reasonable Doubt as a legal standard of proof.


E.g. How does ‘reasonable doubt’ articulate with other legal concepts such as the ‘burden of proof’ and ‘presumption of innocence’? What do different jurisdictions say about it? How should it be understood? And how is it, in practice? What conception of mental states and belief change makes the most sense of ‘reasonable doubt’ as a standard of proof? etc.

 

  •  Reasonable Doubt as a norm of reasoning and decision-making in general.


E.g. How should the so-called ‘precautionary principle’ be understood, and implemented? How should experts communicate risk and uncertainty? How should cognitive resources be allocated? How quantify uncertainty and ignorance in real-world, complex decisions? What would a Bayesian decision-theoretical account of ‘reasonable doubt’ look like? etc.

​

  •  Unreasonable doubts: conspiracy theories and science denial.


E.g. Are there epistemological normative criteria to tell apart reasonable doubts from unreasonable ones? How can critical thinking be promoted without encouraging science denial and undermining the credentials of scientists? What are the cognitive underpinnings of the spread of misinformation and conspiracy theories? How can we think of a truly democratic organisation of information and the media? etc.

​

More generally, any proposal on how experts (climate scientists, forensic scientists, health professionals, etc.) and decision-makers, but also journalists having to pass on information to the public, deal with doubt and uncertainty in their day-to-day work, would be considered with interest.

Ancre 1
bottom of page