There is a way of thinking known as Bayesian Inference that allows for a hypothesis to be updated for correctness as more data surfaces. Steven Pinker, in his book Enlightment Now, provides a good example of how this works when he describes the thinking of people known as “superforecasters.” When one of these people are asked the likelihood of an event they begin to formulate their guess with an objective base rate based on statistics. Then, in an effort to get their prediction more accurate, they find further evidence to inform their prediction about whatever specific situation they are asked about.
This way of thinking reminds me of the way engineers perform back of napkin calculations. They begin with an overarching idea about how the technical aspects might affect the success of a project and narrow down the scope until they have an estimation for their specific instance. The trick to becoming a “superforecaster” and using Bayesian inference successfully is to remove all emotions from the situation and look at every prediction the same way an engineer would get a rough estimate for a project. Easier said than done when forecasting questions can be as charged as “What is the likelihood of an Islamic terror attack on US soil in the next year?” There’s a lot of baggage to that question and in order to get an accurate assessment it is important to remove emotions from the situation.