Warning

Long read. 1603 words

Whoever thought more information would improve the world was more wrong than anyone has been wrong in the history of being wrong.

It’s not that people aren’t reading. Twitter and social networks make it easier than ever for people to read high-quality information in bite-sized pieces. The average internet browser in 2022 probably reads more words each day than even the most well-read person who lived 100 years ago.

The real problem with all this information is that people aren’t concurrently learning how to think about the words and ideas they’re reading. Grifters take greedy advantage of this imbalance and propagandists are having a field day. As Marshall McLuhan once said, the literate man is the natural sucker for propaganda.

Just like learning a language, it becomes nearly impossible to teach a person to think differently after that person reaches a certain age. Adults tend to get stuck in their ways, like a walking trail cut through snow, digging deeper with each pass and making it harder to jump over the side into the fresh snow to trace a new path.

The intelligence community is well aware of the difficulties of training people into and out of habits. Yet these agencies regularly retrain 55-year-old officers to speak an entirely new language fluently in about half a year. They can also take any smart person off the streets, teach them structured analytical techniques and have them producing robust intelligence estimates in perhaps a dozen weeks.

In other words, thinking correctly, usefully and efficiently can be taught to anyone at any time. The trick is in applying those techniques in real time, or at least within a set time window when making a good decision actually matters. Thinking is both a skill and an art, a lot like learning how to shoot a basketball, swing a hammer or flip a pizza.

The first lesson new analysts learn is to reject binaries. When an intelligence customer asks a question, it is unwise to say “Yes” or “No”, “It will happen”, or “It won’t happen”. By offering binary answers like these, the analyst is claiming to know more than a person ever could: how the future will pan out. The customer needs information that will help them adapt to, not predict, the future. Binaries are inflexible by definition.

A better way to analyse the firehose of information is to place a percentage value on the answer. So, if the question is about forecasting, at one end of the spectrum a value of 0% would indicate an event is not going to happen. On the other end, a 100% value indicates the analyst is absolutely sure an event will happen.

To find the average percentage value between those boundaries, the analyst will break down all the variables in the raw information and sum the individual values placed on each factor. If done correctly, it is possible to supply an estimate of either the veracity of a story or the likelihood of an event.

For example, consider this story: China’s leader Xi Jinping is suffering from brain aneurysm and wants to be treated with traditional medicine, reports claim. You could ask if this story is true or not. But that would be thinking in binaries. The better question is: what percentage value would you place on the veracity of this story?

To figure out the answer, you would need to know that the precise medical condition of any head of state is a closely guarded secret. Knowing the medical vulnerabilities (or strengths) of a leader can be invaluable information for foreign intelligence agencies since the opportunities for political exploitation and meddling are endless.

For this reason, intelligence agencies will wrap all information about the exact medical condition of a leader in so much misinformation, disinformation and outright falsity that it is nearly impossible for an observing intelligence agency to be sure of what’s going on. Perhaps only the leader and their doctor know for sure. The Chinese take the status of the president’s health very seriously.

In other words, we are asking directly about the validity of the source. In this case, the person making the medical claim is a Canada-based blogger who made a video that was quickly censored by Chinese social networks. In intelligence, who you are is irrelevant. All that matters is if you have access. So does this blogger have the necessary access to Xi Jinping to be making such a claim?

This is where the analysis becomes projection because it is unclear who the blogger’s source is. But while Canada has many Chinese immigrants, those people departed China precisely because they were nobodies back home. Nobodies generally don’t have good access to anything. Sure, China often leverages its diaspora for collecting intelligence on their adopted country (this happens in New Zealand, too). But an emigrant can be certain not to have the necessary access to Xi’s inner party back in China.

So, what percentage values can we apply to these factors? For the issue of systems (medical status of leadership) I would apply a value of 7%, given the realities. I would then give the issue of source (does he have access) a value of 11%, again given the realities.

This leaves us with an average provisional value for the story’s veracity of 9% or “very low”. This figure leaves some room for coincidence. After all, the blogger may be right, but for the wrong reasons.

Also, the values are provisional because this was a rough assessment. By adding more variables, the analyst can generate a greater number of percentage values and create a more robust final estimate. The information would also need to be cross-referenced with other data and stories to locate any curious patterns.

Perhaps your intuition could reach the same estimate, but intuition isn’t a metric. And metrics are crucial when thinking about the likelihood of something happening.

One of the best examples of clear thinking about likelihood was displayed by the Soviet Union. Although Nazi Germany and the Soviet Union signed the Molotov–Ribbentrop Pact in 1939, neither side fully expected the non-aggression treaty to hold. So the Soviet KGB was tasked by Joseph Stalin to assess the likelihood of Nazi Germany surprising Soviet forces with an attack.

Where to start on a task like this? The agency had to deal with trillions of factors and variables on all kinds of levels. The KGB couldn’t analyse them all at once, so it decided to look for a handful of key factors that would best indicate with a robust percentage value that, should those factors change, Nazi Germany was strong enough to win and therefore would attack.

Two factors stood out to the KGB. Due to the harsh winter climate of Russia, if the Wehrmacht were to launch an invasion, that military would need vast stockpiles of antifreeze for oil in its vehicles so the engines would keep working during the cold weather months. It would also need plenty of wool for uniforms so German soldiers could stay warm while fighting.

By blocking out all other data points and monitoring just these (and a few other) factors, the KGB had found a way to deliver a confident briefing to Stalin every day about the likelihood of a German invasion: if those stockpiles were sufficient, Hitler was ready to invade. If they were too low, it didn’t matter if Hitler attacked because the Soviets would win.

As history shows, the Germans did attack in June 1941, and we know how that ended for the Nazis. A lot of historians have wondered why Stalin did not purge or dismiss the KGB officers. But the answer is easy: they were wrong about the invasion date, but for the right reasons. They knew that the real question was about the potential for Soviet victory, not picking the time of a surprise invasion.

Tripwires, red flags, bottlenecks – call this analytical technique what you like. It shows the importance of monitoring the right variables when trying to forecast the future. Given this process of thinking, did the New Zealand government isolate the right variables when analysing the threat of Covid-19? Did it know what variables to look for in 2020? What are the implications for its modelling if the government didn’t know these things?

Learning to use structured analytical techniques has a lot in common with learning a new language. As with most things, a person’s worst enemy is their own ego. That’s why the most efficient way to learn a new language is to copy a movie actor or actress who speaks in the second language. By forcing your ego to mimic that person, your ego will not fight back as your identity changes. The same goes for learning how to think.

Since your worldview is bound to change after applying better analytical techniques, the brain will feel unmoored and under attack as the old ways of knowing are rejected. But since mathematics itself is a language, pretend you are a different person looking at the information. How would a sailor look at this data? How would a mother look at it? Etc. You might be surprised by how quickly you start to think clearly.

Put it this way: the world isn’t going to magically get less complicated. The last two years have shown that futurist Alvin Toffler’s prediction was correct: “The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn and relearn.”

Nathan Smith is a former business journalist and columnist at the NBR. He also worked as the chief editor at the New Zealand Initiative policy think tank. He is now a freelance writer and copy editor.