In recent posts I’ve looked at how untested software can skew results, and dishonest scientists who manufacture data. But supply side issues aren’t the only problem with Big Data. We have to look at how humans deal with information we’ve never had before.
Take climate change. We know the planet is getting warmer. Almost all competent scientists agree that increased atmospheric CO2, a byproduct of fossil fuel consumption, is the cause. Other theories – such as sunspots, cloud cover, solar variations, and a world-wide conspiracy of grant-hungry scientists – have been examined and found wanting.
The impact promises to be catastrophic: hundreds of millions of people displaced, dwarfing the current refugee tragedy in Syria, trillions of dollars in property losses, tropical diseases spreading into formerly temperate climes, and more.
Yet we have the spectacle of an US Senator throwing a snowball on the Senate floor as if that made any kind of sense. And a candidate for President of the United States, the world’s leading economy, called global warming BS because it got cold in New York in the winter.
Inherent in using Big Data are the tools we use to make sense of them: statistics. Before the 2012 election, people claimed that Mitt Romney was ahead by “deskewing” poll results.