Flawed computer models may have exaggerated the effects of an Icelandic volcano eruption that has grounded tens of thousands of flights, stranded hundreds of thousands of passengers and cost businesses hundreds of millions of euros. The computer models that guided decisions to impose a no-fly zone across most of Europe in recent days are based on incomplete science and limited data, according to European officials. As a result, they may have over-stated the risks to the public, needlessly grounding flights and damaging businesses. "It is a black box in certain areas," Matthias Ruete, the EU's director-general for mobility and transport, said on Monday, noting that many of the assumptions in the computer models were not backed by scientific evidence. European authorities were not sure about scientific questions, such as what concentration of ash was hazardous for jet engines, or at what rate ash fell from the sky, Mr Ruete said. "It's one of the elements where, as far as I know, we're not quite clear about it," he admitted. He also noted that early results of the 40-odd test flights conducted over the weekend by European airlines, such as KLM and Air France, suggested that the risk was less than the computer models had indicated. – Financial Times
Dominant Social Theme: Computers let us down!
Free-Market Analysis: We find the emerging explanations for the mess in Europe to be both compelling and predictable. Perhaps officials were wise to ban all flights for a period of time. But we assumed in our innocence that the officials had done significant testing. We assumed they'd sent planes into the air to test what damage there could be from the ash. We assumed they'd calibrated the danger in a real-time way. But apparently they hadn't.
In this article we will examine, generally, the EU's general reliance on computer modeling and the questions that arise from such abiding faith in statistical analysis. While there is certainly a difference between modeling the spread of volcanic ash and other kinds of statistical modeling, the over-riding reliance on such analysis generally is a troubling trend in our view and one that may explain why Western governments' predictions are so often wrong in a variety of areas.
Computers are ubiquitous in Western society and play an ever-larger role in government decisions. The discipline in aggregate (as it applies to sociopolitical modeling) is called econometrics, and it partakes of the same optimism about computer modeling and statistical analysis as modeling of future weather patterns – or even volcanic ash. The question arises (or remains): Does the increase in the use of computers and computer power improve the quality of government decision-making. Here's more from the Financial Times article:
While the US system leaves air carriers with the responsibility to determine whether or not it is safe to fly "the American model is not a model of less safety", he said. "You just need to look at the statistics to see that." Under European rules, member states have the power to decide whether or not their airspace should be open. But decisions during the past week have been guided by computer models from the Volcanic Ash Centre in London and Eurocontrol, an organisation that co-ordinates air travel.
European safety procedures on volcanic ash were put in place after two incidents involving British Airways and KLM jets in the 1980s, in which aircraft engines lost power after flying through ash above Indonesia and Alaska. In the wake of those events, the International Civil Aviation Organisation, a UN body that sets flight standards, asked air traffic controllers to develop contingency plans. Under these plans, the presence of ash prompted airspace to be restricted.
Mr Ruete said it would require more study, and backing from all 27 EU member states, to use a US-style system, which gives carriers greater latitude – and potential liability – to make such judgments. Mr Ruete's opinion was seconded by airlines, which have argued that the risks have been over-stated. "Our flights have shown that we can fly safely in these environments," said Aage Duenhaupt, a Lufthansa spokesperson. "The mathematics and the reality in the air have no correlation," he added, referring to computer models used by the Volcanic Ash Advisory Centre.
This is par for the course when it comes to the EU and its "member states" – the bureaucracy of which, in aggregate, is unmatched in the world, except perhaps by China. In fact, we think the problems of this past week are a good example of the misplaced faith that the EU has generally in computer models. Here's something from a recent UK Telegraph report on the issue:
It emerged that two vastly differing maps of the ash cloud were being circulated by official bodies.
One, prepared by the government-controlled Met Office and used by the CAA and Nats, showed the ash cloud covering much of Europe. A second, produced for the European air traffic control co-ordinator Eurocontrol limited the areas of danger to two high-density clouds over the Atlantic. This second map and extensive test flights which have shown no negative effects on aircraft helped persuade many other European countries to reopen their air space. European aviation ministers agreed to draw up plans to allow aircraft to fly in areas where the low concentration of ash from the Eyjafyoll volcano was not considered to pose a safety threat.
No matter the results of the maps,the reliance on computer modeling rather than first-hand testing is disheartening. Within the EU, there are a plethora of alphabet agencies that utilize computer modeling and forward-looking mathematical analysis to provide guidance to EU policy markets. Long-lived agencies such as the European Commission Joint Research Centre, mention econometric analysis as part of their methodologies. The JRC (Commission) explains its mission thusly:
The Commission's job is to represent the common European interest to all the EU countries. To allow it to play its role as 'guardian of the treaties' and defender of the general interest, the Commission also has the right of initiative in the lawmaking process. This means that it proposes legislative acts for the European Parliament and the Council of Ministers to adopt.
The Commission's Internet material directly mentions econometrics as a core competency that helps the Commission provide "internationally accepted research material," as follows: "JRC work … includes econometric studies, data gathering and also projects related to clean technologies and the safety of chemicals or the harmonisation of data."
Econometrics is basically an Anglo-American invention, one developed in the late 19th century out of a schism between Austrian (free-market) economics and British economics. British economics, as all modern economics, recognized the fundamental insight of marginal utility – that only the market itself could provide dynamic or variable pricing for goods and services. But having recognized marginal utility as the dividing line between classical and neo-classical economics, British economists began to elaborate on a mathematical language that would express economic concepts utilizing the shorthand of descriptive equations.
A main initial proponent of econometrics was the British economist Alfred Marshall who, Wikipedia tells us, "is credited with an attempt to put economics on a more mathematical footing. He was the first Professor of Economics at the University of Cambridge and his work, Principles of Economics coincided with the transition of the subject from 'political economy' to his favoured term, 'economics'." The socialist economist John Maynard Keynes is perhaps the most prominent econometric economist of the 20th century.
The schism between Anglo-American econometrics and the Austrian (free-market) concept of human action is fundamental to an understanding of how modern Western governments have evolved and how they utilize economics as a tool to further policy-making objectives. While recognizing marginal utility – the idea that only markets can determine supply-and-demand pricing adjustments – Western policy makers have adopted economic computer modeling as a planning methodology.
Using econometrics to project future supply and demand remains controversial however. Ludwig von Mises in his great free-market book Human Action explained how efforts to predict human behavior using mathematical models were likely doomed to failure. The main problem with any predictive analysis, Mises, explained, was that human beings themselves were creatures of action – human action. Econometric modelers projected trends to their logical conclusion, but Mises and other Austrians held that such trend-projection was vitiated by human perceptions of the very same problems the models were analyzing.
This is actually an argument that can be traced back to Thomas Malthus whose famous treatise, An Essay on the Principle of Population was published in six editions between 1798 and 1826. Malthus predicted famine and a wide-scale die-off in Britain by projecting population and food trends, and his pessimistic conclusions helped brand economics as "the dismal science."
Fortunately, Malthus' conclusions never came true. And this is the crux of the argument between those who espouse econometrics and those who favor Austrian free-market analysis. History, at least in Malthus' case, gives free-market analysis a victory. Malthus' predictions didn't come true because people, faced with a growing population and, potentially, a lack of food, took Misesian human action well in advance of any food or grain shortage. They grew more food.
This sort of thing happens over and over again within the world of statistical analysis. It is the reason government planning is almost inevitably wrong. While certain large-scale trends may be projected forward with some (small) level of confidence, more detailed trend analysis almost never comes to fruition. The failure of econometric modeling can be seen in areas as disparate as weather (the infamous hockey stick graph predicting global warming comes to mind) to the most recent contretemps involving volcanic ash dispersal over Europe.
From the standpoint of policy-makers, of course, modeling (and now computer-assisted modeling, economic or not) is an indispensible tool when it comes to generating policy. In fact, modeling, whether econometric or not, is actually a kind of dominant social theme. One can argue, in fact, that the power elite has been behind the growth and acceptance of econometric disciplines because it provides an underlying justification for government action.
It is possible to model (and project) almost problematic trend these days using econometric tools and computer power. The ability to project-out trends with mathematical certainty is seemingly of great aid in supporting the power-elite's manifold fear-based campaigns. From the spread of AIDS to the inevitability of global warming and other looming human catastrophes, econometrics provides a fundamental justification for government action to "prevent" the projected catastrophe.
Econometrics explains the confidence that Western governments, especially, have in predicting what is to come and the certainty of the legal remedies being offered. Misesian human action explains why the remedies founder and fail. The recent, controversial modeling of Icelandic volcanic ash – while not specifically an econometric model – is a case in point.