SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
The consequences of the Japanese earthquake - especially the ongoing crisis at the Fukushima nuclear power plant - resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks, and about how badly markets and societies can manage them.
Of course, in one sense, there is no comparison between the tragedy of the earthquake - which has left more than 25,000 people dead or missing - and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.
Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe. Events proved them wrong: not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.
Before the Great Recession, America's economic gurus - from the head of the Federal Reserve to the titans of finance - boasted that we had learned to master risk. "Innovative" financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society, but even themselves.
These wizards of finance, it turned out, didn't understand the intricacies of risk, let alone the dangers posed by "fat-tail distributions"- a statistical term for rare events with huge consequences, sometimes called "black swans". Events that were supposed to happen once in a century - or even once in the lifetime of the universe - seemed to happen every ten years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause - something like the meltdowns that keep dogging the nuclear industry.
Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: we might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favour self-delusion. A system that socialises losses and privatises gains is doomed to mismanage risk.
...
Click here to read the rest.
Dear Common Dreams reader, The U.S. is on a fast track to authoritarianism like nothing I've ever seen. Meanwhile, corporate news outlets are utterly capitulating to Trump, twisting their coverage to avoid drawing his ire while lining up to stuff cash in his pockets. That's why I believe that Common Dreams is doing the best and most consequential reporting that we've ever done. Our small but mighty team is a progressive reporting powerhouse, covering the news every day that the corporate media never will. Our mission has always been simple: To inform. To inspire. And to ignite change for the common good. Now here's the key piece that I want all our readers to understand: None of this would be possible without your financial support. That's not just some fundraising cliche. It's the absolute and literal truth. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. Will you donate now to help power the nonprofit, independent reporting of Common Dreams? Thank you for being a vital member of our community. Together, we can keep independent journalism alive when it’s needed most. - Craig Brown, Co-founder |
The consequences of the Japanese earthquake - especially the ongoing crisis at the Fukushima nuclear power plant - resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks, and about how badly markets and societies can manage them.
Of course, in one sense, there is no comparison between the tragedy of the earthquake - which has left more than 25,000 people dead or missing - and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.
Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe. Events proved them wrong: not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.
Before the Great Recession, America's economic gurus - from the head of the Federal Reserve to the titans of finance - boasted that we had learned to master risk. "Innovative" financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society, but even themselves.
These wizards of finance, it turned out, didn't understand the intricacies of risk, let alone the dangers posed by "fat-tail distributions"- a statistical term for rare events with huge consequences, sometimes called "black swans". Events that were supposed to happen once in a century - or even once in the lifetime of the universe - seemed to happen every ten years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause - something like the meltdowns that keep dogging the nuclear industry.
Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: we might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favour self-delusion. A system that socialises losses and privatises gains is doomed to mismanage risk.
...
Click here to read the rest.
The consequences of the Japanese earthquake - especially the ongoing crisis at the Fukushima nuclear power plant - resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks, and about how badly markets and societies can manage them.
Of course, in one sense, there is no comparison between the tragedy of the earthquake - which has left more than 25,000 people dead or missing - and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.
Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe. Events proved them wrong: not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.
Before the Great Recession, America's economic gurus - from the head of the Federal Reserve to the titans of finance - boasted that we had learned to master risk. "Innovative" financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society, but even themselves.
These wizards of finance, it turned out, didn't understand the intricacies of risk, let alone the dangers posed by "fat-tail distributions"- a statistical term for rare events with huge consequences, sometimes called "black swans". Events that were supposed to happen once in a century - or even once in the lifetime of the universe - seemed to happen every ten years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause - something like the meltdowns that keep dogging the nuclear industry.
Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: we might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favour self-delusion. A system that socialises losses and privatises gains is doomed to mismanage risk.
...
Click here to read the rest.