SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
The consequences of the Japanese earthquake - especially the ongoing crisis at the Fukushima nuclear power plant - resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks, and about how badly markets and societies can manage them.
Of course, in one sense, there is no comparison between the tragedy of the earthquake - which has left more than 25,000 people dead or missing - and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.
Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe. Events proved them wrong: not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.
Before the Great Recession, America's economic gurus - from the head of the Federal Reserve to the titans of finance - boasted that we had learned to master risk. "Innovative" financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society, but even themselves.
These wizards of finance, it turned out, didn't understand the intricacies of risk, let alone the dangers posed by "fat-tail distributions"- a statistical term for rare events with huge consequences, sometimes called "black swans". Events that were supposed to happen once in a century - or even once in the lifetime of the universe - seemed to happen every ten years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause - something like the meltdowns that keep dogging the nuclear industry.
Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: we might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favour self-delusion. A system that socialises losses and privatises gains is doomed to mismanage risk.
...
Click here to read the rest.
Donald Trump’s attacks on democracy, justice, and a free press are escalating — putting everything we stand for at risk. We believe a better world is possible, but we can’t get there without your support. Common Dreams stands apart. We answer only to you — our readers, activists, and changemakers — not to billionaires or corporations. Our independence allows us to cover the vital stories that others won’t, spotlighting movements for peace, equality, and human rights. Right now, our work faces unprecedented challenges. Misinformation is spreading, journalists are under attack, and financial pressures are mounting. As a reader-supported, nonprofit newsroom, your support is crucial to keep this journalism alive. Whatever you can give — $10, $25, or $100 — helps us stay strong and responsive when the world needs us most. Together, we’ll continue to build the independent, courageous journalism our movement relies on. Thank you for being part of this community. |
The consequences of the Japanese earthquake - especially the ongoing crisis at the Fukushima nuclear power plant - resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks, and about how badly markets and societies can manage them.
Of course, in one sense, there is no comparison between the tragedy of the earthquake - which has left more than 25,000 people dead or missing - and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.
Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe. Events proved them wrong: not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.
Before the Great Recession, America's economic gurus - from the head of the Federal Reserve to the titans of finance - boasted that we had learned to master risk. "Innovative" financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society, but even themselves.
These wizards of finance, it turned out, didn't understand the intricacies of risk, let alone the dangers posed by "fat-tail distributions"- a statistical term for rare events with huge consequences, sometimes called "black swans". Events that were supposed to happen once in a century - or even once in the lifetime of the universe - seemed to happen every ten years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause - something like the meltdowns that keep dogging the nuclear industry.
Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: we might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favour self-delusion. A system that socialises losses and privatises gains is doomed to mismanage risk.
...
Click here to read the rest.
The consequences of the Japanese earthquake - especially the ongoing crisis at the Fukushima nuclear power plant - resonate grimly for observers of the American financial crash that precipitated the Great Recession. Both events provide stark lessons about risks, and about how badly markets and societies can manage them.
Of course, in one sense, there is no comparison between the tragedy of the earthquake - which has left more than 25,000 people dead or missing - and the financial crisis, to which no such acute physical suffering can be attributed. But when it comes to the nuclear meltdown at Fukushima, there is a common theme in the two events.
Experts in both the nuclear and finance industries assured us that new technology had all but eliminated the risk of catastrophe. Events proved them wrong: not only did the risks exist, but their consequences were so enormous that they easily erased all the supposed benefits of the systems that industry leaders promoted.
Before the Great Recession, America's economic gurus - from the head of the Federal Reserve to the titans of finance - boasted that we had learned to master risk. "Innovative" financial instruments such as derivatives and credit-default swaps enabled the distribution of risk throughout the economy. We now know that they deluded not only the rest of society, but even themselves.
These wizards of finance, it turned out, didn't understand the intricacies of risk, let alone the dangers posed by "fat-tail distributions"- a statistical term for rare events with huge consequences, sometimes called "black swans". Events that were supposed to happen once in a century - or even once in the lifetime of the universe - seemed to happen every ten years. Worse, not only was the frequency of these events vastly underestimated; so was the astronomical damage they would cause - something like the meltdowns that keep dogging the nuclear industry.
Research in economics and psychology helps us understand why we do such a bad job in managing these risks. We have little empirical basis for judging rare events, so it is difficult to arrive at good estimates. In such circumstances, more than wishful thinking can come into play: we might have few incentives to think hard at all. On the contrary, when others bear the costs of mistakes, the incentives favour self-delusion. A system that socialises losses and privatises gains is doomed to mismanage risk.
...
Click here to read the rest.