nancylebov: blue moon (Default)
[personal profile] nancylebov
Nassim Taleb (author of The Black Swan: the Impact of the Highly Improbable)on what we can't predict.
This tells us that there is "no typical" failure and "no typical" success. You may be able to predict the occurrence of a war, but you will not be able to gauge its effect! Conditional on a war killing more than 5 million people, it should kill around 10 (or more). Conditional on it killing more than 500 million, it would kill a billion (or more, we don't know). You may correctly predict a skilled person getting "rich", but he can make a million, ten million, a billion, ten billion—there is no typical number. We have data, for instance, for predictions of drug sales, conditional on getting things right. Sales estimates are totally uncorrelated to actual sales—some drugs that were correctly predicted to be successful had their sales underestimated by up to 22 times!

The short version is that rare events make a huge difference, but are so uncommon that we can't make remotely sensible predictions about their size.

And for fans of A Fire Upon the Deep....

Indeed some systems tend to optimize—therefore become more fragile. Electricity grids for example optimize to the point of not coping with unexpected surges—Albert-Lazlo Barabasi warned us of the possibility of a NYC blackout like the one we had in August 2003. Quite prophetic, the fellow. Yet energy supply kept getting more and more efficient since. Commodity prices can double on a short burst in demand (oil, copper, wheat) —we no longer have any slack. Almost everyone who talks about "flat earth" does not realize that it is overoptimized to the point of maximal vulnerability.

Link thanks to Geek Press.

Date: 2008-09-20 06:34 am (UTC)
From: [identity profile] captain-button.livejournal.com
Quibble: ITYM A Deepness In The Sky, that is where he has the descriptions of the over-optimized economies coming apart, IIRC.

Date: 2008-09-20 07:01 am (UTC)
From: [identity profile] nancylebov.livejournal.com
I'm pretty sure it's A Fire Upon the Deep which has a couple of examples of optimization being highly risky-- the book is about the relationship between the ships (which need planetary societies to refit themselves) and the planetary societies (which are unstable and need outside resources from ships t0 restart themselves). A Deepness in the Sky has the Emergency, a very nasty dictatorship.

Do we both need to reread the books? Oh, horror!

Date: 2008-09-20 10:56 am (UTC)
From: [identity profile] stoutfellow.livejournal.com
For a detailed nonfictional account of the same problem, try Joseph Tainter's The Collapse of Complex Societies.

Date: 2008-09-20 11:43 am (UTC)
From: [identity profile] anton-p-nym.livejournal.com
I think both books discuss it, though "Deepness" goes into more, er, depth on the issue.

-- Steve should probably give them a re-read himself, come to think of it.

And I actually read the whole book like a sucker

Date: 2008-09-20 06:36 pm (UTC)
From: [identity profile] inertiacrept.livejournal.com
Taleb presents an interesting question: Does being a completely odious and tangential writing style which patronizes your readers and dismisses your critics with the worst sort of "if you don't get it, it must be because you're retarded" playground name-calling destroy the actual intelligent point hiding underneath all of that rot, or just make it harder to get to?
From: [identity profile] nancylebov.livejournal.com
My impression is that being somewhat aggressive or insulting improves the odds of non-fiction selling. I hope I'm wrong.

December 2025

S M T W T F S
 123456
78910111213
141516 17181920
21222324252627
28293031   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 27th, 2026 07:54 am
Powered by Dreamwidth Studios