From Kevin Fong’s BBC Horizon programme:
- Multi-tasking comes at an expense.
- Use checklists to reduce human errors
- In a team know who is responsible, at all times.
- When in emergency, follow rule 2.
From Kevin Fong’s BBC Horizon programme:
Nicholas Taleb’s Reddit AMA has been, by far, the most intellectually stimulating “ask me anything” that I’ve ever read. I don’t agree with everything that he says (see last question below, for example). Nevertheless the whole thread was thoroughly enjoyable. Here are excerpts of bits that made me pause and think:
R: What kind of risks do you think we overlook most in day to day life?
T: The answer to your question is in the following: 7000 Americans die every day, many, many of preventable causes. What we talk about is usually the sensational. Do the math: they die from lack of stressors (activity), corn syrup, cigarettes, etc. So the real risks/killers are discernible; they map to the risks for your life.
R: Is your freedom the only source to go on fighting with such fervor?
T: I have always been fighting… But my freedom gives me more moral obligations, make me feel more guilt for not shouting fraud when I see it.
R: What kind of system would you set up in order to promote anti-fragility?
T: Rule: any company that would cause a national emergency requiring a bailout should it fail should be classified BAILABLE-OUT and employees should not be allowed to earn more than civil servants. That would force companies to 1) be small, 2) not leech off the taxpayer.
R: What is the most important skill or trait a human being can have in the modern world?
T: A sense of honor. It puts you above everything else.
R: What is one thing that a recent college graduate can due to be Antifragile?
T: Get passing grades and follow voraciously your curiosity on the side instead of competing in school. In the end what matters is your curiosity, nothing else. And read nothing that doesn’t interest you but interests someone else.
R: You’ve talked a lot about financial issues and health issues. You have touched on the environment, but not said much about energy use.
T: The problem is the nonlinearity of harm. We have too many people on the planet, with too much concentration of pollutants. And these people are converging to the same habits.. We are not supposed to be eating the same thing. Any concentration harms.
R: What can the average joe do to make sure “skin in the game” is enforced on those in power?
T: Decentralization is where we start. Vote for that and for people promoting it.
R: What have you been reading recently?
R: You can check out his amazon reviews if you haven’t seen that yet. Link
R: How many books in your library have you not read?
T: Actually, only 40% partially read.
R: Can you begin to be antifragile while being poor or you should first make some money and plan ahead?
T: The poor is more antifragile than the rich: less to lose, both economically and psychologically.
R: As an engineer and technologist, I’m exposed to a lot of neophilia. Do you have any suggestions for heuristics besides reading the classics as an inoculation against neophilia?
T: Yes, use the Lindy effect as a testing rule… that is, look for solutions from simpler technologies.
The longer a technology has been around, the longer it’s likely to stay around.
R: according to your principles, how would you deal with the obesity epidemic hitting the U.S.?
T: The general problem is that we are not made to control our environment, and we are designed for a degree of variability: in energy, temperature, food composition, sleep duration, exercise (by Jensen’s inequality). Depriving anyone of variations is silly. So we need to force periods of starvation/fasts , sleep deprivation , protein deprivation, etc. Religions force shabbats, fasts, etc. but we are no longer under the sway of religions… The solution is rules…

Written with Alex Flint
Beyond all the needs that it fulfils, all technological innovation is underpinned by a common driving force: how to make information flow more efficiently. From when the first modern humans walked the earth, we’ve assumed that it was their survival instinct that drove innovation. It certainly has, but we forget that without the ability to efficiently pass on information from one generation to the next, our ancestors would’ve had to reinvent the most basic things every time they needed it.
From the beginning of human civilisation till today, our aim has been to increase, what can be termed, brain to brain bandwidth. The idea encompasses not just flow of information from one person to the other but also how effectively it is transmitted, that is how well it is understood or used by the person receiving it.
We’ve come to associate the last 50 years with the period when the information revolution took place. But that is because the industrial revolution that preceded it made life easy enough for us to focus primarily on information and its transmission. Is the information revolution slowing down though? Certainly not.
The personal computer was expected to make its way into every home well before the 1990s. But its limitations with speed and memory did not let that happen. Its main users for many years were technology geeks, nerds and hackers.
While no one doubted the achievement of Apple I from a purely technical standpoint, giants of the field like IBM did not believe in the dream of the PC-enthusiasts. In 1976 it was hard to imagine how exactly an abstruse gadget in a wood-casing with the title “Apple I” scrawled over the headpiece would have a large impact on ordinary life. But should it have been so difficult? The fundamental role of information in our lives seemed to have been underplayed.
By the time the personal computer, as we know it*, was first built, it had already been over a decade since Gordon Moore’s prediction that the number of components on an integrated circuit would double every two years^.
A general purpose information processing device was going to be in demand and would become cheap enough for many to afford. But it still took a genius and a rebel like Steve Jobs to force the incumbents to accept that the PC age had begun.
The next innovation after the PC that had a comparable impact on humanity’s brain to brain bandwidth was the internet. What the PC made possible was a better way to access and manipulate information. The advent of the internet brought things a step further by enabling us to connect such information with relative ease.
However, like the PC before it, mass adoption took time. After being invented as a means of transferring data between physicists, Sir Tim Berners-Lee’s idea took off in the mid-90’s. Since then the internet has disrupted not just information transfer mechanisms but many other markets. From the postal system to the education system, anything that has information transfer at its heart has been changed by the internet.
While many might dispute social media as the next big innovation, there is little doubt that adding a personal touch to information flow has made a huge difference. Defined as a website that allows you to make a profile page, connect with friends and view your friends’ connections, the first social networking website was SixDegrees.com launched in 1997.
Since then, of course, social networking sites like MySpace, Orkut, Facebook, Twitter and, most recently, Google+ have drawn hundreds of millions of users. Even though Facebook is not quite worth $100 billion just yet, the sheer number of users of Facebook has helped it create a parallel world of its own on the internet. Just a little less than half the world’s internet users have Facebook accounts. It’s not just Facebook and Twitter though. Social news sites like Reddit, Digg and StumbleUpon draw large crowds too.
But innovation in this sector is reaching a plateau. All social networking websites have essentially the same features: profiles, news feed, data-sharing (photos, links, documents, etc.) and many ways of bringing users together in groups or by direct communication. We’ve reached a point today when people are spending less time on social networks than before.
The next innovation needed in increasing our brain to brain bandwidth are being touted to come from wearable computing, be it smartwatches or products like Google Glass. But these seem like an incremental development rather than one that is paradigm-shifting.
What we really need is a virtual way to replicate the water-cooler effect. The effect is named after the phenomenon that colleagues in an office meet at a water-cooler, which leads to serendipitous exchange of ideas. It is thought that the internet has led to the decline of these chance events happening, and thus slowed down the pace of innovation.
It was this that formed the core of a recent note from Marissa Mayer, Yahoo’s CEO, that asked Yahoo employees to stop working from home. Many decried Mayer’s note, calling her out of touch with reality. But she has a point because there is a lot of value in face to face communication. No innovation yet has come close to solving that problem.
A solution to this problem will truly impact the world. Economists have found out that the easiest way to double world GDP is to get rid of international borders. Which, of course, is a politically implausible proposition. But if technological development could allow virtual presence of a person to be nearly as good as real presence, this dividend would not remain an unrealised one.
And perhaps Yahoo workers could start working from home again.
* Many will dispute which exactly was the first personal computer. Perhaps it was GENIAC built in 1955. The Apple II built in 1977 was the first mass-produced PC. But the first PC with a graphic user interface, that we have become so accustomed to, was Lisa built in 1983.
^ The often-quoted period of 18 months was a modification by David House, of Intel, who said the growth in computing power will come not just from more transistors but also from faster ones
First published on medium.com. Image from here.