Do Self-Help Books Work? Cover

How Modern Non-Fiction Books Waste Your Time (and Why You Should Read Them Anyway)

When I first discovered non-fiction books, I thought they were the best thing since sliced bread. Whatever problem you could possibly have, there’s a book out there to help you solve it. I had a lot of challenges at the time, and so I started devouring lots of books.

I read books about money, productivity, and choosing a career. Then, I read books about marketing, creativity, and entrepreneurship. I read and read and read, and, eventually, I realized I had forgotten to implement any of the advice! The only habit I had built was reading, and as wonderful as it was, it left me only with information overwhelm.

After that phase, I flipped to the other, equally extreme end of the spectrum: I read almost no books, got all my insights from summaries, and only tried to learn what I needed to improve a given situation at any time.

So, do self-help books work? As always, the truth lies somewhere in the middle.

Read More
How To Not Be Gullible Cover

How To Not Be Gullible

In 1997, 14-year-old Nathan Zohner used the science fair to alert his fellow citizens of a deadly, dangerous chemical.

In his report Dihydrogen Monoxide: The Unrecognized Killer, Nathan outlined all the alarming characteristics of the colorless, odorless, tasteless compound — DHMO for short — which kills thousands of Americans each year:

  • DHMO can cause severe burns both while in gas and solid form.
  • It is a major component of acid rain and often found in removed tumors of cancer patients.
  • DHMO accelerates corrosion of both natural elements and many metals.
  • Ingesting too much DHMO leads to excessive sweating and urination.
  • For everyone with a dependency on DHMO, withdrawal leads to death.

After giving his presentation, Nathan asked 50 fellow students what should be done. 43 — a staggering 86% — voted to ban DHMO from school grounds.

There was only one problem: Dihydrogen monoxide is water.


Every day, people use facts to deceive you because you let them.

Life is hard. We all get fooled six ways from Sunday. People lie to us, we miscommunicate, and it’s impossible to always correctly read other people’s feelings. But facts? If we let facts deceive us, that’s on us.

When it’s hard to be right, there is nothing wrong with being wrong. But when it only takes a few minutes or even seconds to verify, learn, and educate yourself, choosing to stay ignorant is really just that: A decision — and likely one for which you’ll get the bill sooner rather than later.

If you know a little Latin, Greek, or simply pay attention in chemistry class, the term “dihydrogen monoxide” is easy to deconstruct. “Di” means “two,” hydrogen is an element (H on the periodic table), “mono” means one and “oxide” means oxidized — an oxygen atom (O on the periodic table) has been added. Two hydrogens, once oxidized. Two Hs, one O. H2O. Water.

When Nathan ran his experiment “How Gullible Are We?” in 1997, people didn’t have smartphones. They did, however, go to chemistry class. Nathan’s classmates had parents working in the sector, and they all had chemistry books. They even could have asked their teacher: “What’s dihydrogen monoxide?” But none of them did.

In his final report, Nathan wrote he was shocked that so many of his friends were so easily fooled. “I don’t feel comfortable with the current level of understanding,” he said. James Glassman, who wrote about the incident in the Washington Post, even coined the term “Zohnerism” to describe someone using a fact to mislead people.

Today, we have smartphones. We have a library larger than Alexandria’s in our pocket and finding any page from any book takes mere seconds. Yet, we still get “zohnered” on a daily basis. We allow ourselves to be.

“Too much sugar is bad for you. Don’t eat any sugar.” Yes, too much sugar is bad, but the corollary isn’t to stop eating it altogether. Carbohydrates are the body’s main source of energy, and they’re all broken down into various forms of sugar. It’s a vital component of a functioning metabolism. Plus, each body has its own nuances, so cutting out sugar without more research could actually be bad for you. But if I’m selling a no-sugar diet, who cares, right?

You care. You should. And that’s why it’s your job to verify such claims. It’s easy to spin something correct in a way that sends you in whatever direction the manipulator wants to send you. The only solution is to work hard in order to not let yourself be manipulated:

  • Say “I don’t know” when you don’t know. I know it’s hard, but it’s the most liberating phrase in the world. Whenever you’re out of your comfort zone, practice. “Actually, I don’t know, let me look it up.”
  • Admit that you don’t know to yourself. You’ll miss some chances to say “I don’t know.” That’s okay, you can still educate yourself in private later. Your awareness of your ignorance is as important as fighting it.
  • Google everything. When you’re not 100% sure what a word means, google it. When you want to know where a word comes from, google it. When you know you used to know but are hazy on the details, google it. Seriously. Googling takes ten seconds. Google everything.
  • Learn about your biases. Hundreds of cognitive biases affect our thinking and decisions every waking second. Learning about them and occasionally brushing up on that knowledge will go a long way.
  • When someone argues for one side of a conflict, research both. Whether it’s a story in the news, a political issue, or even the issue of where to get lunch, don’t let yourself get clobbered into one corner. Yes, McDonald’s is cheap. Yes, you like their fries. But what about Burger King? What do you like and not like about both of them?
  • When someone talks in absolutes, add a question mark to every sentence. James Altucher often does this with his own thoughts, but it’s equally helpful in questioning the authority of others. Don’t think in absolutes. Think in questions.

The dihydrogen monoxide play has been used many times to point people at their own ignorance. A 1994 version created by Craig Jackson petitions people to “act now” before ending on a truthful yet tongue-in-cheek note: “What you don’t know can hurt you and others throughout the world.”

Richard Feynman received the Nobel prize in physics, but he started his journey as a curious boy, just like Nathan Zohner. Like Einstein, he believed inquisitiveness could solve any problem, and so he always spoke in simple terms — to get people interested in science.

He also said the following, which still rings true today: “The first principle is that you must not fool yourself — and you are the easiest person to fool.”

Creativity & Breathing Cover

To Stay Creative, Remember to Breathe

“I sometimes disappear for weeks or even months at a time. When I do this, I’m not abandoning my work or being lazy. I’m just trying to breathe.”

So writes Matthew Inman, creator of the web comic The Oatmeal, in a post titled Creativity is like breathing. To explain the analogy, Inman writes: “When you make stuff, you’re exhaling. But you can’t exhale forever. Eventually, you have to breathe in. Or you’ll be dead.”

That’s why Inman spends lots of time reading books, being outdoors, and jumping from project to project, he says. They’re all forms of breathing, and they don’t just make him better at his job, they’re also reasons why he loves his job. It’s the beauty of being a creative: Everything you do is fuel for your work.

When your job is to make things, your whole life is your canvas. You can have a brilliant idea over a bowl of cereal, write about what happened on vacation, even the bad stuff, like going through a breakup, you can work into your creative output. In fact, you’ll both have to and want to.

Whatever happens in your life impacts your emotions, your thoughts, and, as a result, what the outcome looks like when you put those thoughts and emotions on paper — or any other medium. Why do you think I just used “a bowl of cereal” as an example? It’s because, for the past two days, I’ve been staring at a comic called The Oatmeal. That’s how the human mind works.

While there’s nothing you can do about your intelligence running under the influence of many biases, you likely won’t mind once you realize there’s an active benefit on top of this more passive dynamic when creating: You consciously get to work through the events in your life. Writing about a positive experience makes it better. Sharing your business failure on a podcast mellows the pain.

Soon, you’ll process your whole life in real-time through the lens of creativity — and it’s one of the most powerful forms of self-healing there is. You’ll constantly learn, evolve, and challenge yourself to accept your past by creating something others can use in the future. As wonderful as it is to find this kind of outlet, there’s a downside: Your work can become addicting.

When everything is input, it’s natural to consistently want to form output. You’ll feel like you should shape and release all your experiences and ideas, which, of course, is impossible. What’s more, not all input is created equal. Some stories will have more value to your audience than others. This is another, less appealing part of the artist’s job: You have to curate your work and select what’s most worth sharing. This is where it helps “to breathe.”

As Zat Rana put it in The Philosophical Argument for Working Less, part of respecting your work is accepting that it’s “just one part of life, not the whole thing:”

Even if you love your work more than you love anything else, you are likely to find it more complete and fulfilling if you step away from it, time to time.

Eventually, you have to breathe in — or you’ll be dead. If you’ve ever hit creator’s block after a long stretch of releasing a lot of work, you may have realized: It’s not that you can’t publish daily, it’s that your posts start to feel stale. You’re panting. Short, choppy breaths, out, out, out. You need time to breathe in — literally, and then figuratively. Beyond our own desire to insta-journal about our lives, there’s also a component of societal pressure, Zat says:

There seems to be a certain guilt in our current culture associated with just taking time to do nothing, to relax, to leisure, to waste time, and to simply have no plans. But the truth is that, without these things, you are not going to get the most out of your work anyway.

When you feel tired, sleep. When you lack good analogies, watch a movie. Don’t feel bad about taking a vacation from time to time. Leisure creates its own form of productivity. If you allow your experiences to ripen, more of them will mix. Your subconscious will add its own kind of seasoning, and, soon, it’ll send a powerful insight back to the surface.

Once that great idea strikes like lightning, you won’t be able to not act on it. A breath of truly fresh air is so empowering, you’ll have to direct it somewhere. Well-rested and fired up, you’ll rush back to your chair, ready to put out the next comic. Who knows what brilliant metaphor you’ll write about. Maybe something like, “Creativity is like breathing.”

What Is the Future of Learning?

“A wise man can learn more from a foolish question than a fool can learn from a wise answer.” 

Bruce Lee

In the past four years, I have asked a lot of foolish questions:

Can I be a professional translator without any credentials?

If I want to be a published writer, should I still ghostwrite for money?

Do summaries of existing book summaries make any sense?

The seemingly obvious answer to them all is “no,” yet I did all those things anyway. And while some led nowhere, others now pay my bills. Often, the only way to get satisfying answers is to try, especially with foolish questions. The beauty of daring to ask them, rather than accepting the answers society gives you, is that you’ll have many more unexpected insights along the way.

Like that, today, the answers are always less valuable than the questions.

The Half-Life of Knowledge

In 2013, we created as much data as in all of the previous history. That trend now continues, with total information roughly doubling each year. Michael Simmons has crunched the numbers behind our knowledge economy:

You probably need to devote at least five hours a week to learning just to keep up with your current field—ideally more if you want to get ahead.

Bachelor’s degrees in most European countries consists of 180 credits (EU schools tend to use a quarter credit system as opposed to the semester hour system typical in the U.S.), and each of those credits is worth about 30 hours of studying time. That’s 5,400 hours. Sadly, what you learn from those hours starts decaying as soon as you’ve put in the time. Scientists call this “the half-life of knowledge,” a metric that’s decreasing fast.

A modern degree might last you just five years before it’s completely irrelevant.

Since new information is now generated more and more rapidly, it takes less time for said information to lose its value. Back in the 1960s, an engineering degree was outdated within 10 years. Today, most fields have a half-life much less than that, especially new industries. A modern degree might last you just five years before it’s completely irrelevant. Even with a conservative half-life estimate of 10 years (losing about 5 percent each year), you’d have to put in 270 hours per annum just to maintain those initial 5,400—or about five hours per week.

As a side effect of this global, long-lasting trend, both the time we spend attaining formal education and the number of people choosing this path have increased dramatically for decades. Years of schooling have more than doubled in the past 100 years, and in many countries, it’s common to study for some 20-plus years before even entering the workforce. In the U.S. alone, college enrollment rates have peaked at over 90 percent of the total population in the age group around secondary school completion already.

The larger our ocean of information, the less valuable each fact in it becomes. Therefore, the knowledge bundles for college degrees must get bigger and, thus, take longer to absorb. But the ocean also grows faster, which means despite getting bigger, the bundles don’t last as long. It takes a lot of time to even stay up to date, let alone get ahead of the increasing competition.

Instead of flailing more not to drown, maybe we should get out of the water.

A Scary Future to Imagine

While it’s important to dedicate time to learning, spending ever-increasing hours soaking up facts can’t be the final answer to this dilemma. Extrapolate the global scramble for knowledge, and we’d end up with 50-year-old “young professionals,” who’d retire two years into their careers because they can’t keep up. It’s a scary future to imagine but, luckily, also one that’s unlikely.

I saw two videos this week. One showed an unlucky forklift driver bumping into a shelf, causing an entire warehouse to collapse. In the other, an armada of autonomous robots sorted packages with ease. It’s not a knowledge-based example, but it goes to show that robots can do some things better than people can.

There is no expert consensus on whether A.I., robotics, and automation will create more jobs than they’ll destroy. But we’ll try to hand over everything that’s either tedious or outright impossible. One day, this may well include highly specialized, knowledge-based jobs that currently require degrees.

Knowledge is cumulative. Intelligence is selective. It’s a matter of efficiency versus effectiveness.

A lawyer in 2050 could still be called a lawyer, but they might not do anything a 2018 lawyer does. The thought alone begs yet another foolish question:

When knowledge itself has diminishing returns, what do we need to know?

The Case for Selective Intelligence

With the quantity of information setting new all-time highs each year, the future is, above all, unknown. Whatever skills will allow us to navigate this uncertainty are bound to be valuable. Yuval Noah Harari’s new book asserts this:

In such a world, the last thing a teacher needs to give her pupils is more information. They already have far too much of it. Instead, people need the ability to make sense of information, to tell the difference between what is important and what is unimportant, and above all, to combine many bits of information into a broad picture of the world.

The ability Harari is talking about is the skill of learning itself. The 2018 lawyer needs knowledge. The 2050 lawyer needs intelligence. Determining what to know at any time will matter more than the hard facts you’ll end up knowing. When entire industries rise and fall within a few decades, learning will no longer be a means but must become its own end. We need to adapt forever.

Knowledge is cumulative. Intelligence is selective. It’s a matter of efficiency versus effectiveness. Both can be trained, but we must train the right one. Right now, it’s not yet obvious which one to choose. The world still runs on specialists, and most of today’s knowledge-accumulators can expect to have good careers.

But with each passing day, intelligence slowly displaces knowledge.

The Problem With Too Many Interests

Emilie Wapnick has one of the most popular TED talks to date—likely because she offers some much-needed comfort for people suffering from a common career problem: having too many interests. Wapnick says it’s not a problem at all. It’s a strength. She coined the term “multipotentialite” to show that it’s not the people affected but public perception that must change:

Idea synthesis, rapid learning, and adaptability: three skills that multipotentialites are very adept at and three skills they might lose if pressured to narrow their focus. As a society, we have a vested interest in encouraging multipotentialites to be themselves. We have a lot of complex, multidimensional problems in the world right now, and we need creative, out-of-the-box thinkers to tackle them.

While there’s more to it, it’s hard to deny the point. After all, some of these thinkers work on some of our biggest problems. And we love them for it.

Jeff Bezos built a retail empire and became the richest man in the world, but he also helped save an important media institution and works on the infrastructure we need to explore space. Elon Musk first changed how we pay and then how we think of electric cars, and now how we’ll approach getting to Mars. Bill Gates really knows software, but now he’s eradicating malaria and polio. The list goes on.

The term “polymath” feels overly connoted with “genius,” but whether you call them Renaissance people, scanners, or expert-generalists, the ability they share stays the same: They know how to learn, and they relentlessly apply this skill to a broad variety of topics. In analyzing them, Zat Rana finds this:

Learning itself is a skill, and when you exercise that skill across domains, you get specialized as a learner in a way that someone who goes deep doesn’t. You learn how to learn by continuously challenging yourself to grasp concepts of a broad variety. This ironically then allows you to specialize in something else faster if you so choose. This is an incredibly valuable advantage.

Beyond learning faster, you’ll also innovate more, stay flexible, stand out from specialists, and focus on extracting principles over remembering facts.

To me, that sounds exactly like the person an unpredictable world needs.

A Curious Boy

In 1925, one year before he entered school, Isaac Asimov taught himself to read. His father, uneducated and thus unable to support his son, gave him a library card. Without any direction, the curious boy read everything:

All this incredibly miscellaneous reading, the result of lack of guidance, left its indelible mark. My interest was aroused in twenty different directions and all those interests remained. I have written books on mythology, on the Bible, on Shakespeare, on history, on science, and so on.

“And so on” led to some 500 books and about 90,000 letters Asimov wrote or edited. Years later, when his father looked through one of them, he asked:

“How did you learn all this, Isaac?”

“From you, Pappa,” I said.

“From me? I don’t know any of this.”

“You didn’t have to, Pappa,” I said. “You valued learning and you taught me to value it. Once I learned to value it, the rest came without trouble.”

When we hear stories about modern expert-generalists, we assume their intelligence is the result of spending a lot of time studying multiple fields. While that’s certainly part of it, a mere shotgun approach to collecting widely diversified knowledge is not what gives great learners special abilities.

What allowed Asimov to benefit from his reading, much more so than what he read or how much, was that he always read with an open mind. Most of the time, we neglect this. It’s a fundamental misunderstanding of how we learn.

In order to build true intelligence, we first have to let go of what we know.

The Value of Integrative Complexity

Had Asimov learned to read in school, he likely would’ve done it the way most of us do: memorizing or critiquing things. It’s an extremely narrow dichotomy, but sadly, one that sticks. Rana offers thoughts about the true value of reading:

Anytime you read something with the mindset that you are there to extract what is right and what is wrong, you are by default limiting how much you can get out of a particular piece of writing. You’re boxing an experience that has many dimensions into just two.

Instead of cramming what they learn into their existing perspectives, people like Asimov know that the whole point is to find new ones. You’re not looking for confirmation; you’re looking for the right mental update at the right time.

With an attitude like that, you can read the same book forever and still get smarter each time. That’s what learning really is: a state of mind. More than the skill, it’s receptiveness that counts. If your mind is always open, you’re always learning. And if it’s closed, nothing has a real chance of sinking in.

Scientists call this “integrative complexity”: the willingness to accept multiple perspectives, hold them all in your head at once, and then integrate them into a bigger, more coherent picture. It’s a picture that keeps evolving and is never complete but is always ready to integrate new points and lose old ones.

That’s true intelligence, and that’s the prolific learner’s true advantage.

A Matter of Being

Your brain is like a muscle. At any moment, it’s growing or it’s deteriorating. You can never just keep it in the same state. So when you’re not exercising your mind, it’ll atrophy and not only stop but quickly reverse your progress.

This has always been the case, but the consequences today are more severe than ever. In an exponential knowledge economy, we can’t afford stale minds. Deliberately spending time on learning new things is one way to fight irrelevance, but it’s not what’ll protect us in the uncharted waters of the future.

The reason the wise man can learn from even the most foolish question is that he never assigns that label in the first place.

Beyond being carriers of knowledge, we need to become fluid creatures of intelligence. Studying across multiple disciplines can start this process. It has many advantages—creativity, adaptability, speed—but it’s still not enough.

If we focus only on the activity of learning, we miss the most important part: Unless we’re willing to change our perspective, we won’t grasp a thing. It’s not a matter of doing but of being. The reason the wise man can learn from even the most foolish question is that he never assigns that label in the first place.

And so it matters not whether we learn from our own questions or the insights of others, nor how much of it we do, but that we always keep an open mind. The longer we can hold opposing ideas in our heads without rejecting them, the more granular the picture that ultimately forms. This is true intelligence. It’s always been valuable, but now it’s the inevitable future of learning.

Bruce Lee undoubtedly possessed this quality. By the time he died, he was a world-renowned martial artist, the creator of an entire philosophy, and a multimillion-dollar Hollywood superstar. All at only 32 years old. Long after his passing, one of his favorite stories captures both the essence of his spirit and how he became the cultural icon we still know and love today:

A learned man once went to visit a Zen teacher to inquire about Zen. As the Zen teacher talked, the learned man frequently interrupted to express his own opinion about this or that. Finally, the Zen teacher stopped talking and began to serve tea to the learned man. He poured the cup full, then kept pouring until the cup overflowed.

“Stop,” said the learned man. “The cup is full, no more can be poured in.”

“Like this cup, you are full of your own opinions,” replied the Zen teacher. “If you do not first empty your cup, how can you taste my cup of tea?”

Use This Storytelling Framework to Craft Amazing Narratives Cover

Use This Storytelling Framework to Craft Amazing Narratives

There is a class of entertainment that is underrated, in spite of its external success: stories about telling stories. Hit shows like How I Met Your Mother, Suits, or Gilmore Girls and blockbusters like Ocean’s Eleven, the Bourne movies, and Fight Club all thrive on their characters’ abilities to launch into enchanting monologues at a second’s notice.

Whoever asks Barney Stinson about his playbook, platinum rule, or Valentine’s Day can expect a full-fledged fake history lesson. Despite what the gang might say, they love it. Because who tells stories like that?

Sometimes, life throws us the same opportunity to tell a story however we want to tell it. It might be an essay for a job application, a speech to your old class, or a new acquaintance asking about a childhood experience. But we’re not a character in a movie, so we never have those stories locked and loaded and often butcher them as a result.

How can we change that?

The Universal Principles of Storytelling

Steven Pressfield laid out a framework in Nobody Wants to Read Your Sh*t. He calls it the universal principles of storytelling:

1) Every story must have a concept. It must put a unique and original spin, twist or framing device upon the material.
2) Every story must be about something. It must have a theme.
3) Every story must have a beginning, a middle, and an end. Act One, Act Two, Act Three.
4) Every story must have a hero.
5) Every story must have a villain.
6) Every story must start with an Inciting Incident, embedded within which is the story’s climax.
7) Every story must escalate through Act Two in terms of energy, stakes, complication and significance/meaning as it progresses.
8) Every story must build to a climax centered around a clash between the hero and the villain that pays off everything that came before and that pays it off on-theme.

Since reading the book, I have run nearly all my articles through this framework. This has led to some of my biggest hits so far. I’ve gathered the cornerstone elements into a template you can copy:

Theme:
Concept:
Hero:
Villain:
Act 1 - Hook:
Inciting Incident:
Act 2 - Build:
Escalation:
All is Lost:
Breakthrough:
Act 3 - Payoff:
Climax:

But how do you use it?

Photo by 贝莉儿 NG on Unsplash

How to Not Forget the Books

There’s a How I Met Your Mother episode in which Ted starts his own architecture firm, Mosbius Designs. One afternoon, Robin walks into Ted lost in thought, who responds to her prompt with the following:

“What if I don’t think of the books?”

“Excuse me?”

“There’s this famous architecture story about an architect who designed this library. It was perfect. But every year, the whole thing would sink a couple inches into the ground. Eventually, the building was condemned.

He forgot to account for the weight of the books.

This company, it’s just me. What if I don’t think of the books?”

Like the library in Ted’s example, any story that doesn’t rest on the foundational pillars of Steve’s framework is bound to crumble. And even though accounting for the principles of storytelling doesn’t guarantee it’ll be well received, a story built this way always ‘works.’

Case in point, here’s what the screenwriters might’ve put into the template for Ted’s five-sentence story:

Ted's Library Story
Theme: The flawed nature of human short-term thinking.
Concept: A project is never just about building what you set out to build.
Hero: The architect.
Villain: His narrow, short-term perspective.
Act 1 - Hook: An architect designs a beautiful library but forgets to account for the statics of the building once it's in use.
Inciting Incident: The plans pass all stages without the mistake being noticed.
Act 2 - Build: A year after the grand opening, problems begin to show up in the basement, which keep getting worse every year.
Escalation: Year after year, repairmen and investigators return to figure out the problem.
All is Lost: Eventually, a report shows the building is sinking into the ground.
Breakthrough: The architect realizes the sinking is caused by the weight of the books.
Act 3 - Payoff: The building is condemned and the architect is right back to where he started.
Climax: An official tells the architect the building will be shut down. This leads to the architect sitting over his original plan at night, all by himself, having a drink and facing the pain of his short-term thinking.

It might have collapsed into a few lines, but since this kind of thought went into it, intuitively the story still makes perfect sense. It feels right. And while there are no hard rules here, this is what I think about for each element:

  • Theme: The underlying topic of it all. The bigger the theme, the more powerful the story. Love, time, identity — every human has to deal with these.
  • Concept: Look at the topic from a new angle, one that few people would ever consider on their own.
  • Hero: Who rides the rollercoaster of hook, build, and payoff? This needn’t be a person.
  • Villain: Who puts the hero on that rollercoaster and tries to throw him or her off during the ride? This can also be a mistake or the state of the hero’s mind.
  • Act 1 – Hook: The overarching sequence of events that pulls the reader or listener into the story.
  • Inciting Incident: The event that officially kicks off the story. It usually involves the hero and the villain, and the climax will bring them right back to it.
  • Act 2 – Build: The overarching sequence of events that escalates the hero’s trauma, known to them or not, until they’re forced to do something.
  • Escalation: The villain’s main act of the show.
  • All is Lost: The hero’s lowest point.
  • Breakthrough: The moment of insight that forces the hero on the only possible path: to fight the villain. This could be a brilliant idea or a sobering realization. It doesn’t indicate the hero will win.
  • Act 3 – Payoff: The overarching sequence of events that resolves all the conflicts built up to this point by forcing the hero and villain to face one another.
  • Climax: The hero and the villain clash. Whatever the outcome, it must close all the boxes that have been opened up to this point.

Whether you sit down with this template before you even begin a story, think of it as you’re telling it, or use it to review one you’ve already shared, it will allow you to condense the story into one coherent web of reason and emotion that connects right with your audience’s soul.

For example, when I wrote Why Losers Will One Day Rule The World, I watched and read a ton about The Gambler. Then, I filled in the template before I started writing.

Why Losers Will One Day Rule The World
Theme: Learning to accept our insignificance so that we can start.
Concept: If you don’t know what you want, starting with something arbitrary will ironically help you get there.
Hero: The reader who says “screw it, I’m already a loser, I might as well go for broke.”
Villain: The voice in your head that wants us to settle for mediocrity.
Act 1 - Hook: If you’re not a genius, should you really just give up?
Inciting Incident: Gregor Mendel found out that genetics favor certain traits over others. As a result, life is naturally unfair.
Act 2 - Build: Some people win the genetic lottery twice, while others lose twice. That's depressing, but there is a stabilizing element that somehow makes life fair again for all.
Escalation: Examples of genetic lottery winners and losers.
All is Lost: In face of mediocrity and a sea of mediocre options, some people choose nihilism. That’s a bad solution.
Breakthrough: Both the genius and the generalist have to gamble to make it. No one really wins the lottery.
Act 3 - Payoff: If you have to gamble anyway, choose an arbitrary goal, so you can at least start going somewhere.
Climax: We all have to gamble, so we’re all losers in a way. Only when we accept our loser status can we be free.

For others, like You Don’t Need An Identity To Have A Life, I started writing with a blank slate. All I had was the theme. Then, I used the template to fill in gaps as I went, move around sections, and drop in ideas. I didn’t have a concept until the very end, and I didn’t use some ideas at all.

You Don’t Need An Identity To Have A Life
Theme: Identity is dangerous. You’re stronger without it.
Concept:
Hero: Jason Bourne.
Villain: The voice in your head that says, “I am this way and I always will be.”
Act 1 - Hook: Howard Hughes wasted his entire life playing a genius inventor’s son when that role was never really his to play. And we all do that. Playing roles that we were never cast for.
Inciting Incident: Jason Bourne finds out his name, but he has no idea who the person behind that name is.
Act 2 - Build: Every day, we’re building more towards assembling a self and hardening our identity, only to ultimately find out we might not like what we’ve created.
Escalation: Bourne finds out he’s a killer.
All is Lost: Quote from Denial of Death. Wasting your life in service of building a conceptual self that may not last, nor be perceived in any way as what you set out to make it.
Breakthrough: We're like actors on a stage (Counterclockwise Study). Our identity is like the weather (Jim Carrey).
Act 3 - Payoff: Bourne’s fluid identity is his strength. Justin Timberlake’s too (muted). More examples? Frank Abagnale! How far he got! Ending: Bourne says “not really.”
Climax: Bourne abandons his former identity the second he finds out what it was, choosing his fluid self over any sort of crystallized version in an instant, in spite of having worked so hard to find out who this former self was.

I’m far from an expert in using this template and I’ve barely scratched the surface of everything there is to know about telling stories. But at least now I don’t forget the books.

Photo by Sylvia Yang on Unsplash

Everything Is a Story

We might not be film characters, but if you think about it, our opportunities to tell stories are not rare. They’re omnipresent. We tell stories all the time. In fact, we do little else. A phone call is a story. A sales pitch is a story. Dinner with friends is a story. And so is this post.

When Harvey Specter, Rory Gilmore, and Tyler Durden raise their voices, we listen. Not because they know how to talk, but because they know how to lead. That’s what storytelling really is. Human communication 101. We’ll never run as smoothly as characters on a script, but if we fail at the basics, if we forget to account for the books, we miss out on a whole lot more than the corner office. We miss out on making change.

And isn’t that all we’re here to do?

How To Eliminate the Number One Cognitive Bias on the Internet Cover

How To Eliminate the #1 Cognitive Bias on the Internet

In 1906, famed English statistician Sir Francis Galton visited the annual ‘West of England Fat Stock and Poultry Exhibition.’ The 84-year-old scientist was obsessed with breeding in his spare time. While I’m sure he strolled along the stalls amidst curious onlookers, he did find the perfect mix of leisure and work: a weight-guessing contest.

An ox was brought on stage before ‘dressing’, which is butcher speak for slaughter and removal of organs. Contestants could then submit guesses regarding the dressed weight for six pence each, which resulted in a variety of prizes, a cool $6,000 in modern-day dollars for the host, and a data set of 787 points for Galton to play with.

As Galton suspected, not even the few livestock experts among the group guessed the correct weight. The best estimate came in at 1,207 lbs, nine pounds off the 1,198 lbs mark. When he calculated the mean of all guesses, however, Galton was shocked: 1,197 lbs.

Imagined as a single individual, the crowd’s judgement was almost perfect.

Read More
10 Cognitive Biases and How To Fight Them Cover

10 Cognitive Biases and How To Fight Them

Irrationality rules the world. Quite literally, these days.

Global leaders behaving like little boys, threatening each other with their oversized toys. Fake news spreading like wildfire. Needless technology receiving millions in funding.

It’s a great time to be alive, but sometimes I wish Plato were still around to remind us of one of his big ideas: Think more.

Frustrated by the tendency of his fellow Greeks to act mostly on impulse, he always prompted them to examine their own lives. The goal was to think for yourself and be less trapped by doxa — the Greek word for common sense or popular opinion.

This is why we love Elon Musk so much. We see someone, who can objectively look at the world, build their reasoning from the ground up and then make decisions grounded in reality — and we think they’re a genius.

Actually, he’s just doing what we were supposed to all along: think for ourselves. It’s that we do so little of it. As Tim Urban notes on Wait But Why:

“We spent this whole time trying to figure out the mysterious workings of the mind of a madman genius only to realize that Musk’s secret sauce is that he’s the only one being normal. And in isolation, Musk would be a pretty boring subject — it’s the backdrop of us that makes him interesting.”

So how do we get back to rational? How can we think more and more clearly?

It is here that Musk and Plato agree, though one learned from physics, the other from philosophy: we must start with a clean slate. Plato’s old friend and mentor puts it in a nutshell.

“The only true wisdom is in knowing you know nothing.”  —  Socrates

It’s a process of getting back to square one so you can start fresh, this time from your own perspective. The way we begin this process is by ridding ourselves of our modern-day version of doxa: cognitive biases.

They fall into different categories and are shortcuts our brain uses to deal with too much information, figure out what to remember, fill in gaps in meaning and act fast when we need to. At the same time, these cognitive design flaws silently ruin our lives, one decision at a time.

There are many of them and some are worse than others. Here are the ten we must try to fight the hardest — and one way to do the fighting.

Belief: The Backfire Effect

You’ve probably heard of confirmation bias, which is our tendency to seek information that confirms our opinions, rather than form those opinions from the best information available. While troublesome, I’m much more worried about its bigger brother: the backfire effect.

Also referred to as belief perseverance or the continued influence effect, it says we react to disconfirming evidence by strengthening our previous, wrong beliefs.

For example, if you’ve agreed with me in the intro that Elon Musk is awesome, you’ll likely have felt a tad of cognitive dissonance at Tim’s statement that in isolation, Musk would be a boring subject.

This is why corrections in the news world don’t work. They never get as many views and only enhance the previous idea. The facts are gone, the feeling remains.

As you go through the following biases and catch yourself thinking: “that’s definitely not me,” you know what’s going on.

Probability

Great poker players are less affected by mental biases because they’re probability machines. Not only can they estimate the likelihood of events with more accuracy, but the habit of constantly trying to estimate alone comes with a lot of benefits.

Out of all the biases around probability, the following two continue to drive a huge wedge between us and our personal success.

Ambiguity Effect

The ambiguity effect is our impulse to avoid options for which we don’t have enough information to make a good probability guess. It stops us from chasing our big goals, because we’re not considering what’s realistically possible.

We’d rather spend $100 on lottery tickets than on stocks or cryptocurrencies, because the information required to gauge the probability of making a profit is easier to obtain.

If we did our homework, we’d often see our probabilities are better than we think and we control them more than we know.

Survivorship Bias

When we don’t know our chances, we default to following those we can see. Tim has a successful blog. Tim writes this way. I want a successful blog, so I’ll write like Tim.

This logical fallacy is called survivorship bias — the trend to focus on the elements and people that remain at the end, thus neglecting probability.

There may have been hundreds, thousands or millions of people who started blogs and wrote like Tim, but didn’t make it. Therefore, using Tim as a proxy is in no way playing it safe. It’s just playing copycat.

Risk

Risk is often lumped together with probability. However, while the chance of a bad event occurring is important to consider, risk has another component, which is just as easy to misjudge: its magnitude.

But don’t worry, we suck at estimating both.

Zero-Risk Bias

This bias indicates we prefer to eliminate whatever little risk is left completely, rather than opting for an overall greater reduction with some remaining. It’s the reason we get a heart attack when the phone rings and the caller ID says it’s our boss’s boss. Our brain blows the magnitude of the worst-case scenario way out of proportion.

“All anxiety is is experiencing failure in advance.”  —  Seth Godin

The zero-risk bias explains why insurance companies can charge a premium for full coverage and why we’d rather give up cereal completely than eat more vegetables — the latter might reduce our risk for diabetes more, but the former feels safer.

Neglect of Probability

In our aspirations we might fail at probability estimation, but when it comes to risk, we often abandon the effort altogether. Neglect of probability leads us to respond only to the magnitude of an event, not its likelihood.

Since we’re so bad at estimating that magnitude, however, we end up ignoring small risks, like falling down the stairs, altogether, while assuming certainty for great ones — if any plane were to crash, it must be ours.

The combination of these two biases explains most of our misplaced fear.

“We’re more afraid of public speaking than texting on the highway, more afraid of approaching an attractive stranger in a bar than marrying the wrong person, more afraid of not being able to afford the same lifestyle as our friends than spending 50 years in meaningless career — all because embarrassment, rejection, and not fitting in really sucked for hunters and gatherers.”  — Tim Urban, Wait But Why

When we look at the people we consider bold risk-takers, the great entrepreneurs, investors and artists of our time, most of them just turn out to have an accurate understanding of risk and probability.

It’s what allows Warren Buffett to buy when everyone’s panicking and sell when others fall for the hype.

“We simply attempt to be fearful when others are greedy and to be greedy only when others are fearful.”  —  Warren Buffett

Social: The Bandwagon Effect

In the Iraq War, a U.S. army major managed to prevent riots by keeping food vendors away from large squares and social gatherings. This way, there was no fuel for peoples’ undirected anger and they turned home, rather than into a mob.

The forces at play here are herd behavior and group think, where a large group takes action without explicitly agreeing on a direction and everyone joining in to not conflict with the group. The bandwagon effect is a specific, everyday life version of it. It’s why we believe and do things solely for the reason that many others also do.

A classic example is when you have to choose between two restaurants and go with the one that’s more crowded, because hey, it must be good, right? But if everyone before you went by the same logic, inevitably the first guests chose at random between two empty restaurants. Similarly, you’re more inclined to like a tweet that already has 1,000 likes. On the internet, it’s extra hard to think for yourself.

“Whenever you find yourself on the side of the majority, it is time to pause and reflect.”  —  Mark Twain

Memory: The Spotlight Effect

The spotlight effect is a social bias that manifests itself in our memory. It’s the belief we hold that everyone is watching our every move, all the time.

The reason is simple: we are the center of our universe. We live in our own heads, 24/7. Therefore, it’s natural we overestimate our role in everyone else’s life too. But you’re not the only one who can’t imagine the world without you — everyone else is just as focused on themselves, which means they don’t really have the time to, well, watch you.

This imagined spotlight that puts us center stage is turned on in high school, when all we care about is who did what with whom at what time. Inevitably, it spills over into adulthood and leaves us too cautious to publish that honest blog post, say what we think or try something unusual.


We’ve learned about seven cognitive biases so far. Imagine not just one, but all of them are influencing your thinking right now — because that’s exactly what’s happening. That’s the environment we’re supposed to make decisions in.

So what do most of us default to? Right. More of the same.

Decision-Making: Irrational Escalation

Success often hinges on doing things differently — you know, in our own way. It doesn’t guarantee we’ll land a hit, but it improves the odds. Sadly, that’s exactly what our mental biases hold us back from.

They lead to what behavioral scientists call escalation of commitment. We continue down the same path, even if it’s an irrational one. To stay safe, we do more of the same. What’s always been done.

This irrational escalation happens in several ways and it destroys our growth.

Loss Aversion

When nobel prize winner Daniel Kahnemanl handed people mugs and told them they were worth $5, he found out in spite of knowing the value, nobody was willing to sell the good at the same price. This is known as the endowment effect. We value goods more, simply because we own them.

This leads to loss aversion. As soon as we have something, we have something to lose — and losing hurts up to twice more than winning makes us happy.

So we spend most of our days preserving what we have instead of going for what else we want.

Sunk Cost Fallacy

Funny enough, while we’re trying hard to avoid losses, when we’re losing, the sunk cost fallacy makes sure we lose big. When a path of action becomes irrational, we continue on it solely to be consistent with our previous actions.

How many times have you left the theatre when the movie was bad? Do you go to events you paid for, even if you don’t feel like going the day of? When you’ve invested time or money into something that doesn’t work out, it’s hard to face that failure, man up and move on.

But wasting more time and money, just to avoid that realization, is much costlier in the long run. If we thought of buying ourselves options, not obligations, we would remain free to make the best decision — no matter the sunk cost.

Parkinson’s Law of Triviality

You might be familiar with Parkinson’s Law: “work expands so as to fill the time available for its completion.” Coined by the same Parkinson is the Law of Triviality, sometimes also referred to as bike-shedding: In an effort to avoid the cognitive discomfort that stems from dealing with all the above and solving complex problems, we spend disproportionate amounts of time on trivial issues.

When you start a blog, designing the logo, choosing the colors, optimizing your menu and link structure all seem really important. It’s easy to get lost in those details for weeks when really, all you had to do was write.

There’s Something on Your Windshield

It’d be nice if we had to deal with just one cognitive bias at a time. We’d open our cognitive bias playbook, flip to page 19 and take the specific steps needed to handle the culprit. But that’s not how it works.

There are dozens of cognitive glitches, working against us every second of every day. Me, while I’m writing this. You, while you’re reading this. Almost 200 of them are listed on Wikipedia. And those are just the ones we’ve identified so far.

While they’re so omnipresent they’re just a part of life, you can think of them like raindrops on your windshield. A few speckles here and there won’t completely cloud your vision, but if they fill every inch, you might as well drive in the dark.

Since there are way too many to fight each one explicitly, we need one tool to deal with at least a decent bunch of them. A bias against biases, if you will. To our best knowledge, that bias is awareness.

It’s not the perfect wiper, but at least you’ll see if you drive on the right road.

The Solution: Your Stress Response

Most of our mental biases date back to a time when quick decisions determined our survival. The tool we can use to fight them is just as old.

Even today our initial reaction to most stressors is to treat them like potential death threats. You know, just to be safe. The reaction that plays out is called fight-or-flight response. Our body releases a cocktail of adrenaline and cortisol, which increases our heart rate, dilates our pupils and triggers tunnel vision. But hidden in this physical power stance lies our golden arrow.

In his book, What Every Body Is Saying, ex-FBI agent and body language expert Joe Navarro observes a third component of our stress response: the freeze reaction. Neither fight nor flight are viable options in school or at the office, so we default to first freezing in place, like our ancestors did when a T-Rex walked by.

“One purpose of the freeze response is to avoid detection by dangerous predators or in dangerous situations. A second purpose is to give the threatened individual the opportunity to assess the situation and determine the best course of action to take.

This second purpose is our holy grail. Our chance to ask: “What’s really going on? Is my brain tricking me here?”

Thanks to our ancestors, the basics of the freeze response still remain intact, but it takes a more conscious effort make it our go-to reaction. Due to all our biases, an internal conflict arises with each external threat. Our goal must be to use the break we catch with the freeze response to shift our attention to what’s going on inside.

In her book, The Willpower Instinct, Stanford psychologist Kelly McGonigal has dubbed this better version of our stress reaction the pause-and-plan response.

“The pause-and-plan response puts your body into a calmer state, but not too sedate. The goal is not to paralyze you in the face of internal conflict, but to give you freedom. By keeping you from immediately following your impulses, the pause-and-plan response gives you the time for more flexible, thoughtful action.”

The moment you acknowledge a mental bias it loses its power. Thankfully, McGonigal also shares what that moment looks like.

“The pause-and-plan response drives you in the opposite direction of the fight-or-flight response. Instead of speeding up, your heart slows down, and your blood pressure stays normal. Instead of hyperventilating like a madman, you take a deep breath. Instead of tensing muscles to prime them for action, your body relaxes a little.”

Breathing. You’re doing it right now. But taking a deep breath? Please, do it right now. It’s amazing how shallow our most important survival mechanism becomes without us even noticing. Breaking that pattern is our escape from the grasp of doxa. Our fresh start with a clean slate.

Every decision is better after a single, deep breath.

More breathing, more thinking. Deeper breathing, deeper thinking.

“The heaviest penalty for declining to rule is to be ruled by someone inferior to yourself.”  —  Plato

Plato was referring to politics with this quote, but it extends to all of life, really. The first place we must rule then, is our own mind. The goal isn’t to think perfectly. It’s to not let others do the thinking for you.

Among the long list of mental biases, there even is one describing our tendency to think of ourselves as less biased than we actually are. It’s called our bias blind spot.

If nothing else, I hope it’s now a smaller raindrop on your windshield.


Sources

[1] Philosophy — Plato

[2] Cognitive Bias Cheat Sheet

[3] Seth Godin’s Exclusive Linchpin Keynote

[3] The Cook and the Chef: Musk’s Secret Sauce

[4] 14 Warnings From Trust Me, I’m Lying

[5] Why You’ll Soon Be Playing Mega Trillions

[6] The Power of Habit Summary

[7] The Spotlight Effect: Why No One Cares About That Thing You Did

[8] What Every Body Is Saying Summary

[9] The Willpower Instinct Summary

The 3 Best Study Hacks for College

While getting my Bachelor’s degree, I’ve tried every mode of study you can imagine. Go to all the classes, go to some classes, go to no classes. Self-study, group study, teaching, being taught, you name it, I’ve tried it.

All I ever got was Bs.

(Our grade scale goes from 1–4, 1.0 being the best)

So when I decided to go back to school, I thought why stress myself. I’ve been hacking college since the day I got here.


1. Hacking classes.

In Germany, most classes aren’t mandatory. Since all we have is one final exam for most subjects, you can stay home all year, study for yourself and then ace the class.

Here in Munich, most classes are even recorded to watch at your own leisure, yet most of my fellow students still go for one reason: they’re lazy and they feel bad if they don’t.

Last semester, many of them went to all the lectures, did not pay attention, watched the replays, did not pay attention again, and then tried to study the slides.

What I did was to go to every class once, see if the professor does nothing more than read off the slides (most of them did), and then summarized the slides myself instead.

For every single slide, I wrote down what it meant in one sentence. This way, I’d end up with 6–12 dense pages of notes for each class. All I had to do then, was study them.

Study Hacks Summarizing
Yeah, yeah, my handwriting sucks, we’ve been over this.

The goal of summarizing is to reduce the amount of information your brain has to hold.

You’ll do a lot better by knowing 80% of the material in detail, rather than having an idea of 100% of it, but not really knowing what you’re talking about.

When I was all done with my summary, I would try to create a tree structure of the material on one or two pages, so I could have the entire class on one piece of paper.

Study Hacks Memorization
(doesn’t have to be fancy, as long as it works for you)

Bonus tip:

Minimize the number of classes you take by going for those with the highest credits on average.

In my program, 6 credits per class is solid. 3 aren’t worth your time, 5 fall one credit short when adding up to modules (you need 12, 18, 24, etc.), and 8 are usually a ridiculous amount of extra work.

2. Hacking exams.

Everyone I know struggles with studying for several exams in parallel. So whenever you have three in a week, shit hits the fan. You spend way too much time studying for the first and are only left with the time between exam 1 and 2 to study for the latter, and so on.

So the first thing I did was to pick classes based on exam dates, which were spread far apart.

Only two of my exams fell in one week, and those classes were mandatory. The earlier in the semester an exam, the better. Classes started in October, my first exam was in December. This not only meant it was far away from all the others, but also that there was less material to study.

My first exam.

The second thing I did was to improve my exam schedule as I went along. That December exam I only found out about in November, so I adjusted.

Same thing with a required law class. It was scheduled right between the two mandatory exams, but then the professor opened another slot for it three weeks earlier.

Was it a hassle to study the material in one week rather than three? Sure, but this way, I probably spent more time focused on law than I would have, if I’d had to study in parallel.

(that is one big ass law book)

The best thing you can achieve when structuring your exams is peace of mind as you move towards them.

Every minute you spend in a hasty state of worry is a minute of studying lost, so optimize your schedule as best as you can.

3. Hacking assignments.

In one statistics class, we were eligible to get an additional 20% of the exams points as a bonus for completing a report. Had I known this would turn into a 50-page paper about energy drink consumption, I probably wouldn’t have done it, but oh well.

(You can download the paper here, if you’re interested)

We started from scratch and went all the way from designing our own questionnaire, to surveying a sample of people to analyzing the data with SPSS.

However, nowhere does it say you have to do assignments like this the hardest way possible.

  • Instead of designing our survey in Word, we used Google Forms, to make collecting data easier.
  • Instead of annoying 10 of our fellow students to complete the thing, I sent it to my email list and we collected 100 answers in 24 hours. You could also use a service like Pollfish and just pay for people to fill out your survey.
  • Instead of formatting the 2,000 data points in Excel to let us import them to SPSS, I hired someone to do it for $20 on Freelancer.com.

You might think outsourcing work as a student is ridiculous, but consider this:

Would you pay $10 or $20 for 3–4 hours of focused study time?

Not including the stress from fretting about the tasks and delays you encounter. Sometimes, your time really is worth more than the return of a menial task. Even, if you’re a student.


Of course, there is one big disclaimer to all the above: none of these hacks work if you don’t.

Ultimately, I put in just as much, if not more time into studying than I did during my Bachelor’s. But thanks to these hacks, it was a lot more fun to do so, because I could focus on the parts that mattered.

And I did it all while writing articles like this one, every single day. If I can find the time, why not you?