Do Self-Help Books Work? Cover

How Modern Non-Fiction Books Waste Your Time (and Why You Should Read Them Anyway)

When I first discovered non-fiction books, I thought they were the best thing since sliced bread. Whatever problem you could possibly have, there’s a book out there to help you solve it. I had a lot of challenges at the time, and so I started devouring lots of books.

I read books about money, productivity, and choosing a career. Then, I read books about marketing, creativity, and entrepreneurship. I read and read and read, and, eventually, I realized I had forgotten to implement any of the advice! The only habit I had built was reading, and as wonderful as it was, it left me only with information overwhelm.

After that phase, I flipped to the other, equally extreme end of the spectrum: I read almost no books, got all my insights from summaries, and only tried to learn what I needed to improve a given situation at any time.

So, do self-help books work? As always, the truth lies somewhere in the middle.

Read More
How To Not Be Gullible Cover

How To Not Be Gullible

In 1997, 14-year-old Nathan Zohner used the science fair to alert his fellow citizens of a deadly, dangerous chemical.

In his report Dihydrogen Monoxide: The Unrecognized Killer, Nathan outlined all the alarming characteristics of the colorless, odorless, tasteless compound — DHMO for short — which kills thousands of Americans each year:

  • DHMO can cause severe burns both while in gas and solid form.
  • It is a major component of acid rain and often found in removed tumors of cancer patients.
  • DHMO accelerates corrosion of both natural elements and many metals.
  • Ingesting too much DHMO leads to excessive sweating and urination.
  • For everyone with a dependency on DHMO, withdrawal leads to death.

After giving his presentation, Nathan asked 50 fellow students what should be done. 43 — a staggering 86% — voted to ban DHMO from school grounds.

There was only one problem: Dihydrogen monoxide is water.


Every day, people use facts to deceive you because you let them.

Life is hard. We all get fooled six ways from Sunday. People lie to us, we miscommunicate, and it’s impossible to always correctly read other people’s feelings. But facts? If we let facts deceive us, that’s on us.

When it’s hard to be right, there is nothing wrong with being wrong. But when it only takes a few minutes or even seconds to verify, learn, and educate yourself, choosing to stay ignorant is really just that: A decision — and likely one for which you’ll get the bill sooner rather than later.

If you know a little Latin, Greek, or simply pay attention in chemistry class, the term “dihydrogen monoxide” is easy to deconstruct. “Di” means “two,” hydrogen is an element (H on the periodic table), “mono” means one and “oxide” means oxidized — an oxygen atom (O on the periodic table) has been added. Two hydrogens, once oxidized. Two Hs, one O. H2O. Water.

When Nathan ran his experiment “How Gullible Are We?” in 1997, people didn’t have smartphones. They did, however, go to chemistry class. Nathan’s classmates had parents working in the sector, and they all had chemistry books. They even could have asked their teacher: “What’s dihydrogen monoxide?” But none of them did.

In his final report, Nathan wrote he was shocked that so many of his friends were so easily fooled. “I don’t feel comfortable with the current level of understanding,” he said. James Glassman, who wrote about the incident in the Washington Post, even coined the term “Zohnerism” to describe someone using a fact to mislead people.

Today, we have smartphones. We have a library larger than Alexandria’s in our pocket and finding any page from any book takes mere seconds. Yet, we still get “zohnered” on a daily basis. We allow ourselves to be.

“Too much sugar is bad for you. Don’t eat any sugar.” Yes, too much sugar is bad, but the corollary isn’t to stop eating it altogether. Carbohydrates are the body’s main source of energy, and they’re all broken down into various forms of sugar. It’s a vital component of a functioning metabolism. Plus, each body has its own nuances, so cutting out sugar without more research could actually be bad for you. But if I’m selling a no-sugar diet, who cares, right?

You care. You should. And that’s why it’s your job to verify such claims. It’s easy to spin something correct in a way that sends you in whatever direction the manipulator wants to send you. The only solution is to work hard in order to not let yourself be manipulated:

  • Say “I don’t know” when you don’t know. I know it’s hard, but it’s the most liberating phrase in the world. Whenever you’re out of your comfort zone, practice. “Actually, I don’t know, let me look it up.”
  • Admit that you don’t know to yourself. You’ll miss some chances to say “I don’t know.” That’s okay, you can still educate yourself in private later. Your awareness of your ignorance is as important as fighting it.
  • Google everything. When you’re not 100% sure what a word means, google it. When you want to know where a word comes from, google it. When you know you used to know but are hazy on the details, google it. Seriously. Googling takes ten seconds. Google everything.
  • Learn about your biases. Hundreds of cognitive biases affect our thinking and decisions every waking second. Learning about them and occasionally brushing up on that knowledge will go a long way.
  • When someone argues for one side of a conflict, research both. Whether it’s a story in the news, a political issue, or even the issue of where to get lunch, don’t let yourself get clobbered into one corner. Yes, McDonald’s is cheap. Yes, you like their fries. But what about Burger King? What do you like and not like about both of them?
  • When someone talks in absolutes, add a question mark to every sentence. James Altucher often does this with his own thoughts, but it’s equally helpful in questioning the authority of others. Don’t think in absolutes. Think in questions.

The dihydrogen monoxide play has been used many times to point people at their own ignorance. A 1994 version created by Craig Jackson petitions people to “act now” before ending on a truthful yet tongue-in-cheek note: “What you don’t know can hurt you and others throughout the world.”

Richard Feynman received the Nobel prize in physics, but he started his journey as a curious boy, just like Nathan Zohner. Like Einstein, he believed inquisitiveness could solve any problem, and so he always spoke in simple terms — to get people interested in science.

He also said the following, which still rings true today: “The first principle is that you must not fool yourself — and you are the easiest person to fool.”

Jordan Peterson’s 12 Rules for Life Cover

Jordan Peterson’s 12 Rules for Life

If you make happiness the meaning of life, every time you’re not happy, you’ll feel like a failure.

If, however, you do what’s meaningful in every situation, even failure will have a purpose. Failing will still be painful, but your perspective will never feel “empty,” and you’ll always have reason to look forward to the future.

This is one of Jordan Peterson’s 12 Rules for Life, and, like all of them, it’s common sense that somehow still stabs you right in the heart. We’re great at ignoring common sense until someone hits us over the head with it. 

This is what Peterson does in his book, which many criticize for being too verbose. “He could’ve said that in a few paragraphs!” Well, he did. The book is based on a viral Quora answer Peterson wrote. But a post on a website does not hold the same power as a book full of stories. 

It’s true: Most self-help books are too long. But through their packaging, they can do a better job of spreading and delivering a message than any blog post ever can. 

Like his book, Peterson is a controversial figure. I’m not here to discuss his politics, his logic, or his views on our culture. I’m here to learn. I only have “a few paragraphs,” but this is how I interpret his 12 lessons.

Read More
Learn to Memorize 10 Items in 4 Minutes Cover (How To Remember Twice As Much)

Learn to Memorize 10 Items in 4 Minutes

If you could recall double the information, your life would be a lot easier.

You wouldn’t spend so much time googling, you’d only look at your shopping list half as much, and you’d have twice as many chances to deliver a great idea in a meeting.

With over 30,000 citations, The Magical Number Seven, Plus or Minus Two is one of the most referenced papers in history. Published in 1956 by Harvard psychologist George Miller, it asserts humans can store seven objects in their short-term memory on average, with variations ranging from five to nine.

Since then, researchers have found that memory span is not constant, that capacity depends on the nature of chunks (are they numbers, letters, words?), and that sound also plays a role in how much we can retain.

What hasn’t changed is that most people would likely struggle if you asked them to quickly remember a list of ten objects. You can test yourself online, but surprisingly, we had a solution to this before we even knew the problem.

In 1918, memory artist David Roth published his book Roth Memory Course. He called it, “a simple and scientific method of improving the memory and increasing mental power.” The book is packed with examples and exercises, but one of the most powerful is the very first: combination pictures.

Read More
4 Ways To Not Write an Introduction Cover

4 Ways To Not Write an Introduction

The number one thing stopping people from reading your article is its title. If they don’t open the wrapper, all the effort you put into the gift is lost.

The number two thing is your introduction. In the cruel obstacle course of “hurdles to jump over so people will read my writing,” the introduction is one of the most neglected, most easily dismissed elements.

In turn, a lot of articles are dismissed by readers, leaving authors scratching their heads, wondering what they did wrong. “My title scored 78 in the analyzer! I picked a relevant image! Why aren’t people reading?!”

They’re not reading because you wasted their time. You just waited to do it in the intro, and it made them even angrier than a bad title. Now, they clicked on it for nothing. They unwrapped the gift, and it sucked.

It’s easy to let clutter sneak into your introductions. It happens to all writers, and we don’t always catch it before it’s too late.

Most of the time, bad introductions are the result of laziness rather than lack of skill or imagination. The mistake would have been easy to spot, if only we’d made the time to look for it. We chose to ignore it. We were in a hurry. So we tossed our bird out the window, hoping it could fly with a broken wing.

Sometimes, a miracle happens. But if we don’t want to spend our lives scraping dead birds off the street, we better learn to respect our readers’ time.

Below are four instructive examples of how to not start an introduction.

These before-and-afters show how we try to fight clutter at Better Marketing. All authors agreed to be listed as examples in advance. We’re grateful they write for us, and we hope this study will serve them and others well.

Let’s do away with bad introductions.

Read More
Creativity & Breathing Cover

To Stay Creative, Remember to Breathe

“I sometimes disappear for weeks or even months at a time. When I do this, I’m not abandoning my work or being lazy. I’m just trying to breathe.”

So writes Matthew Inman, creator of the web comic The Oatmeal, in a post titled Creativity is like breathing. To explain the analogy, Inman writes: “When you make stuff, you’re exhaling. But you can’t exhale forever. Eventually, you have to breathe in. Or you’ll be dead.”

Read More
Read Books in Parallel, But Only When You Need Advice Cover

Read Books in Parallel, But Only When You Need Advice

Right now, I’m reading not one, not two, but five books. That sounds like the beginning of a how-to-read-200-books-a-year article, but I’m here to tell you the opposite — sort of.

Forgive me for quoting myself, but, three years ago, I wrote:

Being book-smart just for the sake of being book-smart is a vanity metric for your ego. Don’t learn solely for the sake of learning. Be a practitioner. Use the information you consume. Ironically, learning things right when you need them will also help you remember them better.

I still find this to be true. When I’m about to go on a date, I might read a chapter on small talk and first impressions. Then, I try to use some of the questions and tips and later reflect on what I’ve learned. Connecting the information from the book to a real-world experience creates a much stronger, long-lasting memory than simply repeating a list of tips to myself, hoping they sink in.

In his tutorial for reading 200 books a year, Charles Chu shared some uncomfortable truths:

Here’s the simple truth behind reading a lot of books: It’s not that hard. We have all the time we need. The scary part — the part we all ignore — is that we are too addicted, too weak, and too distracted to do what we all know is important.

Chu is right when he says we could easily find the 417 hours we need to read that many books if we gave up some of the 3400 hours we spend on social media and TV. Where he and I disagree is that I don’t think reading 200 books a year is important. I think what’s important is that you live a healthy, happy, and meaningful life.

Of course, books can be a great aide on this quest. But — and I think this truth is equally uncomfortable — no one needs the advice from 200 books a year to live a balanced life. If you want to read 200 novels and passively soak up whatever feels relevant at the time, be my guest. But don’t kid yourself: You don’t have time to implement 60,000 pages of advice.

The solution to whatever you’re struggling with right now likely fits on one — and that’s enough to chew on for the time being.

Read More
7 Daily Opportunities To Be Mindful as a Non-Meditator Cover

7 Daily Opportunities To Be Mindful as a Non-Meditator

Dan Harris is as American as it gets. He’s outgoing, confident, and calls a spade a spade. He’s been an anchor for ABC News for the past 20 years, informing his fellow citizens on what matters. He has reported from war zones, interviewed drug lords, and co-hosts Good Morning America on weekends.

In short, Dan Harris represents a life most Americans aspire to live: be a strong voice, follow your ambition, but keep your feet on the ground and your heart in the right place. But, like all of us, Dan Harris is human.

For the first few years at ABC, he was a workaholic. Feeling he didn’t deserve his dream job in his 20s, he overcompensated. He fell into depression and turned to recreational drugs. All of this culminated in one incident: In 2004, Dan had a panic attack, live and on the air, in front of five million people.

How do they say? It’s all fun and games until someone loses an eye.

Read More

What Is the Future of Learning?

“A wise man can learn more from a foolish question than a fool can learn from a wise answer.” 

Bruce Lee

In the past four years, I have asked a lot of foolish questions:

Can I be a professional translator without any credentials?

If I want to be a published writer, should I still ghostwrite for money?

Do summaries of existing book summaries make any sense?

The seemingly obvious answer to them all is “no,” yet I did all those things anyway. And while some led nowhere, others now pay my bills. Often, the only way to get satisfying answers is to try, especially with foolish questions. The beauty of daring to ask them, rather than accepting the answers society gives you, is that you’ll have many more unexpected insights along the way.

Like that, today, the answers are always less valuable than the questions.

The Half-Life of Knowledge

In 2013, we created as much data as in all of the previous history. That trend now continues, with total information roughly doubling each year. Michael Simmons has crunched the numbers behind our knowledge economy:

You probably need to devote at least five hours a week to learning just to keep up with your current field—ideally more if you want to get ahead.

Bachelor’s degrees in most European countries consists of 180 credits (EU schools tend to use a quarter credit system as opposed to the semester hour system typical in the U.S.), and each of those credits is worth about 30 hours of studying time. That’s 5,400 hours. Sadly, what you learn from those hours starts decaying as soon as you’ve put in the time. Scientists call this “the half-life of knowledge,” a metric that’s decreasing fast.

A modern degree might last you just five years before it’s completely irrelevant.

Since new information is now generated more and more rapidly, it takes less time for said information to lose its value. Back in the 1960s, an engineering degree was outdated within 10 years. Today, most fields have a half-life much less than that, especially new industries. A modern degree might last you just five years before it’s completely irrelevant. Even with a conservative half-life estimate of 10 years (losing about 5 percent each year), you’d have to put in 270 hours per annum just to maintain those initial 5,400—or about five hours per week.

As a side effect of this global, long-lasting trend, both the time we spend attaining formal education and the number of people choosing this path have increased dramatically for decades. Years of schooling have more than doubled in the past 100 years, and in many countries, it’s common to study for some 20-plus years before even entering the workforce. In the U.S. alone, college enrollment rates have peaked at over 90 percent of the total population in the age group around secondary school completion already.

The larger our ocean of information, the less valuable each fact in it becomes. Therefore, the knowledge bundles for college degrees must get bigger and, thus, take longer to absorb. But the ocean also grows faster, which means despite getting bigger, the bundles don’t last as long. It takes a lot of time to even stay up to date, let alone get ahead of the increasing competition.

Instead of flailing more not to drown, maybe we should get out of the water.

A Scary Future to Imagine

While it’s important to dedicate time to learning, spending ever-increasing hours soaking up facts can’t be the final answer to this dilemma. Extrapolate the global scramble for knowledge, and we’d end up with 50-year-old “young professionals,” who’d retire two years into their careers because they can’t keep up. It’s a scary future to imagine but, luckily, also one that’s unlikely.

I saw two videos this week. One showed an unlucky forklift driver bumping into a shelf, causing an entire warehouse to collapse. In the other, an armada of autonomous robots sorted packages with ease. It’s not a knowledge-based example, but it goes to show that robots can do some things better than people can.

There is no expert consensus on whether A.I., robotics, and automation will create more jobs than they’ll destroy. But we’ll try to hand over everything that’s either tedious or outright impossible. One day, this may well include highly specialized, knowledge-based jobs that currently require degrees.

Knowledge is cumulative. Intelligence is selective. It’s a matter of efficiency versus effectiveness.

A lawyer in 2050 could still be called a lawyer, but they might not do anything a 2018 lawyer does. The thought alone begs yet another foolish question:

When knowledge itself has diminishing returns, what do we need to know?

The Case for Selective Intelligence

With the quantity of information setting new all-time highs each year, the future is, above all, unknown. Whatever skills will allow us to navigate this uncertainty are bound to be valuable. Yuval Noah Harari’s new book asserts this:

In such a world, the last thing a teacher needs to give her pupils is more information. They already have far too much of it. Instead, people need the ability to make sense of information, to tell the difference between what is important and what is unimportant, and above all, to combine many bits of information into a broad picture of the world.

The ability Harari is talking about is the skill of learning itself. The 2018 lawyer needs knowledge. The 2050 lawyer needs intelligence. Determining what to know at any time will matter more than the hard facts you’ll end up knowing. When entire industries rise and fall within a few decades, learning will no longer be a means but must become its own end. We need to adapt forever.

Knowledge is cumulative. Intelligence is selective. It’s a matter of efficiency versus effectiveness. Both can be trained, but we must train the right one. Right now, it’s not yet obvious which one to choose. The world still runs on specialists, and most of today’s knowledge-accumulators can expect to have good careers.

But with each passing day, intelligence slowly displaces knowledge.

The Problem With Too Many Interests

Emilie Wapnick has one of the most popular TED talks to date—likely because she offers some much-needed comfort for people suffering from a common career problem: having too many interests. Wapnick says it’s not a problem at all. It’s a strength. She coined the term “multipotentialite” to show that it’s not the people affected but public perception that must change:

Idea synthesis, rapid learning, and adaptability: three skills that multipotentialites are very adept at and three skills they might lose if pressured to narrow their focus. As a society, we have a vested interest in encouraging multipotentialites to be themselves. We have a lot of complex, multidimensional problems in the world right now, and we need creative, out-of-the-box thinkers to tackle them.

While there’s more to it, it’s hard to deny the point. After all, some of these thinkers work on some of our biggest problems. And we love them for it.

Jeff Bezos built a retail empire and became the richest man in the world, but he also helped save an important media institution and works on the infrastructure we need to explore space. Elon Musk first changed how we pay and then how we think of electric cars, and now how we’ll approach getting to Mars. Bill Gates really knows software, but now he’s eradicating malaria and polio. The list goes on.

The term “polymath” feels overly connoted with “genius,” but whether you call them Renaissance people, scanners, or expert-generalists, the ability they share stays the same: They know how to learn, and they relentlessly apply this skill to a broad variety of topics. In analyzing them, Zat Rana finds this:

Learning itself is a skill, and when you exercise that skill across domains, you get specialized as a learner in a way that someone who goes deep doesn’t. You learn how to learn by continuously challenging yourself to grasp concepts of a broad variety. This ironically then allows you to specialize in something else faster if you so choose. This is an incredibly valuable advantage.

Beyond learning faster, you’ll also innovate more, stay flexible, stand out from specialists, and focus on extracting principles over remembering facts.

To me, that sounds exactly like the person an unpredictable world needs.

A Curious Boy

In 1925, one year before he entered school, Isaac Asimov taught himself to read. His father, uneducated and thus unable to support his son, gave him a library card. Without any direction, the curious boy read everything:

All this incredibly miscellaneous reading, the result of lack of guidance, left its indelible mark. My interest was aroused in twenty different directions and all those interests remained. I have written books on mythology, on the Bible, on Shakespeare, on history, on science, and so on.

“And so on” led to some 500 books and about 90,000 letters Asimov wrote or edited. Years later, when his father looked through one of them, he asked:

“How did you learn all this, Isaac?”

“From you, Pappa,” I said.

“From me? I don’t know any of this.”

“You didn’t have to, Pappa,” I said. “You valued learning and you taught me to value it. Once I learned to value it, the rest came without trouble.”

When we hear stories about modern expert-generalists, we assume their intelligence is the result of spending a lot of time studying multiple fields. While that’s certainly part of it, a mere shotgun approach to collecting widely diversified knowledge is not what gives great learners special abilities.

What allowed Asimov to benefit from his reading, much more so than what he read or how much, was that he always read with an open mind. Most of the time, we neglect this. It’s a fundamental misunderstanding of how we learn.

In order to build true intelligence, we first have to let go of what we know.

The Value of Integrative Complexity

Had Asimov learned to read in school, he likely would’ve done it the way most of us do: memorizing or critiquing things. It’s an extremely narrow dichotomy, but sadly, one that sticks. Rana offers thoughts about the true value of reading:

Anytime you read something with the mindset that you are there to extract what is right and what is wrong, you are by default limiting how much you can get out of a particular piece of writing. You’re boxing an experience that has many dimensions into just two.

Instead of cramming what they learn into their existing perspectives, people like Asimov know that the whole point is to find new ones. You’re not looking for confirmation; you’re looking for the right mental update at the right time.

With an attitude like that, you can read the same book forever and still get smarter each time. That’s what learning really is: a state of mind. More than the skill, it’s receptiveness that counts. If your mind is always open, you’re always learning. And if it’s closed, nothing has a real chance of sinking in.

Scientists call this “integrative complexity”: the willingness to accept multiple perspectives, hold them all in your head at once, and then integrate them into a bigger, more coherent picture. It’s a picture that keeps evolving and is never complete but is always ready to integrate new points and lose old ones.

That’s true intelligence, and that’s the prolific learner’s true advantage.

A Matter of Being

Your brain is like a muscle. At any moment, it’s growing or it’s deteriorating. You can never just keep it in the same state. So when you’re not exercising your mind, it’ll atrophy and not only stop but quickly reverse your progress.

This has always been the case, but the consequences today are more severe than ever. In an exponential knowledge economy, we can’t afford stale minds. Deliberately spending time on learning new things is one way to fight irrelevance, but it’s not what’ll protect us in the uncharted waters of the future.

The reason the wise man can learn from even the most foolish question is that he never assigns that label in the first place.

Beyond being carriers of knowledge, we need to become fluid creatures of intelligence. Studying across multiple disciplines can start this process. It has many advantages—creativity, adaptability, speed—but it’s still not enough.

If we focus only on the activity of learning, we miss the most important part: Unless we’re willing to change our perspective, we won’t grasp a thing. It’s not a matter of doing but of being. The reason the wise man can learn from even the most foolish question is that he never assigns that label in the first place.

And so it matters not whether we learn from our own questions or the insights of others, nor how much of it we do, but that we always keep an open mind. The longer we can hold opposing ideas in our heads without rejecting them, the more granular the picture that ultimately forms. This is true intelligence. It’s always been valuable, but now it’s the inevitable future of learning.

Bruce Lee undoubtedly possessed this quality. By the time he died, he was a world-renowned martial artist, the creator of an entire philosophy, and a multimillion-dollar Hollywood superstar. All at only 32 years old. Long after his passing, one of his favorite stories captures both the essence of his spirit and how he became the cultural icon we still know and love today:

A learned man once went to visit a Zen teacher to inquire about Zen. As the Zen teacher talked, the learned man frequently interrupted to express his own opinion about this or that. Finally, the Zen teacher stopped talking and began to serve tea to the learned man. He poured the cup full, then kept pouring until the cup overflowed.

“Stop,” said the learned man. “The cup is full, no more can be poured in.”

“Like this cup, you are full of your own opinions,” replied the Zen teacher. “If you do not first empty your cup, how can you taste my cup of tea?”

3 Rules for Writing in the 21st Century Cover

3 Rules for Writing in the 21st Century

I love when movies get sequels after 10, 20, 30 years — like Blade Runner, Wall Street, or Tron. Because by the time they come out, as the soundtrack of the latter suggests, the game has changed.”

Not just whatever game is played inside the movie, but the way movies are made, how society looks at them, and what’s needed for a story to break through and touch people’s hearts. How will the creators deal with all this?

If they handle it gracefully, they add to their legacy. If not, they run the danger of staining a beautiful tombstone. But while movie-making has sure transformed a lot in the last three decades, I think there’s one game that’s changed even more in the past ten years: writing.

When it comes to words, nothing’s the way it used to be. Newspapers are printed on screens, not paper. Writers don’t write books. Reading is simpler than ever but was never harder to do. So how can we, the creators, keep up?

It’s a hard question and I don’t have even half the answers, but after sitting with it, I noticed I pay close attention to three things in particular when it comes to modern writing.

As a result, here are three rules for writing in the 21st century.

Read More