Late Bloomers: The Power of Patience in a World Obsessed with Early Achievement
There were about thirty employees. I’d never managed anyone. Now I had to. And I did okay – much, much better than expected. Do you know what I learned? I learned the most important part of leadership is showing up. Could have fooled me. That wasn’t in the books I’d read. I sure didn’t learn it from my father. Turns out eagerness is infectious. I moved the CEO’s office to an open-glass conference room where everyone could see it and me. I made it a point to be the first there every day and the last to leave. I took employees to lunch every day and to dinner every night – at cheap diners but I gave them my time and interest. I wandered around endlessly talking to them, focusing on every single one and what they thought.
The effect amazed me. And them. That I cared made them care. Suddenly I felt what it was like to lead.
At the opposite end of the success spectrum is the early bloomer, the fast starter. Five-foot-one-inch Riley Weston was spectacular; at age nineteen, she landed a $300,000 contract with Touchstone, a division of Walt Disney, to write scripts for the television show Felicity – the coming-of-age story of a UCLA freshman. Weston’s fast start in major league television landed her on Entertainment Weekly’s list of Hollywood’s most creative people.
Margin note: Younger
There was only one problem. Riley Weston was not nineteen. She was thirty-two, and her real identity, until this profitable ruse, was Kimberly Kramer from Poughkeepsie, New York. “People wouldn’t accept me if they knew I was thirty-two,” she said in her defence. She was probably right.
If we or our kids don’t knock our SATs out of the park, gain admittance to a top-ten university, reinvent an industry, or land our first job a ta cool company that’s changing the world, we’ve somehow failed and are destined to be also-rans for the rest of our lives.
This message, I believe, creates a trickle-down societal madness for early achievement. This has led to very costly mistakes on the part of educators and parents in how we evaluate children, inflict pressure on them, and place senseless emotional psychological burdens on families.
Consider how, in high-pressure cities, some elite preschools play on the fears of affluent parents of three- and four-year-olds. The Atlanta International School in Atlanta offers a “full immersion second language program” – for three-year-olds. Just pony up $20,000 for a year’s tuition. But that’s a bargain compared to the fees at Columbia Grammar School in New York, which will set you back $37,000 a year. Your three- and four-year-olds will get a “rigorous academic curriculum” dished out in three libraries, six music rooms, and seven art studios. Writes Parenting magazine, “Columbia Grammar School’s program is all about preparing kids for their futures – attending prestigious colleges.”
Ah, the truth spills out. For what else would motivate parents to spend $40,000 to give their three-year-old a head start? According to these luxe preschools, there is one goal that justifies the cost: Get your toddler into a prestigious college fifteen years later. The message could hardly be more direct – or more threatening. If your kid doesn’t ultimately get into a “prestigious college,” his or her life will be needlessly hard.
The pressure doesn’t stop with gaining admission to a proper preschool. “I’m contacted by a lot of parents who are completely freaking out that their 14-year-old is not spending the summer productively,” Irena Smith, a former Stanford University admissions officer, told the Atlantic. Smith now runs a college admissions consultancy in Palo Alto, California, where clients typically spend $10,000 more.
A recent sports story makes the point. In the 1028 Super Bowl, neither the Philadelphia Eagles nor the New England Patriots had many five-star recruits in their starting lineups. Translation: Only six of the forty-four starters were top-rated prospects in high school.
Now look at the quarterbacks. New England’s Tom Brady didn’t merit even a humble two or one ranking in high school. His ranking was NR – “no ranking.” The victorious Eagles quarterback, Nick Foles, winner of the 2018 Super Bowl’s most valuable player award, had a three ranking in high school. But for most of the season, Foles was actually the Eagles backup. He got to play only after starting quarterback Carson Wentz hurt his knee toward the end of the season. Wentz, like Brady, had an NR ranking in high school. No surprise: As a high school junior, Wentz wasn’t primarily a quarterback. His school’s football program listed him as a wide receiver.
With his lowly NR rank from high school, no major college football program had recruited Wentz. He went to North Dakota State, a small-college powerhouse. But while he was there, he grew to six-five and 230 pounds. Wentz literally blossomed in college, which is late by football standards. Now let’s ask ourselves. How many of us are potential Carson Wentzes in our own way? How many of us were tagged with “no ranking” in high school, or dismissed early in our careers, or are dismissed even now? What gifts and passions might we possess that haven’t yet been discovered but that could give us wings to fly?
For most people, paying to get an edge on standardized test scores is not only worth it – it’s necessary. As long as high-stakes tests remain an important aspect of competitive college admissions, there’ll be no shortage of people looking for an advantage.
We see the same pressure cooker for early measurable achievement outside academics. Consider sports. According to a recent Washington Post story, 70 percent of kids quit sports by age thirteen. Why? The kids have a ready explanation: “It’s not fun anymore.”
A kid in the United States is now fourteen times more likely to be on medication for ADD compared to a kid in the U.K. A kid in the United States is forty times more likely to be diagnosed and treated for bipolar disorder compared to a kid in Germany. A kid in the United States is ninety-three times more likely to be on medications like Risperdal and Zyprexa used to control behavior compared to a kid in Italy. So in this country and really in no other country, we nowe use medication as a first resort for any kid who’s not getting straight A’s or not sitting still in class. No other country does this. This is a uniquely American phenomenon, and it’s quite new.
Let’s stop and ask: Is the sacrificial expenditure of money, wrecked family dinners, and kids exhausted from organized activities producing better, more productive, or happier people? Is it helping people bloom? For the majority of kids, it’s doing the exact opposite. This pressure for early achievement has an unwitting dark side: It demoralizes young people. By forcing adolescents to practice like professionals, to strive for perfection, and to make life choices in their teens (or earlier), we’re actually harming them. We’re stunting their development, closing their pathways to discovery, and making them more fragile. Just when we should be encouraging kids to dream big, take risks, and learn from life’s inevitable failures, we’re teaching them to live in terror of making the slightest mistake. Forging kids into wunderkinds is making them brittle.
This topic is of particular importance to Stanford psychology professor Carol Dweck, author of the bestselling 2006 book Mindset: The New Psychology of Success. On a late summer day, I sat down with Dweck to discuss the changes she’s seen in her years of teaching college freshmen. “I think society is in a crisis,” she told me. “Kids seem more exhausted and brittle today. I’m getting much more fear of failure, fear of evaluation, than I’ve gotten before. I see it in a lot of kids; a desire to play it safe. They don’t want to get into a place of being judged, of having to produce.” And these are the kids who were admitted to Stanford – these are the early “winners” in life. The optimism of youth, it seems, has been warped into a crippling fear of failure.
These two Harvard revolutionaries, James Conant and Henry Chauncey, promoted the SAT as a weapon against the lazy aristocracy.
Between the 1950s and the 1990s, the SAT replaced membership in old-money aristocracy as America’s official gatekeeper of elite university admissions.
Jeffrey Arnett, a psychology professor at Clark University, is urging society to recognize what he calls “emerging adulthood” as a distinct life stage. Arnett believes that social and economic changes have caused the need for a new, distinct stage between the ages of eighteen and thirty. Among the cultural changes that have led to Arnett’s concept of emerging adulthood are the need for more education, the availability of fewer entry-level jobs, and less of a cultural rush to marry while young.
Arnett, who describes himself as a late bloomer, says that emerging adulthood is an important period for self-discovery. Emerging adults often explore identity, experience instability, and focus on themselves during this time. Exploration is part of adolescence too, but in the twenties it takes on new importance. The stakes are higher as people approach the age when their possibilities narrow and they need to make long-term commitments.
Arnett proposes, controversially, that prolonging adolescence actually has an advantage. If this sounds like coddling, rest assured; that’s not what he means. Rather, he argues for a super-adolescent period involving continued stimulation and increasing challenges. Maintaining brain plasticity by staying engaged in new, cognitively stimulating, and yes, highly demanding activities can actually be a boon, as opposed to falling into repetitive and more predictable jobs and internships that close the window of plasticity. In other words, delaying adulthood may actually be desirable. It can foster independent thinking and the acquisition of new skills. More than that, it can boost motivation and drive.
There is a compelling neurological rationale for taking a year or two off before, during, or after college. People who prolong adolescent brain plasticity for even a short time enjoy intellectual advantages over their more fixed counterparts in the work world. Studies have found that highly accomplished people enjoy a longer period during which new synapses continue to proliferate. The evidence is clear: Exposure to novelty and challenge while the brain’s frontal cortex is still plastic leads to greater long-term career success.
Andrew J. Martin, researching 338 students, discovered that young adults who take gap years tend to be less motivated than their peers before the gap year, much like Kyle DeNuccio. But after their gap year, most of them find new motivation. In Martin’s words: “They had higher performance outcomes, career choice formation, improved employability, and a variety of life skills… The gap year can be seen as an educational process in which skills and critical reflection contribute to an individual’s development.”
A late-forties peak period of innovation is supported by the average age of U.S. patent applicants, which is forty seven.
Amazingly, there are twice as many entrepreneurs over fifty as there are under twenty-five.
The idea that one’s forties are a peak age of entrepreneurship is supported by the work of the twentieth-century developmental psychologist Erik Erikson. Erikson believed that ages forty to sixty-four constitute a unique period where creativity and experience combine with a universal longing to make our lives matter. Starting a company is how many people pursue what Erikson called “generativity,” building something that has the potential to make a positive contribution beyond our mortal lives.
We need ways to start a career later, to have more flexibility in mid-career, and to taper off gently at our own pace near the end of our careers.
Margin note: Lindsey
For the sake of argument, let’s say that people in industry X peak in their forties or fifties. By peak, I mean the acme of a person’s technical skills, team building and managerial skills, and productivity and communications skills, along with their willingness to work long hours, hop on airplanes for a week of sales meetings, and so on.
The traditional up-and-out career path would dictate: You’re gone after fifty-five. We, your employers, can’t afford to keep you, a fifty-five-plus employee, on the payroll. A kinder career arc would acknowledge that nearly all employees peak at some point, but even “past peak” senior employees can make valuable contributions. So why not fashion a career path in which at some point the pay raises stop and salaries may even decline, and the titles stop accruing, evolving from “group vice president” to “senior consultant.”
Such a career arc includes no forced retirement age. Why should a sixty-five-year-old or a seventy-two-year-old not work if they want to and their employer finds their contribution to be valuable, at the right level of pay? (Note to CEOs: If your human resources and legal departments can’t figure this out, replace them with ones that can.)
Another good reason for a career arc replacing up-and-out is age diversity. A valuable older employee on the arc’s downslope has no need to defend his or her turf. They are now free to offer counterintuitive advice or words of caution: “This is a brilliant idea, but let’s be sure to identify your base assumptions about sales, and talk them through, so we don’t make any costly missteps.” That’s a different conversation from one that begins with an older employee defending his or her turf. The worst thing a company can do is kill off the creative energy of its young and talented people. The second-worst thing is to allow young people to blindly walk into avoidable traps that a wise senior employee can help them foresee.
Margin note: Lindsey
So let’s pivot to the bountiful world of late bloomer strengths, which don’t get the attention they deserve.
Curiosity is the first late bloomer strength.
People should consider it a moral obligation to be curious about things. Not being curious is not only intellectually lazy, but it shows a willful contempt for the facts.
The London-based science journal the Cube writes that “curiosity is a cognitive process which leads to the behavior perceived as motivation. From the human perspective the relationship between curiosity and motivation creates a feedback; the more curious one becomes about something, the more motivated one will be, and the more motivated one is the more one learns and the more curious one will become.” Curiosity is a dopamine hit, says the Cube.
Compassion is the second late bloomer strength, the ability to put ourselves in others’ shoes and in doing so understand their challenges and how best to help them.
Shimul Melwani, of the University of North Carolina, has found that compassionate managers and executives are perceived as better leaders. Compassionate leaders appear to be stronger, have greater levels of engagement, and have more people willing to follow them.
The third strength that late bloomers tend to have in spades is resilience. As defined in Psychology Today, “resilience is that ineffable quality that allows some people to be knocked down by life and come back stronger than ever.”
Resilience is not a passive quality but “an ongoing process of responding to adversity with concerted action.”
Reframing adversity in the life stories we tell ourselves in another key strategy that people tend to learn over time. A Harvard study showed that students who acknowledged the adversity they faced and reframed their challenges as growth opportunities performed better and kept their physical stress levels lower than students who were trained to ignore their adversity.
Their stories illustrate another late bloomer strength. The best descriptor I can think of is equanimity. Equanimity means a “mental calmness, composure, and evenness of temper, especially in a difficult situation.”
UCLA and Stanford psychologists Cassie Mogilner, Sepandar Kavar, and Jennifer Aaeker report that excitement and elation are emotions that move the happiness needle for younger people, while peacefulness, calm, and relief drive it for those who are older.
Research has long established that calm leaders are more effective.
His great strength – one that is available to late bloomers in particular – was insight.
Our insights are the result of us drawing on our full mental library of experience, patterns, and context, yielding an idea of extraordinary value.
As we age, we collect and store information. That, and not a “fuzzy memory,” is part of the reason it takes us longer to recall certain facts. We simply have more things to remember. Older people have vastly more information in their brains than young people do, so retrieving it naturally takes longer.
That’s right. Just quit.
I ended the previous chapter by asking, “How can the curious and creative, the searchers and explorers, jump off the dominant culture’s conveyor belt and begin shaping our own fates?” We do it by quitting. Quit the path we’re on. Quit the lousy job. Quit the class we hate. Quit the friends and associates who hurt us more than help. Quit the life we regret.
Was Daniel J. Brown a quitter in the normal sense? Did he lack courage? Did he lack ambition? Should he have stuck it out in high school, miserable and ostracized and perhaps doomed to a mental breakdown? I would argue that quitting was the best possible option for young Dan Brown. By saying no to the expectations of others, including his parents, he set his life on a much healthier track.
Despite our cultural enthusiasm for determination, there are situations in which perseverance is actually maladaptive. Research points to three awkward truths about our determination not to quit: (1) tenacity, or willpower, is a limited resource; (2) quitting can be healthy; and (3) quitting, not doggedness, often produces better results.
The first problem with our cultural obsession with determination is that applying single-minded resolve to something that you don’t really believe in actually makes you less effective. Tenacity misapplied erods our ability to summon willpower or persistence when we really need it.
My point in bringing up Bowerman’s story is this: Saluting the benefits of tenacity only makes sense to a point, because each of us only has a certain amount of resolve, both mental and physical. If social norms encourage us to apply our determination in excess, or to the wrong endeavor, we’ll only run out of it. Perseverance applied in meeting others’ expectations – whether family, community, or society – will deplete our reserves of willpower. We’ll finish our days exhausted, yet unable to sleep. And then when we really need determination and resolve, we may not have enough left to pursue a new pathway or a genuine passion.
The notion that we can somehow strengthen our willpower “muscle” is misleading at best, and detrimental at worst. Conventional wisdom suggests that if we do certain exercises or practice certain habits, we can strengthen this muscle. But science and research tell us that this isn’t true. We can’t simply apply determination like a jelly spread to everything we do in our lives – we’ll burn out. When we force ourselves to do things we’re not naturally inclined to do, or that don’t fit our passion or purpose in life, we pay for it with reduced motivation and drive.
The second problem with the cult of tenacity is that quitting is actually a healthy thing to do at times. Many of the things we desire – thanks in large part to culture – are unattainable. Research shows that when we quit pursuing unattainable goals, we’re happier, less stressed, and even get sick less often. That’s right, quitting is actually physically good for you.
A number of studies tracking subjects ranging from adolescence to young adulthood to older adulthood have shown that goal disengagement – quitting – strongly and positively affects physical health. Three studies found that people who were able to quit pursuing unreachable goals had healthier hormonal patterns and greater sleep efficiency. Not quitting was associated with higher levels of depression, stress, and emotional upset.
These biases can best be defined through two economic concepts: the sunk-cost fallacy and opportunity cost. The first, sunk cost, deals with the past. Sunk cost is the money, time, or effort we’ve already put into a project or direction in life. The more and longer we invest in something, the more difficult it is to let it go. The sunk-cost fallacy is when we tell ourselves that we can’t quit because of all the time or money we’ve already spent.
The second economic concept is opportunity cost. Unlike sunk cost, this concept deals with the future. It means that for every hour or dollar we spend on one task or direction, we’re giving up the opportunity to spend that hour or dollar on a different, better task or direction. In other words, instead of focusing our efforts on something that isn’t working or fails to make us happy, we could be directing our energies toward something that might make us happier, better suit our lifestyle, or help us make more money – if only we weren’t so worried about the sunk cost.
In the context of the sunk-cost fallacy, abandoning a failed pathway seems to waste the resources we’ve already expended. For late bloomers, this would be the feeling that all those years pursuing a Ph.D., a law firm partnership, or that career in fashion would be a waste of time, money, sweat, and tears if we gave up now.
Yet refusing to abandon our investment in a college major, a job, or a path that’s not right for us can be costly. For every moment we double down on something that’s not working, we’re forgoing other potentially valuable opportunities. As behavioral economics and psychology show us, the real waste is not in sacrificing our past by quitting a failing endeavor. It is in sacrificing our future by not pursuing something better.
But the single most important lesson in quitting – and I hope an idea that will stick with you – is that it is a strength, not a failing. We need to overcome our natural tendency toward the sunk-cost fallacy. We need to see quitting for what it really is, a virtue – the ability to “fail quickly” and to pivot nimbly.
How do most people deal with self-doubt? Not always so well. Many of us self-handicap, or sabotage our chances of success. We imaginatively create an obstacle right before any true test of our ability. That way if we fail, we have a perfectly acceptable justification and can protect our internal beliefs about our talent and ability: I drank too much the night before the big test, so of course I didn’t do as well as I could have. Procrastination is one of the most prevalent forms of self-handicapping for late bloomers: I didn’t get around to writing my resume until the last minute – that’s why I didn’t get the job. Or, All my time was taken up with busywork – thanks to my boss – so I couldn’t get to that big presentation. That’s why it bombed. If only I’d had another day or two.
Additionally, many self-handicappers rely on what psychologists call the “tomorrow fantasy.” This fantasy is that we’ll give our full effort tomorrow, down the road, when it suits us. When the time is right, we’ll give our genuine best – which, of course, will produce success. I don’t care that much about this project, so whatever. But when it’s something I’m passionate about, I’ll work hard at it. Then people will see what I can really do. This illusion allows us to avoid putting our ability to a true test.
One day I asked him about the role of confidence in a successful career. He snorted. “Confidence,” he said. “In my whole career I’ve been passing men with greater bravado and confidence. Confidence gets you off to a fast start. Confidence gets you that first job and maybe the next two promotions. But confidence stops you from learning. Confidence becomes a caricature after a while. I can’t tell you how many confident blowhards I’ve seen in my coaching career who never get better after the age of forty.”
Fortunately, we can improve our self-efficacy through something that we all already do: talk.
Overall, the study showed that motivational self-talk dramatically increases both self-efficacy and performance. It also confirmed Bandura’s premise that increases in self-efficacy are related to improvements in performance.
Even how we refer to ourselves can make a difference. Ethan Kross, director of the Self-Control and Emotion Laboratory at the University of Michigan, has found that people who speak to themselves as another person – using their own name or the pronoun you – perform better in stressful situations than people who use the first-person I. In a study, Kross triggered stress in participants by telling them that with just five minutes to prepare they had to give a speech to a panel of judges. Half the participants were told to try to temper their anxiety using the first-person pronoun: Why am I so scared? The other half were told to address themselves by name or the pronoun you: Why is Kathy so scared? or Why are you so scared? After they each spoke, the participants were asked to estimate how much shame they experienced. People who used their names or you not only reported significantly less shame than those who used I, their performances were also consistently judged to be more confident and persuasive.
According to Kross, when people think of themselves as another person, “it allows them to give themselves objective, helpful feedback.” This is because they self-distance – they focus on themselves from the distanced perspective of a third person. “One of the key reasons why we’re able to advise others on a problem is because we’re not sucked into those problems,” explained Kross. “We can think more clearly because we have distance from the experience.” By using external pronouns for ourselves, we view ourselves as a separate person, enabling us to give ourselves more objective advice.
To break what he called “exacerbation cycles” of people with low self-efficacy, Bandura suggests we avoid the negative reinforcement of a skill deficiency or promotion of the idea that a particular task is easy.
This just reaffirms a point we all already know: words matter.
Instead of telling a late bloomer, “This isn’t brain surgery,” try saying, “This is a challenge, but you can figure it out.” Or instead of telling yourself, “I feel terribly overwhelmed right now,” try, “AAlex, you have the capability to do this, and here’s how.” These simply linguistic tweaks can help late bloomers – as well as everyone else – make significant strides toward greater self-efficacy.
Alison Wood Brooks of Harvard Business School recently studied the influence of framing on our emotions, looking at anxiety in karaoke singing, public speaking, and math performance. When confronted with performance anxiety, most people try to suppress their emotions. Brooks investigated an alternative strategy: framing anxiety as excitement. Compared to those who attempted to calm down, individuals who instead framed their anxious energy as excitement actually felt more genuine enthusiasm and performed significantly better.
Brooks found that we can frame anxiety as excitement by using simple strategies like self-talk (saying “You are excited!” our loud) or simple messages (“Get excited!”). These framing messages allow us to channel our anxious energy into an opportunity mindset rather than a threat mindset. Brooks’s findings demonstrate that we all have remarkable control over our perceptions and our resulting feelings. The way we frame – and verbalize – our feelings helps us construct the way we actually feel.
The second reframing step is to link a challenge to a larger goal: This big presentation is not only exciting, it will give me visibility and lead to more opportunities. The larger goal should be clear and compelling in your mind. It should capture the excitement of doing something new that can substantially improve your life.
Leaders who are able to reframe challenges as learning opportunities, who can reframe change initiatives as chances to help others, are consistently more successful.
Why would people, even friends, try to keep you “in your place”? Animals and humans are wired to be status conscious. Groups of crabs will quite literally pull down any member who tries to escape a trap or a bucket, relegating the whole group to certain death. Psychologists and sociologists call the phenomenon the “crab pot syndrome.” Among humans, members of a group will attempt to negate the importance of any member who achieves success beyond the success of others.
Kimberly’s positive experience with moving from her “professional bubble” and to a more diverse (and less “cool”) environment is supported by extensive research. Journeys like hers – journeys to new lives – involve embracing roles and jobs that are more congruent with one’s true self. Studies have shown that “higher levels of satisfaction and mental and physical well-being will occur when there is a good fit between the person and the environment.” This includes the work environment.
Take a moment to reflect on your own story.
That time you were rejected by an employer or laid off – is it further proof that your career is going nowhere? That you’re a failure or a washout, one of those late bloomers who never launched? Or is the layoff one of the best things that ever happened, freeing you to find work that better suits your individual talents?
In Quiet, Susan Cain wrote, “The secret to life is to put yourself in the right lighting. For some it’s a Broadway spotlight; for others, a lamplit desk. Use your natural powers – of persistence, concentration, insight, and sensitivity – to do work you love and work that matters. Solve problems, make art, think deeply. Figure out what you are meant to contribute to the world and make sure that you contribute it.”
For me, this happened during a leadership conference I was attending in 2016. It was a simple assignment, really. The organizers had us write down fifty accomplishments in our life that we were most proud of. Some might be resume accomplishments, some might be goofy things that we wouldn’t dare put on a resume…
Now, forty-one years later, that insane and noble run was the achievement I chose to put atop my personal list of fifty accomplishments. The drill in the seminar was to try to recognize what drove me on that run. What I learned about this drill was profound, and I regretted that I hadn’t absorbed the lesson decades earlier.
What I learned was this: I accomplish the most not when I set out to prove something, but when I set out to discover something.
On that run, I learned that whenever aI went around one corner and up the next hill, I was doing it to satisfy my curiosity. The thought process was: I wonder what would happen if I just kept going for a little longer? If I just ran another mile – patiently, persistently – to take a look? If I begged for money? What I did not do was plan this run as a competition, a test of my grit, or with any plan at all. The very thought of doing that would have wrecked it. I never would have started. Rather, I just did one little hill at a time, patiently and persistently, and ended up climbing a mountain – twice.
That worked for me.
I learned that the situations where I excel are those where my curiosity takes over. When it does, a sense of exploration also takes over. I get in the zone, and I go for it. I feel pulled, not pushed – pulled by a beautiful power I can’t explain. Persistence and patience come to me; I don’t need to summon them. That’s when I really succeed. It was a great insight into who I am and what drives me. My own blooming occurs when I explore, when I take a step forward, with no particular goal other than to see what’s next on the road.
Start a “mgr design lab” like Stanford guy on back cover – at Penn or S-ford
157 – 2020 goal setting
P181 convo(?) w/Zoey