As Americans, we fetishize the idea of success, and we’ve told ourselves for generations that the most admirable route to said success was “self-made.” For years, we told ourselves success was achieved alone, and that life was a contest between winners and losers, and losers only had themselves to blame. This created a cult-like devotion to a myth of our own making and bred a competitive culture that lived by the phrase, “Be a Boss.” But the thing is, there’s no such thing as being self-made. No one exists in a vacuum and no single person has ever succeeded, or failed, alone — period. Luckily, “self-made’s” hold on America is slipping.
“Self-made” is indigenous to America. Rags-to-riches tales have European roots, sure, but America was always all about “making it.” Free from monarchies and strict social classes, it was sold as the place where anyone from anywhere could come and be anything, especially if they were white, male, and Christian.
Yet the term “self-made” didn’t arise until the 1820s and 1830s, and it did so slowly: “self-made” first appeared in an 1824 article championing Georgia senator WH Crawford — “[He] is what we call a self-made man; has risen from obscurity by his own exertions…”; then again in an 1826 speech arguing higher education bred a “self-made man;” and the legendary statesman Henry Clay spoke of Kentucky’s “enterprising self-made men” in the halls of Congress in 1832.
Self-made’s halting emergence here is no coincidence: The United States was finally coming into its own. Sure, we’d declared independence in 1776, but the war left the new nation a complete mess: regional divisions threatened fragile unity; the economy was in the post-colonial crapper; and Americans still used Europe as a cultural litmus test - we had no literary or arts scene of our own, so we gauged ourselves against the establishment. In other words, post-colonial Americans didn’t know who they were, what they were doing, or why they were doing it.
But things turned around after the turn of the century, thanks to a number of factors:
o First, a booming manufacturing sector bolstered the flagging economy and spread wealth from Founding Father-esque elite to more humble, middle class persons — though mostly still white, male, and Christian. 😒
o Second, Westward expansion propelled optimism of America’s future and new pride in its authentic experience: gone were powdered wigs and in were coonskin caps. Woot!
o Third, domestic arts and authors created distinctly American culture, as seen in Thomas Cole’s very of-that-moment 1826 painting Daniel Boone Sitting at the Door of His Cabin.
All this and more fed the most powerful — and most ironic — fuel for the “self-made” myth: the formation of a cohesive national identity, or national brand. Americans of this era no longer saw themselves as former English subjects. They were pioneers of a new era. The States weren’t a post-colonial hodge-podge, they were a distinct entity forged from a savage New World. In Americans’ eyes, the U.S.A. was the very definition of “self-made.”
With the rhetorical wheels now greased, “self-made” spread like wildfire; teachers, preachers, and politicians all bowed to “self-made men.” Bookstores and push carts were filled with biographies of supposedly self-made men, including politicians like George Washington and business moguls like furrier Samuel Slater, too; sermons sang the archetype’s praises, and grade school textbooks lavished impressionable children with tales of rags-to-riches self-made in America. (And, yes, the subjects were almost always men. The earliest use of “self-made woman” I found is dated 1863, from actress Mollie Williams’ obituary: “She is to all intents and purposes a ‘self-made woman,” and usage after was comparatively rare. Again, 😒.)
From that point on, generations of American children were told they could, and would, be self-made. This wasn’t just a hope. It was an expectation. Americans were expected to “make good” by being “self-made.” “Self-made” thereafter became a national mantra — almost an incantation — that conjured a culture in which we’re told success happens on one’s ownsome. Generations of Americans were told their friends and family were actually their competition; they should therefore only look out for #1.
According to the self-made myth, there were no limits to what individuals could do in America, the land of opportunity. There were no socioeconomic or racial hurdles here. All you needed was grit and determination — and if you failed, it was your own damn fault. American philosopher Francis Bowen sums up this erroneous notion, “Neither theoretically nor practically, in this country, is there any obstacle to any individuals becoming rich… There are no obstacles but natural and inevitable ones [i.e. intelligence, etc.]; society interposes none.”
Bowen wrote that in 1856, as slavery still raged — a fact that gives you a good idea of just how self-deluded self-made mythology was, and how deluded it remained. “There is not a poor person in the United States who was not made poor by his own shortcomings, or by the shortcomings of someone else. It is all wrong to be poor, anyhow,” said Reverend Russell Conwell in his influential 1913 Acres of Diamonds speech. Ronald Reagan remarked 70 years later, “I want to see above all that this remains a country where someone can always get rich,” as if it were easy as Apple pie. And Bill Clinton often bragged that his presidency created more “self-made millionaires” than any other.
No matter what media, audience, or era, the “self-made” proselytizers reiterated the same claim: anyone and everyone could be “self-made” in America. Thus, American life became a mad dash from the cubicle to the corner office, and once you got there, you were supposed to say you did it all by yourself.
But the self-made myth appears to be on its way out.
The world has become a more cooperative place in the past two decades or so. There are a number of factors contributing to this shift: Gen X’s late-90s-era ascent into the world of business replaced mid-century, post-WW2 norms; increased globalization exposed people to new ways of thinking and approaches; and shifting gender dynamics are also playing a role: After years of being held down, women are now taking the reins in economic and legislative affairs.
As New York Times columnist Jill Filipovic recently noted, “[Female politicians] have created more space for authenticity over self-aggrandizement… Today’s rising female politicians tell a very different story than ‘I worked hard, and so I got here by myself.’ One by one, they credit those who inspired their success, supported their ascent and cleared the trail so they could walk further still.”
Of course we can’t neglect the internet’s role in self-made’s slow decline. The web was created for collaboration, and that open-access culture has impacted our world in ways large and small. From crowd-sourcing sites to Uber, from open access databases to Slack, the internet, in its purest form, is all about symbiotic collaboration and shared achievement.
The internet’s communal mentality has changed the very architecture of work, too. As social media took off, so too did open floor plans and cooperative work spaces: The first such spaces were found in the Bay Area in 2005, and a 2008 New York Times report on the new-fangled spaces said they were built around internet-inspired principles of “collaboration, openness, community, sustainability and accessibility.” And without a doubt the internet has grown collective movements like BlackLivesMatter to MeToo.
Another dramatic and illustrative example of our increasingly collaborative culture comes from the dramatic and illustrative world of fashion. Fashion houses have historically guarded their ideas for fear of tainting a well-tailored brand. While Salvador Dalí and Elsa Schiaparelli’s 1937-1938 Lobster and Tears Dresses were an early and isolated exception, most designers kept their ideas and inspirations hidden from sight lest a competitor steal their look.
Now we see designers producing closets’ full of co-created wares. Again, this began around the rise of social media: In 2004, with Karl Lagerfeld and H&M’s best-selling collection. That hi-low success opened the flood gates to interbrand creations. Last year alone Vogue counted 1,450 cross-brand collaborations the world over. In this day and age, being “self-made” is so gauche.
One could argue this collaborative trend is simply about an increasingly insatiable need for newness engendered by social media, but that overlooks the creative satisfaction many of these designers achieve through collaboration. As Rei Kawakubo said of collaborative fashion culture, “We’re sharing the space, but no one is losing their identity. If anything, what each of us does is somehow accentuated. The result can only be positive.”
The rise of collaborative culture is heartening for those who know teamwork and shared ideas yield the best results. Taken together, these developments, shifts, and movements suggest we’re replacing a self-centric ideology with one that’s more interdependent. But there are still others who still believe the self-made myth. A 2013 Rasmussen Reports survey showed 86% of Americans believe individuals make their own success. Yes, that survey is now six years old, and Rasmussen is a right-leaning outlet, but it still speaks to an influential American mindset that’s ingrained in our culture and history.
That said, cooperative culture will only grow if all people rebuke “self-made” malarkey. We can’t talk about “going it alone” or being “self-made.” To create a more collaborative culture, we must stand together, acknowledging those who helped us succeed and helping other succeed, too. To create a more participatory, inclusive society, we must work in unison to create, produce, live and learn, and we must make sure those who promote “self-made” myths stand alone.