Get Access to Print and Digital for $23.99 per year.
Subscribe for Full Access
May 2023 Issue [Essay]

The Age of the Crisis of Work

What is the sound of quiet quitting?
Illustrations by Grace J. Kim

Illustrations by Grace J. Kim

Something has gone wrong with work. On this, everyone seems to agree. Less clear is the precise nature of the problem, let alone who or what is to blame. For some time we’ve been told that we’re in the midst of a Great Resignation. Workers are quitting their jobs en masse, repudiating not just their bosses but ambition itself—even the very idea of work. Last year, as resignation rates appeared to plateau, the cause célèbre shifted to “quiet quitting.” This theory holds that what truly distinguishes the present crisis is a more metaphorical sort of resignation: a withdrawal of effort, the sort of thing that is called “work-to-rule” when undertaken by a union. This supposed rebellion against extortion has served as fodder for familiar right-wing complaints about entitlement. The most optimistic commentators on the left, however, have assimilated these hypothetical phenomena into a vision of revitalized working-class self-activity. The AFL-CIO president Liz Shuler has boasted of “labor’s great resurgence.” Even the New York Times, in its own milquetoast fashion, has acknowledged the prospect that, “after decades of declining union membership, organized labor may be on the verge of a resurgence in the U.S.”

There is evidence for all and none of these accounts. Despite declining in 2022, the overall rates of resignation remain higher than at any point in the twenty-first century before the pandemic. But such numbers conceal striking differences across sectors. Statistically speaking, the Great Resignation has been confined almost entirely to the low-wage service industries. Rates of quitting among the so-called laptop class budged only slightly after the initial recession following COVID-19 and now sit at or below pre-pandemic levels. But aggregate employment in the sectors at the heart of the Great Resignation has been growing at a solid clip for years. There’s been a particular acceleration in the gig economy—self-employment increased about 20 percent in the first two years of the pandemic. The quitting phenomenon is real, in other words, but it manifests more frequently in canvassing for better deals than in a radical refusal to work altogether. As its icon we should not imagine a Charles Foster Kane type who realizes, at the summit of his professional climb, that all is vanity, but rather a taxi driver switching fleets a few times in search of better pay and conditions before deciding to drive for Uber.

The evidence for quiet quitting—a concept popularized by a viral TikTok video—is considerably more qualitative, to put it kindly. The best numbers on the subject come from the biannual Gallup study of employee engagement. The data suggests that, since 2019, engagement has indeed declined while active disengagement has risen—but very modestly, and not to historically unusual levels. Unlike resignation, however, the disengagement trend seems to be just as pronounced, perhaps even uniquely pronounced, among white-collar workers. Gallup reports that “managers, among others, experienced the greatest drop” in engagement.

The much-ballyhooed labor resurgence appears to mirror quiet quitting on both of these counts. In recent years, approval ratings for unions have increased greatly, and unions have been winning elections at a remarkable rate, but the number of union members is still depressed relative to the early Aughts, and overall union density has kept up its decades-long slide. Meanwhile, the composition of the labor movement continues to shift toward white-collar professionals, as it has for the past twenty years. It is defectors from this stratum who seem to be leading union efforts among low-wage service employees at companies such as Starbucks and REI. In the media, these college graduates—perhaps most notably Jaz Brisack, the Rhodes scholar who helped organize a union at a Starbucks in Buffalo—have become the new face of the American labor movement.

While the right-wing chestnut that “no one wants to work anymore” is an exaggeration, so too is the progressive retort that “workers are finally standing up for themselves.” Something slipperier is at hand: an inchoate sense of disillusionment. Tendrils of dissatisfaction are solidifying. Talk of a crisis of work suggests that many people today understand work itself, I think accurately, as a governing institution in its own right, analogous in some ways to the state. Like the State of the Union address, meditations on the state of work have become an annual ritual for consultancies, IT companies, and other purveyors of “work solutions.” In a sense, work functions as a nation within a nation—an imagined community, in Benedict Anderson’s famous definition. Its moral health is of obscure but paramount significance.

So what kind of crisis do we have on our hands in Work Nation? In 1973, Jürgen Habermas enumerated several threats to which states could succumb: economic crisis, rationality crisis, motivation crisis, and—his own coinage—legitimation crisis. Ours is not an economic crisis in Habermas’s sense: unemployment is low and wages are growing, at least for the time being. Rationality crisis and motivation crisis don’t fit the bill, either. On the whole, people are still working without vocal protest. But something isn’t right. People are still showing up—and yet work has become somehow alien. It acts on us, not through us. It is a nuisance. Work is a “false idol,” we read in the Times: the sacred is now profane. This is a legitimation crisis. Habermas wrote that “scattered secondary conflicts . . . become more palpable” when they “directly provoke questions of legitimation” but don’t quite rise to the level of “objective systemic crises.” This is the real state of work today: skirmishes, but no real battles; a constellation of apparently benign tumors.

The malaise is easy to mistake for a motivation deficit, but it’s more a matter of shifting priorities. For better or worse, people are still happy to work when it seems to be in their best interest. It’s the extra demands of the institution that have begun to grate—the pressure to go “above and beyond” now rejected by the proverbial quiet quitters. Hence the divergent manifestations of the crisis among different groups of workers. For low-wage service workers, currently faced with relatively abundant job openings but a desolate labor-organization landscape, it makes sense to jockey around the job market in search of better pay and conditions. For overproduced professionals in sectors like academia, yoked by their credentials to a narrower (if usually more desirable) job pool, that’s less of an option. Instead they—we—turn at best to unionization, and at worst to quiet despair.

A legitimation crisis occurs, Habermas argued, when the state “lags behind programmatic demands that it has placed on itself.” Legitimacy evaporates when promises are broken. Here I think we can find the roots of work’s legitimation crisis as well. No less than the state, work makes promises to its subjects. Our culture has scripts about what makes work worthwhile, not just necessary; not a burden to be endured but an important component of a flourishing life. And increasingly these scripts do not play out as written.

In the United States, one of the oldest work-legitimizing scripts speaks in praise of industriousness: hard work is its own reward; it makes a person self-sufficient, a producer rather than a parasite. This narrative thrived as the nation industrialized in the nineteenth century. At the time, there was much work that needed doing. Industrial output grew roughly at the same rate as the industrial workforce: the nation could develop at precisely the rate at which new workers enlisted. To work was to contribute to this great project, to be a participant in the construction of modernity and the satisfaction of the needs of an empire. That’s not to say that everyone worked happily. But opprobrium commonly targeted the idle rich, feasting vampirically on the labor of others, rather than the concept of work itself, which remained a point of pride. “It is we who plowed the prairies; built the cities where they trade; dug the mines and built the workshops, endless miles of railroad laid,” boasted a popular union anthem.

Then it was done: prairies plowed, cities erected, mines dug. Near the century’s close people began to whisper of “industrial maturity.” Their work had constructed a machine that would now continue to hum ever more efficiently—no additional labor required. With the foundations set, captains of industry could focus their efforts on new techniques for squeezing productivity out of the resources—both human and material—that they already had on hand. By the Twenties, new technological and managerial efficiencies allowed industrial production to expand without net additions to the manufacturing workforce—even without the net investment of capital.

Many contemporaries saw mass unemployment on the horizon—and for good reason. But it was not to be, at least at first. Instead, early-twentieth-century American capitalism conjured up an astonishing array of industries organized around the production of consumer goods and the providing of services. Three of the most important sectors of the new economic order were automobile manufacturing and production; mass entertainment (radio, movies, and magazines, and all the advertising they required and facilitated); and the professions (including the accountants, lawyers, “managers” of various stripes, production technicians, and a bewildering assortment of all-purpose paper pushers). For the time being, these newfangled industries absorbed the surplus workforce expelled from heavy industry—or even rejuvenated those sectors, as in the case of steel production.

It was enough to avert an economic crisis—or more precisely, to help delay one until 1929. But in the meantime it raised uncomfortable questions. The American economy was now oriented around the provision of goods and services that would have been, from the perspective of previous generations, totally unnecessary. Was it possible to be proud of this work? The question applied to low-wage manual and clerical workers, who were also asked to tolerate increasingly invasive methods of labor regimentation and management. They began responding—in an echo of today’s Great Resignation—by quitting at sky-high rates. But the question also applied to the professionals who staffed corporate offices and did intellectual and artistic work. For them the matter of their work’s worthiness was also complicated by their having labored under conditions that earlier denizens of their class would have found demeaning: namely, as permanent employees rather than independent proprietors.

One particular group of professionals responded to this legitimation crisis by elaborating a new framework for justifying the value of work. A demographically diverse bunch, they had trained as social scientists and spiritual advisors; they worked as business school professors, popular self-help writers, and pioneers of management consulting. What united them was their commitment to what I call the “entrepreneurial work ethic.” The concept of the entrepreneur—then an obscure academic term—entered the vernacular in the early twentieth century and eventually came to encapsulate everything that made work valuable to advocates of the new gospel.

Champions of the entrepreneurial ethic argued that work was meaningful insofar as it embodied the personality of the individual who created it. This explanation could have gained traction only in an era of industrial maturity, at the close of the Gilded Age. Precisely because this new sort of work—performed in car factories and cinemas, classrooms and corporate offices—was not strictly necessary from the vantage point of preindustrial American society, it had the potential to serve as an arena for self-actualization. People could now use their work to leave their individual marks on the world. No longer required to do what society demanded, workers could instead do what they loved.

It can be hard to grasp that this idea originated in the early twentieth century, because it’s still treated as new today. The computer scientist and New Yorker work guru Cal Newport has asserted that the advice to “follow your passion” was rarely heard before the Seventies. But as early as 1904, Elizabeth Jones Towne, one of the bestselling self-help writers of the era, urged readers to “get into line with a work you do love—something in which you can express yourself.” (Newport’s own success formula, to be “so good they can’t ignore you,” also originates from this period: “You are in business for yourself, selling your skill to your employer,” the Success magazine founder Orison Swett Marden advised in 1913.)

The entrepreneurial ethic feels contemporary in a way that few aspects of Progressive Era economics do because it was reinvented in cycles over the course of the twentieth century—in tandem with the work that Americans performed. When business leaders, policymakers, and popular intellectuals announced, during subsequent moments of social crisis, that one era of work was ending and another beginning, they were reiterating an old idea.

As the nation clawed itself out of the Depression, many entrepreneurialists began to emphasize the theme of regional development. For them, the entrepreneurial project was about modernizing the backward South and West. Later, amid the onset of Rust Belt deindustrialization and the maturation of a generation of college-educated idealists in the Sixties, a concept of “social entrepreneurship” emerged to extol the industrialization of the social services sector—the increasing shift of both low-wage and professional workers in the Midwest and Northeast into health care, education, and non-profit work. For many baby boomers, entrepreneurship was a matter of building new careers out of the struggle against the social problems produced by the midcentury political economy and its collapse. That collapse, of course, accelerated significantly in the last quarter of the century, and a nascent entrepreneurial imagination centered on the start-up emerged in its wake. Entrepreneurship, in the age of Apple, became a matter of commodifying new ideas: turning them into companies and then, to stay one step ahead of global competition, into a ceaseless stream of new products.

The paradigm of entrepreneurship zigged and zagged, but the fundamental promise remained the same: work leads to personal growth and self-realization. This idea was at the core of the developmentalist version of entrepreneurship. For the Harvard psychologist David McClelland, an important postwar figure in the Ford Foundation milieu, development occurred when a populace became entrepreneurial, which meant that they had acquired what he called “achievement motivation”—the desire to distinguish themselves, to accomplish something meaningful. It was also ubiquitous in the social-entrepreneurship world of the boomers’ youth: Abraham Maslow, the psychologist who popularized the concept of self-actualization, went on to become a management consultant in the Sixties, singing the praises of entrepreneurship. And it has, if anything, grown even more intractable over the past several decades. “Nobody ever changed the world on 40 hours a week,” Elon Musk tweeted in 2018. “But if you love what you do, it (mostly) doesn’t feel like work.” Here is the entrepreneurial ethic in a nutshell: by creating work out of what we value the most, we can accomplish something that really matters. We can change the world, even achieve personhood.

Of course, not everyone drank the entrepreneurial Kool-Aid (to name just one of the innumerable consumer products that came to market in the Twenties). The conviction that most of the work performed in modern society was “useless toil,” in William Morris’s phrase, recurred throughout the twentieth century. Enthusiasm for the philosophy of Morris, Leo Tolstoy, and other critics of industrialism swept the United States in the early years of mass production. The New Left of the Sixties derided the idea that the expansion of professional employment had ushered in an era of greater autonomy and creativity in the workplace. As the Students for a Democratic Society’s 1962 Port Huron Statement reads:

Many social and physical scientists, neglecting the liberating heritage of higher learning, develop “human relations” or “morale-producing” techniques for the corporate economy, while others exercise their intellectual skills to accelerate the arms race. . . . The serious poet burns for a place, any place, to work; the once-serious and never-serious poets work at the advertising agencies.

More than three decades later, the anti-work ethos of Jeffrey “The Dude” Lebowski—self-proclaimed co-author of the original Port Huron Statement, “not the compromised second draft”—struck a nerve among a new generation of slackers.

But entrepreneurialism proved resilient, in large part because of its uncanny ability to redeploy the critical impulse for its own ends. Complaints about meaningless work often damn the present state of things in the name of an ideal ultimately derived from the entrepreneurial ethic itself. They concede that it is proper to seek meaning or personal fulfillment or self-development at work. That opens the door for a new batch of entrepreneurs promising to furnish employees with work that truly matters. The countercultural refuseniks of today become the non-profit bosses and tech gurus of tomorrow. Will it be any different this time?

The scale of today’s legitimation crisis is unusual. The conviction that work is not living up to the promises of the entrepreneurial ethic is no longer confined to a discrete subculture. It has seeped into the American mainstream. “Work won’t love you back,” an old slogan, which the labor journalist Sarah Jaffe used as the title of her 2021 book, has become something of a mantra among disaffected millennials and Gen Z-ers. The late David Graeber taught us to speak of “bullshit jobs,” a concept he defined as: “a form of paid employment that is so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence.” Our classification of the bullshit job has grown only more capacious in the ensuing years; crucially, the consent of the employee no longer seems necessary to consign a job to the bullshit category.

What broke the spell? Undoubtedly, the story has to start with the 2008 financial crisis and its aftermath. The paradox of the entrepreneurial work ethic is that the vast majority of working Americans are not, and in the twentieth century never were, genuine entrepreneurs. Its logic always relied on the ability of employees to participate in the entrepreneurial project vicariously. To secure this sense of participation, its advocates typically encouraged managers to embrace a paternalistic approach to employee relations. If workers were “taken into confidence in production plans,” claimed one entry in a 1922 industrial psychology handbook, if “instead of standing baffled before meaningless production ‘they are made conscious participators in the creative process,’ ” then they might find “permanent success.” The most perceptive entrepreneurialists, like Maslow, understood that a certain degree of material security was necessary for workers to conceive of themselves as conscious, creative participators. Self-actualization, Maslow famously argued, became a goal for people only once their more basic needs were met. Hungry workers won’t internalize their employers’ accomplishments as their own. But the Great Recession inflicted deprivation and precarity on American workers to a degree not experienced in nearly a century—in part because of welfare cuts and contingent labor arrangements whose justifications included the imperative to promote entrepreneurship.

Nonetheless, I think the widespread feeling that there’s been another turn of the screw in recent years is correct—even as the economy has, for the most part, been doing well by conventional macroeconomic metrics. Two factors seem especially significant here. The first is the creeping sense of social rot, or even collapse, that has grown progressively palpable since the 2016 election, and which was supercharged by the pandemic. The feeling is shared throughout the political spectrum, with Trump supporters fearing deep state overreach and woke communism where liberals fear fascism. Entrepreneurialism is a progressive philosophy (if not always a “progressive” politics, in today’s sense). It celebrates creation and change. But today it is hard not to feel that if we have been, in fact, changing the world, we have been changing it for the worse.

The pandemic was also disillusioning in a different sense. It forced a wide swath of the workforce to confront, in a particularly visceral fashion, the fact that their work was “non-essential.” Whatever technical definitions governments and employers gave to this category, its sting was hard to escape. “ Nonessential’ is a word that invites creeping nihilism,” Noreen Malone wrote last year in the New York Times Magazine. “This thing we filled at least eight to 10 hours of the day with, five days a week, for years and decades, missed family dinners for . . . was it just busy work?” It wasn’t just the label. The non-essential category was supposed to distinguish jobs that could be done from home, rather than not at all, but everyone knows that for many professional and managerial workers this was a distinction without a difference in the early months of the pandemic. It was a moment of cutting one another an unprecedented degree of slack. And the world didn’t fall apart.

These various pressures have compelled people to take a hard look at the bullshit that was there all along. Marginalist economics, with its equation of value and consumer demand, has trained us to be uncomfortable making pronouncements on the intrinsic merits of different spheres of economic activity. If there’s a market for it, who are we to judge? And yet the simplest explanation for why so many people think their work is pointless is that a lot of it is. A clear-eyed survey of the marquee sectors of today’s economy should be enough to make even the most hardened marginalist suspect a racket. Once the mascot of American entrepreneurship, the entire tech industry is now in disgrace. The outright frauds (Theranos, Juicero, etc.) occasionally seem preferable to the many companies that are actually disrupting things. Elon Musk’s exploding self-driving cars battle with the “metaverse” for the status of punchline du jour. Hopes that it might break down social barriers and topple repressive regimes having evaporated, the online content factory serves primarily as a vehicle for people to post screenshots of TV shows that increasingly appear to be written for exactly that purpose.

It’s not just tech. No one really believes, if they ever did, that financiers are up to any good, even when it’s made-up currencies they’re manipulating. The role of consultancies like McKinsey in price gouging and mass layoffs was once the bane of Pete Buttigieg’s presidential campaign, but now such sins pale in comparison to more recent revelations of collaboration with brutal dictators and opioid peddlers. The non-profit industrial complex has conspicuously failed to halt climate change or dismantle systemic racism. Everyone thinks that education is brimming with bullshit: both the right-wingers who see schools as indoctrination factories and left-leaning humanists like myself, who despair at the reorientation of the academy around STEM. Even health care, an anchor of the postindustrial economy, is replete with an especially pernicious sort of pointless work, as the most basic consideration of the incentive structure of fee-for-service medicine would lead one to expect. As the authors of a 2019 JAMA article write: “Many tests are overused, overtreatment is common, and unnecessary care can lead to patient harm.”

And then there are the “services.” Boosters in the boom years of today’s personal-service industry pitched these jobs as an easy pathway to entrepreneurship. Service workers would discover their own aptitudes and learn how to detect and create market demand. The more talented and ambitious workers in the “Service Class,” Richard Florida argued in his 2002 book The Rise of the Creative Class, were in fact “close to the mainstream of the Creative Economy and prime candidates for reclassification.” Two decades later, we know that class-vaulting through gigified service work is rare. With the promise of upward mobility spectacularly broken, all that’s left is the work itself, which is often egregiously demeaning and exploitative. This past December, the webcomic artist Rob DenBleyker told his large Twitter following about an experiment he had conducted: he ordered a McDonald’s burger on DoorDash with none of its elements selected, not even the patty or bun. Sure enough, a delivery person brought him an empty McDonald’s bag. A guy using the money he’s earned to do bullshit on the internet and force a low-wage worker to do bullshit for him: this stunt represents our entire economy. No one is self-actualizing here.

But we’ve been here before. The memory of past protests against meaningless work ought to make us cautious about announcing the death throes of the entrepreneurial ethic. At the very least, if today’s legitimation crisis does prove to be terminal for the existing ideological fabric of work, it will not be because the bullshit quotient in our economy is unprecedented. From the perspective of overall social utility, after all, we should esteem the efforts of today’s scammers to rip off venture capitalists with fake start-ups more highly than, say, the labors of the scientists of the Sixties who succeeded in devising new ways to kill Vietnamese peasants.

The question is whether working Americans will, once again, respond to their disillusionment by rallying behind some new dream of meaningful work. A coalition of strange bedfellows is currently struggling against that prospect and envisioning the end of work altogether. Rather than making up new work for people to do, they believe, we should welcome the fact that there isn’t much left that genuinely needs to be done. We should figure out a way to enjoy the capacity for leisure that past development has placed at our disposal. This post-work politics is hardly new on the left, with antecedents in Paul Lafargue’s classic 1883 pamphlet The Right to Be Lazy and John Maynard Keynes’s 1930 essay “Economic Possibilities for our Grandchildren,” which proposed a fifteen-hour work week. But more recently it’s picked up some steam among the techno-libertarian set. Even Musk has proposed a universal basic income. “With automation there will come abundance,” he said—and mass unemployment, since “there will be fewer and fewer jobs that a robot cannot do better.” Such technological capacity would make it impossible to come up with new work for people to do, leaving societies with no choice but to finally decouple income from work.

The post-work approach offers a potential liberation from the cycle of job creation and destruction. But its worst-case scenario is troublingly dystopian: a world divided between a small group of Muskian oligarchs and a vast surplus population living idle lives, at least until the oligarchs’ generosity runs out. I think this specter should lead the post-work left to devote more attention to the task of developing a positive vision of what human flourishing might look like in the absence of work. Many leftists are understandably reluctant to insist upon the rightness of any particular way of living. But it’s hard to explain what makes for a bad life without a competing conception of a good one—and drawing such a contrast seems indispensable if one is to distinguish the socialist vision of life after work from the libertarian one.

The haziness of most contemporary articulations of the post-work vision is a boon to those private- and public-sector entrepreneurs currently staking out their claim to the economic vanguard, pushing a new collection of projects that promise to once more restore purpose to work. Across the political spectrum, one now finds a variety of schemes for reorienting the economy around meaningful work. This aspiration is one valence of Make America Great Again, and the Trumpist movement’s rhetorical gestures toward the power of protectionism and immigration restriction to transform the United States into a country that makes things once more. But it’s also the animus of the “supply-side progressivism” championed by Ezra Klein and Derek Thompson, among others. Klein and Thompson point out that there is, in fact, plenty of useful stuff, such as green energy and housing, that we could be making. They understand this misallocation of productive resources as a market failure that government intervention, in the form of a new industrial policy, could rectify. Pushing this vision beyond its residual attachment to capitalism yields the “eco-modernist” socialism of writers like Leigh Phillips who argue that democratic central planning would yield a form of economic growth and technological progress guided by human need rather than profit.

Even if the state could succeed in restructuring the economy around some inarguably valuable shared project, however, it seems inevitable that any such initiative would eventually exhaust itself. The economist Mariana Mazzucato, closely aligned with the supply-side progressivism tendency, calls for us to embrace the “entrepreneurial state.” But entrepreneurialism, whether in the public or private sector, has always proved to be a short-term solution to the problem of industrial maturity. Perhaps we have no choice but to repeat the cycle forever, whatever the human cost. Still, one does not need to fully embrace the automation hype to worry that technological advancement will only shorten the duration for which any enterprise centered on the material fulfillment of real social needs can succeed in maintaining high employment levels.

If the present crisis does end up leading us out of entrepreneurialism altogether, it will be in large part because this cyclical dynamic is more perceptible today than ever before. What is truly unprecedented in this moment is the historical distance that now separates us from the last era in which American industrialization had yet to be achieved. Industrial maturity is not only all that we have ever known—this was true of the New Left as well—but, for many of us, it was all that our grandparents ever knew. For much of the twentieth century, the appeal of the entrepreneurial work ethic came from the contrast between the imagined work of the future and the drudgery of the present. Most of the workforce today is no longer able to experience this contrast so vividly (even though millions of workers in the United States still have jobs that expose them to an inhumane degree of physical risk).

There are now few Americans who have heard firsthand of the crushing toil of preindustrial farm work, the suffocating toxicity of the nineteenth-century coal mine, or the carnage that accompanied railroad construction. It was possible for workers to take pride in discharging such miserable tasks with heroic resolve—and to express anger at those who got rich without doing their fair share—while still dreaming of escape. For people who have only known work as a relentless struggle, however ostensibly ennobling, the idea of satisfying, self-expressive, creative work might seem well worth the risk of placing one’s faith in the entrepreneurs peddling this prospect. But for generations now, the majority of Americans have experienced work as mostly just fine—disappointing but ultimately survivable. Why shouldn’t we try to make the best of a mediocre situation, rather than chase perfection?

Americans’ increasing familiarity with work that is neither gratifying nor unendurable underpins what might turn out to be the most likely short-term resolution of the current legitimation crisis: the rejection of the entrepreneurial work ethic in favor of a more cold-blooded understanding of work as a simple exchange of drudgery for money. is your job toxic? asks the Harvard Business Review. or is it just a job? Similar perspectives can be found throughout the business press. “A job is just a job,” argues a blog post sponsored by the workflow company Zapier. It is “an exchange of your labor for your employer’s cold, hard cash.” In the Financial Times, Lucy Kellaway argues that “the corporate obsession with happiness” is itself part of the root of worker unhappiness. If managers, influenced by the doctrines of the entrepreneurial work ethic, didn’t encourage their subordinates to seek fulfillment at work, their employees would, ironically, be less dissatisfied.

In the long run, employers might find that they’ve gotten more than they bargained for in coming around, at long last, to the ideal of a fair day’s wage for a fair day’s work. Historically speaking, employers and employees have not seen eye to eye about how much remuneration is sufficient to compensate for workers’ submission to a life of meaningless toil. Bosses open negotiations at their own risk.

 is a lecturer at Harvard University and an associate editor at The Drift.



More from

Close
“An unexpectedly excellent magazine that stands out amid a homogenized media landscape.” —the New York Times
Subscribe now

Debug