Ask any person not directly employed in the arts or finance, and she will tell you that work sucks. It’s weird, because our parents and grandparents talked as if work were the best thing in the world, or at least a primary source of meaning in their lives. Then somewhere between Vietnam and Office Space, the indignity of the office became a standard motif in movies and television. That conceit seems almost quaint in a 21st-century employment landscape that features full-time “contract” work, wage theft, and stagnant pay. In a book called The Fissured Workplace, Boston University professor David Weil argues that work has become “debased” by management structures that separate employers from employees. Two other authors, Applebaum and Batt, see that as the inevitable outcome of three decades of leveraged buyouts.
If you are unfortunate enough to know me in real life, I have probably already tried to make you read Nick Hanauer’s Politico piece on how rising inequality is not in the best interest of the very rich. If you haven’t, you should read it now. I’ll wait here and look at
fourth-quarter economic projections cat videos. Hanauer essentially makes the same argument that Henry Ford made in his defense of so-called “welfare capitalism:” the people who make Ford cars are the same people who buy Ford cars, so it’s good for business to pay workers a higher wage. The case for welfare capitalism is a case for a strong middle class, and it’s particularly relevant in a consumer economy. I’m more interested in Hanauer’s other argument, though: if inequality continues to increase, the inevitable consequence will be either revolution or a police state.
Robert Putnam, author of the unfalsifiable big-think text Bowling Alone, told Maclean’s last week that “America is moving toward a caste society.” His next book is called Our Kids: The American Dream in Crisis, which sounds pretty exciting if you, like me, are obsessed with the question of whether life in America is easier or harder than it was 30 years ago. In this case, “easier” means “more fair.” I think we can agree that in the ideal America, the decisions an individual makes would be more important to the course of her life than the circumstances of her birth. Getting born to two married, upper-class parents is difficult to pull off, and we should probably offer a second chance to the kids who blow this crucial first choice.
Shamus Khan’s opinion piece about “The New Elitists” initially made me angry. Sure, I enjoyed the sweet anecdote about William Vanderbilt getting snubbed by the New York Academy of Music, and I’m always up for a screed against the rich. The rich are given resources out of all proportion to their talent and usefulness, usually by their parents. It’s a peculiar way to run a country that defined itself against inherited aristocracy, although it makes more sense if you think of the United States as the country that defined itself against Marxism. My complaint is that Khan focuses his column on cultural elitism—the “omnivorousness” that passes for sophistication but is often simply the hallmark of privilege. I consider myself a cultural omnivore. I like Sean Paul’s art and Jean Paul Sartre. Must I therefore be an elite?
My friend Tarik sent me the chart above pursuant to an unrelated thought experiment. It comes from the Economist, which compiled figures from little-e economist Angus Maddison and the UN to plot economic output and percentage of total human-years lived against centuries. A human-year is a particularly useful unit of history if you prefer the broad trend hypothesis to the Great Man Theory. As the Economist puts it, “if people do make history, as this democratic view suggests, then two people make twice as much history as one.” Fact: two people, each living 70 years, experience more human time—that is, history—than one person living 70 years. Given that the life expectancy of your typical eighth-century serf was like 28, the lion’s share of human experience has taken place in the last century.