The Roaring Twenties

by Joshua Zeitz

20s

The 1920s heralded a dramatic break between America’s past and future. Before World War I the country remained culturally and psychologically rooted in the nineteenth century, but in the 1920s America seemed to break its wistful attachments to the recent past and usher in a more modern era. The most vivid impressions of that era are flappers and dance halls, movie palaces and radio empires, Prohibition and speakeasies. Scientists shattered the boundaries of space and time, aviators made men fly, and women went to work. The country was confident—and rich. But the 1920s were an age of extreme contradiction.

The unmatched prosperity and cultural advancement was accompanied by intense social unrest and reaction. The same decade that bore witness to urbanism and modernism also introduced the Ku Klux Klan, Prohibition, nativism, and religious fundamentalism. America stood at a crossroads between innovation and tradition. Many Americans were looking boldly ahead, but just as many were gazing backward, to cherished memories of a fabled national innocence.

Age of Convergence

Many of the trends that converged to make the twenties distinct had been building for years, and, in some cases, decades.

We think of the twenties as an era of liberation for women. Indeed, the decade gave rise to the flapper, described by Webster’s Collegiate Dictionary as “a young girl, esp. one somewhat daring in conduct, speech and dress,” immortalized in the short stories of F. Scott Fitzgerald and by silent film stars like Clara Bow, Colleen Moore, and Louise Brooks. But women had been breaking down the separate spheres of Victorian culture for quite some time. A powerful women’s political movement demanded and won the right to vote in 1920. Spurred on by the growth of an urban, industrial economy that required a larger female labor force, and by the emergence of public amusements that defied the old nineteenth-century courting system, many young women now had the wherewithal and drive to lead independent lives. By the dawn of the decade, anywhere between one-quarter and one-third of urban woman workers lived alone in private apartments or boardinghouses, free from the watchful eyes of their parents, and as early as 1896, newspaper columnist George Ade used the term “date” to describe a new convention by which boys and girls paired off to frolic at dance halls, amusement parks, and other public spaces, free from adult supervision.

Closely associated with the rise of the flapper, the twenties gave rise to a frank, national discussion about sex. But this trend, too, had been building over time. As early as 1913, the Atlantic Monthly announced that the clock had tolled “Sex o’clock in America,” indicating a “Repeal of Reticence” about issues that had once been considered taboo. To be sure, these trends accelerated after World War I: surveys suggest that 14 percent of women born before 1900 engaged in pre-marital sex by the age of 25, while as many as 39 percent of women who came of age in the 1910s and 1920s lost their virginity before marriage. But the fundamental structural changes that were at play in earlier decades—namely, urbanization and industrialization—long predated the twenties. Between 1800 and 1920 the number of children borne by the average American woman fell from seven to three. Americans were not necessarily having less sex. Rather, in an urbanizing society, where more children were a cost rather than an asset, they stepped up their use of birth control, and in so doing, redefined sex as something to engage in for pleasure rather than procreation.

We think of the twenties as an era of prosperity, and in many respects, Americans had never lived so well. But this trend, too, claimed earlier roots. As factories and shops mechanized, the work week of the urban blue-collar worker fell from 55.9 hours in 1900 to 44.2 in 1929, while his or her real wages rose by 25%. By the dawn of the twenties, Americans had more time and money to spend on new kinds of public amusements like dance halls, movie theaters, fun parks, and baseball stadiums. They also had more opportunities to buy competitively priced durable items, thanks to new methods of production and distribution. The prosperity of the post-war period greatly accelerated this trend. By 1929, American families spent over 20% of their household earnings on such items as phonographs, factory-made furniture, radios, electric appliances, automobiles, and “entertainment.” What people couldn’t afford, they borrowed. By the mid-’20s Americans bought over three-quarters of all furniture, phonographs and washing machines on credit.

The proliferation of advertising—alongside the maturation of the publishing, music, and film industries—exposed citizens to a new gospel of fun that was intimately associated with the purchase of goods and services. “Sell them their dreams,” a prominent ad-man intoned. “Sell them what they longed for and hoped for and almost despaired of having. Sell them hats by splashing sunlight across them. Sell them dreams—dreams of country clubs and proms and visions of what might happen if only. After all, people don’t buy things to have them. . . . They buy hope—hope of what your merchandise might do for them.”[1]

Age of Wonders

If many of the social trends that we associate with the twenties had long been building, the decade was indeed unique in many ways.

It was a decade of firsts. For the first time ever, more Americans (51%) lived in cities than in villages or on farms.

It was a decade of economic expansion. Between 1919 and 1929 horsepower per wage earner in manufacturing skyrocketed by 50%, signaling a robust wave of mechanization that increased productivity by 72% in manufacturing, 33% in railroads and 41%.

And it was a decade of technological wonder.

In 1912, only 16% of American households had electricity; by the mid-20s, almost two-thirds did. Overnight, the electric vacuum cleaner, the electric refrigerator and freezer, and the automatic washing machine became staples in middle-class homes.

At the dawn of the twentieth century, automobiles were still unreliable and scarce, but in the years just prior to World War I, pioneers like Ransom Olds, Henry Leland and Henry Ford revolutionized design and production methods to make the car affordable and trustworthy. When the sociologists Robert and Helen Lynd interviewed high school students in Muncie, Indiana in the mid-‘20s, they found that the most common sources of disagreement between teenagers and their parents were 1) “the number of times you go out on school nights during the week”; 2) “the hour you get in at night”; 3) “grades at school”; 4) “your spending money”; and 5) “use of the automobile.”[2]

Another pre-war technology that came of age in the twenties was film. By the mid-1920s movie theaters were selling 50 million tickets each week, a sum equal to roughly half the US population! And the generation that came of age in the twenties learned things at the movie palace that they couldn’t learn in school. “The only benefit I ever got from the movies was in learning to love and the knowledge of sex,” a young woman confided to an interviewer in the mid-20s. “If we didn’t see such examples in the movies,” explain another, “where would we get the idea of being ‘hot?’ We wouldn’t.”[3] These young informants might have been thinking of the 1923 blockbuster Flaming Youth, which one reviewer described as “intriguingly risqué, but not necessarily offensively so. The flapperism of today, with its jazz. . . . and its utter disregard of the conventions, is daringly handled in this film. And it contains a bathing scene in silhouette that must have made the censors blink.”[4]

Like film, radio was invented in the late nineteenth century but experienced its formative era of commercial expansion in the twenties. On November 2, 1920, radio station KDKA in Pittsburgh, Pennsylvania, broadcast the presidential election returns. It was the first-ever live radio transmission for a popular audience, and although few Americans that evening had the necessary technology to hear the results, by 1922 more than three million households had acquired radio sets. Seven years later more than twelve million households owned radios, fueling an industry that saw $852 million in annual sales.

Americans living in the 1920s could listen to Roxy and His Gang, the Clicqot Club Eskimos and the Ipana Troubadours. They could hear Gartland Rice announce the World Series—live—or listen to Floyd Gibbons relate the day’s news. Radio proved a highly democratic medium, and by mid-decade local stations helped bring “race music,” “hillbilly” sounds, and ethnic recordings into living rooms across the country. In the late 1920s, enterprising American businessmen built powerful “X-stations” just across the border in northern Mexico to evade federal radio frequency regulations. From this vantage point, they were able to beam the music of “Fiddlin’ John Carson,” the Carter Family and Jimmie Rodgers to every destination from California to New York City.

Return to Normalcy

Since the dawn of the twentieth century, American politics had been dominated by Theodore Roosevelt and Woodrow Wilson, two presidents whose outsized personalities and dueling visions of the progressive spirit defined the tenor and tone of public life. After 1920, Americans seemed to aspire to “normalcy.” In Warren G. Harding, they got exactly what they bargained (and voted) for.

Harding’s best qualities were his extreme affability and striking good looks. Both got him in trouble regularly. Even as a young boy, the future president seemed all too inclined to please everyone and offend no one. As a successful newspaper publisher, local politician and, later, US senator from Ohio, Harding joined the Rotary Club, the Elks, the Odd Fellows, the Hoo Hoos, the Red Men, and the Moose. He relished poker games and excelled at public speaking. He played the b-flat trumpet in the town marching band. Indeed, Warren Harding was the very embodiment of Sinclair Lewis’s Babbit—and proud to be so.

The ever-genial Harding stacked his Cabinet with cronies from Ohio. He let his attorney general sell pardons and pledges of government non-interference to the highest bidders. He looked the other way while his secretary of the interior accepted almost $400,000 in kickbacks in exchange for a long-term lease on oil-rich federal lands at Teapot Dome, Wyoming. All the while, he adhered to a limited and conservative vision of government, pressing for lower taxes and less regulation and issuing an implicit repudiation of Wilsonian reformism.

Despite—or perhaps even because of—his limitations, Warren Harding was widely admired by the American electorate. When he died halfway through his term, the public offered up a great outpouring of sorrow and sympathy. It was only in the following months that Warren Harding’s countrymen learned of their late president’s extramarital affairs and scandal-ridden administration. But by then, it hardly seemed to matter. Everything was back to normal.

Silent Cal

Harding’s successor, Calvin Coolidge, may have been the most reticent man ever to occupy the White House. Austere, laconic, and conservative to a fault, “Silent Cal” perfectly embodied the laissez-faire ethic that governed American politics throughout the “Jazz Age.” He slept eleven hours each day, vetoed far more bills than he proposed, and claimed that his only hobby was “holding public office.” He had little to say. When a constituent bet that she could “get more than two words” out of him, the President replied simply: “You lose.” Upon hearing that Coolidge had passed away in 1933, the famous wit Dorothy Parker asked: “How could they tell?”

Coolidge slashed the federal budget by almost half, eliminated the gift tax, sliced the estate tax by 50% and lowered the maximum federal surtax from 60% to 20%. The president disavowed anything beyond minimal regulation of business and commerce. He denied a federal role in labor relations and repeatedly affirmed his absolute faith in market forces. What was “of real importance to wage-earners,” he claimed, “was not how they might quarrel with their employers but how the business of the country might so be organized as to insure steady employment at a fair rate of pay.”[5]

In 1928 Coolidge announced unexpectedly and without fanfare that he did “not choose to run for president” again. His wife was as surprised as anyone. “Isn’t that just like the man!” she exclaimed. “I had no idea.”

The Engineer

When Herbert Hoover took the oath of office as the nation’s thirty-first president in 1929, The New York Times sounded an enthusiastic note of approval, applauding the new chief executive for his “versatile ability,” “sterling character,” and “Progressive leanings.”

Orphaned at the tender age of nine, Hoover was raised by austere Quaker relatives in Iowa. He worked his way through Stanford University, where he earned a degree in engineering and graduated first in his class. Over the next twenty years he ascended steadily up the corporate ladder, carving out a brilliant career as a mine operator, engineer, and businessman. During World War I he served as US food administrator and masterminded voluntary production and consumption standards that kept the American Expeditionary Force well nourished and domestic prices steady. After the war he headed up the American relief effort in Belgium, where he was widely credited with feeding and clothing several hundred thousand European refugees. After saving Belgium, Hoover served as secretary of commerce under Warren G. Harding and Calvin Coolidge. In that office he greatly expanded the government’s collection and dissemination of industrial data, organized dozens of voluntary corporate councils, and brought the executive branch into close cooperation with business and labor.

It was Herbert Hoover’s great misfortune that the Depression began only months into his term in office. Smart, well educated, well traveled, and enormously capable, Hoover considered himself an activist and a Quaker humanitarian. As an engineer, he embodied the guiding spirit of progressivism, with its faith in rational and informed public policy. The poverty and despair of his countrymen profoundly affected Hoover.

But like most public men of his era, Herbert Hoover believed that sound volunteerism was the best remedy for economic distress. Rather than adopt strong federal regulatory and fiscal measures, he called for more studies and for an organized—but voluntary—response on the part of the private sector.

By 1930, this pattern of inaction made Herbert Hoover one of the most despised men in America. A popular Vaudeville skit had the straight-man announce that the Depression was over. “Has Herbert Hoover died?” his sidekick would ask. In public appearances, the president seemed thoroughly defeated. “A rose would wilt in his hand,” one observer famously remarked.

Culture Wars

The great revolution in morals, aesthetics, and everyday life that was sweeping through America didn’t meet with uniform approval. Though the twenties are remembered primarily as a decade of bold innovation and experimentation, they also witnessed a fierce counter-revolutionary tendency.

In 1925 a group of local boosters in Dayton, Tennessee, persuaded a young high school science teacher, John Scopes, to violate the state’s anti-evolution law. They merely wanted to draw attention to their economically depressed crossroads town. Instead, what followed was a sensational trial that pitted the famous “lawyer for the damned” Clarence Darrow, a committed civil libertarian and almost fanatical atheist, against William Jennings Bryan, the famously eloquent Nebraskan who had thrice failed to attain the presidency but who remained a hero to rural fundamentalists in the South and Midwest. The trial’s climax came when Darrow called his adversary to the stand as a biblical expert and Bryan reluctantly admitted that some scriptural language might be more allegorical than literal.

The trial seemed like the culmination of a long-simmering clash between liberal and fundamentalist Christians. Although it was technically a win for the prosecution, liberals declared it a great victory for their cause. Bryan, they said, had unintentionally exposed fundamentalism as a simpleton’s creed, while Darrow had established the supremacy of science over fundamentalist Christianity. In fact, the conservatives were far from beat. They immediately began to regroup and charter missions, publishing houses, and radio stations. Fifty years later, they would reemerge as a powerful force in American public life.

More successful in the immediate term was the Ku Klux Klan, a Reconstruction-era paramilitary group that had faded from American life until 1915, when Colonel William Simmons re-founded the organization at a small ceremony on Stone Mountain, in Georgia. By 1925 the organization claimed at least five million members and controlled politics in Indiana, Texas, Oklahoma, and Colorado; it was enormously powerful in several other states, notably California and Georgia. The Klan’s greatest legislative achievement came in 1924, when it joined a broad coalition of conservative groups that won passage and approval of a draconian anti-immigration statute. The golden door would remain closed for another forty years.

The new Klan represented diverse ideas to its polyglot membership. It was avowedly white supremacist, but for good measure it also included Jews, Catholics, Asians, and “new women” among its list of enemies. Its followers could be found in cities as well as in the countryside, but as a general rule, the organization was fundamentalist and conservative in both profile and disposition. As one sympathetic observer explained, “The Ku Klux movement seems to be another expression of the general unrest and dissatisfaction with both local and national conditions—the high cost of living, social injustice and inequality, poor administration of justice, political corruption, hyphenism, disunity, unassimilated and conflicting thought and standards—which are distressing all thoughtful men.”[6]

In 1924, the organization enjoyed sufficient strength to force a deadlock at the Democratic National Convention, where supporters of New York’s governor, Al Smith—a Catholic—faced off against Klansmen aligned with former Treasury Secretary William McAdoo. While Smith’s supporters shouted “Ku Klux McAdoo!”—to which McAdoo supporters taunted their opponents with cries of “Booze! Booze! Booze!”—the convention came to a deadlock. On the 103rd ballot, exasperated, and desperate, the convention agreed on a compromise candidate, a lackluster federal judge named John W. Davis, who was resoundingly defeated by the incumbent, Calvin Coolidge. It was the high-water mark for the Klan.

Arguably, Prohibition was the most successful achievement of anti-modern forces in the 1920s. Writing just after Congress and states ratified the Eighteenth Amendment, which authorized a ban on the production and sale of alcoholic beverages, the great urban wit H. L. Mencken attributed such “crazy enactments” to “the yokel’s congenital and incurable hatred of the city man—his simian rage against everyone who, as he sees it, is having a better time than he is.” In his shrill, visceral response to Prohibition, Mencken may have overstated the intensity of America’s rural-urban divide. Over the next decade there would be no shortage of bathtub gin and woodshed stills in the countryside. Yet he was right on one count: passage of the Eighteenth Amendment and its accompanying federal statute, the Volstead Act, both of which took effect in 1920, were the culminating events in a long effort by conservative forces to check the growing power of America’s immigrants and urban dwellers—one and the same, in some respects, since first- and second-generation Americans comprised the overwhelming (75%+) part of the population in metropolises like New York, Chicago and Boston. Though Americans widely flouted the new law (and, accordingly, the twenties are remembered as a particularly liquid era), in fact, per capita alcohol consumption plummeted during Prohibition, lending the decade yet another paradoxical trait.

End of an Era

The twenties were always something of a gilded age. Even amid the great prosperity and excess of the decade, America’s economy was fundamentally weak. Over 40% of Americans got by on less than $1,500 each year, which economists cited as the minimum family subsistence level. The income of the top 0.1% of families equaled the income of the bottom 42%. Most country folk did not experience the prosperity of the Roaring Twenties. Farm prices hit rock bottom in the aftermath of World War I and widened the gulf between America’s (relatively) prosperous cities and impoverished farms.

Such glaring inequality had consequences. Boom times relied on mass consumption, and eventually, working people reached their limit. The very wealthy could only buy so many cars, washing machines, radio sets, and movie tickets. When consumer demand bottomed out, America’s economy simply stopped functioning.

When the stock market collapsed in 1929, and when the twin influences of under-consumption and over-speculation began wreaking structural havoc on the American economy, the nation’s revolution in values and aesthetics remained incomplete. The twenties were arguably the nation’s first modern decade, but many of its social and cultural revolutions would play themselves out in future years.

[1] William Leach, Land of Desire: Merchants, Power and the Rise of a New American Culture (New York, 1993), 298.

[2] Robert S. Lynd and Helen Merrell Lynd, Middletown: A Study in Modern American Culture (New York, 1929), 257, 524.

[3] Garth S. Jowett, Ian C. Jarvie, and Katherine H. Fuller, eds., Children and the Movies: Media Influence and the Payne Fund Controversy (New York: Cambridge University Press, 1996), 276.

[4] Joshua Zeitz, Flapper: A Madcap Story of Sex, Style, Celebrity, and the Women Who Made America Modern (New York: Crown, 2006), 211.

[5]  William E. Leuchtenberg, The Perils of Prosperity: 1914-1932 (New York, 1958, rev. 1993), 97.

[6] Lynn Dumenil. Modern Temper: American Culture and Society in the 1920s (New York, 1995), 235.

Joshua Zeitz has taught American history at Harvard University and Cambridge University. He is the author of Flapper: A Madcap Story of Sex, Style, Celebrity and the Women Who Made American Modern (2006) and White Ethnic New York: Jews, Catholics, and the Shaping of Post-War Politics (2007). He is currently writing a joint biography of John Hay and John Nicolay.

Leave a Reply

Your email address will not be published. Required fields are marked *