Whether you parcel them out for lunch the next day or squirrel them away with the best intentions until they’ve gone bad, leftovers are a mostly unremarkable reality of modern life. But leftovers have a story to tell, and their curious history tells us about changes in technology and in attitudes both toward affluence and dinner.
Until the icebox (aka proto-refrigerator) became standard in many homes at the turn of the 20th century, “leftovers” didn’t exist. Because there was no way to keep food in the form a freshly prepared meal took at the table, preservation of remaining food was as much a part of the culinary process as preparation. Cookbooks would often follow directions for a meal with instruction for pickling, curing, or salting the remains to prolong the life of all ingredients.
These weren’t leftovers as we think of them today, but the basis of another meal or food item entirely. But the ability to reliably keep things cool changed all that, as people could hang onto last night’s dinner without worrying about immediate spoilage. And so the notion of “leftover”—the remains of a meal that could be kept and consumed in a recognizably similar form later—was born, thanks to this technological innovation of the early 20th century.
The most interesting thing about leftovers, however, is not their invention but shifting attitudes toward them. The luxury of an icebox didn’t mean abundance was taken for granted. In fact, in World War I, eating one’s leftovers was positioned as so patriotic that some celebrated killing house pets rather than recklessly waste human food on them (in those days, pets ate scraps from human meals). From the wartime years through the intense poverty of the Depression, resourcefulness with this new category of “leftover” proved one’s virtuous frugality even more strongly. A 1917 U.S. Food Administration poster reminded citizens to “serve just enough/use what is left”; while a Good Housekeeping headline from 1930 admonished, “Leftovers Shouldn’t Be Left Over.”
By the 1960s, when the majority of American homes had electricity and refrigeration technology improved, leftovers potentially had a much longer life. Yet as food prices fell, leftovers lost status; throwing them away became a mark of middle-class status, historian Helen Veit notes in her book, Modern Food, Moral Food: Self-Control, Science, and the Rise of Modern American Eating in the Early Twentieth Century. Fast food restaurants and frozen meals were newly affordable, and often more convenient than cooking at home. Consuming these innovations conveyed a modern, casual affluence in a way that packing up last night’s painstakingly prepared pot roast most definitely did not.
Simultaneously, as many middle-class women entered the workforce, feminists questioned the domestic ideal in general and kitchen labor in particular by highlighting the considerable uncompensated housework that limited women from professional endeavors. Understandably, that worldview defined getting creative with last night’s dinner as drudgery. Paradoxically, the convenience of serving leftovers (especially for a working woman increasingly out of the kitchen) also earned the disapproval of conservatives, who perceived it as cutting corners on a homemaker’s primary responsibility.
Eating leftovers, or worse, serving them to a guest, thus made one an object of disdain or ridicule rather than paragon of civic virtue as in earlier eras. Etiquette columns throughout the 1960s and early 1970s regularly fielded questions about whether it was even acceptable to ask for a “doggy bag” at restaurants, the uncertainty of letter writers revealing this ambivalence about how to act appropriately around leftovers.
Are leftovers poised for a return to glory? Not only did portion sizes grow by 50 percent from 1977 to 1996, but Veit points out that the recent popularity of foods like curries and stews that taste better after a few days bodes well for the resurgence of the leftover, if for personal sensory pleasure rather than civic purpose.