The Consumer: Days of Our Lives | Adweek The Consumer: Days of Our Lives | Adweek
Advertisement

The Consumer: Days of Our Lives

Advertisement

ADWEEK 25TH ANNIVERSARY

Does life change a lot in 25 years? It's probably more precise to say that we change. If you were born the same year Adweek was, your life has changed a great deal since then. But it may not be all that different now from the life a 25-year-old was leading in 1978. In looking at society in the aggregate, we need to keep our bias toward the new from blinding us to the continuities of existence. Guided by some data, here's a look at ways in which life has and has not changed during the past quarter-century.

You've come a long way, mother

There's a reason why history must be written after the fact. When you look at trends within the space of a few years' time, you may think you see their long-term trajectory. But look back along the arc of events from a 25-year distance and you're apt to see how misleading those few years were. The role of women in America is a case in point. As it happens, Adweek was born in the era when feminism was enthusiastically interpreted to mean that women could "have it all." Once the patriarchy had been dismantled (people then believed), women would be able to have absorbing careers and lovely families without either getting in the other's way. In hindsight, we can wonder why anyone thought this was a plausible outcome for people of either sex. At the time, though, skepticism was taboo. To voice such doubts in the have-it-all heyday was either chauvinistic (if you were a man) or treasonous (if you were a woman).

No longer. Fast-forward to 2003 and you find many women ridiculing the worship of career that was a cornerstone of have-it-all dogma. That's especially true for mothers of young kids. A Redbook study this year confirmed that the working-mother role is losing its appeal. Among women who have young kids and work full-time, 57 percent said, "I would quit my job this instant if we didn't need the money." Sixty-five percent of stay-at-home mothers expressed satisfaction with the role they've chosen. Just as telling, 35 percent of working mothers said they envy their stay-at-home counterparts, while 15 percent of stay-at-homes said they envy the workers. These attitudes show up in the choices women are making. Among mothers of infants, the percentage who work outside the home fell between 1998 and 2000—"the first significant decline" since 1976, when the Census Bureau started keeping track of the matter.

It's not just that women are feeling the lure of motherhood. Many (including some who are childless) have thought to pose the question: What's so great about work? An article last month in U.S. News & World Report told of the disdain 20something working women feel for workaholic 50something women who try to mentor them. The young women often feel these would-be mentors "are deaf to the latest business buzzword, work-life balance." In discussing an American Bar Association study aimed at keeping women on track for partnerships, the article cited "the scorn young female lawyers voiced for senior women, who had often passed up marriage or motherhood on their long plod toward partnership." A recent article in The New York Times Magazine discussed "The Opt-Out Revolution"—i.e., the tendency of highly credentialed women to step away from high-pressure careers in favor of child-rearing. On the surface, it looks as if they're sacrificing career for the sake of their kids. But the article notes a dirty little secret: They're glad to escape from work, at least for a while. As one woman says, "Having a baby provides a graceful and convenient exit."

For women who do stick with work—and most have no choice—one change in the past quarter-century has been in their earnings relative to men's. Among full-time workers, women made 63 percent as much money as men in 1979, according to the Bureau of Labor Statistics. By 2002, the figure had risen to 78 percent. And the latter figure doesn't express the full extent of women's progress, as the total is skewed by a greater gap among older women (see the chart at left). Will the disparity ever vanish? Given the way kids intrude on careers, one might doubt it. But a countervailing force has arisen: a gender gap in education that favors women. In the 25-34 age cohort, Census figures show women more likely than men to have finished college (33 percent vs. 29 percent). A BusinessWeek article this spring cited estimates that "there will be 156 women per 100 men earning degrees by 2020," if current trends continue.

For now, though, women still feel they live in a man's world. In a survey this year for the Center for the Advancement of Women, women were asked which sex has "a better life in the U.S. today." Sixteen percent said women do, while 55 percent said men do and 19 percent said the two sexes live equally well. (The rest didn't know or declined to answer.) That's virtually unchanged from a 1989 Gallup survey: 18 percent women, 54 percent men, 20 percent both equally.



Please, sir, may I have much more?

What have Americans been eating during the past 25 years? Better to ask: What haven't they been eating? Contradictory trends have coexisted, with people eating more health food, more junk food, more organic food, more genetically engineered food, and so on—sometimes, one suspects, all in one sitting.

The ingredient most conspicuously in decline in American cookery is time. Even more than 25 years ago, speed and convenience are the determinants of what we eat. One telltale sign: The proportion of at-home dinners in which the main dish is a frozen item (eaten unfrozen, one hopes) reached 10.4 percent in 2000, says The NPD Group—one-fifth higher than was the case five years earlier. Meanwhile, an NPDFoodworld report documents a steady rise in consumption of frozen fare (see the chart above). In an indication of how today's consumers are streamlining meal preparation, the proportion of at-home dinners that included a side dish fell from 66 percent in 1991 to 56 percent in 2002. The research firm emphasizes that while people don't want to spend hours at the stove, most want to dine at home. This dovetails with the fact that the percentage of dinners eaten at a restaurant didn't change between 1985 and 2000, holding at 7 percent.

In what specific ways have Americans altered their diets over the past couple decades? Not by shunning red meat, despite conventional wisdom to the contrary. According to the Department of Agriculture, per capita consumption of red meat has declined, but not by a great deal (from 126.4 pounds in 1980 to 113.5 pounds in 2000). People more than took up the slack by increasing their consumption of poultry (from 40.8 pounds in 1980 to 66.5 pounds in 2000). Similarly, the decline in intake of beverage milk (from 27.6 gallons to 22.6 gallons) was fully offset by the rise in ingestion of cheese (from 17.5 pounds to 29.8 pounds). Dr. Atkins notwithstanding, Americans boosted their consumption of flour and cereal products from 144.7 pounds in 1980 to 199.9 pounds in 2000. Although too few consumers were eating the numerous daily servings of fruits and vegetables that experts recommend, they did boost their per capita intake of foods in these categories from 608 pounds in 1980 to 707.7 pounds in 2000. (Broccoli, cauliflower and asparagus posted some of the biggest gains; apples, peaches and celery suffered declines.) Nutritionists who see high-fructose corn syrup as the smoking gun in the obesity crisis can point to the rise in consumption of it, from 19 pounds per capita in 1980 to 63.8 pounds in 2000. Oddly enough, our caloric binge has come in tandem with a tendency to skip meals. A 2000 report by the National Restaurant Association found the typical American age 8 and up skipping 2.4 meals per week: 34 percent skipped at least one breakfast, 20 percent skipped at least one lunch and 7 percent skipped at least one dinner.

Despite recurring predictions that Americans would finally develop a taste for wine, per capita consumption of that beverage was a shade lower in 2000 than it had been in 1980 (2 gallons vs. 2.1 gallons), after modest ups and downs in the intervening years. (The wine-cooler craze—remember that?—pushed the figure up to a modern high of 2.5 gallons in 1985.) Beer consumption declined between 1980 and 2000 (from 24.3 gallons to 21.7 gallons per capita), while the nation's intake of distilled spirits dropped like a man tumbling off a bar stool (from 2 gallons to 1.3 gallons per capita). If the typical American was drinking himself into an early grave, he was doing it with carbonated soft drinks (up from 35.1 gallons per capita in 1980 to 49.3 gallons in 2000—but nearly flat since 1990).



The graduates

You wouldn't guess it from the dumb-and-dumbered-down fare that now constitutes popular culture, but Americans are better-educated than they were a quarter-century ago. (At least, they're better-credentialed.) A recent Census report tells the tale. In 1980, 66.5 percent of adults age 25 and up had at least a high school diploma; 16.2 percent had a bachelor's degree or more. In 2000, the corresponding numbers were 80.4 percent and 24.4 percent. Another 14 percent had completed one or more years of college without getting a degree; 6 percent had earned an associate degree. Another recent report, from the American Council on Education, shows striking gains during the past two decades in the college-participation rates of minority women. Among college-age black women who've graduated from high school, the college participation rate rose from 28.4 percent in 1978-80 to 42 percent in 1998-2000. Among Hispanic women, the rate rose from 27 percent to 37 percent. Black men had a smaller gain (from 30 percent to 37 percent), while the rate was flat for Hispanic men (31.5 percent in 1978-80 vs. 31 percent in 1998-2000). Even these latter numbers represent real gains, though, when one takes account of the fact that high-school graduation rates have increased during the past 20 years—from 68 percent to 76 percent among blacks and from 55 percent to 59 percent among Hispanics.



Welcome to my humble chateau

If people were as pleased with their spouses as with their homes, the divorce rate would be a lot lower. In a survey last year by the Fannie Mae Foundation, 41 percent of adults said their current residence is "a great place to live" and 43 percent said it's a "good place." One reason for homeowner satisfaction is that house prices have risen steadily during the past two decades, apart from a dip in the early 1990s. In 1980, says the Census Bureau, the median sale price of privately owned single-family houses was $64,600. By 2001, the figure had climbed to $175,200. Does this mean the average American has been priced out of the housing market? Apparently not, as the rate of home ownership rose during the same period, from 64.8 percent in 1978 to 68 percent this year. New houses have become notoriously lavish in our age of the McMansion. Consider this Census info-tidbit if you're ever tempted to turn up your nose at a toilet-paper account: Of privately owned single-family homes completed during 2001, 56 percent had 2.5 bathrooms or more, vs. 25 percent of new homes in 1980. Likewise, 37 percent of 2001's new homes had four or more bedrooms, vs. 20 percent of 1980's. The proportion of new homes with two or more stories rose from 31 percent to 53 percent.

Home-improvement has emerged as the national pastime. Home is no longer just the place where you live; it's the place where you express your taste by renovating and re-renovating. In polling fielded by Harris Interactive for the Lowe's Home Improvement Trendex Survey, homeowners last year said they planned to shell out an average of $400 per month to upgrade their houses; one in 10 said they expected to spend $10,000 or more for such projects during the ensuing 12 months.



The family way and the unfamily way

Did the nuclear family fade away during the past 25 years? So one might suppose from the way demographic data have been treated in the media. The truth is, family structure has been a blend of continuity and change. Much has been made of the fact that fewer than one-fourth of households now consist of married parents and their kids. That figure reflects an important trend, but the death of the family isn't it. Though the percentage of kids living with a single parent has doubled since 1978, about seven in 10 still live with a pair of married parents. The household statistics have been transformed not just by kids in untraditional families but by the rise in households consisting of one person—often a young one who hasn't yet married (but will) or a person who used to be married. In 1978, Census data put the median age of first marriage at 24.2 for men and 21.8 for women. By 2002, the figures had climbed to 26.9 for men and 25.3 for women, which means a lot more single 20somethings rattling around in one-person households. There's been a parallel shift in the age at which women first give birth—from 22.7 in 1980 to 24.9 in 2000, says the Centers for Disease Control and Prevention. At the other end of adulthood, parents now have longer lives as empty-nesters. In short, adults spend a larger share of their lives in kid-free households than was once the case.

There's also been a rise in the number of people who never become parents. As of last year, says the Census Bureau, 44 percent of all women of childbearing age (classified as 15 to 44) were childless. Among those 40-44, 18 percent had never had a kid, vs. 10 percent in 1976. As for men, their notorious aversion to marriage is showing up in the data. A report this year from The National Marriage Project at Rutgers said 18 percent of men age 35-44 have never married, vs. 7 percent in 1970. Among men who do marry and have kids, anecdotal evidence suggests they're more engaged at home than their fathers were—but only up to a point. In a Kaiser Family Foundation survey this year, 49 percent of working mothers said they have to miss work when a child is home sick, vs. 30 percent of working fathers. About 80 percent of mothers said they "assume the major role in selecting their children's doctor, taking children to doctor's appointments, and follow-up care."



Keep your chins up

It's not as if Americans have ever been a svelte people during the past half-century. Still, the recent rise in weight has been phenomenal. According to the Centers for Disease Control and Prevention, the incidence of obesity jumped 74 percent between 1991 and 2001. Going back further, the National Center for Health Statistics noted a drop in the number of people 20-plus who have a "healthy weight," from 49.6 percent in 1976-80 to 33.6 percent in 1999-2000. Harris polling found the rate of obesity among those age 25-plus rose sharply between 1983 and 2002 (see the chart below).

Elsewhere on the bad-habits front, the decline in smoking has slowed from its pace of a decade or more ago. (The CDC pegs the number of smokers in the adult population at about 23 percent as of 2001.) Still, Gallup last year found "roughly one former smoker for every current smoker." A CDC report this year noted a rise in the proportion of people who are "some-day smokers" rather than everyday smokers. More than one-fifth of smokers are of the some-day variety. You might think people are too busy eating and smoking to have time for illicit drugs, but apparently they're not. A Gallup poll last month found 24 percent of adults answering "yes" when asked, "Has drug abuse ever been a cause of troubles in your family?" That's up from 19 percent in 1995. Despite all these bad habits, life expectancy has increased by a couple years (to 77.2 years as of 2001, says the CDC) since Adweek was launched—not that we're taking credit for it, mind you.



The media future as moving target

If you'd had the foresight to invest in a dotcom venture in the year when Adweek began, you would hold the distinction of having gone broke decades before the rest of the Internet crowd. How much has the media scene changed since then? In 1978, fewer than one-fifth of American households had cable TV, and the VCR was a novelty. Now, more than four-fifths of households get cable programming (via wire or satellite) and VCRs are as common as toasters. The headlines (and hand-wringing) are about newer technologies. Thus do we see a report like one issued this fall by the Yankee Group under the heading, "The Death of the 30-Second Commercial." (The cause of death this time is forecast to be the coming proliferation of personal video re-corders, aka PVRs.)

The main lesson of the media's history during the past 25 years is that its future keeps veering off in unexpected directions before the predictions made a few years earlier can catch up. Thus, obsessive talk of 200-channel TV has given way to musings about how the Internet impinges on other media. With about six in 10 adults wired (according to the Pew Internet and American Life Project), the period of rapid growth in Internet access is behind us. But Internet usage will evolve in unpredictable ways as broadband becomes the norm, wireless use proliferates, etc. For some people, the Internet has given rise to media multitasking as they watch TV and go online at the same time. In a BIGresearch poll this year, 24 percent of men and 29 percent of women said they often do this.

On the whole, though, media forecasters have tended to underestimate people's passivity. Part of television's appeal is that one can sit like a lump while watching it. That fact limits interactive use of the medium. It's not that people think TV programming is good. (In a poll last year by Arbitron and Edison Media Research, 52 percent of adults said it's getting worse, while 33 percent said it's getting better.) It's just easy.



Welcome to the melting pot

Here's one conspicuous difference between present-day America and that of 1978: Many current residents (or, if they're younger than 25, their parents-to-be) lived in another country then. A Census bulletin this March said the foreign-born population had surpassed 32 million in 2002—11.5 percent of the total population. The immigrant population is now split almost evenly between people from Latin America (52 percent) and those born elsewhere (including 26 percent from Asia and 14 percent from Europe). Nearly half of all immigrants arrived since 1990. A Census report last month said 47 million residents age 5 and up speak a language other than English at home—up 15 million since 1990. As you might guess, a majority of this 15 million consisted of Spanish-speakers. But immigration has not become a wholly Hispanic affair. Of the 20 non-English tongues spoken most widely in the U.S., the biggest proportional gain in the 1990s was not for Spanish but for Russian—with French Creole the runner-up.

One might expect all this change to roil the country. On the whole, though, the U.S. has remained pretty unroiled. In a Gallup poll last fall, the number of people saying immigration ought to be reduced (49 percent) was roughly equaled by the sum of those saying it should be raised (12 percent) or kept as is (36 percent). By the way, nativist gringos aren't the only ones with doubts about immigration. In a poll last fall by the Pew Hispanic Center and Kaiser Family Foundation, 47 percent of Hispanic voters born outside the U.S. said there are "too many" immigrants here. Are immigrants assimilating? Many are doing so in the most basic way: by marrying outside their ethnic communities. In an article this year for The Milken Institute Review, demographer William Frey said nearly three of 10 marriages involving Latinos or Asians are mixed marriages.



Untrue facts

Some of the things we all know about the past 25 years happen not to be true, once you look at the evidence. Here's a sampling of them.

The myth: Americans are less trusting than they used to be. The reality: Not so, for the simple reason that they've never been especially trusting. Polling over the years by the National Opinion Research Center at the University of Chicago has asked adults whether they think "most people can be trusted" or "you can't be too careful in dealing with people." In 1978, the tally was 39 percent "most can be trusted" vs. 57 percent "can't be too careful." In 2002, it was a nearly identical 39 percent "most can be trusted" vs. 56 percent "can't be too careful."

The myth: Crime rates have been rising. The reality: If you read the papers, you know they've fallen steeply during the past 10 years. But it took a long time for the good news to sink in even temporarily, as a succession of Gallup polls indicates (see the chart at left). Once they've got an idea in their heads, people can be stubborn about letting go of it. We've seen the same phenomenon in public opinion about inflation, with many people complaining of rising prices at a time when the real rate of inflation has been negligible.

The myth: People have less leisure time than they did a couple decades ago. The reality: After declining from 26 hours per week in 1973 to 19 hours in 1980, adults' average amount of free time has stayed remarkably steady for the past 20 years. In Harris surveys since 1989, the number has never exceeded 20 hours or fallen below 19 hours. If people feel unleisured, it's partly because they now have so many options competing for their finite free time.