Skip to content

The Dollar Store Backlash Has Begun

The U.S. has added 10,000 of these budget retail outlets since 2001. But some towns and cities are trying to push back.

It has become an increasingly common story: A dollar store opens up in an economically depressed area with scarce healthy and affordable food options, sometimes with the help of local tax incentives. It advertises hard-to-beat low prices but it offers little in terms of fresh produce and nutritious items—further trapping residents in a cycle of poverty and ill-health.

A recent research brief by the Institute of Local Self Reliance (ILSR), a nonprofit supporting local economies, sheds light on the massive growth of this budget enterprise. Since 2001, outlets of Dollar General and Dollar Tree (which bought Family Dollar in 2015) have grown from 20,000 to 30,000 in number. Though these “small-box” retailers carry only a limited stock of prepared foods, they’re now feeding more people than grocery chains like Whole Foods, which has around 400-plus outlets in the country. In fact, the number of dollar-store outlets nationwide exceeds that of Walmart and McDonalds put together—and they’re still growing at a breakneck pace. That, ILSR says, is bad news.

“While dollar stores sometimes fill a need in cash-strapped communities, growing evidence suggests these stores are not merely a byproduct of economic distress,” the authors of the brief write. “They’re a cause of it.”

Dollar stores have succeeded in part by capitalizing on a series of powerful economic and social forces—white flight, the recent recession, the so-called “retail apocalypse”—all of which have opened up gaping holes in food access. But while dollar store might not be causing these inequalities per se, they appear to be perpetuating them. The savings they claim to offer shoppers in the communities they move to makes them, in some ways, a little poorer.

Using code made public by Jerry Shannon, a geographer at University of Georgia, CityLab made a map showing the spread of dollar stores since the recession.

 
David Montgomery/Jerry Shannon

As Lawrence Brown, a community health expert at Baltimore’s Morgan State University, tweeted in response to the ILSR report, dollar stores function as “subprime groceries.” And recently some local governments have started pushing back on these retailers, rejecting development at the neighborhood level or devising ordinances that seek to limit their spread in certain areas.

Such moves can be divisive—detractors point to the dire need such stores are meeting in retail-starved areas. But the rise of dollar stores represents a deeper problem, one rooted in the history of housing segregation. Addressing that issue requires questioning the host of complicated assumptions that have led to the present conditions—and the myriad ways residents in so-called food deserts have responded to them.

The “Food Desert” Paradox

Ashanté Reese, an assistant professor at Spelman College, lives on Atlanta’s Westside, within two miles of a pair of dollar stores. Her zip code was particularly hard hit in the recession, suffering a 50 percent foreclosure rate. Those demographics are now changing, but the residents for a long time included elderly folks and people on fixed incomes—the exact kind of shoppers dollar-store executives have said they are targeting.

There’s also a traditional supermarket, a Kroger, which is where Reese shops. But the one near her house isn’t as nice as the one 15 minutes away, she says. The one in a whiter, more affluent neighborhood regularly advertises grains, nuts, seafood, olives, and wine.

“There are these tropes that are perpetuating in the shopping experience,” said Reese, who is also the author of a forthcoming book called Black Food Geographies: Race, Self-Reliance, and Food Access in Washington D.C.

While her neighborhood may have some alternatives, the presence of dollar stores in neighborhoods that don’t creates a Catch-22. On one hand, these chains are serving communities that others have neglected or abandoned—a phenomenon researchers have termed “supermarket redlining.” And when a segregated neighborhood loses a supermarket, the effects on residents in the immediate area can have effects on physical and mental health—it affects the self-worth of community. Having an affordable option for buying food in the vicinity—even if it’s not ideal—may be seen by residents as better than nothing. “As someone on a fixed income, I see [dollar stores] as saving the poor,” one Twitter user said, responding to the ILSR brief. “I can stock up on staples there a whole lot cheaper than at regular grocery stores.”

On the other hand, the absence of traditional grocers, and the presence of dollar stores, is deeply entwined with the history of spatial and structural inequality in America. “Supermarkets follow the patterns of racial and residential segregation—we can map this in any of the cities that have a solid black population,” said Reese.

In her research, she traces the decline of the supermarket in communities of color—specifically black communities—to the late-1960s, when unrest broke out in several major cities following Martin Luther King Jr.’s assassination. As white flight to the suburbs accelerated, urban supermarkets closed, citing security and financial reasons.

“Whether intentional or not, they were following white people out of the city,” she said. In Washington, D.C., where Reese did her field research, she counted 91 supermarkets in 1968; by 1995, just 33 remained. “We don’t see a reverse of that until now,” said Reese. Today, economically booming D.C. has many supermarkets, but they’re not evenly distributed across the city. In Ward 7 and 8, for example, only three grocery stores serve about 150,000 residents. (Recently, the ride-hailing company Lyft and local nonprofit Martha’s Table have partnered up to provide supermarket rides to residents of these neighborhoods.)

Enter the dollar stores.

Like Walmart before themMore . . .

Leave a Reply

Your email address will not be published. Required fields are marked *