Editor’s note: This is the second article in a series on the home designs that define four European cities: London, Berlin, Amsterdam, and Paris. Read the collection , it was partly a reaction against the Modernism that came to dominate both the Eastern and Western sectors of postwar Berlin. Compared to the tidy, boxy flats of postwar developments, people felt “that these buildings were non-conformist and thus provided more opportunities for individual freedom and expression.”
That idea that these tenements, conceived as dormitories for industrial workers, promoted individualism came mainly from their neglect. Most citizens who could opt for better-equipped newer apartments did so, and division caused many to leave the city. So older neighborhoods where tenements had survived wartime bombing started to hollow out. Many Mietskasernen became available for squatting, attracting an alternative population of dropouts in the 1970s and ’80s. In West Berlin, they also attracted a mostly Turkish immigrant community that otherwise might have struggled to find affordable housing.
This was nonetheless not only a Western phenomenon. “East Berlin’s tenements in particular were totally neglected by the state’s centralized construction industry, one that was essentially incapable of renovation, even though it tried to change,” says Ladd. “So you had these terribly deteriorated buildings, with barely livable, officially abandoned apartments, and a dissident scene of people who wanted to disappear from sight—something that was sometimes accepted by the state and sometimes not, though the Stasi always knew. There was thus a more extreme dropout dissident scene that you get only in the East, which contributes to the enduring mythology of the Mietskaserne.”
As these buildings were repopulated, their new occupants discovered something that had gone unnoticed. The apartments’ spaces were in fact generously sized and quite flexible. “Older buildings were built with less specific purposes in mind for the rooms,” says Ladd. “They are also a lot bigger than in Modernist blocks because the efficiency of Modernism meant that they could be smaller because they were so carefully designed—so the adaptability wasn’t there.”
People also started to fall back in love with the facades’ ornamentation, which could include anything from neoclassical pilasters under the roof to Art Nouveau masks over the doorway. So many buildings had been lost to wartime bombing that remaining courtyards got a little more light, due to repeated gaps in the urban fabric. And at street level, cheaply rented retail units were taken up for a myriad of community uses, from small shops to art spaces and informal bars, creating vibrant activity in the buildings and the streets they faced. By the end of the 1980s, architectural opinion had swung back in their favor, and Berlin was building neo-Mietskasernen that blended into older streets with ease.
The Mietskasernen still shape local ideas of what a desirable home is. It’s just as likely to mean high ceilings, polished wooden floors, and generously proportioned rooms as a house with a backyard and a private entrance. Post-reunification, these buildings have become increasingly expensive; some boroughs are even buying them to prevent new landlords from raising rents and displacing tenants.
Within the buildings, the hierarchy of spaces has changed. Increased noise from cars means that on major streets, the street-facing apartments are not always the most desirable, even if they are larger. Fancier Mietskasernen have had elevators installed, making upper floors more desirable. As a result there has been a boom in new penthouse apartments on top of them—modern, open-plan units with sweeping views, capping what were once working-class buildings.
Therein lies an ironic reversal. One-hundred years ago, living on the top floor of a Berlin tenement might have been something to hide, a sign of being so poor that you had to accept hauling your groceries and winter coal up six flights of stairs. Nowadays, if you concealed from casual inquirers that you live on a tenement’s top floor, it would more likely be to avoid exposing yourself as a gentrifier.
In the next piece in this series, we’ll look at the canal houses of Amsterdam.
Before each home game during his time with the American Football League’s Buffalo Bills, Jack Kemp would have seen the neighborhoods surrounding the city’s football stadium on the drive in and understood that something had gone wrong.
Buffalo’s War Memorial Stadium, a WPA project described by Sports Illustrated
The results were textbook: Residents with means moved out, property values plummeted, and a scattered, segregated, impoverished community with diminishing services was left behind. By 1973, the owner of Kemp’s team found a new, publicly subsidized suburban home for the Bills.
An average quarterback with an outsize reputation thanks to the team’s back-to-back AFL championships in 1964 and 1965, Kemp was by all accounts a good teammate and leader. He was a founder and president of the AFL Players Association and helped relocate the league’s all-star game out of segregated New Orleans after African American players boycotted over being denied access to local clubs and taxis. Later in life, he would recall his years of sharing team huddles with mostly black teammates as his introduction to the world outside his middle-class roots in Los Angeles. In 1970, Kemp retired from football and began his second career in politics. As a Republican lawmaker, he set about trying to address the urban poverty that plagued the kinds of places his teammates came from—places similar to or even worse off than the community Bills fans parked their cars in front of on Sundays in the ‘60s.
Kemp died in 2009; when in a 1988 story on the complex’s management, which also employed 250 people, mostly Cochran Gardens tenants. Gilkey traveled the world to share her story and her business model, and the turnaround at Cochran earned a site visit from President Bush (at Kemp’s urging) in 1991.
Similarly, Northeast D.C.’s Kenilworth-Parkside homes, built in 1958, had also fallen on hard times by the 1970s. In 1981, resident Kimmi Gray went on a quest to take it out of government hands in order to fix it, convincing D.C. Mayor Marion Barry to turn it over to the Kenilworth-Parkside Resident Management Corporation (which she chaired). She kicked drug dealers out, hired residents on welfare to work in her office, and increased rent collection by more than 75 percent. The improvements turned Gray into a GOP celebrity; Kemp called her a hero. The Bush administration made Kenilworth-Parkside their tenant-ownership testing grounds in which residents could buy their own homes by purchasing limited-equity cooperative shares, and the management corporation would continue to manage until there were no more apartments to sell. “There were times when Jack Kemp came into Kenilworth almost as often as I came home,” one man told the Washington City Paper in a 1995 story on the redevelopment.
But these two successes were short-lived. At Cochran Gardens, the previously lauded tenant management group had the complex taken away from them by the city a decade later for tax mismanagement. Vacancies shot up before the development was demolished and replaced with townhouse-style housing in 2008. Meanwhile, at Kenilworth-Parkside, renovations for 341 units ended up costing more than it would to build a replacement apartments from scratch. Over one hundred other units were still “boarded up and unlivable,” according to City Paper, by the end of the first Clinton administration. The District’s vastly improved economic fortunes in the 21st century has led to growing concerns about gentrification as an affordability crisis spreads to all of its quadrants. In 2016, the D.C. Housing Authority teamed up with two developers to demolish the remaining 290 Kenilworth-Parkside apartments for a mixed income complex and a plan to subsidize the affordable units through the profits of its market-rate homes.
But at least one early success has endured. Columbia Point, a 1950s housing project in Boston’s Dorchester neighborhood, had become plagued by crime, drug use, and a vacancy rate of nearly 75 percent by the 1980s. In a bind, the city turned management over to a private company in 1984, which then created a rebranded, redeveloped Harbor Point. It was the nation’s first effort to transform a public housing project by recruiting middle- and high-income residents (one-third of the units would remain subsidized for low income).
The transformation effectively became the model for Kemp’s HOPE vision. Despite its results, the new ownership was only saved from bankruptcy by a last-second investment from Chevron, an energy company looking to get in on the newly established low-income tax credits in 1986. It still lives on as a healthy, celebrated community thanks to dedicated management but also because of the site’s easy access to public transit, a state university, and the nearby Harborwalk.
Still, as HUD secretary, Kemp remained an enigma in the Bush cabinet. In a 1993 New York Times piece called “How Jack Kemp Lost the War on Poverty,” Jason Deparle described how Kemp repeatedly clashed with White House budget director, Richard Darman: “By refusing to invest more in poverty prevention, [Kemp] says, the ‘budgetmeisters’ courted more expenses later—in welfare, crime, and misery.”
American downtowns during this period, from Baltimore to San Diego, saw new skyscrapers, stadiums, and waterfront marketplaces redefine what urban life could be for a tourist or suburbanite. But for anyone who depended on low-income housing, not much improved. Kemp’s approach, Deparle wrote, carried a recklessness “that ignored their limitations, exhibiting a fervor that reminded one ally of an adolescent who had just read The Fountainhead.” Despite Kemp’s hopes for HOPE, only 135 of the country’s 1.3 million public housing units in existence during Kemp’s time at HUD were sold before he left the agency.
Kemp’s tenant ownership ideas, however, lived on through the HOPE VI program, passed by Congress one month before the 1992 election and embraced as the anchor of community redevelopment through the Clinton administration into the 21st century. By 2010, $6.2 billion in Revitalization Grants had been awarded around the country through HOPE VI, while the establishment of the Faircloth Amendment has capped the total number of public housing units in the U.S at 1999 levels. The ripple effects of this embrace of a Kempian path to affordable housing will likely be felt in American cities for years to come, as private developers have so far delivered mixed results in their efforts.
Public housing inventory dropped in cities across the country, often through the demolition of ubiquitous and often distressed brick towers and garden apartments. In Atlanta, where the first HOPE VI pilot grant was awarded, all of its publicly owned family housing buildings had been eliminated by 2011. Since then, median rents for a one-bedroom have skyrocketed while the city’s eviction rate ranks among the nation’s highest. In Chicago, public housing’s legacy is so potent that a National Public Housing Museum is expected to open in 2021 inside the only remaining building of the Jane Addams Homes. The most notorious of its public housing projects, Cabrini-Green, saw its mid- and high-rises demolished between 1995 and 2011. Located in the Near North Side, significant displacement has occurred while the local housing authority works on plans for redeveloping the area in a way that will meet demand for middle- and high-income homes, while also fulfilling its obligations to return hundreds of publicly owned units to the site.
The legacy of these initiatives resonated with UK musician PJ Harvey on a visit to Washington during the Obama administration. Harvey turned the notes she scribbled from the backseat of Washington Post reporter Paul Schwartzman’s car during a tour of Southeast D.C. and turned them into the first track of her 2016 albumHope Six Demolition Project. The song, “Community of Hope,” captures the never-ending effort to improve a neighborhood that had been essentially cut off from the wealth and power of the Nation’s Capital, settling for the removal of notorious public housing projects, and the promise of a new Walmart.
As for enterprise zones, their spirit lives on through today’s Opportunity Zones, despite years of research proving that such programs have limited benefits. Formed under the Tax Cuts and Jobs Act of 2017, the program invites extremely wealthy people to avoid capital gains taxes by investing in (mostly) impoverished census tracts. So far, it has been great for investors, especially those who are friends or relatives of the celebrity developer-turned U.S. president. But it has done little to address the poverty and underemployment people already living in those tracts face as new luxury developments sprout up around them, if they’re close enough to a more prosperous census tract.
As ProPublica recently reported, it’ll be difficult to ever track the benefits for everyone else. Some Opportunity Zones aren’t even in poor areas, like in downtown Detroit, where mega-landlord Dan Gilbert’s preferred census tracts allow him to save money on higher-end hotel, office, and apartment projects that would likely have been built without the extra incentives.
The Buffalo neighborhoods surrounding the since-demolished stadium where Kemp made a name for himself on Sundays are now Opportunity Zones, too. Decades of population decline and continued disinvestment have slowly made way for an uncertain future in the face of a burgeoning, government-subsidized medical research campus nearby that has emerged since the turn of the century. These neighborhoods remain racially and economically separate from the private sector growth approaching their front doors, and fears over displacement—even among homeowners—have bubbled up in recent years over parking spaces, online map labels, and proposed market-rate housing in ways that make it clear that the people benefiting from the new wave of opportunities aren’t the ones who ever needed the boost.
This type of community tension between black and white, poor and rich, isn’t what Kemp would have wanted to see. But it’s in the urban renewal playbook he helped create.
His mother remembers the window he loved so much. It was at the back of the house she was renting in northeast Baltimore, the house where her son, Deshawn Fisher, was born in 1993. This particular window was in Deshawn’s bedroom, and as he grew from an infant to a toddler, he enjoyed watching the world outside through the glass.
The frame of that same window was covered in flaking lead-based paint. By the time Deshawn was two years old, the level of lead in his bloodstream registered 11 micrograms per deciliter—six micrograms higher than the five-microgram poisoning threshold instituted by the Centers for Disease Control and Prevention in 2012. As Fisher grew up, he developed behavioral problems and ADHD. His IQ level was seven points lower than it should have been, the results of a baby’s life around lead dust and paint chips.
According to the Centers for Disease Control, there is no safe amount of lead in children’s blood. Even a level of 1 microgram per deciliter of blood of this harmful and sometimes deadly neurotoxin is enough to lower IQ by several points. Lead wreaks havoc on the mind and body; irritability, mood disorders, appetite loss, and developmental delays are all symptoms of exposure. “A sugar-size packet of lead dust throughout a two-bedroom home is enough to create a lead-poisoned child,” says Helen Meier, an epidemiology professor at the University of Wisconsin-Milwaukee’s school of public health.
For decades, lead served as a critical component in the pipes, paint, and petrol of America’s rural and urban topography. Over the second half of the 20th century, as evidence piled up regarding the negative health effects generated by chronic, low-level exposure, the law finally caught up. Starting in the 1970s, federal provisions gradually took effect to restrict or ban lead-based paint, leaded gasoline, and the use of lead pipes in plumbing.
But the specter of lead still looms in cities across the country, and its effects continue to be felt. The latest exhibit is Newark, New Jersey, where improperly treated drinking water is corroding the city’s lead service lines, allowing the toxic element to flow through residents’ faucets.
Today, about half a million American kids between the ages of 1 and 5 have a blood-lead level that exceeds 5 micrograms per deciliter. And it’s a problem that disproportionately affects children of color, according to the CDC. The impact of this contamination in American cities could be enormous. One provocative hypothesis, for example, draws a direct connection between lead exposure as a child and crime later in life. “Higher levels of lead exposure are correlated with lower test scores and higher rates of criminal activity,” says Kevin Schnepel, an economics professor at Canada’s Simon Fraser University and co-author of the 2017 research paper “Life after Lead.”
Roughly $15 billion is spent in the U.S. annually to handle new cases of lead poisoning. Fully eradicating the toxin in our towns and cities means replacing 7 million lead service lines, remediating lead paint in 38 million housing units, and cleaning up countless tons of soil contaminated by the lead spewed into the air by automobiles. One estimated price tag: About half a trillion dollars.
It’s a daunting figure. But what if the costs of failing to finally reckon with the effects of this contaminant are far worse?
“Verily, we live in an age of lead,” trumpeted Baltimore’s Afro-American newspaper in September 1906. While the article noted that iron was a “precious metal” imperative to the new industrial age, its main point was that too few people realized “how useful, if not absolutely necessary, to modern civilization” lead had become.
An abundant metal that doesn’t rust and is easy to shape, lead has long been the stuff that cities were made of—a history that Parisians were reminded of when the lead-tiled roof of the Notre Dame Cathedral went up in flames in April, spreading clouds of contamination across the city.
In America, lead was the ideal material to fashion the snaking pipes that made up vast plumbing systems underneath cities. Lead-based paint—nothing more than metallic lead corroded by an acid into a fine white powder and then mixed with linseed oil—was known for being brighter, shinier, and more durable than other paints. The very rich of the late 19th and early 20th centuries used it inside and outside of their homes; in the 1930s, lead paint, because of its toughness, was mandated for use in public housing. Lead-acid batteries cranked the automobiles Americans drove; leaded gasoline—infused with tetraethyl lead, developed by General Motors—burst onto the scene in 1923. The additive boosted power and stopped engine “knocking.” (It also poisoned the GM engineer who developed the technique, and killed several Standard Oil workers in the refinery that made leaded gasoline.)
“It’s everywhere; it’s like the skin of the urban landscape,” says Leif Fredrickson, an environmental historian writing a book about Baltimore’s history of lead poisoning. “One of the classic quotes from a doctor in the 1920s is that children will grow up in a world of lead.”
This despite a steadily increasing body of scientific knowledge that identified lead as extremely dangerous. The first observations of neurologic effects from lead exposure were recorded in Brisbane, Australia, in the 1890s. A Sherwin-Williams newsletter from 1899 published research noting “white lead is a deadly cumulative poison.” Its risks are exacerbated by another quality: It’s sweet, making flakes of lead paint an especially enticing treat to babies and toddlers fond of plumbing the contours of the world through their taste buds. Several countries in Europe banned lead paint indoors five years after evidence emerged in 1904 linking it to childhood poisoning; in 1922, members of the League of Nations banned lead paint outright. One notable exception: the United States.
“Everybody knew lead was toxic,” writes Mona Hanna-Attisha, the doctor who helped expose the lead crisis in Flint, in her book What the Eyes Don’t See, published in early 2019. “[B]ut what it did to the human body was insidious and invisible, while its benefit to industry was tangible and quantifiable in dollars.”
What makes lead poisoning so pernicious is that, in small doses, there are no immediate signs of what it does to a child’s health. But once lead gets into the body—whether via inhalation, ingestion, or just from contact, as tetraethyl lead can be absorbed directly through the skin—it interrupts hemoglobin function in cells, crowds out calcium in bone, and erodes gray matter in the brain regions required to perform executive functions, like being able to pay attention and control or manage our impulses and emotions.
Beginning in 1971 with the Lead-Based Paint Poisoning Prevention Act, federal law did its part to slowly phase out lead from meaningful parts of urban infrastructure. In 1978 came a federal ban on using lead paint in residential cases. In 1986, a federal ban on using lead service lines in any new plumbing. That same year, leaded gasoline was finally taken off the market in the U.S; it was banned outright a decade later.
The fallout from so many decades of using lead to construct America persists. Belched into the air by cars and industry over decades, it has settled in the soil of school yards and neighborhood playgrounds. So widespread was its use in 20th-century America that just about every U.S. city bears the burden of lead, and investigations by news outlets into the damage wrought by lead in urban centers are easy to find: Los Angeles, St. Louis, Chicago, Milwaukee, New Orleans, and, perhaps most notoriously, Flint. About 38 million U.S. homes built before 1977 hide lead paint, the most common source of lead exposure for American kids. The U.S. Department of Housing and Urban Development estimates that 62,000 public housing units require lead abatement. And millions of underground lead pipes remain, still delivering drinking water in cities large and small.
While the burden of lead is shared by many American cities, its health impacts aren’t. Trace the instances of lead exposure in countless locations across the U.S., and you’ll find the same pattern: It’s predominantly an issue in lower-income, minority communities.
Baltimore may now be a cautionary example of the perils of lead contamination, but the city was also a pioneer in trying to curb its effects: In 1950, it was the first to ban lead-based paint in home construction. By then, however, the damage had largely been done. Hypersegregated by race and income, lower-income neighborhoods on the east and west sides of the city today bear the brunt of the effects of lead exposure.
Ruth Ann Norton, president and CEO of the Baltimore nonprofit Green and Healthy Homes Initiative, is one of the current advocates for eliminating childhood lead poisoning in the city. Since she started her job in 1993, close to 40,000 children in Baltimore have been poisoned with blood-lead levels higher than 10 micrograms per deciliter. “If you care about health, you must care about lead poisoning,” she says. “But if you care about opportunity, which is tied deeply into racial equity, we must eradicate lead.”
Doing so, as Norton says, is not a technical challenge: It’s just a matter of time, money, and political willpower.
“There’s still an ongoing low-level lead poisoning crisis taking place,” says Lawrence Brown, a lead researcher in his own right and a professor in the school of community health and policy at Morgan State University in Baltimore.
The effects of childhood lead exposure are not distributed equally in cities like Baltimore. The interactive map below uses 2018 lead test data from the Maryland Department of the Environment and U.S. Census Bureau figures to map the city’s toxic burden. In it, you can see how lower-income neighborhoods are still seeing high levels of child lead exposure, and explore the data on your own.
Epidemiologists discovered an association between lead poisoning and lower IQs and behavioral problems in the 1970s. And in recent years, some of the loudest calls for widespread remediation of lead hazards sound from those studying the link between lead exposure and crime. The lead-crime hypothesis comes courtesy of Rick Nevin, a former HUD consultant; it’s well explained by an article Kevin Drum wrote for Mother Jones in 2013. In short: Environmental lead exposure—primarily from leaded gas, secondarily from lead paint—can explain the dramatic spike in urban crime in the U.S. from 1960 to 1990. After lead was phased out by federal provision through the ’70s and ’80s, crime levels dropped in the ’90s, since children weren’t registering blood-lead levels of 10 micrograms per deciliter and higher.
Proponents of this linkage say that eradicating all environmental lead exposure, despite its cost, would naturally lead to one big benefit: about a 10 percent drop in crime, equalling about $150 billion in corresponding societal benefits, year after year.
It’s compelling, and variousstudies and articles by other researchers have shown a similar relationship between reducing lead exposure and a reduction in crime rates later on. Nonetheless, not all who have researched the link are wholly convinced of its merit.
“There’s a lot of plausibility for thinking that lead exposure leads to brain changes, that leads to behavioral issues, that then leads to trouble, or doing things that can be categorized as crime,” says lead historian Fredrickson. “But it’s fraught.” (The relevant back-and-forth conversation about this between Fredrickson and others can be found on Twitter.)
In “Life after Lead,” Schnepel and co-author Stephen Billings study a cohort of kids in Charlotte, North Carolina. From their comparison between a control group and a lead-exposed group, they are able to show a decrease in the rate of criminal activity among the kids exposed to lead who are randomly assigned some sort of intervention. The question they still have to answer is what part of the intervention decreased crime: Was it removing lead paint in a child’s home? Or was it, for instance, simply having a guidance counselor more involved with a child acting out in class?
“Our paper certainly contributes to the idea that lead is a really important factor in criminal behavior in communities, but it’s not proof of that by any means,” Schnepel says.
What isn’t clear, in other words, is whether the correlation between lead and crime is perhaps better explained by uncontrolled factors, as University of Queensland researcher Wayne Hall points out. (Couldn’t it just be that a child grows up in a neighborhood where both crime and lead exposure are already common?)
Everyone agrees that lead should be cleaned up as a public health prerogative. But Brown isn’t interested in splitting hairs over the degree of association between lead exposure and crime. It’s all interconnected, he insists.
“The literature on the effects of lead poisoning is very clear,” says Brown, who is leaving Morgan State to be the new director of County Health Rankings & Roadmaps at the University of Wisconsin Population Health Institute. “But then you can also map how what happens early on in education leads to this domino effect that’s going to impact the future trajectory of those children later in life.”
The dimming of the age of lead has been, in many ways, a great public health success story. In the late 1970s, the average American kid had about 15 micrograms; today, that average is about 1, according to the latest CDC data.
In particular, more states have prioritized screening children for lead poisoning early on. The sorts of problems that Deshawn Fisher experienced—hyperactivity, an inability to pay attention, behavioral disorders—usually don’t present themselves until someone reaches elementary school, which is why testing children before age 5 is encouraged.
In 2016, Maryland made it a requirement to test every child under 2 for lead; as the Baltimore Sunreported in 2018, the state managed to screen just under half of all children in 2017. Stricter requirements for Maryland landlords to cover or remove lead-based paint in their properties has also led to a decrease in the number of cases of lead poisoning in Baltimore. This fall, Baltimore received a $9.7 million federal grant to address lead paint in homes.
That’s cause for celebration, since it means fewer kids being poisoned to the degree they once were. In the 1950s, it wasn’t uncommon to see children—especially black children in Baltimore, according to Fredrickson—testing positive for levels of lead above 40 micrograms per deciliter, requiring stays in hospital intensive care units.
Yet thinking the fight against lead exposure is over simply because of lower microgram readings would be premature, especially given our growing understanding of what lead does to the body. In the wake of the lead crisis in Flint in 2014, researchers found that blood-lead levels in children increased by only 0.11 micrograms per deciliter. But as of the start of this school year, one in five students in Flint public schools are now eligible for special education, compared to one in eight in 2012-2013, the year before the lead crisis.
In Baltimore, the shadow of lead passed over generations of kids who sometimes weren’t even made aware of the danger.
Kondwani Fidel is a poet from East Baltimore whose new book, Hummingbirds in The Trenches, explores growing up in the city’s underinvested communities. He vividly recalls officials doing inspection of lead “hot spots” in his house and the houses of his friends. What the 26-year-old can’t recall is when those officials broke down the true effects of lead exposure—because it never happened. It was only later in life when lead’s impact was truly understood. Fidel remembers sitting on the stoop one day and having to read the comments underneath a post on Facebook to one of his childhood friends—a friend who grew up in a home with lead—because his friend couldn’t read.
“Lead-paint poisoning, that’s traumatizing even thinking about it,” Fidel says. “You’re not able to operate in life. You don’t have a fair chance at life. So you’re already defeated.”
Norton likes to point out that just three granules of sugar’s worth of lead ingested by a child is enough to have permanent, irreversible, lifelong cognitive impacts. As of this year, Illinois and Ohio joined Michigan in being the only three states where developmental therapies are available for children with even very low levels of lead in their blood.
“When I was first involved in these cases, there was no good science that showed lead levels under 20 [micrograms per deciliter] causes these problems,” says malpractice attorney Michael A. Pulver. “In the 25 years I’ve been doing them, it’s the same story over and over again: school failure, attention problems, hyperactivity.”
Hundreds of lawsuits piled up in the wake of poisonings in cities like Baltimore—not only against negligent landlords, but also against the paint industry. This legal fallout is another manifestation of lead’s lingering impact on urban America. In the spring, a federal jury awarded three men in Milwaukee $2 million apiece after they sued several paint companies that they claimed were responsible for the lead poisoning they received as toddlers. Last July, three former lead paint manufacturers agreed to a $305 million settlement in California to help remediate contamination in older homes in Oakland, San Diego, Los Angeles, and several other counties and cities.
That same month, Deshawn Fisher was awarded $2.3 million after a successful lawsuit brought against the owner of the rental property where he spent his first years. Pulver represented him. Fisher told the court he had to repeat grades in school, and while he managed to earn a high school diploma, he’s been unable to complete college courses. In court, the owner of the rental home testified that he’d never inspected the property for lead paint.
And it all started with a window, a simple childhood pleasure of Fisher’s that now frames his adult life.
Suburbs are increasingly not just where Americans live, but where they work. According to the U.S. Bureau of Labor Statistics’ Quarterly Census of Employment and Wages, 32 percent of U.S. employment is in the suburbs of large metropolitan areas—that is, in the medium- and lower-density counties within metropolitan areas that contain at least 1 million people. That is on par with the 32 percent of the population that lives in the suburbs of these metros. (A slight majority of Americans lives in suburbs overall, but this analysis looks specifically at suburbs of large metros.)
The latest BLS data show that job growth, like population growth, is faster in these suburbs than in urban counties, smaller metros, and non-metropolitan areas.
However, the jobs of the suburbs look different from city jobs. The urban revival that has drawn new residents into city neighborhoods—although overstated—has also affected where people work.
To see this, let’s start by looking closely at the kinds of jobs found in the suburbs. Naturally, with a large share of the population, the suburbs have a large share of jobs that tend to be found wherever people live. These “everywhere jobs” include retail sales roles, elementary and middle school teachers, hairdressers, and many other occupations that serve customers face to face.
Rural areas and smaller metros have the lion’s share of jobs requiring lots of land or needing proximity to natural resources, like logging workers, farmers, and mining machine operators. Urban counties have their own mix, too, with a majority of actors, producers, and directors; disproportionately high shares of economists and financial analysts; and many workers serving urban residents and visitors, such as taxi drivers and hotel employees.
But what about jobs that are distinctly prevalent in suburbs? These are a bit harder to guess. Engineering jobs are most disproportionately clustered in suburban locations. For instance, 47 percent of petroleum, mining, and geological engineers are in suburbia, compared with 31 percent of employment overall. Sales, mechanical, and chemical engineers are also unusually prevalent in the suburbs. So are several operational occupations in finance and insurance, like claims adjusters, loan interviewers, and bill collectors.
The most suburban jobs
What do these particularly suburban jobs have in common? To hire specialized engineers, it’s a plus to be in a big labor market—but their jobs often require machinery or equipment that needs too much space to justify a high-rent central city location. The financial and insurance occupations benefit from being in a large market with lots of customers, but probably can’t outbid better-paid analysts or lawyers for downtown space. In short, these suburban jobs get access to the large local economy without paying downtown office prices.
Even though we can point to jobs that cluster in the suburbs, the suburbs have a mix of jobs that look more like the economy overall than the mix in either urban counties or smaller metros and rural areas does. In other words, the most suburban jobs are less exclusively suburban than the most urban jobs (actors, economists) are urban, or than predominantly small town and rural jobs are typical of job markets there.
There are three reasons why faster job growth in the suburbs isn’t the whole story.
First, urban jobs pay more—and average wages are rising fastest in urban counties. Better-paying jobs sort into urban locations, and increasingly so. Average wages per worker in urban counties are 26 percent above those in the higher-density suburbs of the same large metros and 46 percent above lower-density suburbs.
These employment patterns are similar to residential patterns, which show that population growth may be faster in suburbs, but urban areas are gaining in higher-income households. For jobs, as well as for people, the urban revival is more about a compositional shift than faster growth.
Second, the future of work may favor urban areas. Based on the latest BLS occupational projections through 2028 and current job-location patterns, urban counties have the mix of jobs projected to grow fastest. Suburbs may have lots of jobs in some fast-growing services, but they also have a disproportionate share of operational occupations in finance and insurance that are projected to shrink.
Of course, having jobs in faster-growing occupations doesn’t guarantee that urban counties will have the fastest job growth. Constraints and costs also affect where jobs go, and lower-cost suburbs might lure some jobs in occupations that have traditionally located in central urban areas.
Third: the urban revival in job growth is more of a catching up than a pulling ahead. Taking the long view, suburban job growth has outpaced urban job growth for decades. But the gap has narrowed since the recession of the late 2000s, and urban jobs rebounded in tandem with the suburbs’ jobs. That’s in contrast to much of the period prior to the recession, when job growth in cities often lagged behind even non-metropolitan areas.
Since employment projections favor cities, the job-growth gap between suburban and urban counties might remain narrow—although the widening gap between large metros and the rest of the country might persist.
More important, the long view also reveals that urban counties, suburbs, smaller metros, and rural areas tend to move together. They all do better in booms and suffer in recessions. In most years, the differences in job growth between urban counties, suburbs, and other areas are small, relative to how the ups and downs in the economy affect job growth in all types of places. Suburbs do better when the overall economy does; the same goes for cities and rural areas. Despite the debates over the urban revival and suburban growth, we’re all in it together.
Keep up with the most pressing, interesting, and important city stories of the day. Sign up for the CityLab Daily newsletter here.
What We’re Following
Stump the mayors: Tomorrow, three of the five former and current mayors who are running for president will have to answer to a council of their city-leading peers. In Iowa, the U.S. Conference of Mayors will hold a Local America Presidential Forum withformer mayors Cory Booker and Julián Castro, current Mayor Pete Buttigieg, and two non-mayoral candidates: billionaire Tom Steyer and Senator Amy Klobuchar.
The conference of mayors will present the White House hopefuls with a city hall-centered policy agenda for 2020 chock-full of specific priorities from housing to infrastructure. But the forum also underscores a more general message: Mayors say they’re more in touch with the priorities their citizens are actually talking about. CityLab’s Sarah Holder has the details on the mayors’ slate of proposals:Mayors to Presidential Hopefuls: Listen to Cities
In a circular city, “reduce-reuse-recycle” will replace “take-make-dispose”. Urban mobility will be carbon-neutral, relying on low- to zero-emission vehicles within a broader energy network powered by renewables. Cities and businesses will also generate savings from using recycled building materials and turning waste into fuel to power buses.
In other words, circular cities will blend ancient approaches with modern technologies. But how will they do it, and where will the money come from?
Today’s musing comes from CityLab Executive Editor David Dudley, who reflects on a recently departed prolific novelist:
The American writer Stephen Dixon, who passed away on November 6 at age 83, hammered out 18 novels and about 600 pieces of short fiction, the most recent of which came out last month. He was a two-time National Book Award finalist, but despite his prodigious output and loads of literary prizes, he needed a day job to pay the bills; his knotty, challenging, experimental fiction never sold well.
That job was teaching writing at Johns Hopkins University, where I met him as an undergrad in the late 1980s.
Dixon was an imposing figure, a laconic former reporter with a Lower East Side accent and no-guff demeanor. He wrote his fiction on a manual typewriter, which was getting weird even back in 1987, and his work vibrated with all manner of urban anxieties. The 1988 novel Garbage chronicled a bar owner’s doomed battle against corrupt municipal trash collectors. In 1995’s Interstate, a drive-by highway shooting launches a looping, post-modern nightmare narrative that repeats and restarts. Random violence, menacing strangers, and the workaday annoyances of city life filled Dixon’s stories, which felt perfectly attuned to the dysfunctional atmosphere of that era.
I didn’t work closely with Dixon at Hopkins, but I loved the badassery of his writing and was awed by the relentlessness of his freelance hustle: He gave writing students a copy of his guide to pitching magazines, something he insisted, against all evidence, that we should be doing. This typewritten document, which I still have, listed dozens and dozens of publications, from Playboy and Esquire to scads of teeny now-defunct magazines, and gave names of editors, rates, and unvarnished insider tips on what to try and sell them. Dixon seemed to approach the whole Art of Fiction thing with a refreshing absence of pretense; writing was more like steamfitting or hanging drywall, a craft performed by hand, every day, until you got halfway good at it and could get paid. For me, that turned out to be an approach that worked.
Many years later, when I had a teeny now-defunct magazine of my own, I had an opportunity to publish a Stephen Dixon short story (“Mr. Greene,” which also appears in the 2010 Fantagraphics collection What Is All This?). It’s a surreal, scary, and very Dixon-esque fantasia of random violence erupting in suburbia. Go pick that book up, Navigator readers, or, really, any one of his works: Other contemporary writers got more famous, but I’m not sure anyone did a better job of capturing the uneasy energies of modern American life.
What we’re writing:
How )¤ One day as a mascot in Times Square. (Mel Magazine) ¤ It’s ski season! Here’s your essential guide to mountain architecture. (Curbed)¤Let there be night skies. (Huffington Post)¤ For one matchmaking company, your loneliness is worth $725. (Washington Post)¤ One hiker’s journey across California—on foot. (Longreads)¤ Your two-hour delay is good business for airport restaurants and online companies. (Slate)¤ The Mona Lisa is holding the Louvre hostage. (New York Times)¤
Views from the ground:
@spartsuno captures a towering building in Melbourne. @ahmiich visits the “warp square” of Superkilen park in Copenhagen. @mikekowal people-watches at Pershing Square in New York City. @dontgiveafiddlestick finds children playing at the ancient Banganga Tank in Mumbai.
Showcase your photos with the hashtag #citylabontheground and we’ll feature it on CityLab’s Instagram page or pull them together for the next edition of Navigator.
“I’m still flying at four thousand feet when I see it, that scarcely perceptible glow, as though the moon had rushed ahead of schedule. Paris is rising over the edge of the earth.”
At the end of his grueling 33-hour solo flight over the Atlantic, Charles Lindbergh was searching for the airport, north of the French capital, on which to land the Spirit of St. Louis. The pilot would recall the unconventional but dazzling navigation aid he used: “Far below, a little offset from the center, is a column of lights pointing upward, changing angles as I fly—the Eiffel Tower. I circle once above it and turn north-eastward.”
In those days, the Eiffel Tower was less a solitary beacon and more a constellation. It was illuminated by 250,000 light bulbs, spelling out the word Citroën. From 1925 to 1934, this symbol of Paris—and indeed of modernity itself—was a colossal advertisement for a company, helmed by a former arms manufacturer, that was headed for bankruptcy.
Advertisements tell us about much more than the products and services they promote. They tell us about desire, how it changes, and how it and thus we are manipulated. Like many revelatory urban features, advertising signage is ubiquitous to the point of becoming almost invisible. Yet we read cities as much as we inhabit and traverse them.
In cinematic aerial footage of cities, we are often presented with the blank facades of skyscrapers. But the closer to street level we get, the closer to the part of the city we navigate, we find that cities are a riot of lettering and symbols. The city itself is a form of visual language. Advertising is everywhere. It is a pictorial cacophony that we’ve grown used to.
We are not as immune as we might think to its powers. It reflects who we are, or want to be, while threatening to overwhelm us. And yet, often despite itself, it can connect us to the past, to the local, and to senses of meaning.
How to stand out in a visual cacophony
The first aim of signage is to stand out. Grabbing the attention of passersby was easier in the past, when there were fewer signs. Competition, however, has long been fierce. Ingenuity has always been required to gain an edge.
In the ruins of Pompeii and Herculaneum, archaeologists found revealing signs on buildings—a dairy was marked by an engraving of a goat, a stonemason by tools, a wine merchant by two figures hauling an amphora jug, presumably full of wine.
In Japan, shop signs known as kanban were often carved out of wood or bamboo. They served a similar purpose as their Roman equivalents: Combs, vegetables, swords, and wigs, among other objects, informed citizens of the wares on sale. But they were sometimes presented with such decorative skill and attention—for instance, a gold-lacquered carp leaping into a waterfall, representing a pharmacist—that they became works of art.
Due to their intrusive quality, signs can arouse irritation as much as curiosity. In Daniel Defoe’s novel A Journal of the Plague Year, he laments one of the side-effects of the outbreak of bubonic plague in London in 1665. “[A] wicked generation of pretenders to magic” made their fortunes in superstition, and
this trade grew so open and so generally practiced that it became common to have signs and inscriptions set up at doors: “here lives a fortune-teller,” “here lives an astrologer,” “here you may have your nativity calculated,” and the like …
There were more risks than the exploitation of credulity. The craze in London for hanging signs resulted in accidents, such as one in 1718 in Bride Street, when four people were killed by a falling sign that pulled part of the facade off the building. This resulted in periods when such signage was banned in England.
By the 19th century, advertising had colonized cities. In earlier times, the cacophony of salespeople had been audio, as encapsulated by Hogarth’s as the zenith, or nadir, of this approach. Flashing neon motel signs grabbed the attention of drivers and relied on their impulsivity. Signs became huge to stand out at cruising speed—often surpassing, and outliving, the buildings they represented (“the big sign and the little building is the rule of Route 66”).
As long, straight highways crossed the desert toward the city, advertisers gained space and time to create narratives, through the use of staggered signs and doggerel that would gradually unfold as cars passed. Burma Shave was a famous example, with ditties like, “Our fortune / Is your / Shaven face / It’s our best / Advertising space / Burma-Shave,” and “Don’t take / a curve / at 60 per / We hate to lose / a customer / Burma-Shave.”
With the Highway Beautification Act of 1965, the U.S. government attempted to legislate against what was seen as visual pollution. While this affected small businesses, who’d made roadside and barn signage an unlikely form of outsider art, it ultimately did little to halt the advance of corporate advertising. Ever more elaborate ways of gaining and holding the public’s interest continued to unfold, like (1937), African Americans queue at a flood-relief agency underneath a billboard—featuring a jubilant middle-class white family straight out of Hollywood schlock—that announces, “There’s no way like the American way.” The image was still prescient enough to be used as the basis for the cover of Curtis Mayfield’s album There’s No Place Like America Today (1975).
Perhaps it is relevant still. At some point, advertisers move from the supply of services and products to the manufacture of insatiable desires and statuses that they cannot or will not help people attain. The images in the billboards may be beautiful, tantalizingly so, but there is a difference, and even an antagonism, between beauty and truth.
During the four years he lived at Rome’s Villa Medici as a recipient of the prestigious Grand Prix de Rome, Tony Garnier spent hardly any time on the study of isolated ancient monuments, as was required. Instead, the young architect from Lyon, France, focused his energy from 1899 to 1903 on what would later become his theoretical chef d’oeuvre: a utopian plan for an industrial city.
“If our structure remains simple, without ornament, without molding, bare everywhere, we can then dispose of the decorative arts in all their forms,” he wrote in Une Cité Industrielle (An Industrial City), published as a book in 1917. The book is a detailed collection of avant-garde designs for a socialist city of 35,000 people. This hypothetical city is heavily industrialized and zoned, divided according to four functions: housing, work, leisure, and health. Garnier advocated for the use of concrete in building, as well as the importance of greenery, natural light, and collective social amenities.
An Industrial City wasa bridge between the utopian socialism of Charles Fourier and the Garden City idea of Ebenezer Howard, on one side, and Modernist city planning on the other.
In 1919, Garnier received a letter from a young admirer named Charles-Édouard Jeanneret, who had just encountered An Industrial City.“It is a milestone clearly delimiting a past period and opening up all possible hopes… In ten years, [your book] will be the foundation of all production and be the first rallying sign,” he wrote.
Today, Garnier is not nearly as well known outside of France as Jeanneret (or Le Corbusier). But “one could say that Garnier is to Lyon what Antoni Gaudí is to Barcelona,” said Catherine Chambon, director of the Tony Garnier Urban Museum, an open-air museum devoted to the architect in Lyon, France’s second-largest city. There’s not a neighborhood in the city where his presence isn’t felt.
This year and into 2020, the city is celebrating the 150th anniversary of Garnier’s birth. The Tony Garnier Urban Museum has put up an exhibit; the municipal archives has, too, focusing on the fruitful professional relationship between Garnier and former Mayor Edouard Herriot. The city’s Renaud Foundation will display Garnier’s paintings, drawings, plans, and photographs.
Garnier, a son of canuts or workers in the silk industry, was born in the working-class Croix-Rousse neighborhood of Lyon on August 13, 1869. Growing up in modest conditions where people worked and lived in the same space led Garnier to consider the social aspect of housing from an early age.
His youth also coincided with a crisis in the textile industry. Small workshops shuttered to make way for big, mechanized factories. With these economic changes came pulmonary illnesses, to which he lost his mother and two aunts. Sanitation and hygiene came to assume great importance in municipal projects during Garnier’s tenure as city architect.
Schooling was not compulsory at the time, but Garnier’s father insisted on educating him. He revealed himself to be a talented student and made it to the École des Beaux-Arts in Paris. After spending four years on scholarship in Rome and one year traveling around the Mediterranean, Garnier returned to his home city. The mayor, Victor Augagneur, gave Garnier his first assignment in 1905: the construction of a municipal dairy. Augagneur then warmly recommended Garnier to his successor, Edouard Herriot.
It is impossible to talk about Garnier’s work without mentioning his decades-long collaboration with Herriot. “Here is a visionary architect dedicated to social progress. And here’s a radical socialist mayor, who has great ambitions for his city in terms of health and housing. They didn’t see eye to eye on all subjects, obviously, but their ideas about Lyon’s future converged,” said Chambon.
Garnier completed about 80 projects over his career, most of them in Lyon. Herriot commissioned what are now seen as hallmarks of the city’s architecture: the popular Halle Tony Garnier, which was originally built as a cattle market and slaughterhouse; the Grange Blanche Hospital, now known as the Edouard Herriot Hospital; and a stadium, the Stade de Gerland.
One afternoon in Lyon this past July, Elodie Morel, who works for GrandLyon Habitat, a social-housing management company, pointed me to a five-story building. “Come up,” she said. We visited a sunny two-bedroom apartment with a balcony, overlooking an open space planted with trees. We were at Cité Tony Garnier—a housing estate of 1,500 apartments with 3,000 residents in the Etats-Unis neighborhood.
In the early 20th century, this part of Lyon was neglected, so “the municipality decided to use it for a public housing project for workers in factories nearby,” said Morel. Garnier, an established architect by then, was hired for the job, and finished the estate in 1933. It was a model of social housing with the latest comforts. Every apartment had running water, a gas connection and a toilet, luxuries that were hard to come by in working-class neighborhoods at the time. For the sake of convenience, each building was standardized with only one type of apartment—one, two, three, or four bedrooms—and the buildings were organized in islands served by a network of orthogonal streets and courtyards.
This new district was as close as Garnier came to his ideal city. “However, he could not include all the public amenities he envisaged, such as a swimming pool and a library,” Chambon noted. “The habitation was also more dense [than he initially planned], owing to economic constraints between the two wars.”
Toward the end of the 20th century, Garnier’s legacy was forgotten even in the housing complex that bears his name. The specter of demolition also loomed, because the buildings were run-down. Long-time residents got together and decided to try to save the estate.
Elsewhere in Lyon, a group of young artists and architects had just established CitéCréation, an initiative to create large-scale urban murals, inspired by Diego Rivera’s work in Mexico. Together, the residents of Cité Tony Garnier, the muralists, and OPAC du Grand Lyon, a social housing company, launched a major rehabilitation project in 1985. Today, there are 25 murals on building walls in the area, drawing thousands of tourists a year. Some of those murals showcase Garnier’s visionary designs.
During a recent walking tour in the neighborhood organized by CitéCréation, a group stood in front of a huge mural. A car slowed down and a man told them: “I live here. I know about these murals.” Other local residents share his pride in this chronicle of their history and homage to Garnier, who once wrote: “There is enough ideal in the worship of beauty and benevolence to render life splendid.”
Long ago, Jane Jacobs showed us how dense, diverse urban neighborhoods filled with short blocks and old buildings were catalysts of innovation and creativity. But when economists and urbanists measure innovation they typically look at big geographic areas like metros. Yet, what Jacobs was talking and writing about was the micro-geographic texture of much smaller neighborhoods like her own Greenwich Village.
A new study, forthcoming in the Review of Economics and Statistics, takes a close look at the effect of small urban neighborhoods—and in particular on key characteristics of their physical layout—on innovation. The study, by Maria P. Roche, a doctoral student at the Georgia Institute of Technology’s Scheller College of Business, examines the effect of certain neighborhood characteristics on innovation. The study compares the rate of innovation (based on patents granted between 2011 and 2013) to two key neighborhood characteristics that capture older more compact, neighborhoods built before the mass onset of the automobile: street density (based on the total miles of streets shared by cars and pedestrians) and percentage of housing stock built before 1940.
The study includes a range of control variables to capture the role of amenities like bars and restaurants; the role of a particular type of human capital or talent measured as concentration of college grads and inventors; the presence of key knowledge institutions like universities and colleges; and physical characteristics such as land area and bodies of water. To get at this micro-geography of innovation, the study looks at Census block groups and tracks the connection between neighborhood form and innovation in more than 120,000 block groups or across the country
The study finds that neighborhood form—in particular the density and layout of its streets—has a considerable effect on innovation. It finds that a ten percent increase in street density or connectivity is associated with a 0.05 to 1 percent increase in innovation. This is in line with previous studies which find that a ten percent increase in employment density results in a two percent increase in per capita patenting over a ten-year period and that a ten percent increase in highway connectivity leads to a nearly two percent increase in patenting across metro areas over a five-year period.
Neighborhoods with higher street density not only have more patented innovations, but more citations of the patents they generate. This suggests that neighborhoods with denser streets help facilitate greater knowledge exchange and higher levels of interaction over the ideas they generate, as Roche told me via email. The report reads: “Studies comparing citation data with surveys of inventors have detected a strong correlation between patent citations and knowledge flows.”
The study also finds population, employment, and amenities like bars and restaurants to be positively associated with neighborhood level innovation. Roche sees these as factors that work together with the layout of streets and neighborhood form to spur interaction between people—the exchange of knowledge and ideas that ultimately generate new innovations.
For too long, we’ve seen innovation as something that takes place in corporate R&D (research and development) centers, university laboratories, and suburban office parks. But as Jane Jacobs long ago said, new innovations are more likely to come from the density and diversity of urban neighborhoods.
These Jacobs-identified factors have tended to elude economists and urbanists, who have lacked the kinds of detailed neighborhood-level data and analysis needed to track and identify them. Until now, most studies of the geography of innovation have tracked innovations or startup companies broadly across cities and metro areas. Roche’s study uses detailed data to help us better understand how factors of urban form interact with density to shape geographic micro-clusters of innovation at the neighborhood level. Not only does innovation turn on the presence of universities or concentration of talent or human capital, but on physical characteristics like street layout and form of the neighborhood.
As Roche puts it, her findings provide real empirical “support for the idea that the actual physical capacity to connect people and ideas may, in fact, be one reason why cities, and some neighborhoods are more conducive for innovation than others.” The forms and structure of our cities and neighborhoods are not add-ons or afterthoughts. They are key features of the innovative fabric that powers our economy as a whole.