Keep up with the most pressing, interesting, and important city stories of the day. Sign up for the CityLab Daily newsletter here.
What We’re Following
Rolling in the green: We told you little vehicles were going to be big business, and now Uber is buying in even more. Bloomberg reports the company is joining Google Ventures in a $335 million funding round for the scooter- and bike-sharing company Lime. The move comes just a week after Lyft bought Motivate, which operates docked bikeshare systems in some of the largest cities in the United States. Uber already lets users find Jump’s dockless e-bikes in the app, and plans to do the same for Lime’s rental scooters, accelerating the venture-capital-boosted race to the curb.
As ride-hailing companies embrace multimodal mobility, “dockless” vehicles are not without their detractors. Last week, Milwaukee announced a lawsuit against the scooter company Bird, which is run by a one-time Uber executive (Milwaukee Biz Times). Still, there are promising signs that tech companies now see value in a variety of transportation options, and we might yet see people get out of their cars and reconnect with city streets (New York Times).
A proposed discount service could cut prices by 25 percent.
One Nation, Undercount
With a highly contested citizenship question planned for the next Census, vulnerable communities are bracing to be undercounted in 2020. But the history of manipulating the Census goes as far back as the Articles of Confederation.
The Census as a tool for representative democracy seems simple enough: count the people and apportion power to represent them. But making the count an inherently political machine has made it possible to exploit, especially as the ever-shifting concept of race began to evolve, and as people moved from rural to urban America. On CityLab, visual storyteller Ariel Aberg-Riger draws the throughline on this all-too-familiar story with A Visual History of the U.S. Census.
What We’re Reading
No, “drunk walking” is not causing the rise in pedestrian deaths (Streetsblog)
Former New Orleans Mayor Mitch Landrieu may or may not run for President (Politico)
Starbucks will say goodbye to plastic straws in 2020 (NPR)
Our cities are getting hotter—and it’s killing people (Curbed)
Please LeBron, don’t try to ride your bike to work in Los Angeles (Los Angeles Times)
Tell your friends about the CityLab Daily! Forward this newsletter to someone who loves cities and encourage them to subscribe. Send your own comments, feedback, and tips to firstname.lastname@example.org.
The trains that roll through the Channel Tunnel between London and Paris have proved a roaring success since they launched in 1994, and a new proposal could make them accessible to even more people.
Getlink, the company that manages the tunnel, is exploring a way to run lower-cost routes through the tunnel that could result in ticket prices between 25 and 30 percent lower than current rates. The idea is to follow an existing French model for cheaper intercity travel that cuts costs by using older tracks and suburban terminuses. So far, research into a cheaper service has focused on if and how it might be feasible, rather than exactly when—but has found that such a service is eminently implementable and viable along the lines of companies already in existence. Before that happens, however, the company will have to iron out a few kinks in the international train system—and find its place in an already busy and fairly affordable market.
To North Americans, travel between London and Paris at existing prices might already seem like a steal. Last-minute buys can be pricey, but book far enough in advance and you can expect to pay less than $100 for a round-trip ticket, whether by plane or train. In advance, some existing train services can already be cheaper than flying. Looking at tickets for this coming October, the lowest round-trip prices for both train and plane are around £58 ($77), making the plane more expensive when you factor in the cost of traveling to the airport. It’s only in the month prior to departure that train tickets move markedly ahead of planes.
So how could a new train service find a niche in this market? By treading a middle path: being slightly cheaper than the current Eurostar train service, and slightly more convenient than going out to a far-flung airport. That’s how France’s current low-cost train services, Izy and Ouigo, work. They cut costs by using cheaper suburban railway stations and using older tracks, where fees per mile are lower. The trick to their survival is that these stations aren’t that farout of the way, while their speed still leaves even the fastest Amtrak services in the dust.
Ouigo’s service between Paris and Marseille, for example, may require passengers to board the train way out in suburban Marne La Vallée, but with a journey time of three hours and nineteen minutes, it only takes four minutes longer than the fastest service available through TGV, France’s intercity high-speed rail service. Izy’s service from Brussels to Paris, meanwhile, takes an hour longer than the superfast one-hour-and-twenty-two minute service provided by Thalys, but it still serves the exact same city-center stations.
This seems to be the model for the new Getlink service, which would take an estimated three hours to travel between Paris and London. That’s admittedly longer than the fastest train service currently available (two hours and seventeen minutes)—but bearably so. The thornier question is which stations to use. Opt for a terminus that’s too far out of town and lengthy onward transit could make the service far less attractive.
Just as significant is the issue of passport controls. As with Eurostar’s new London-to-Amsterdam service, British border requirements mean that each station within the Schengen common border area that is served by an international service needs its own border checkpoints on site. British customs refuse to check passports on the train itself, meaning the Paris and London stations would need to create fully secured platforms with booths for border guards in order to accommodate these cheaper rail services.
London already has a suitable candidate for such a facility, suggests French newspaperJournal du Dimanche: Stratford International Station. It was set up next to the 2012 Olympic Park to cater to international arrivals, but was never actually used for that purpose. Stratford is almost ready to go as an international station; it’s also well connected to London’s public transit network and not that far out of town.
In Paris, French media speculation suggests that the most likely candidate would be the station at Roissy Charles De Gaulle Airport, which is already used by some Ouigo services. There, it’s possible some border staff could be shared by the airport. The airport’s location is, as you’d expect, decidedly suburban, but it is on the RER suburban rail network, with trains to the heart of the city taking a little over half an hour.
All of these options make the cheap train plan feasible, and the study gives the green light to move to the next stage: building border facilities and negotiating fees. It may still take some time to begin those preparations, however. In the long-run, a successful version of the plan might seriously challenge the wisdom of operating flights between the British and French capitals. It’s not just that flying is a more polluting form of transit. When it’s not faster, cheaper, or more convenient, one has to wonder exactly who would continue to fly.
This week, national governments are convening in New York for the final round of negotiations toward a Global Compact for Migration (GCM). The gathering has an ambitious goal: to develop a comprehensive approach to the subject at a time when record numbers of people are on the move and attendant politics are extraordinarily fraught. Yet an essential voice will be missing from the conversation: the voice of local governments.
Cities like New York are responsible for providing access to essential public services, including health and education, for all of their residents. They have enormous experience implementing policies that address the needs of both newcomers and longtime residents. New York City’s government offers programs like IDNYC, a municipal ID card, and ActionNYC, which provides high-quality immigration legal services. These are powerful examples of good practice.
New York City is not alone in its efforts. Recognizing the essential role civic participation plays in full integration, the Brazilian city of São Paulo established a Municipal Council of Immigrants that provides a mechanism for immigrant residents to participate in the formulation, implementation, and monitoring of the city’s policies. In Barcelona, Spain, the city’s Service Center for Immigrants, Emigrants and Refugees offers information and free advice on immigration, asylum, emigration, and voluntary return to all residents, regardless of status.
Yet despite the enormous contribution cities make to migration governance, they are not able to participate in the formal negotiations unless invited to do so by their national government. For American cities, that prospect is a nonstarter—the Trump administration has withdrawn from the process entirely.
But these institutional barriers are not stopping cities from raising their voices. Late last year, a dozen U.S. cities and more than 130 international ones sent a letter to the co-facilitators of the Global Compact process, committing to contributing to the process. Since then, they have taken it upon themselves to provide feedback on the evolving draft. Last month, New York City, which has a unique role as host to the United Nations, submitted detailed comments on behalf of an informal coalition of more than 40 global cities, and the Commissioner of the Mayor’s Office of Immigrant Affairs addressed the co-facilitators in an informal session.
The cities’ submission to the GCM outlines several things cities want to see in the final agreement:
First, cities want to be recognized as partners in developing policies, not just implementing them. Urban leaders have specialized knowledge on issues of social cohesion. It would be a missed opportunity for national governments, humanitarian actors, and UN agencies not to draw from their expertise.
Second, cities want a clearly defined role in implementation and follow-up of the GCM. Cities will have very real operational responsibilities as attention shifts from designing the GCM to implementing it. They will need outlets to express concerns and provide updates on developments unfolding within their jurisdictions.
Third, cities want to separate references to local authorities from references to other local actors. Municipalities (governments) are distinct from local actors (NGOs). They have very different legal responsibilities and mandates. The most recent draft of the GCM clarifies this point by denoting that the “whole-of-government” approach should include all levels of government. It should reinforce the distinction in the sections on implementation and follow up.
National governments may resist efforts to codify a clear role for municipalities in international migration governance. That is partly out of concern for protecting their sovereign right to set policy on issues surrounding visas and borders. Yet none of the changes cities are requesting constitute a challenge to that authority. The reluctance of national governments may also reflect discomfort with the rise of city diplomacy across a range of global issues, of which migration is only one. This is unfortunate, since there is little reason to think that the space for international engagement is finite and influence is zero-sum.
Cities want more of a voice in migration governance for a very practical reason: They are doing the work of providing safety and dignity to both newcomers and longtime residents, and they need support to do it. Ultimately, the success of these efforts is in the interest of member states, which benefit when their constituent communities are cohesive and well-functioning. New York City’s diversity is a strength: Immigrants own 52 percent of New York City’s businesses. Last year, they contributed an estimated $195 billion to the national GDP.
The compact will be finalized at the end of this week, and formally adopted in December. During these next six months, New York City will seek opportunities to meet with member states to discuss how it can contribute to implementing the policies under consideration and to achieving the compact’s goals. We encourage other cities to engage their national governments and share relevant practical lessons from their experiences.
If GCM implementation successfully engages city stakeholders, it could serve as model for globally negotiated agreements on a range of issues including sustainable development and climate change. Achieving progress in these domains depends significantly on the commitment of local governments to reaching agreed benchmarks.
Today, the international community would not launch a major, multi-year global governance process on migration, or nearly any other topic, without engaging a broad range of civil society actors. Three decades ago, that might not have been the case. Likewise, few discussions on such challenges would take place without thought to private sector engagement. Just one decade ago, that wasn’t so. Our hope is that a decade from now, local authorities will similarly be viewed as essential partners in global problem solving, in particular when it comes to migration. Cities are at the center of some of our era’s greatest challenges and will be indispensable to solving them. As former UN Secretary-General Kofi Annan said, “The future of humanity lies in cities.”
Until she moved to Fresno, California in 2003, Janet DietzKamei had never experienced asthma. But after just a few years in a city notorious for its filthy air—the
DietzKamei’s monitor, made by PurpleAir, is part of a network across California’s San Joaquin Valley, run in part by the Central California Environmental Justice Network. By putting monitors in backyards and around schools, the group is hoping to see what the area’s biomass plants and the dozens of trucks that rumble through are pumping into the lungs of disadvantaged residents.
Measuring air quality has been the purview of state environmental regulators, who rely on monitors approved by the U.S. Environmental Protection Agency that cost tens of thousands of dollars. That data is used to send out bad-air alerts (the green, orange, and red warning days) and for regulatory purposes.
But these readings show only a narrow slice of the air, based on a handful of monitors that may not be placed where the worst pollution is. Advances in technology have produced smaller sensors as cheap as $250, meaning that environmental activists, community groups, and curious citizens can map out air pollution around their schools, parks, or backyards.
This could eventually reshape air-pollution regulation, with previously unmeasured areas gathering data on air they say could violate federal health standards. In western Colorado, the environmental group Citizens for Clean Air has put up two dozen low-cost monitors around Grand Junction to supplement the two state-run monitors in the Grand Valley. In a region grappling with wildfire smoke, increased truck traffic, and natural-gas pollution, activists say a stronger web of monitors is necessary to prove to the state that more attention needs to be paid to them.
“The state does what they can with what they have to work with,” said Karen Sjoberg, the group’s leader. “They’ve got monitors in the best locations they can and they’ll do studies on that, but we need low-cost versions where we live.”
Even in large cities, which tend to get more attention because of their higher populations, low-cost sensors are being used to glean localized air-quality data. In addition to Fresno, take Salt Lake City, where pollution is a fact of life: The city sits in a basin, and wintertime inversions trap a thick coat of visible smog over the city for days at a time. Shea Wickelson, a high-school chemistry teacher at the Salt Lake Center for Science Education, said students begin thinking about pollution when recess is canceled on bad smog days.
“If you’re having that experience from elementary school, you’re very aware of air quality,” Wickelson said. “Students are coming up with questions like, ‘How is the air quality inside versus outside?,’ or ‘How does premium fuel compare to regular fuel?,’ or ‘How is the air around a school bus?’”
Answering those questions hasn’t been easy, but a partnership with the University of Utah has helped. The AirU program has students building their own particulate-matter sensors, starting with toy blocks, a cheap Arduino computer board, and a photo resistor that scatters light to detect particles of pollution. Students can use the tissue-box-sized monitors for science-fair projects, but they’ve also created a data-rich map of pollution around the city.
“Our lower-income areas have not always been very well represented, because people have other concerns than thinking of how to monitor air quality,” said Kerry Kelly, a chemical engineering professor at the University of Utah who oversees the program. “We’re getting real-time maps of the city’s microclimates. As this valley develops, this can help you manage where you’re putting things like schools.” Similarly, in Denver, Google has worked with Aclima to put the company’s low-cost sensors on street view cars to map pollution around the city.
The new generation of monitors is made possible by advances in laser technology. Monitors can capture air through a fan, then use a laser to count the number and size of particles in the air. Adrian Dybwad, the founder of PurpleAir, said he first started tinkering with air sensors to see what his family was breathing from a nearby gravel pit in Salt Lake City. An infrared sensor from the internet was too dependent on temperature, but he tested a modified laser sensor he got online against official regulatory monitors and found a 95 percent correlation.
After initially giving the monitors away, Dybwad’s company has now sold hundreds around the country, resulting in a real-time nationwide map on the company’s website.
“We call it high-resolution air sensing,” Dybwad said. “Having the ability to know what’s in your air, it gives people peace of mind.”
The technology works well for particulates, the pollution that can come from dust, smoke, and diesel exhaust and can lodge in the lungs and bloodstream. Ozone pollution, or ground-level smog, requires more complex readings on temperature, humidity, and gas makeup, which is a barrier to higher-quality and low-cost monitors for all pollutants.
As would be expected, accuracy is a challenge—the monitors require calibration, can be affected by temperatures, and may be susceptible to, say, a backyard barbecue or a bug that flies into the sensor. They’re not precise enough for regulatory purposes, and some states have warned citizens against calling in with outrageously high readings that are most likely a glitch.
That said, some state agencies have embraced the low-cost brands. Colorado recently deployed some PurpleAir monitors to communities threatened by wildfires in the southwest of the state, a way to see where smoke was traveling so they could warn residents.
The EPA has been running trials for wearable sensors and an air monitor that could be installed in a park bench, to put it closer to roads and parks. As hardware continues to get smaller and battery life advances, some are even looking toward a future where monitors are stitched into clothing or clipped onto a jacket for a minute-by-minute reading.
Kelly, the University of Utah professor, said the possibilities for wearable sensors could be endless.
“Think of a crossing guard, or someone in a woodworking shop—we can understand their exposure and maybe find ways to minimize it,” she said. “If you’re an asthmatic, this can change your behavior. There’s so much information we can find.”
The final day of Mobilize Dar es Salaam, June 28th, 2018, began with the plenary, “Advancing Inclusive City Design from Fringe to Mainstream.” On the premise that an equitable city takes into account the needs of everyone— including women, children, elderly people, and people with disabilities—in transport planning, the session explored ideas and dilemmas of designing inclusive transit systems.
The webinar –– Wednesday, July 18th @ 2 p.m. CT –– will cover the powerful upending of the electricity market structure on the doorstep, the implications for centrally planned power systems, and the policies that align market rules with the democratizing power of solar and energy storage.… Read More
Pinkeye struck the National Building Museum in 2015, just two weeks into the run of “The Beach.” At least one visitor said that she caught the bug at the D.C. museum’s summer spectacular. Hard to prove, but easy to believe, given that the attraction was a magnet for germs: a ginormous, roiling ball pit, filled with 750,000 plastic balls.
The risk was worth it, plenty of visitors decided. “The Beach”—designed by New York’s three-man design collective, Snarkitecture—drew more 183,000 visitors over its two-month run. That puts the exhibit in a category all by itself in the museum’s history. “The Beach” cemented the Building Museum’s annual architectural folly series as a rite of summer.
Since then, the museum brought “Icebergs” by James Corner Field Operations and “Hive” by Studio Gang, smaller shows that weren’t so laser-targeted at the K–5 demographic. This year, the beach is back—and there’s an entire beach house to go with it.
A whole retrospective of Snarkitecture’s work since 2010 opened at the Building Museum on July 4. “Fun House” is a whole lot bigger than “The Beach.” The massive house structure features 40 different objects and installations, from a Rube Goldberg machine–like palace that runs on marbles to an awesome mountain of pillows for building forts. For kids, this show must be nirvana in built form, a crystallization of dreams they maybe never knew they had. For adults, though, “Fun House” might feel more like a summer bummer: a repeat, a retread, and even a missed opportunity.
Home is the superstructure for “Fun House.” The show subdivides into exhibition areas that correspond to the parts of the home, each one highlighting a different example of Snarkitecture’s work. So the “foyer” of the home is a recreation of “Dig,” the firm’s 2011 project for the Storefront for Art and Architecture in New York. For that show, Snarkitecture filled the space with EPS architectural foam, turning a void into a solid, then excavated a way through it using simple tools. Just as with “The Beach,” Snarkitecture’s “Dig” has been copied and pasted into “Fun House” (in a slightly smaller format). “Fun House” is a carnival tour through the many clever—and occasionally brilliant—ideas that Snarkitecture has brought to life over the past decade.
The conceit will be lost on children, of course, who aren’t going to ask questions about “Marble Run,” a sculpture that Snarkitecture debuted at the Delano South Beach Hotel as part of Design Miami in 2015. A tower of snaking chutes, it’s a simple (but large) play toy rendered in Snarkitecture’s signature all-white aesthetic. Kids deposit black glass marbles and watch them go. Then it’s off to the next thing, whether that’s war over couch cushions in the living room (“Pillow Fort,” 2012) or wandering through a maze of hanging fabric strips in the bedroom (“Light Cavern,” 2015).
Maria Cristina Didero, an independent curator based in Milan, is the architect behind this meta-architectural showcase. “Fun House” is as much her show as it is Snarkitecture’s; some of the most graceful touches in the show come in the places where she has organized the company’s products. “Cast Light” (2011) and “Broken Ornament” (2012)—both sculptures made of cast gypsum cement that appear in the “Fun House” kitchen—illuminate how the designers use trompe-l’œil, obstructions, and fractures to rethink everyday things.
Snarkitecture has the air of a Silicon Valley startup, a trio that likes to move fast and break things. For “Eames Chair” (2014), a project for Design Within Reach, the firm wrapped the iconic Eames lounge chair and ottoman in shrink-wrap plastic and set a torch to it. “Tilt Coaster” (2014) and “Slip Chair” (2017) are manufactured just so to be unusable. Its work is design for an era that prioritizes experiences over things: all-white-everything objects that represent a kind of negation of the object. Snarkitecture is the Marie Kondo of the architecture world.
The problem with “Fun House,” then, is that Building Museum viewers already know all about Snarkitecture. Visitors got a thorough-going look at the company’s monochromatic vision with “The Beach” just three years ago. While the survey is as fun as the work—no small feat—what’s it doing here?
Washington, D.C., seems especially prey to a category of art that might be called spectacle. “Wonder,” a series of over-the-top installations at the Renwick Gallery—ostensibly the Smithsonian Institution’s craft museum—drew some 732,000 visitors to the museum over eight months after its 2015 reopening. The Hirshhorn Museum and Sculpture Garden clocked 1 million visitors in 2017 thanks to Infinity Mirrors, its honey-trap assembly of Yayoi Kusama’s delightful (and socially shareable) mirror environments.
Snarkitecture’s second appearance at the museum in almost as many years raises the question: Is the Building Museum taking architecture seriously? There are programs in other cities that feature follies, too, namely the Serpentine Gallery’s annual pavilion series in London. This year’s edition is a somber structure designed by Mexico City’s Frida Escobedo. It wouldn’t work nearly as well as a place to dump the kids for a couple of hours.
Running concurrently with “Fun House” is “Evicted,” an exhibit based on Matthew Desmond’s award-winning and haunting look at the justice and housing crisis facing America’s most vulnerable families. It’s hard to imagine a higher contrast in museum shows, between entertainment and education. And of course, museums need to do both. But the rush to capture crowds has generated serious inflation in D.C.—and elsewhere—that’s measurable in ever-escalating spectacles.
At the very least, there are lots and lots of designers who deserve the platform, and the Building Museum isn’t doing viewers a service by going back to the well. It isn’t even doing wonders for Snarkitecture, whose provocations look like a safe bet this second time around. Real risk might look different. There’s always next summer.
Keep up with the most pressing, interesting, and important city stories of the day. Sign up for the CityLab Daily newsletter here.
Cities are growing at a faster rate than any other habitat on the planet, as Planet Earth II noted in its “Cities” episode. Sure, describing cities as a “habitat” is a little jarring, but the urban landscape is so much more than people, buildings, and roads. Everything from preserving wild forests to creating planned public parks from scratch reveals how much we strive to remain part of nature.
The road trip is a classic staple of American life—there’s even an obsessively detailed map of cross-country travels in American literature. But despite the cliche, roadside attractions and novelties continue to fascinate, from awesome diner food to jaw-dropping rest stops. Last summer, our CityLab on the Road series detailed the towns and characters that sprung up along the Lincoln Highway, the first cross-country road in the United States, dedicated in 1913.
In 1956, the U.S. began building a much more ambitious—and sometimes infamous—countrywide road network: the Interstate Highway System. The map above from Geotab, a GPS-based fleet-tracking management company, shows how the network evolved over time to become a nationwide system of more than 49,000 miles, making road journeys accessible from nearly any city. President Eisenhower’s massive public works project marked a dramatic shift in United States road building, from constructing public trails that connected cities to massive highways that gutted them. While that wasn’t the plan at the start, some scholars have argued that the Interstate Highway System should have been two separate systems: roads between cities and roads within cities.
330 million: Visitors to National Park Service parks in 2017
1.3: Square miles in New York City taken up by Central Park
17: Square miles in New York City taken up by on-street parking
$5.82: Estimated public benefit delivered for each dollar spent planting trees
50 million: New trees Britain plans to plant to create a coast-to-coast forest
10 billion: Tons of concrete produced around the world each year
Commuters check Google Maps for traffic updates the same way they check the weather app for rain predictions. And for good reasons: By pooling information from millions of drivers already on the road, Google can paint an impressively accurate real-time portrait of congestion. Meanwhile, historical numbers can roughly predict when your morning commutes may be particularly bad.
But “the information we extract from traffic data has been exhausted,” said Zhen (Sean) Qian, who directs the Mobility Data Analytics Center at Carnegie Mellon University. He thinks that to more accurately predict how gridlock varies from day to day, there’s a whole other set of data that cities haven’t mined yet: electricity use.
“Essentially we all use the urban system—the electricity, water, the sewage system and gas—and when people use them and how heavily they do is correlated to the way they use the transportation system,” he said. How we use electricity at night, it turns out, can reveal when we leave for work the next day. “So we might be able to get new information that helps explain travel time one or two hours in advance by having a better understanding of human activity.”
In a recent study in the journal Transportation Research Part C, Qian and his student Pinchao Zhang used 2014 data to demonstrate how electricity usage patterns can predict when peak congestion begins on various segments of a major highway in Austin, Texas—the 14th most congested city in the U.S. They crunched 79 days worth of electricity usage data for 322 households (stripped of all private information, including location), feeding it into a machine learning algorithm that then categorized the households into 10 groups according to the time and amount of electricity use between midnight and 6 a.m. By extrapolating the most critical traffic-related information about each group for each day, the model then predicted what the commute may look like that morning.
When compared with 2014 traffic data, they found that 8 out of the 10 patterns had an impact on highway traffic. Households that show a spike of electricity use from midnight to 2 a.m., for example, may be night owls who sleep in, leave late, and likely won’t contribute to the early morning congestion. In contrast, households that report low electricity use from midnight to 5 a.m., followed by a rise after 5:30 a.m., could be early risers who will be on the road during rush hour. If the researchers’ model detects more households falling into the former group, it might predict that peak congestion will start closer to, say, 7:45 a.m. rather than the usual 7:30.
“In addition to using a plain machine learning model to make the connection between two systems, we also try to use our intuition to explain [the human activity] behind the patterns of electricity use,” said Qian.
He also acknowledged that the sample size is small. Making a more accurate prediction model will not only call for more numbers, but also more variety of data. Ideally, the predictions would factor in historic and real-time traffic data, along with weather updates, public transit information, and insights into how other utilities are used. (A spike in pre-dawn toilet flushing? Prepare for a jammed-up morning rush.) Knowing more detail about the households themselves would only improve the model—but Qian said it’s not necessary.
“For the manager of roadway systems, what they’re interested to know are the ‘what if’ scenarios,” he said. “If there is an accident, or if there is extreme weather conditions, or if the city adds one more lane or change the parking prices [downtown]—things you can’t learn from the past.” They also want to better understand driver’s behavior and proactively manipulate it to reduce congestion. There’s no doubt that Google Maps is already helpful to drivers, he added, “but it was never designed from the city manager’s perspective.”
Right now, this study is only a “proof of concept,” according to Qian, to show that it’s worth the effort for cities to explore this kind of data—and make it publicly available. In fact, he said, the hardest part is accessing the information, which is usually spread across public agencies without any sort of coordination or guarded by private utility companies.
“One paper won’t address all the issues, but it’s a starting point,” he said. “If more people realize the importance of analyzing data, it’s a good win-win for the public and private industries.”
Readers of CityLab have likely seen numerous arguments condemning the competition among North American cities to land the second Amazon headquarters. Urban policy experts are nearly unanimous in their opprobrium for the HQ2 selection process and their skepticism that the “winning” city will reap the rewards of economic growth and job creation that Amazon has promised. (And even if it does, the price tag for accommodating HQ2 may negate or severely diminish any fiscal benefit.)
For cities eager to lay the groundwork for long-term economic growth, what are better options? Can any single development gambit generate such a windfall of new jobs?
Yes, but it will not happen through a development strategy based on luring outside corporations with taxpayer incentives. Rather, this kind of big bang is far more likely to happen when a homegrown local startup gets acquired or goes public, transforming early employees and investors into millionaires.
These beneficiaries can become invaluable mentors and investors for a succeeding generation of local startups; many may use their earnings to become entrepreneurs themselves. A major business acquisition or IPO—not the HQ2 charade—is the kind of jackpot that economic developers should pursue.
I have seen how the effects of such a big bang rippled through the Washington, D.C., region, where I live. During the 1990’s AmericaOnline (AOL), then based in Tysons Corner in Northern Virginia, dominated the dial-up consumer internet market. AOL was valued at $125 billion when it merged with Time Warner in 2000, a transaction that made many executives and employees very wealthy.
That merger was not ultimately successful, but the imprimatur of AOL remains visible across D.C.’s business landscape two decades later. Along with several AOL alumni, company CEO Steve Case went on to found Revolution, a Washington, D.C.-based venture firm with hundreds of millions in capital that has made numerous investments into local startups like Optoro. Case also founded Revolution Health, which employed hundreds of people in the early 2000’s, including Tim O’Shaughnessy, the future co-founder of e-coupon company LivingSocial, which at its peak was valued over $1 billion. O’Shaughnessy now makes venture investments as president of DC-based Graham Holdings, while LivingSocial alumni have founded local startups such as Framebridge and Galley. All of this activity loops back to one company: AOL and its merger with Time Warner.
Or jump to a city on the other side of the coast: Seattle. In his book The New Geography of Jobs, economist Enrico Moretti notes that in the 1970’s, Seattle was a struggling industrial city whose fortunes were closely tied to that of its major employer, the aerospace firm Boeing, which had downsized dramatically during that decade. But in 1979, the region became the beneficiary of what may have been the most consequential business relocation of all time: Microsoft’s move from Albuquerque to Bellevue at a time when the company employed just a couple dozen people. (Note that Microsoft received no public incentives to induce its move, and I’ve found no evidence that economic development officials took particular note of it). Microsoft thrived in the early days of personal computing, going public in 1986. Since then, its employees have founded a slew of companies that are today at the center of Seattle’s thriving tech scene, from Zillow to Inrix to Vulcan Capital.
These sorts of economic big bangs can happen anywhere, not just on the coasts. Just last December, Birmingham-based e-delivery startup Shipt was acquired by Target for $550 million. Shipt was founded by a Birmingham entrepreneur who managed a local team of over 200 and raised capital from Alabama investors—all of whom got a windfall. Birmingham is poised to reap the benefits of the Shipt acquisition for many years to come, including a bevy of new mentors and investors supporting a new generation of startups.
So what is the lesson for your town’s economic development boosters? City leaders are wise to stay close to startups that are starting to scale, meeting with them regularly, congratulating them on raising a new round of capital, and solving whatever issues arise regarding regulation, workforce, or infrastructure. Once a company has hundreds of employees, it’s likely that other states will dangle big checks in exchange for relocation. A company whose leaders and employees feel tied to its community is less likely to be tempted to jump ship.
And if an incentive battle does break out, the city where the startup is based will enjoy home court advantage, with employees already settled into their homes and commutes and their children enrolled into local schools. Meeting and maintaining relationships with fast-growing startups may seem obvious, but in my experience economic development leaders are often unaware which local companies have raised the most venture capital—an obvious indicator of future growth.
Last fall, 238 cities invested countless staff hours putting together proposals for Amazon, hoping to prevail in an economic development competition so intense and expensive it has earned comparisons to the Olympics. Most cities have been quiet about what it cost to draft their HQ2 proposals, but not all. Virginia Beach, for example, spent $100,000, not including staff hours that could have gone toward other projects.
But what if cities like Virginia Beach had instead invested those resources into outreach to learn about their five fastest-growing local startups and solve their most pressing challenges? Regulations could have been streamlined, new community college courses created, and zoning requirements tweaked—at a fraction of the cost of a corporate relocation incentive package. And the cities’ time and money would have been much more likely to spark an economic big bang than the frivolous pursuit of HQ2.