US Population Trends Don’t Support Cost-Effective Train “Mass” Transit

COST Commentary: For some time, “Smart Growth” supporters have promoted the idea that people are returning to the central cities and higher densities as a living choice. This has been one of the major themes in supporting the expenditure of huge amounts of tax dollars on rail mass transit.

The five articles below, by three different authors, present a wealth of information and suggest: There is absolutely no evidence to support the mythical trend of people moving back to the central core cities. The fact that US urban population has recently exceeded rural population for the first time has fueled the myth of this “back to the city movement.” As pointed out below, there are counter trends impacting urban growth and a blurring of the traditional distinctions between rural and urban living. While rural population has shifted to urban areas, core municipality (city) populations have shifted to the outer urban areas (suburbs). Meanwhile rural living has become more like urban living in many ways.

The bottom line is that there are no general trends of increasing densification in the core areas which mass transit primarily serves. The vast majority of people’s living choices and actions do not support the often stated rational for building more and more expensive, highly tax subsidized, rail transit systems. In fact, the real trends would indicate, more than ever, the most cost-effective transit systems will be modern bus systems which have: 1) the flexibility necessary to meet changing demands in a timely manner, and 2) the much lower costs necessary to sustain support to a much greater portion of the population needing transit.
______________________________________________________________________________________

The Ambiguous Triumph of the “Urban Age”
by Robert Bruegmann in Newgeography.com
09/13/2011

In its State of the Population report in 2007, the United Nations Population Fund made this ringing declaration: “In 2008, the world reaches an invisible but momentous milestone: For the first time in history, more than half its human population, 3.3 billion people, will be living in urban areas.”

The agency’s voice was one of many trumpeting an epoch-making event. For the last several years, newspaper and magazine articles, television shows and scholarly papers have explored the premise that because most of the world now lives in urban rather than in rural areas things are going to be, or at least should be, different. Often the conclusion is that cities may finally get the attention they deserve from policy makers and governments. This optimism dovetails nicely with a sizeable literature of urban advocacy chronicling the rejuvenation of central cities and extolling the supposed virtues of high-density city living, even predicting the withering away of the suburbs.

This supposed triumph of the urban is fraught with ironies, however. The first is that, rather than a simple rush of people from the hinterlands into the centers of high density cities, there has also been, within almost every urban area in the world, a significant move of the population outward, from dense city centers into peripheral suburban areas and beyond them into very low-density exurban regions.

We can use Paris as a typical example. The city of Paris reached its peak population of nearly 3 million in the 1920s. It has lost nearly a third of its population since then. What remains in the city is a smaller and wealthier population. At the same time the suburbs, accommodating both families of modest income forced out of the city as well as a burgeoning middle class, have grown enormously, from two million to over eight million. And this does not count a great deal of essentially urban population that that lives in a vast ring of exurban or “peri-urban” settlement. Certainly the majority of “urban” dwellers in the Paris region do not live in the elegant apartment blocks along the great boulevards familiar to the tourist. They live in houses or small apartment buildings in the suburbs and use the automobile for their daily transportation needs.

In fact, Paris is a good example of an even more fundamental irony. At the very moment when urban population has been reported to surpass the rural, this distinction has lost most of its significance, at least in many parts of the affluent world. Two hundred years ago, before automobiles, telephones, the internet and express package services, cities were much more compact and rural life was indeed very different from urban life. Most inhabitants of rural areas were tied to agriculture or industries devoted to the extraction of natural resources. Their lives were fundamentally different from those of urban dwellers.

Today the situation has changed radically. Most people living in areas classified as rural don’t farm or have any direct connection with agriculture. They hold jobs similar to those in urban areas. And although they might not have opera houses, upscale boutiques or specialized hospitals nearby, the activities that take place in these venues are available to them in ways that they never were before.

I can confirm the way the distinction between urban and rural has broken down by looking out the window of the house in Omro, WI where I am staying this weekend. Omro, population about 3000, is located 8 miles west of Oshkosh and is legally a city under Wisconsin law. It is also an “urban” place according to the Census Bureau which, like those of other countries, defines urban largely by density standards. In the case of the US, this means, in simplest terms, a density of at least 1000 people per square mile or just under two people per acre.

At one time this 1000-people-per-square-mile figure did provide a logical demarcation line. Above those densities were places that could afford urban services like public water and sewers, sidewalks, streetlights, municipal fire departments and libraries. Below that level were places that either didn’t have these services or had to depend on faraway county governments. Unless you were closely associated with agricultural production or other rural economic activities or you were wealthy enough to provide your own services, it was quite inconvenient to live in rural areas.

Today, the automobile, rural electrification, the internet and the rise of alternate and privatized services has transformed what it means to live in rural areas. “Country living” today has few of the drawbacks that made it inconvenient for middle class residents as recently as fifty years ago, and the migration of so many urbanites into the country has blurred the distinction between urban and rural.

The view out my window bears this out. When I look one direction what I see are city streets and houses on land that is technically urban. Of course, Omro, with a single main street, two traffic lights and only a handful of stores, is not at all the kind of place that most people associate with the words “city” or “urban.” Like the majority of small urban places in this country, its densities are lower than those found in the suburbs of larger cities. When I look out the other direction I see mostly fields beyond the city limit. But, unlike the case in the past, there is no sharp divide. There has been a significant increase in the number of houses out in the area that is technically “rural.” Some of these used to be farmhouses, but there are few farmers anywhere for miles around. Most farming is now done under contract or as a large industrial-scale operation.

Most of the houses in the “rural” area around Omro have been built in the last decade or two and never housed anyone with any direct connection to farming. They are suburban in appearance and mostly inhabited by people who work at home, are retired or commute some distance to jobs spread across a vast swath of urban territory that stretches from Fond du Lac south of Lake Winnebago to Green Bay where the Fox River meets Lake Michigan.

The result is that today, as you drive outward from the center of Fond du Lac, Oshkosh, Appleton or Green Bay, the number of houses per square mile diminishes, but there is no clear break between city and country. It is a crazy quilt of agricultural, residential and other uses. Commuting patterns, if charted on a map, would form a giant matrix of lines running in all directions. Whether one is in the center of Oshkosh or 50 miles away, however, one can still live an essentially urban existence.

This same diffused urban condition holds true for very large swaths of the United States wherever there is enough underground water to allow wells. It is particularly conspicuous in the older and more densely settled eastern part of the country. A state like New Jersey exhibits a pattern of dense older cities, radiating suburbs, vast exurban territories and farmland and open space, overlapping in ways that confound traditional notions about what is urban and rural. In places like New Jersey, the census distinction has lost almost all of its meaning.

I don’t mean to suggest that that the news that the majority of the world’s population is now urban has no significance. In fact this move from the countryside to urban areas has been one of the defining events of world history over the last several centuries. Although this process was mostly finished in Western Europe and the United States decades ago, it still continues in most of Latin America, Africa and Asia and accounts for a great deal of the dramatic upward surge in income throughout the world.

Nor am I suggesting the demise of the great cities of Europe or America. Far from it. Many rich families in particular will probably continue to choose high-density neighborhoods like those on the Upper East Side of New York or the 16th arrondissement in Paris, although often with a rural retreat as well As the world gets wealthier, more people may make a choice to live in this way.

However, current trends give no reason to believe that places like Manhattan or central Paris are going to increase in population and density as part of a “back-to-the-city” movement. As cities gentrify, they undoubtedly become more attractive, but increased demand leads to higher prices keeping out many families who might choose to live in them. Furthermore, the gentrifiers tend to have smaller families than those they replace, and they also tend to demand more room, larger and better equipped housing units, more parks and open spaces. Because of this, the gentrifiers, citing the need to preserve existing neighborhoods, frequently put up all kinds of barriers to new development and increased population and density, particularly by less affluent citizens. For all these reasons, existing city centers in the affluent world are unlikely to accommodate a significantly larger percentage of the population.

Even in the developing countries, as urbanist Shlomo Angel has shown, most cities are spreading outward at ever lower overall densities just as cities have been doing for many years in the affluent West. For those who don’t have a lot of affluence, and even some who do, low density suburban- and increasingly, even lower density exurban- living, remains alluring for many in both the affluent and the developing world. In fact, we might even be seeing the initial stages of a major reversal of the kind of urbanization that characterized industrializing cities in the West in the 19th and early 20th centuries. The sharp increase in houses outside Omro may presage at least a partial return to a pre-industrial condition seen, for example, in nineteenth century America when people were more evenly spread across the landscape.

This continuing urban sprawl is, of course, deplored by many of those who celebrate the supposed triumph of the “urban age. “ Yet as I have argued in my book Sprawl: A Compact History, this phenomenon is by no means as bad as most anti-sprawl crusaders imagine it to be. Continuing to spread the population could conceivably result in a more equitable, more sustainable pattern of living, particularly as renewable energy and other resources are harvested close to home with less need of the giant systems necessary to maintain our dense industrial-age cities. In any case, despite all of the planning regulations put in place in cities throughout the affluent world to control growth at the edge, the periphery continues, inexorably, to expand almost everywhere.

Nowhere does the evidence suggest that we are witnessing the final triumph of the traditional high-density city. In fact, the much-ballyhooed urban majority might be in great part a statistical artifact, a way of counting the population that over-emphasizes the move from country to city and fails to account for the powerful counter-movement from the city back toward the countryside. Indeed the emerging reality of overlapping patterns of high density centers, lower-density peripheries and vast areas of very low density urban settlement, all of them interspersed with agricultural lands and protected open spaces, threatens to upend altogether the traditional notion of what it means to be urban.
__________________
Robert Bruegmann is professor emeritus of Art history, Architecture and Urban Planning at the University of Illinois at Chicago.
__________________________________________________________________________________
__________________________________________________________________________________
Cities and the Census
by Joel Kotkin and Wendell Cox in City Journal
6 April 2011

The new data show urban America neither on the way out nor roaring back.
6 April 2011

For many mayors across the country, including New York City’s Michael Bloomberg, the recently announced results of the 2010 census were a downer. In a host of cities, the population turned out to be substantially lower than the U.S. Census Bureau had estimated for 2010—in New York’s case, by some 250,000 people. Bloomberg immediately called the decade’s meager 2.1 percent growth, less than one-quarter the national average, an “undercount.” Senator Charles Schumer blamed extraterrestrials, accusing the Census Bureau of “living on another planet.” The truth, though, is that the census is very much of this world. It just isn’t the world that mayors, the media, and most urban planners want to see.

Start with the fact that America continues to suburbanize. The country’s metropolitan areas have two major components: core cities (New York City, for example) and suburbs (such as Westchester County, Long Island, northern New Jersey, and even Pike County in Pennsylvania). During the 2000s, the census shows, just 8.6 percent of the population growth in metropolitan areas with more than a million people took place in the core cities; the rest took place in the suburbs. That 8.6 percent represents a decline from the 1990s, when the figure was 15.4 percent. The New York metropolitan area was no outlier: though it did better than the national average, with 29 percent of its growth taking place within New York City, that’s still a lot lower than the 46 percent that the region saw in the 1990s.

This may be shocking to some. For years, academics, the media, and big-city developers have been suggesting that suburbs were dying and that people were flocking back to the cities that they had fled in the 1970s. The Obama administration has taken this as gospel. “We’ve reached the limits of suburban development,” Housing and Urban Development secretary Shaun Donovan opined in 2010. “People are beginning to vote with their feet and come back to the central cities.” Yet of the 51 metropolitan areas that have more than 1 million residents, only three—Boston, Providence, and Oklahoma City—saw their core cities grow faster than their suburbs. (And both Boston and Providence grew slowly; their suburbs just grew more slowly. Oklahoma City, meanwhile, built suburban residences on the plentiful undeveloped land within city limits.)

All this suburbanization means that the best unit for comparison may be, not the core city, but the metropolitan area; and the census shows clearly which metropolitan areas are growing and which are not. The top ten population gainers—growing by 20 percent, twice the national average or more—are the metropolitan areas surrounding Las Vegas, Raleigh, Austin, Charlotte, Riverside–San Bernardino, Orlando, Phoenix, Houston, San Antonio, and Atlanta. These areas are largely suburban. None developed the large, dense core cities that dominated America before the post–World War II suburban boom began. By contrast, many of the metropolitan areas that grew at rates half the national average or less—San Francisco, Los Angeles, Philadelphia, Boston, New York—have core areas that are the old, dense variety. Planners and pundits may like density, but people, for the most part, continue to prefer more space.

If you do look at cities themselves, rather than at larger metropolitan areas, you’ll see that the census reveals three different categories. The most robust cities, with population growth over 15 percent for the decade—Raleigh, Austin, Charlotte, Las Vegas, Jacksonville, and Orlando—were located within the kind of metropolitan area that urbanists tend to dislike: highly suburbanized, dominated by single-family homes, and with few people using public transit. That’s partly because these cities developed along largely suburban lines by annexing undeveloped land and low-density areas. This has been the case in virtually all the fastest-growing cities. Raleigh has expanded its boundaries to become 12 times larger than it was in 1950; Charlotte and Orlando are nine times larger, and Jacksonville an astounding 25 times larger.

At the opposite end of the spectrum are core cities, mostly in the Midwest and Northeast and often land-constrained, that have continued to shrink. These include longtime disaster zones like Detroit and Cleveland as well as newer ones like Birmingham in the South. They include Pittsburgh, a city much praised for its livability but one that is aging rapidly and whose city government, based disproportionately on revenue from universities and nonprofits, is among the nation’s most fiscally strapped. They even include Chicago, which lost some 200,000 people during the 2000s, its population falling to the lowest level since the 1910 census. The reasons aren’t hard to identify: despite all the hype about Chicago’s recovery and the legacy of Mayor Richard M. Daley, the Windy City is among the most fiscally weak urban areas in the country, its schools are in terrible shape, and its economy is struggling.

Finally, there are cities that have grown, but not quickly. New York City’s population, for example, inched to a record high in the 2000s, but that growth was less than the national average. The population of Los Angeles grew a mere 97,000—the smallest increase since the 1890s. Many of the slow-growing cities (New York, San Francisco, and Boston, for example) suffer from high housing costs, which inhibit population growth. But they also host high-end industries—finance, technology, and business services—and enough well-paid workers in these industries to afford pricey housing and sustain a small rate of growth. The cities also attract already wealthy people from elsewhere.

The census provides information on a smaller level, too, telling us not just which cities have grown, but where the growth has taken place within cities. Often, it has been in and around the historic downtowns. This is a trend in many cities that otherwise differ starkly (New York, St. Louis, Chicago, Los Angeles), and it reflects a subtle shift in the role of the downtown. Rather than reasserting themselves as dominant job centers, downtowns are becoming residential and cultural—a change that H. G. Wells predicted when he wrote that by 2000, the center of London would be “essentially a bazaar, a great gallery of shops and places of concourse and rendezvous.” What may have been an office, industrial, or retail zone morphs into a gentrified locale attractive to the migratory global rich, to affluent young people, and to childless households.

This downtown recovery (which many cities subsidized heavily) was partly why so many urbanists and developers identified a broader back-to-the-city movement; but in reality, the phenomenon was usually limited to a relatively small population and a relatively small area. Since 1950, for example, St. Louis has lost a greater share of its population than any American city ever boasting 500,000 or more residents. The area from downtown to Central West End experienced strong growth during the 2000s, however, adding more people than Portland’s Pearl District, a favorite of urban planners. Yet this gain of 7,000 people was far from enough to offset the loss of 36,000 in the rest of St. Louis.

It’s also worth noting that in economic terms, downtowns are losing their hold. For example, though the residential population of Chicago’s Loop tripled to 20,000 in the past decade, that famed business district lost almost 65,000 jobs; its share of the metropolitan area’s employment also fell. Los Angeles’s downtown, whose population has likewise grown, lost roughly 200,000 jobs from 1995 to 2005. Manhattan is losing employment share to the other four boroughs, as it has been for decades; but as a recent report from the Center for an Urban Future reveals, the process accelerated over the last ten years. From 2000 to 2009, Manhattan lost a net 41,833 jobs, while other boroughs saw net increases. This employment dispersion is even more evident in the suburbs. Of commuters who live in the inner-ring suburbs (such as Yonkers and East Orange), 60 percent work in their home counties and only 14 percent in Manhattan. Of commuters from such outer-ring suburbs as Haverstraw and Morristown, 73 percent work in their home counties and 6 percent in Manhattan.

What, in the end, does the census tell us about America’s cities today? Certainly not that they’re dying, as they threatened to do in the 1950s, but equally certainly that they aren’t roaring back. Cities remain a successful niche product for a relatively small percentage of the population. Most people, though, even in the New York metropolitan area, continue to move toward the periphery rather than the core. That said, New York’s continuing growth over the past decade suggests that its recovery will likely prove durable. As for Senator Schumer’s “another planet” allegations, the census is simply confirming the fact that terrestrial Americans continue to disperse, both within and among metropolitan areas. So far, there’s little that planners, policy makers, and urban boosters can do about that.
________________

Joel Kotkin is a Distinguished Presidential Fellow in Urban Futures at Chapman University in California, an adjunct fellow with the London-based Legatum Institute, and the author of The Next Hundred Million: America in 2050. Wendell Cox is principal of Demographia, a public policy firm in the St. Louis metropolitan area, and a former member of the Los Angeles County Transportation Commission.
_______________________________________________________________________________
_______________________________________________________________________________

FINAL CENSUS RESULTS: CORE CITIES DO WORSE IN 2000’s THAN 1990’s

by Wendell Cox 03/24/2011 in Newgeography.com

Based upon complete census counts for 2010, historical core municipalities of the nation’s major metropolitan areas (over 1,000,000 population) captured a smaller share of growth in the 2000s than in the 1990s.

The results for the 50 metropolitan areas (New Orleans excluded due to Hurricane Katrina and Tucson unexpectedly failed to reach 1,000,000 population) indicate that historical core municipalities accounted for 9 percent of metropolitan area growth between 2000 and 2010, compared to 15 percent in the 1990-2000 period. Overall, suburban areas captured 91 percent of metropolitan area population growth between 2000 and 2010, compared to 85 percent between 2000 and 2010.
Total population growth in the historical core municipalities was 1.4 million, nearly all of it in municipalities with a largely suburban form (such as Phoenix, San Antonio and Charlotte). This compares to an increase of 2.9 million during the 1990s.

Suburban areas (areas in metropolitan areas outside the historical core municipalities) grew 15.0 million, down from 16.1million.

Overall, the major metropolitan areas added 14 percent to their populations in the 2000s, down from 19 percent growth in the 1990s. The historical core municipalities grew 4 percent, compared to the 1990s rate of 7 percent. Suburban areas grew 18 percent, compared to the 1990s rate of 26 percent (all data unweighted).


_________________________________________________________________________________
_________________________________________________________________________________
PERSPECTIVES ON URBAN CORES AND SUBURBS

by Wendell Cox 03/13/2011 in Newgeography.com

Our virtually instant analysis of 2000 census trends in metropolitan areas has the generated wide interest. The principal purpose is to chronicle the change in metropolitan area population and the extent to which that change occurred in the urban core as opposed to suburban areas.

From a policy perspective, this is especially timely because of the recurring report that suburbanites have been moving to the urban core over the last decade. We have dealt with this issue extensively, noting the lack of data for any such interpretation. As of this writing, with data for more than half of the major metropolitan areas (over 1,000,000 population) in, there remains virtually no evidence that people are “moving back to the city” (actually, most suburban growth came from outside metropolitan areas, not from the “cities”).

The Policy Context: Urban Cores and Suburbs

This discussion is not new, and generally pits anti-automobile interests – including much of the urban planning community – who favor the urban development patterns of prewar America (generally the urban planning community) against those who would prefer allowing people to make their own choices about where they live or work.

Over the past 60 or more years, the data indicates that consumers have nearly exclusively chosen less dense and more suburban areas. This is not to suggest, however that many of us, including this author, automatically favor suburbs over urban cores. Indeed, I have enjoyed years of alternating between living in suburban America and the urban core of the (inner) ville de Paris (arrondissements I, II, V, VII and XI). But if you have a taste for urban living, that does not mean high-density cities are inherently superior to suburban living. People, after all, have different preferences.

Urban areas include both urban cores and suburbs. The delineation of urban cores and suburbs is subjective. There was for example a time – say around 1820 – when development to the north of New York’s Houston Street would have been considered suburban. More than two thirds of the present ville de Paris was suburban before the city limits were expanded in the 1860s. Now, no one would consider, for example, Washington Square or Herald Square to be suburban and the suburbs of Paris now extended to more than 80 times the land area of the 1860s ville de Paris.

One overlooked way to approach the current debate would be to look not at municipal boundaries but forms of development. Around 1950 we began the breakneck expansion of automobile oriented suburbanization which had proceeded more modestly for two or more decades before.

The Urban Core:

This analysis defines the urban core consistent with the criteria of the US Bureau of the Census in 1950. Metropolitan areas are organized around urban areas (urbanized areas). We use the “central cities” of the core urban areas in 1950 as the urban core in the analysis. Those portions outside the 1950 urban core are thus considered suburban. Where an urban area did not exist in 1950 (such as in Las Vegas and Tucson), the urban core is the central city of the urban area when it was first established.

No existing specification of the urban core is ideal, though the present one is appropriate for the policy purpose stated above. Clearly, the urban core would be far better defined at the census tract or even census block level based upon the characteristics of an urban core. This would include factors such as high residential population density, high transit usage, walkability and a high percentage of multiple unit residential buildings.

Such an ideal definition of the urban core cannot be measured with municipal boundaries. Yet, municipal boundaries have routinely been used by researchers to delineate the urban core, not least because the data is readily available. However there three notable difficulties with the use of municipal boundaries to define the urban core.

First; some areas with urban core characteristics are outside the core municipalities. As The Infrastructurist notes, municipalities like Jersey City or Hoboken have the characteristics of urban cores. However, since they are not a part of the core municipality (city of New York), they are classified as suburbs in our analysis. It is well to remember that both Hoboken and Jersey City represented suburban development, during their period of greatest growth, before 1930.

Second, other areas with postwar suburban characteristics are inside the core municipalities. For example, Richmond County (Staten Island), a part of the city of New York is principally suburban. Much of it was developed well after 1950 and consists largely of single family homes. The median construction date of owner occupied housing in Staten Island is 1970, which compares to 1965 in adjacent Middlesex County, New Jersey. It is newer than in Morris County New Jersey (1965), much of which is outside the urban area (all median house construction years from the 2000 census). Major portions of core municipalities such as Los Angeles, Houston, Dallas, Portland, Seattle, Denver and others are also postwar suburban.

Third, in a number of core municipalities, there is little, if any urban core, at least from a residential perspective. For example, one would be hard-pressed to identify an urban core in municipalities such as Phoenix or San Jose (despite the fact that the San Jose urban area is more dense than New York urban area). In metropolitan areas such as these, it might be preferable to define virtually all growth as suburban, though our analysis still defines these municipalities as the urban core.

Based upon the early results from the census it seems that if the more ideal census tract-based urban core definition were used, the urban cores would be shown to be capturing an even smaller share of growth, while suburban areas would be capturing more. But this analysis will have to wait until all the numbers are in.

Historical Core Municipality

The term “historical core municipality” is used to denote the urban cores using municipal boundaries. The term “city” is avoided because of its multiple definitions. Cities can be municipalities (such as in the city of New York), urban areas (such as the New York urban area), metropolitan areas (such as the New York metropolitan area) or multi-county regions or prefectures of countries like China (such as Wuhan or Shenyang).

This lack of clarity can be routinely seen in media reports that indiscriminately (and without comprehension) make comparisons between cities, using differing definitions. This can even extend even to more technical literature (see pages 12-14 of Urban Transportation Policy Requires Factual Foundations).

Principal Cities: Starting in 2003, the Census Bureau substituted the term “principal city” for the previous “central city” term. The use of principal city designations and the largest municipality as the principal name of a metropolitan area are appropriate for the purposes intended by the Census Bureau.

In its State of Metropolitan America, the Brookings Institution uses up to the three largest principal cities (which it calls “primary cities”) and consider other parts of metropolitan areas as suburbs.

Neither approach, however, is appropriate in analyzing postwar suburbanization. Any municipality in a metropolitan area with more than 250,000 population is considered a principal city, regardless of its urban form. Any municipality with more than 50,000 population but which also has more jobs than resident workers is also a principal city, regardless of its actual on the ground reality.

This leads to a situation in which, for example, Los Angeles has 26 principal cities. Any postwar urban form definition would classify nearly all as suburban (and much of the historical core municipality of Los Angeles, notably the San Fernando Valley, itself is suburban). For example, the suburban city of Cerritos is a principal city, yet was largely filled by dairy farms well into the 1950s and was called Dairy Valley.

Other principal cities hardly existed in 1950. Virginia Beach has become the largest municipality in its metropolitan area, having displaced Norfolk. Yet, in 1950 Virginia Beach had a population of only 5,400, well below the 50,000 threshold that was required of central cities (smaller than Ponchatoula, Louisiana, doubtless an unfamiliar municipality to most readers). Arlington, Texas, the third municipality in the Dallas-Fort Worth-Arlington metropolitan area, had a population of 7,700 in 1950, again well below the central city threshold. Arlington is not an urban core, it is a suburban jurisdiction.

Virginia Beach is a good example of a suburban area that has become the largest municipality in a metropolitan area. Its greater size, however, does not make Virginia Beach the urban core. Otherwise, Contra Costa County in California could, by consolidating with its constituent municipalities (God forbid), replace San Francisco as the metropolitan area’s urban core.

Perhaps the ultimate example of the problem of principal cities being confused with urban cores is Hemet, California, a principal city of the Riverside-San Bernardino metropolitan area that is, in fact an exurb and not in the primary urban area.

Toward the Future

An eventual more precise analysis of urban cores and suburban trends will be welcome. Yet, as our analysis of trends in New Jersey indicated, even the growth in more urban core oriented municipalities was minuscule compared to the state’s suburban growth. Further, much of the urban core growth in the nation came from areas that, although formally located within “city limits” actually were on the suburban fringe. This was true, for example, in Kansas City, Oklahoma City and even Portland. This suggests that the small share of growth reported in urban cores would be even less if it were based on census tract data; and suburbanization, as a way of life, may indeed be even more prevalent than this year’s numbers suggest.
__________________________________
Wendell Cox is a Visiting Professor, Conservatoire National des Arts et Metiers, Paris and the author of War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life”

_________________________________________________________________________________
_________________________________________________________________________________

The Still Elusive “Return to the City”
by Wendell Cox in Newgeography.com
Posted: 22 Feb 2011 02:38 AM PST

Metropolitan area results are beginning to trickle in from the 2010 census. They reveal that, at least for the major metropolitan areas so far, there is little evidence to support the often repeated claim by think tanks and the media that people are moving from suburbs to the historical core municipalities. This was effectively brought to light in a detailed analysis of Chicago metropolitan area results by New Geography’s Aaron Renn. This article analyzes data available for the eight metropolitan areas with more than 1 million population for which data had been released by February 20.

Summary: Summarized, the results are as follows. A detailed analysis of the individual metropolitan areas follows (Table 1).

In each of the eight metropolitan areas, the preponderance of growth between 2000 and 2010 was in the suburbs, as has been the case for decades. This has occurred even though two events – the energy price spike in mid-decade and the mortgage meltdown – were widely held to have changed this trajectory. On average, 4 percent of the growth was in the historical core municipalities, and 96 percent of the growth was in the suburbs (Figure 1).

In each of the eight metropolitan areas, the suburbs grew at a rate substantially greater than that of the core municipality. The core municipalities had an average growth from 2000 to 2010 of 3.2 percent. Suburban growth was 21.7 percent, nearly 7 times as great. Overall, the number of people added to the suburbs was 14 times that added to the core municipalities.

Analysis of Individual Metropolitan Areas: The major metropolitan areas for which data is available are described below in order of their population size (Figure 2 and Table 1).

Chicago: The core municipality of Chicago lost 200,000 residents between 2000 and 2010. Suburban growth was 546,000, adding up to total metropolitan area growth of 346,000 people. The suburbs accounted for 158 percent of the metropolitan area growth. The core municipality decline was stunning in the face of the much ballyhooed urban renaissance in that great city. Yet this renaissance was limited enough as to not lead to an expanding population.

The decline in the core municipality population represents a major departure from the 2009 Bureau of the Census estimates, which would have implied a 2010 population at least 170,000 higher (assumes the growth rate of 2008 two 2009).

Instead all of the growth was in the outer suburbs, beyond the inner suburbs of Cook County.

Dallas-Fort Worth: The historical core municipality of Dallas had a modest population increase of 9000, or less than 1 percent between 2000 and 2010. In contrast, the suburbs experienced an increase of 1.2 million, or 30 percent. Thus, approximately 1 percent of the metropolitan area growth was in the core municipality, while 99 percent was in the suburbs, most of it in the outer suburbs. The inner suburbs added 14 percent to their 2000 population, while the outer suburbs added 36 percent.

The population figure for the core municipality of Dallas – consistently among the strong core areas – was surprisingly low, at 9 percent below (117,000) the expected level. The suburban population was 1 percent (71,000) below expectations.

Houston: The historical core municipality of Houston had comparatively strong population growth, adding 146,000 and 8 percent to its 2000 population. However this figure was 8 percent, or 174,000 below the expected figure. By contrast, the suburban growth was 39 percent, more than five times that of the central jurisdiction. The suburban population growth was 1,085,000, more than six times that of the core jurisdiction. The suburban population was 4 percent or 144,000 higher than expected.

The core jurisdiction of Houston accounted for 12 percent of the metropolitan area growth while the suburbs s accounted for 88 percent. This was evenly distributed between the inner suburbs of Harris County and the outer suburbs. The inner suburbs added 38 percent to their population while the outer suburbs added 41 percent.

Washington: Reversing a decade’s long trend, the historical core jurisdiction of Washington (DC) had a small population gain between 2000 and 2010. But the Washington, DC gain of 30,000 pales by comparison to the suburban gain, which was more than 20 times greater, at 700,000. The core jurisdiction accounted for 4 percent of the population gain, while the suburbs accounted for 96 percent.

More than 60 percent of the growth in the metropolitan area was outside the inner suburban jurisdictions that border Washington, DC (Arlington County and Alexandria in Virginia, together with Montgomery County and Prince George’s County in Maryland), while the inner suburbs accounted for 36 percent of the growth. The population increase in the inner suburbs was 9 percent, compared to 37 percent in the outer suburbs.

Jefferson County in West Virginia was not included in the analysis because data is not yet available.

Baltimore: The historical core municipality of Baltimore, the site of another ballyhooed urban comeback, lost 30,000 people, or 5 percent of its 2000 population. Baltimore’s 2010 population was 4 percent or 16,000 below the expected level. The suburbs experienced a 10 percent or 188,000 person increase. The region’s population increase was roughly equal in numbers between the inner suburbs and the outer suburbs, although the exurban percentage increase was nearly twice as large.

San Antonio: The historical core municipality of San Antonio experienced the largest population increase among the eight metropolitan areas, at 183,000, a roughly 16 percent population jump. The city of San Antonio accounted 43 percent of the growth while suburbs in Bexar County and further out accounted for a larger 57 percent. However, the suburban population increase was 248,000 or 44 percent. This is something of a turnaround in trends that favored the city of San Antonio in the past because of its vast sprawl and predominant share of the metropolitan population.

The city of San Antonio population was 5 percent or 65,000 people short of the expected 2010 level. The suburban population was 15 percent more or 104,000 more than the expected level.

Indianapolis: The historical core area of Indianapolis and Marion County (including enclaves within Indianapolis) grew 5 percent and accounted for 19 percent of the metropolitan area growth. In contrast, the surrounding suburbs grew 28 percent, representing r 81 percent of the metropolitan area growth. Overall, the core municipality added 44,000 people, while the suburbs added more than four times as many, at 188,000.

Austin: The historical core municipality of Austin experienced the greatest growth of any core jurisdiction in the eight metropolitan areas, at 20 percent. Even so, growth in the suburban areas was nearly 3 times as high at 56 percent. The city of Austin accounted for 29 percent of the metropolitan area population growth, while the suburbs accounted for 71 percent. Overall, the central municipality grew 134,000, while the suburbs grew 2.5 times as much, at 333,000.

Generally it is fair to say that, so far, suburban areas are growing far faster than urban cores. In addition, most of the fastest growing core municipalities are those areas that are themselves largely suburban, particularly in relatively young cities like San Antonio, Houston and Austin.

Among the eight metropolitan areas analyzed, the older core jurisdictions (with median house construction dates preceding 1960) tended to either lose population or grow modestly. This is illustrated by the city of Chicago, with a median house construction date of 1945, Baltimore with a median house construction date of 1946 and Washington with a median house construction date of 1949 (Table 2). Generally, the central jurisdictions with greater suburbanization (with median house construction dates of 1960 or later) grew more quickly. For example, highly suburban central jurisdictions like Austin with a median house construction date of 1983 and San Antonio, with a median house construction date of 1970, grew fastest. So much for the long forecast, and apparently still elusive, “return to the city”.

Wendell Cox is a Visiting Professor, Conservatoire National des Arts et Metiers, Paris and the author of War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life ”

Comments are closed.


©2007 Coalition On Sustainable Transportation