Election results from all three of Portland, Oregon's largest suburban counties indicate a reaction against what has been called "Portland Creep," the expansion of the expansive light rail system without voter approval and the imposition of restrictive densification measures by Metro, the regional land-use agency.
Portlanders in the three largest Oregon counties (Multnomah, Washington and Clackamas) have previously voted against financing light rail extensions, however the transit agency has found ways to continue the expansion and now operates five lines, with a sixth under construction. While urban rail aficionados tout the success of the Portland system, transit use by commuters has fallen significantly in relative terms from before the opening of the first light rail line. At the same time, working at home, which does not need billions in taxpayer subsidies, has caught up to and passed transit (Figure).
The electoral events of the past 60 days could severely limit future expansion.
Clackamas County: Chicanery and its Price
In a September 2012 election, voters in Clackamas County approved a measure by a 60% - 40% majority requiring that any commitment of funding to rail would require a vote of the people. Perhaps fearing a negative result in the election, the pro-rail Clackamas County commission hastily approved $20 million to support the under construction Portland to Milwaukie (Clackamas County) light rail line.
Things were to become substantially more difficult for light rail in the November election. In Clackamas County, the two incumbent commissioners on the ballot, both of whom voted for the $20 million bond issue, lost their seats. Voters rewarded their chicanery by replacing them with anti-rail commissioners, leaving the Clackamas County commission with a 3 to 2 anti-rail majority. The Oregonian characterized the election as "a referendum on light rail."
John Ludlow, who defeated Clackamas County commission chair Charlotte Lehan by a 52% to 48% margin, told The Oregonian:
"I think the biggest boost my campaign got was when those commissioners agreed to pay that $20 million to TriMet" for Portland-Milwaukie light rail four days before the September election. I think that put Tootie and me over the top."
"Tootie" is Tootie Smith, a former state legislator who unseated commissioner Jamie Damon in the same election by a similar margin.
Washington County, Oregon: Taxpayers Take Control
Meanwhile, light rail has run into substantial difficulty in suburban Washington County. In September, voters in King City approved a measure to require all light rail funding to be approved by the voters. In the more recent November election, voters in Tigard, the 6th largest city (50,000 population) in the metropolitan area, voted 81%-19% to subject all light rail expenditures to a vote of the people.
Clark County, Washington: Voters Say No
Portland's transit agency also had its eye on expanding light rail service across the state line and the Columbia River to Vancouver, in Clark County, Washington. The plan was to build a new "Interstate Bridge" (Interstate 5) across the river, which would include light rail. The voters of Clark County were asked in a referendum to approve funding for the light rail system and turned it down soundly according to the Columbian, by a 56% – 44% margin.
But there was more. For some time, citizen activist and business leader David Madore has been working to stop both tolls on the new bridge and light rail service. Madore was elected to the board of commissioners of Clark County at the same time that the light rail referendum was being defeated. Madore, like the two other Clark County commissioners, also hold seats on the transit agency board.
Tri-Met's Death Spiral?
Further, Tri-Met's dire financial situation could be another barrier to future expansion. As John Charles of the Cascade Policy Institute has shown, Tri-Met's fringe-benefit bill is astronomically high, at $1.63 for each $1.00 in wages. This is more than five times the average for public employers, according to US Department of Commerce Bureau of Economic Analysis data. Charles refers to Tri-Met as being in a "death spiral" and says that:
"The agency is steadily devolving from a transit district to a retirement and health-care center, with unsustainable fringe benefit costs that now far exceed the mere cost of wages."
Over the last four years, emissions in the United States declined more than in any other country in the world. Coal plants and coal mines are being shuttered. That's not from increased use of solar panels and wind turbines, as laudable as those technologies are. Rather it's due, in large measure, to the technological revolution allowing for the cheap extraction of natural gas from shale. By contrast, Europe, with its cap and trade program, and price on carbon, is returning to coal-burning.
Could President Obama, during his second term in office, turn this homegrown success story into paradigm-shifting climate strategy? In a speech we gave to the Colorado Oil and Gas Association yesterday, we argue that, after a season of ugly ideological polarization, politicians, environmentalists, and the gas industry have a chance to hit the reset button on energy politics.
This will require the natural gas industry to clean up its act, accepting better regulations, cracking down on bad actors, and preventing the leakage of methane, a potent greenhouse gas. It will require environmentalists to consider whether there might be a different path to significant emissions reductions from the one they have pursued over the last 20 years. And it will require Left and Right to put a halt to the tribalism that has characterized the national debate over climate and energy.
— Michael and Ted
Uniting a Fractured Republic
Innovation, Pragmatism, and the Natural Gas Revolution
by Ted Nordhaus and Michael Shellenberger
In 1981, George Mitchell, an independent Texas natural gas entrepreneur, realized that his shallow gas wells in the Barnett were running dry. He had millions of sunk investment in equipment and was looking for a way to generate more return on it. Mitchell was then a relatively small player in an industry that by its own reckoning was in decline. Conventional gas reserves were limited and were getting increasingly played out.
As he considered how he might save his operation, Mitchell turned his attention to shale. Drillers had been drilling shale since the early 19th Century, but mostly they drilled right through it to get to limestone and other formations. Dan Jarvey, a consultant to Mitchell at the time, told us, "When you look at a [gas drilling] log from the 1930s or 1950s or 1970s it is noted as a 'gas kick' or 'shale gas kick.' Most categorized it as 'It's just a shale gas kick' – as in, 'to be expected, but to be ignored.'"
As Mitchell embarked on his 20-year quest to crack the shale gas code, most of his colleagues in the gas industry thought he was crazy. But Mitchell persisted and his efforts would ultimately culminate in today's natural gas revolution.
In doing so, Mitchell upended longstanding assumptions about the future of energy. Just a few years ago, the convention wisdom was that no source of electricity could be cheaper than coal. Today, in the U.S., natural gas is cheaper. As a result, coal's share as a percentage of electricity generated went from over 50 percent in 2005 to 36 percent in 2012. While global coal use continues to rise, the U.S. is at present leaving much of it in the ground. Meanwhile, estimates of recoverable natural gas results in the United States have nearly doubled, growing from 200 trillion cubic feet in 2005 to 350 trillion cubic feet today.
The implications for those of us concerned about climate change are also significant. Leaving coal in the ground has been the longstanding goal of those of us concerned about global warming. Natural gas releases emits 45 percent fewer carbon emissions. In large part due to the glut of natural gas, U.S. carbon dioxide emissions will have declined more in the United States than in any other country in the world between 2008 and 2012 — an astonishing 500 million metric tons out of 6 billion, according to the Energy Information Administration.
While we don’t imagine that any of this is news to most of you in this audience, there is another part of the story that might be. That is the story of the ways in which both the gas industry and the federal government helped Mitchell along the way. In these intensely polarized times, when it seems that almost everyone imagines that either government or corporations are the enemy, and it seems impossible to imagine that the two might actually work together to further the public interest, there are important lessons here too.
As Mitchell considered trying his hand at shale, he cast about to see what was known at the time about how to get gas out of shale. A geophysicist who worked with Mitchell recalled telling him that, "It looks similar to the Devonian [shale back east], and the government's done all this work on the Devonian."
The work Mitchell's geophysicist was referring to was the Eastern Gas Shales Project, which was started in 1976 by President Ford. The Shales Project was just one of several aggressive government-led efforts to accelerate technology innovation to increase oil and gas production. Already in 1974 the Bureau of Mines was funding the study of underground fracture formations, enhanced recovery of oil through fluid injection, and the recovery of oil from tar sands. One year later, the government funded the first massive hydofracking at test sites in California, Wyoming and West Virginia, as well as "directionally deviated well-drilling techniques" for both oil and gas drilling.
The mandate from Congress was for government scientists and engineers to hire private contractors rather than do the work in-house. This was consistent with the tradition of the Bureau of Mines, which would set up trailers around the country to support oil, coal and gas entrepreneurs. This strategy contrasted with the government's nuclear energy R&D work, which had been hierarchical since its birth in the military's Manhattan project. This decentralization proved wise, as it ensured that the information would rapidly reach entrepreneurs in the field and not gather dust inside of a federal bureaucracy.
From early on, Mitchell and his team relied heavily on information coming out of the Eastern Gas Shales project. "We were all reading the DOE papers trying to figure out what the DOE had found in the Eastern Gas Shales," Mitchell geologist Dan Steward told us, "and it wasn't until 1986 that we concluded that we don't have open fractures, and that we were making production out of tight shales."
Through the 1980s, Mitchell didn't want to ask the government – or the Gas Research Institute, which was funded by a fee on gas pipeline shipments to coordinate government research with experiments being conducted by entrepreneurs in the field – for help because he worried that he wouldn't be able to take full advantage of the investment he was making in innovation.
But by the early 1990s Mitchell had concluded that he needed the government's help, and turned to DOE and the publicly-funded Gas Research Institute for technical assistance. The Gas Research Institute, which had worked with other industry partners to demonstrate the first horizontal fracks, subsidized Mitchell’s first horizontal well. Sandia National Labs provided high-tech underground mapping and supercomputers and a team to help Mitchell interpret the results. Mitchell’s twenty-year quest was also made possible by a $10 billion, 20-year tax credit provided by Congress to subsidize unconventional gas, which was too expensive and risky for most private firms to experiment with otherwise.
By 2000, the combination of technologies to cheaply frack shale were firmly in place. The final piece of the puzzle was the sale of Mitchell Energy to Devon Energy, which scaled up the use of horizontal wells. Over the next ten years the use of this combination of technologies would spread across the country, resulting in today's natural gas glut.
Though the collaboration between Mitchell and the government was one of the most fruitful public-private partnerships in American history, it was mostly unknown until we started interviewing the key players involved around this time last year.
After our findings were verified by other researches and reporters, including the New York Times and the Associated Press, some in the oil and gas industry, like T. Boone Pickens, have tried to downplay the government's role.
But the pioneers of this technology have been forthright. "I'm conservative as hell," Mitchell's former Vice President Dan Steward told us, but DOE "did a hell of a lot of work and I can't give them enough credit… You cannot diminish DOE's involvement." Fred Julander said, “The Department of Energy was there with research funding when no one else was interested and today we are all reaping the benefits."
Today marks the end of one of the most divisive chapters in American political history. There is more partisan polarization in Congress than at any time since Reconstruction. There are vanishingly few swing voters. And the ideological divide between liberals and conservatives at times appears unbridgeable.
One of the most insidious aspects of today’s political polarization is the way gross exaggerations turn into ossified caricatures. Left and Right view the other as ignorant, insane, or immoral.
From the Right we have heard that President Obama is taking the country to socialism, and that Big Government is destroying the American dream. From the Left we have heard that Governor Romney would have exported all our jobs to China, and turn Congress over to Big Business. Where this downward spiral takes us is to the conclusion that America is fundamentally broken. The two great institutions of American life — business and government — are viewed by one side or the other as corrupt and nefarious.
Few issues have become more polarizing than energy. Both sides have taken ever more extreme positions. Prominent conservatives have exaggerated both the size of Obama's clean energy investments and the number of bankruptcies. They have described global warming and other environmental problems as either not happening or not worth worrying about. Some environmentalists have taken the opposite tack, exaggerating the negative impacts of gas drilling, downplaying the benefits, and accusing anyone who disagrees with them of being on the take.
As we say in California — everyone needs to chill out. There is too much at stake for America, our environment, and our economy, for such hyper-partisanship to continue.
In our rush to point fingers and interpret everything in catastrophic terms, we have lost sight of the fact that we are the richest nation on earth, and one with improving environmental quality, precisely because the private sector and the government have worked so well together. The failures of Big Business and Big Government should be put in their appropriate historical context.
When the Colorado Oil and Gas Association asked us to give this speech at its conference the day after the election, we agreed on two conditions: that we pay our own way and that COGA invite local environmental and elected leaders to attend. We are glad to see them in the audience, because we need a common dialogue.
As two individuals who came out of the environmental movement, where we spent most of our careers, we are best known for our writings calling for reform and renovation of green politics. In particular, we have advocated that environmentalists drop their apocalyptic rhetoric, which is self-defeating and obscures the very real environmental problems we face.
And we have argued that environmentalists have been overly focused on regulations, when our focus should also be on revolutionary technological innovation, which is needed to make clean energy and other environmental technologies much cheaper, so that all seven going on 10 billion humans can live modern, prosperous lives on an ecologically vibrant planet.
But our work has also focused on reminding private investors and corporate executives of the critical role played by the government in creating our national wealth. While economists have long recognized that innovation is responsible for most of our economic growth, few realize that many of our world-changing innovations would have been unlikely to occur without government support. A short list of recognizable technological innovations includes interchangeable parts, computers, the Internet, jet engines, nuclear power and every other major energy technology.
Consider the information revolution. The government funded the R&D and bought 80 percent of the first microchips. The Internet started out as a federally funded program to connect networks of computers of government. Every major technology in the iPhone can be traced to some connection with government funding. The driver-less robot car that Google has invented relies on technologies that come out of government innovation programs.
While high tech executives who are our age or younger are unaware of the government roots of the IT revolution, the old-timers of Silicon Valley do, and frequently expresses their gratitude for it.
While interviewing the participants of the shale gas revolution, we were struck by how much respect and deference each side gave to the other. In many cases the government scientists and engineers acted as consultants to private firms like Mitchell's — "We never forgot who the customer was," said Alex Crawley, who ran the DOE's fossil innovation program for many years.
As environmentalists, we were taught to be suspicious of such cozy relationships between industry and government workers, that government could not simultaneously promote industry while also attempting to regulate it. But when it comes to technology innovation, those cozy relationships, and the revolving door between government agencies, whether DoD or DoE, and private companies like Mitchell Energy, are absolutely essential to allowing knowledge to rapidly spillover and flow throughout the sector.
And yet, there is also an important role for regulation, not only to protect the public from accidents and environmental degradation, but also to improve technologies and promote better practices throughout the industry. Wise regulation in the long run promotes, rather than hinders, the spread of new technologies and new industries, and this has never been more true than in the case of fracking. While US gas production has taken off, many European nations banned fracking for fear of the local environmental impacts and have started to return to burning coal.
Last August, George Mitchell and New York Mayor Michael Bloomberg announced they would fund a large effort by the states to establish better fracking practices. They called for stronger control of methane leaks and other air pollution, the disclosure of chemicals used in fracking, optimizing rules for well construction, minimizing water use and properly disposing of waste water, and reducing the impact of gas on communities, roads, and the environment.
You would be hard pressed to find very many Americans who would call those reforms unreasonable. They are the kinds of things that die-hard anti-fracking activists and much of the natural gas industry could agree to. And indeed, states like Colorado, and environmental groups like the Environmental Defense Fund, deserve credit for bringing regulators and the gas industry together to improve practices. By squarely addressing the methane leakage problem, and reducing the local environmental impacts, the government and the industry can make natural gas an even more obviously better alternative to coal.
And the good news is that reducing methane leakage is something the industry already knows how to do. Little innovation is required to make sure that old pipelines are not leaking, and that new cement jobs are done properly. Similarly, responsible disposal of fracking fluids is not rocket science, it is something that the oil and gas industry does routinely in other contexts. Promising efforts are also underway to develop more environmentally sound fracking fluids and to further minimize water usage.
There are costs, of course, associated with all of these efforts. But if the history of fracking proves anything, it is that costs will come down quickly. Indeed, if history is any guide, we will see great improvements to fracking technologies and techniques over the next 30 years that will be mutually beneficial to the industry, the public, and the environment, for the history of the shale gas revolution has been a history of incremental improvements to the technology. The water intensity of fracking, for instance, was originally not an environmental problem for drillers but an economic one. Only once Mitchell and others developed methods that required vastly less water to crack the shale did fracking become economically viable.
For all of these reasons, we should both regulate fracking fairly and effectively, and also continue to support innovation to improve unconventional gas technologies. Doing so will help assure a future for gas beyond the precincts in which it is already well established. We also need to support innovation in new gas technologies well beyond fracking practices to include carbon capture and storage, which is more viable economically and technologically for gas than for coal, because gas plants are more efficient, and the emissions stream much purer. In a world in which there may remain significant obstacles to moving entirely away from fossil fuels, gas CCS looks much more viable than coal CCS. As such, we need government and the gas industry to work together to demonstrate carbon capture technologies at sites around the country, similar to how we conducted the Eastern Gas Shales Project.
And the gas industry should support innovation beyond natural gas to include support for innovation in renewables, nuclear and other environmentally important technologies. Championing energy innovation more broadly would do more for the industry than the millions it is currently spending on slick 30-second TV ads and will remind Americans that supporting gas as well as renewables is not a zero sum proposition. Getting our energy from a diversity of sources is in the national interest and gas will thrive for a long time regardless of the energy mix. Moreover, until we have cheap utility scale storage, renewables need cheap gas for backup.
For all of this to happen, the gas industry and environmentalists alike must change their posture toward regulation. While it is the goal of a small number of us to rid the world of particular practices, whether shale-fracking or atom-splitting, most of the rest of us want to improve them.
Over the last 10 years, our message to the environmental movement has been that it must change its attitude toward technological innovation. Technologies are not essentially good or bad but rather in a process of continuous improvement. But there is another side to that story that industry must remember. Regulations that are often bitterly opposed sometimes end up being a boon for industry, paving the way for the broad acceptance of new technologies and pushing firms to improve those technologies in ways that make them more economical as well as more environmental.
In closing we’d like to invoke the title essay of our last e-book, “Love Your Monsters,” which was written by one of our Senior Fellows, a well-known French anthropologist named Bruno Latour. In the essay, Latour monkey-wrenches the Frankenstein fable. The sin of Dr. Frankenstein, according to Latour, was not creating the monster, but rather abandoning him when he turned out to be flawed. We must learn to love our technologies as we do our children, he concluded, constantly helping and improving them. In so doing, we too become all the wiser.
As we consider the implications of the gas revolution for the future of both our energy economy and our environment, we should commit ourselves to the larger effort of improving our technological creations. In so doing, the gas industry and the environmental movement might together update the concept of sustainability for the 21st Century. We should seek not to put limits on the aspirations of 1.5 billion people who still lack access to electricity, nor on the billions more yearning for enough to power washing machines and refrigerators. Nor should we want to sustain today's energy technologies to be used in perpetuity. Rather, we should embrace technological innovation as the key to creating cleaner and better substitutes to today's energy and non-energy resources alike so that we might sustain human civilization far into the future.
According to the Hawaii Reporter, Honolulu's rail transit project has lost a major legal test in The Federal Ninth Circuit Court, as Judge Wallace Tashima ruled in HonoluluTraffic.com v. Federal Transit Administration et al that the city of Honolulu had violated federal environmental law on three counts.
The plaintiffs included are a coalition of environmental, civic, political and taxpayer interests, including former Governor and mayoral candidate Benjamin Cayetano, University of Hawaii Law professor Randall Roth, Retired Judge Walter Heen, retired businessman and transportation expert Cliff Slater, Dr. Michael Uechi, Hawaii’s Thousand Friends, Outdoor Circle and the Small Business Hawaii Entrepreneurial Education Foundation.
The plaintiffs and defendants differ strongly on the impact of the ruling, and the defendants are to return to court in December seeking a permanent injunction against the project.
University of Hawaii Engineering Professor Panos Prevedouros told the Hawaii Reporter that the decision would require environmental planning revisions that could take up to 2 years.
This setback is in addition to a previous unanimous Hawaii Supreme Court ruling that had already required construction to be suspended and which could delay project for at least a year, according to the Hawaii Reporter. The Supreme Court in Kaleikini v. Yoshioka, ruled that the city of Honolulu failed to comply with the state's historic preservation and burial protection laws when it did not complete an archeological inventory survey for the 20-mile route before starting construction.
On October 19, an Amtrak passenger train hit 111 mph in a test run on a 15-mile stretch of track between Dwight and Pontiac, Illinois. It was the first tangible return from a three-year $1.5 billion program of improvements funded under the Administration's high-speed rail initiative. The program hopes to shave about an hour off the 5 ½ hour rail trip between Chicago and St. Louis. Transportation Secretary Ray LaHood and Illinois Gov. Pat Quinn who were aboard, called it a "historic" event. They were perhaps unaware, as Chicago SunTimes respected columnist Mark Brown pointed out, that "ten years ago, also on the eve of an election, the same Illinois Department of Transportation offered another demonstration along nearly the same stretch of track, also reaching 110 mph."
Setting this pre-election rhetoric aside, of President Obama’s vaunted HSR initiative that promised to connect 80 percent of Americans with high-speed rail, only two true high-speed rail projects remain. They are the California SF-to-LA Bullet Train and the "Amtrak Vision for the Northeast Corridor." The future of these two projects is discussed below. A condensed version of this commentary appeared in the Wall Street Journal on September 24, 2012.
High speed trains are hardly new --- they have been crisscrossing France and Japan for over 40 years. But building a nationwide high-speed rail network in America is quite a novel idea. It originated with President Obama who, on April 16, 2009, announced a plan "to give 80 percent of Americans access to high-speed rail within the next 25 years." The program was seeded with an $8 billion grant from the American Recovery and Reinvestment Act of 2009 (ARRA), later supplemented with an additional $2.1 billion in general funds.
But this lofty and extravagant vision soon yielded to practical realities. One such reality is America’s demography. Unlike Western Europe and Japan, the United States, lacks an urban pattern that favors high-speed rail connections. This pattern requires large traffic generating city-pairs that are neither close enough to each other to favor travel by car nor far enough apart to favor travel by air. In Europe and Japan those distances happen to fall in the range of 200-400 miles (Think Paris-Lyon, 290 miles; or Tokyo-Nagoya, 220 miles). The only corridor in the United States that fits this description is the Northeast Corridor. No wonder, the Boston-to-Washington rail line has lately become a focus of high-speed rail planning.
Another reality is that true high-speed rail service requires a dedicated alignment reserved exclusively for passenger trains. Such is the case with the French TGV, the German ICE and the Japanese Shinkansen trains— as indeed, with any train that runs at top speeds of 150 miles per hour or higher. Having high-speed trains share a common track with lumbering freight trains as the Obama Administration has proposed to do, is to invite serious operational conflicts and safety problems. But dedicated rights-of-way for high speed trains require relatively straight and level alignments with minimal curvature. To assemble such rights-of-way in densely populated corridors where land holdings are highly fragmented, would be extremely costly and disruptive if not totally impossible.
Yet another reality is the uncertain prospect for further federal support. Such support is deemed essential for the future of the Administration’s HSR program (but not for the future of privately funded ventures such as the proposed Lone Star HSR line between Dallas and Houston). Congress, by denying White House requests for high-speed rail funds three years in a row, has sent a clear bipartisan signal that states should not count on continued congressional appropriations for high-speed rail. The lawmakers reaffirmed this intention by eliminating Title V of the Senate transportation bill (the National Rail System Preservation, Expansion and Development Act of 2012) from the final version of the surface transportation reauthorization (MAP-21). In the meantime, the $10.1 billion earmarked for high-speed rail has been fully committed.
In sum, high-speed rail advocates, promoters and dreamers need a triple reality check.
Improving Existing Rail Service
But this is not to say that nothing should be done to improve and expand existing passenger rail services, especially commuter rail lines serving major metropolitan areas. Even though such improvements will not result in significant travel time savings, they could lead to more efficient, frequent and reliable transportation service benefitting millions of daily commuters. In 2010, commuter rail systems across the country provided service to nearly 460 million riders.
Improving commuter rail services is indeed, the approach embraced by the California High Speed Rail Authority. Despite its avowed goal to link LA and San Francisco with high-speed trains, almost half of its initial $10 billion first stage of the project will be devoted to upgrading conventional transit and commuter rail services in Los Angeles and the Bay Area, the "bookends" of the high-speed rail line, e.g. through electrification of the SF-to-San Jose Caltrain and "connectivity" improvements in LA’s Metrolink.
The dollars spent on commuter rail improvements will have "an immediate and dramatic effect" according to the Authority’s chairman, Dan Richard. Will Kempton, chief executive of the Orange County Transportation Authority (OCTA) and chairman of the Independent Peer Review Group advising the High Speed Rail Authority concurs. It will be a good investment, he said, whether or not the overall $68 billion high-speed rail project ever gets completed.
Similarly, in the Northeast Corridor where Amtrak has proposed a 30-year $151 billion capital investment program to bring true high-speed rail service between Boston and Washington DC, the initial efforts will be focused on "meaningful incremental improvements" in track, catenary and signals in the New York-to-Philadelphia corridor (the "NEC Upgrade Program"). This stretch of the line was chosen for the initial upgrade because it carries a heavy volume of local commuter traffic in addition to serving long distance trains. As in the case of California’s "bookend" improvements, the upgrades of the 90-mile NY-Philadelphia rail line will not only benefit large numbers of travelers – they also will be far more cost-effective in dollars-per-passenger terms than any eventual improvements raising line speeds over the entire Boston-to-Washington corridor.
Thus, fiscal, economic and political constraints have caused both the California Bullet Train and the Amtrak vision for the Northeast Corridor — the only two projects that have survived on the Obama Administration’s vaunted high-speed rail agenda --- to morph largely into a program of modest near-term improvements in existing commuter rail services. Lack of funds may prevent either project from achieving its avowed goal of providing true high-speed rail service--- in the case of California, reducing travel time between LA and San Francisco to two hours and forty minutes (see Note below). To achieve it, the California project will require $68 billion; the NEC program will need $151 billion.
Is this goal even worth pursuing? Some people think so---in fact they passionately believe in it. They contend that in order to make our cities less auto-dependent we need to invest in high-speed trains. Minor upgrades in existing rail services, they argue, will not make a significant dent in auto use. But many planners beg to differ. They believe that the best chance of persuading current auto users to leave their cars at home is to improve the daily suburban rail commute. Business travelers will continue flying because they look for the fastest way to get to their destination. Families on vacation trips will not abandon their cars in favor of trains because cars offer the least costly and most convenient way to travel to holiday destinations. The only sector of the traveling public that can be influenced to shift to trains in large numbers are suburban commuters.
What of the argument that a great nation like ours---a nation that built the Erie Canal, the transcontinental railroad, the Panama Canal and the Interstate Highway System --- should continue the tradition of visionary grandiose public works.
Regretfully, both ventures have come at a most inopportune time. The nation is recovering from a serious recession and is trying to rein in the deficit and reduce the 16 trillion dollar national debt. At a more distant moment in time, when the economy is growing again and the deficit has come under control, the nation might be able to resume its tradition of pursuing "bold endeavors"---ambitious programs of federally financed public works that benefit the whole nation. When that time comes, perhaps toward the end of this decade, it might be appropriate to revive the idea of high-speed rail--- at least in the context of the densely populated Northeast Corridor where road and air traffic congestion may eventually threaten its continued growth and productivity. For now, prudence, good sense and the nation’s fiscal well-being require that we lower our sights and focus on improving commuter rail connections.
Note on the Status of the California HSR Project
There is a high likelihood that the LA-SF bullet train project will never get completed. Law suits are pending to stop construction of the first stage of the project---the Central Valley segment from Madera to Bakersfield. A motion for a preliminary injunction has been filed by Madera County, the Madera and Merced County farm bureaus and other opponents. The motion seeks to prevent the rail Authority from moving forward on the initial Madera-to-Fresno section until a trial on the lawsuit is completed. Hearing on the preliminary injunction is set for November 16.
Even if the preliminary injunction is denied, construction on the rail section will not begin until the fall of 2013 according to a legal declaration filed by the Authority in the Sacramento Superior Court. What’s more, the Madera-to-Fresno section will not be electrified before 2022 according to the rail Authority---and then only if more funds become available. Additional legal challenges are expected over the Fresno-to-Bakersfield section of the line. The City of Bakersfield has already announced plans to file a lawsuit contending that the Authority’s environmental impact report doesn’t meet CEQA standards. The cumulative effect of these delays has led to speculations that the Authority may not be able to complete work on the Central Valley segment by September 2017 when the federal $3 billion grant expires. And if the federal money stops flowing, who will step in to fill the gap?
Is the "infrastructure crisis" a myth or a reality? Many within the transportation community firmly believe that the crisis is real. They point out that many of our roads, bridges and transit systems are approaching the end of their useful life and are badly in need of repair, reconstruction and modernization. They are convinced that without an ambitious program of investment ---beyond the billions that already are being spent---the transportation infrastructure will continue to deteriorate, rendering great harm to the nation's economy. They find it difficult to understand why politicians and the public do not necessarily share the same sense of urgency. They tend to blame themselves for doing a poor job of "educating" the public about the catastrophic consequences of inaction.
Even though the new two-year transportation bill has barely gone into effect (on October 1), activists already are strategizing how better, i.e. more convincingly, to present the case for higher transportation spending in the next transportation bill. As an AASHTO spokesman reminded us recently, "it is never too early to consider your strategy for making the case that the United States should continue to invest in its transportation infrastructure." "We can't afford to relax," echoed Pete Ruane, president of the American Road and Transportation Builders Association (ARTBA). "We're in a very serious struggle over the future of federal investment in transportation." Similar sentiments have been voiced in various transportation-related meetings over the past several months..
But proponents of greater spending ignore the political realities. With mounting deficits and the shadow of a $16 trillion debt hovering over all fiscal decisions, Congress is not about to vastly increase spending on transportation. Concern about deteriorating infrastructure has failed to resonate with the electorate during the election campaign. Nor did the presidential condidates care to mention transportation in their recent debate on domestic priorities, despite pleas by stakeholder groups to include infrastructure on the political agenda.
Infrastructure crisis believers decry this supposed "indifference" or "short-sightedness" on the part of the politicians and the public. But their anger is misplaced. People recognize and acknowledge the need to modernize and expand the nation's infrastructure. They simply are not convinced by the "sky is falling" rhetoric employed by the alarmists---dire warnings of collapsing bridges and crumbling roads if government does not greatly increase spending on infrastructure.
As the Washington Post editorialized no too long ago, people see no signs of "crumbling infrastructure." They trust their own eyes more than they trust the unverified claims of the experts ---and what they see is highways and transit networks that are well maintained and functioning smoothly and reliably most of the time. They suspect that warnings of catastrophic consequences if spending on infrastructure is not boosted, are overblown, self-serving, and more often than not inspired by liberal advocacy groups, lobbyists and industry spokesmen who have a financial stake in pushing for more federal spending. As one senior congressional aide confided to us, "I don't see our constituents lobbying to raise the gas tax in order to spend more money on transportation."
Moreover, the public is not sure that all of the billions of dollars that the federal government already devotes to transportation ($114 billion in FY 2012) are spent wisely, nor that more money will make the transportation system perform any better (e.g. reduce congestion). They believe that the desire to greatly increase investment in infrastructure must be tempered by the overriding imperative to get the nation's fiscal house in order.
The fiscal and political climate in the next few years will make the job of convincing the skeptical electorate to support higher transportation spending even harder. Funding constraints will continue to make it difficult if not downright impossible for Congress to commit hundreds of billions of federal dollars in a single legislative package, regardless of which party controls the purse strings. Unwilling to raise fuel taxes, Congress is likely to embrace short-term bills as a convenient way out of the dilemma. Short-term authorizations such as MAP-21 will require only modest transfers from the general fund ---especially if states are willing to step in with increased contributions of their own. On the other hand, a six-year bill would require an injection of nearly $90 billion in general revenue.
To be sure, some in the stakeholder community will contend that longer-term (i.e. five- or six-year) authorizations are necessary to allow for orderly planning and implementation of capital projects. They will argue that short-term bills will not provide the kind of funding certainty that major public works require. But to the extent that large capital investments still figure on State DOTs’ and transit authorities’ agendas, private capital, tolling, and credit instruments such as TIFIA and state infrastructure banks, will provide adequate alternatives to the funding stability that long-term congressional authorizations offered in years past.
The bottom line: regardless of the outcome of the November elections, do not expect a boost in federal transportation spending. Indeed, minor reductions in discretionary programs (TIGER, New Starts) are possible if automatic year-end spending cuts under sequestration are not avoided.
The British Broadcasting Corporation (BBC) has just published a list of 10 "monster commutes" around the world. Some are to be expected, and are usually found on any list of extreme traffic congestion, such as Jakarta, Bangkok, Manila, Mumbai, Seoul, Nairobi and Dhaka.
Lexington? However, reading further it becomes clearer that the BBC story deserves its own exhibit in the "Ripley's Don't Believe It" Room at the British Museum. BBC lists Lexington, Kentucky as one of 10 with "monster traffic jams." At first I thought BBC might have listed the wrong "L" place, having intended to cite Lagos or Lima instead. Not so, however since BBC quotes a Lexington commuter who claims to have spent an hour commuting to work one morning.
That, surely is not the experience of the average Lexington resident. According to the United States Census Bureau, the average work trip travel time, one way, in the Lexington metropolitan area is 21 minutes. This compares to the US national average of approximately 25 minutes. Researchers David Hartgen and M. Gregory Fields estimated the excess travel time during peak hour in Lexington at five percent in 2003 (traffic congestion has not become serious enough to warrant the attention of the long-standing Texas Transportation Institute's congestion reporting system). A quick review of data supplied by INRIX suggests that about 150 out of more than 180 rated US, European and Canadian metropolitan areas have worse traffic congestion than Lexington.
Austin? Perhaps a stronger case can be made for the inclusion of Austin, Texas on the list. But even so, Austin barely makes the most congested quarter of the INRIX international list. Austin's worse than average traffic congestion is the result of its late development an express roadway system, as this metropolitan area of the nearly 2,000,000 population was the last in the nation to connect two freeways together.
BBC's Austin commuter is quoted as indicating that he commutes by car, for which "I castigate myself daily." He continues: “I see two things that make me feel both guilty and shocked. A vacant city bus inching along my route and an empty tram cutting across traffic at 5pm." He misses the point. If the city bus is a vacant and the tram is empty, it is because they do not meet the needs of a sufficient number of customers (needs, which by the way can only be defined by consumers, not planners).
The proof is the crowded buses and trains that converge on six large downtown areas in the United States, where 40 percent to 75 percent of commuters use transit. This is not because the people who work south of 59th Street in Manhattan, in Chicago's Loop, or the downtown areas of Philadelphia, Washington, Boston or San Francisco have more effectively managed their guilt than the Austin commuter. It is rather because transit meets their needs. Commuters are rational. They take the mode of transport that best suits their needs. Transit's market shares around the country (many of them miniscule) speak volumes about how well transit meets the needs of potential customers.
Finally, BBC's Austin commuter claims that it takes 45 minutes to drive three kilometers (2 miles) to work (walking would be as fast for most people). It is hard to imagine a more unrepresentative commute in Austin. According to the United States Census Bureau, the average one way commute in Austin in 2011 was 26 minutes. Somehow 85 percent of Austin commuters get to work in less time than the Austin commuter, and they travel a lot farther.
Last year, in congressional testimony before the House Transportation and Infrastructure Committee hearing on high speed rail, we cited the Chicago-to-St.Louis "high-speed rail" project as an example of the Administration's wasteful use of its economic stimulus money. We pointed out that the $1.4 billion program of track upgrades will allow top speed of 110 mph but will raise average speeds of Amtrak trains between Chicago and St. Louis by only 10 miles per hour, from 53 to 63 mph. The four-and-a-half hour trip time will be cut by a mere 48 minutes, to three hours and fourty minutes. In France, TGV trains between Paris and Lyon cover approximately the same ditance (290 miles) in a little under two hours, at an average speed of 150 mph. Yet, federal officials did not hesitate proclaiming the Chicago-St. Louis project as "historic" and hailing it as "one giant step closer to achieving high-speed rail passenger service."
Now, a Chicago Tribune story, linked here and excerpted below, confirms just how "ridiculously expensive" and "uneconomical" this project is turning out to be. As the editorial points out, the project stands to "drain funding from mundane projects that could make a much bigger difference." Something that the California High Speed Rail Authority has belatedly recognized in diverting almost half of the initial $10 billion stage of its bullet train project to upgrading "mundane" commuter rail services in Los Angeles and the Bay Area.
In recent years, under the banner of economic stimulus, the federal government has spent a ton of money getting the tracks ready for those speedy locomotives. In the Chicago-St. Louis corridor, for instance, Uncle Sam has poured at least $1.4 billion into crossing improvements and other upgrades. Between Chicago and Detroit, more than $400 million has been spent.
How would you feel, taxpayer, if we told you that some of the work might need to be torn up and redone?
Angry? You bet.
A debate over just how fast high-speed trains should operate could turn very costly very soon.
The issue comes down to 15 miles per hour.
Today the US Bureau of the Census released a fascinating report on metropolitan area population growth by radius from the corresponding city halls. The report provides summary tables indicating the metropolitan areas that had the greatest and least growth, for example, near the downtown areas. I was surprised to find that Salt Lake City had done so well, having seen is population rise from 336,000 to 355,000 within a two mile radius of city hall (Table 3-7). That struck me as odd. A two mile radius encompasses an area of only 12.6 square miles, for a density of about 28,000 per square mile. Only the city San Francisco has densities that high over such a large area in the West. Moreover, all of the municipality of Salt Lake City is within two miles of city hall, and the 2010 census counted only 186,000 people in the entire city of more nearly 110 square miles.
In reviewing the backup file, Worksheets "Pop2000", Pop2010", "Density2000" and "Density 2010"), I discovered that Salt Lake City's data was actually that of San Francisco and that metropolitan Salt Lake City was credited with 3.2 more people than it had Another surprise was that the San Francisco metropolitan area was reported with 260,000 people, less than one-third the population reported for the core city of San Francisco in 2010. Santa Fe had a reported population 3.4 million people, about 1.4 million people more than live in the entire state of which it is the capital. Further, in at least 35 cases, the populations for metropolitan areas did not correspond to those reported in the 2010 census.
Obviously this is the kind of automated (computer) error that can happen to anyone or any agency. Nonetheless, an immediate correction would be appropriate.
With considerable effort, we were able to get through to the public information office at the Bureau of the Census to notify them of the error.
Until a corrected report is issued, any analysis of the report will need to be very cautious indeed. We look forward to the revision.
With California State Redevelopment Agency money gone, the city of Los Angeles ought to welcome new large-scale private development, and the economic stimulus and job creation it brings, with open arms. City Hall, faced with an anemic municipal budget, could also use the increased tax revenue. One such project that would help abate the city’s budget woes and create new jobs for the city is the University of Southern California’s proposed $1.1 billion “The Village at USC” project.
Surprisingly (or perhaps not), the city’s Planning and Land Use Management Committee delayed approval of the project for the second time last week, citing a need for more time to digest data regarding the project’s gentrifying effects on the surrounding community. The city is not fooling anyone – the delay amounts to nothing short of extortion – an attempt to ensure that committee members receive their proper concessions.
The site for “The Village at USC” is located directly north of the campus on University-owned land. Currently a dilapidated retail center, the new project calls 350,000 square feet of retail and will add up to 5,200 much needed student beds. The project would also create 12,000 new jobs for the city (8,000 permanent and 4,000 construction-related).
Comprehending the short-sightedness of delaying the project requires an understanding of USC’s role in its surrounding neighborhood (full disclosure: this writer is a graduate of USC). The university was founded in 1880, when LA was nothing more than a far outpost of western American expansion. Situated just 2 miles south of downtown, the city grew outward around the campus. Once an upscale neighborhood, the area immediately adjacent to USC lost its luster with the development of the city’s Westside, including Hollywood and Beverly Hills. Post WWII suburban expansion and the construction of the 110 and 10 freeways further eroded the area.
Today the area surrounding USC’s campus is racially and economically polarized. Part of LA’s notorious South Central (now more politically correct referred to as “South LA”), the area was hard hit by the riots of 1992. Yet while crime is still an issue, the area has markedly improved since the riots. Much of the improvement is thanks to a shift in the University’s relationship to its surrounding neighborhood post-1992. Rather than continuing to see itself as an island fortress in a sea of urban chaos, USC reached out to the local community, sponsoring programs for community members and supporting local businesses. The University’s extensive community outreach efforts led it to be named TIME magazine’s “University of the Year” in 2000.
As Los Angeles developed, USC had several opportunities to relocate its campus to other parts of the city and even Orange County, but its commitment to staying in the city’s center stood the test of time. The University is the largest private employer in Los Angeles and serves as a wellspring of knowledge and talent for the city. Given these contributions to LA, it is unfortunate and even appalling that the city’s Planning and Land Use Management Committee would question the University’s intentions and delay its plans to develop on land it owns with its own money (and without any handouts from the city or state).
Adam Nathaniel Mayer is an American architectural design professional. In addition to his job designing buildings he writes the China Urban Development Blog. Follow him on Twitter: @AdamNMayer.
The Census Bureau's American Community Survey released its annual one-year snapshot of demographic data in the United States. As usual, this included journey to work (commuting data), which is summarized in the table below.
|American Community Survey Commuting Data
|2011, 2010 & 2000
|ESTIMATES of Total Commuters
|Motorcyle, Taxi & Other
|Work at Home
|Motorcyle, Taxi & Other
|Work at Home
|Sources: 2000, 2010 Census & 2011 American Community Survey
Trends Since 2010
As estimated employment improved from 137.9 million in 2010 to 138.3 from 2010 to 2011, there was an increase of 800,000 in the number of commuters driving alone, which, as usual, represented the vast majority of commuting (105.6 million daily one way trips), at 76.40 percent. This was not enough, however, to avoid a small (0.17 percentage point) decline in market share.
Car pooling experienced a rare increase of 120,000 commuters, which translated into a 0.1 percentage point loss in market share, to 9.68 percent. Transit increased 190,000 commuters, and had a 0.09 percentage point increase in market share, to 5.03 percent. This brought transit's market share to above its 2008 share of 5.01 percent and near its 1990 market share of 5.11 percent.
Working at home increased by 70,000, with a modest 0.1 percentage point increase from 2010.
Trends Since 2000
Even with declining falling household incomes and rising gasoline prices, single-occupant commuting continued to rise between 2000 and 2011. Solo drivers increased nearly 8 million, more than the total transit commuting in 2011. Car pooling continued its long-term decline, falling 2.2 million. Transit did well (as would be expected with unfavorable economic conditions and unprecedented gasoline price increases), as we noted last year, having added 1.1 million commuters. This was spread thinly around the country, though with a 70 percent concentration in New York and Washington, DC. Over the period, working at home experienced an increase of 1.8 million, the largest increase outside solo driving.
For the most part the commuting data was ignored by the media --- and for good reason. The one year changes were predictably modest. However, the exception was USA Today, with a top of the webpage "Fewer Americans Driving Solo" headline. In fact, as noted above, the short term and long term trends reflected an increase in solo driving. Moreover, reading the story it would be easy to get the impression that a sea change had occurred in how people get to work. To its credit, however, USA Today appropriately labeled the likely reasons for the mountains it made into molehills --- the economy and gasoline prices.