Tuesday, November 30, 2010

Declining Energy Quality Could Be Root Cause of Current Recession, Expert Suggests

Many economists have pointed to a bursting real estate bubble as the initial trigger for the current recession, which in turn caused global investments in U.S. real estate to turn sour and drag down the global economy. King suggests the real estate bubble burst because individuals were forced to pay a higher and higher percentage of their income for energy -- including electricity, gasoline and heating oil -- leaving less money for their home mortgages.

In economic terms, the quality of the nation's energy supply is referred to as Energy Return on Energy Investment (EROI). For example, if an oil company uses a 10th of a barrel of oil to drill, pump, transport and refine one barrel of oil, the EROI for the refined fuel is 10.

"Many economists don't think of energy as being a limiting factor to economic growth," says King, a research associate in the university's Center for International Energy and Environmental Policy."They think continual improvements in technology and efficiency have completely decoupled the two factors. My research is part of a growing body of evidence that says that's just not true. Energy still plays a big role."

In a paper published this November in the journalEnvironmental Research Letters, King introduced a new way to measure energy quality, the Energy Intensity Ratio (EIR), that is easier to calculate, highly correlated to EROI and in some ways more powerful than EROI. EIR measures how much profit is obtained by energy consumers relative to energy producers. The higher the EIR, the more economic value consumers (including businesses, governments and people) get from their energy.

When King plots EIR for various fuels every year since World War II, the graphs indicate two large declines, one before the recessions of the mid-1970s and early 1980s and the other during the 2000s, leading up to the current economic recession. There have been other recessions in the U.S. since World War II, but the longest and deepest were preceded by sustained declines in EIR for all fossil fuels.

EIR is proportional to EROI, meaning they rise and fall together, but the basic data behind the EIR calculations come out annually as opposed to every five years for EROI. EIR also gives insight into different parts of the supply chain such as at the refinery or at the gas pump, which are harder to study with EROI.

King's analysis suggests if EIR falls below a certain threshold, the economy stops growing. For example, in 1972, EIR for gasoline was 5.9 and in 2008 it was 5.5. During times of robust economic growth, such as the 1990s, EIR for gasoline was well over eight. Compare that to some estimates of EROI and EIR for corn ethanol of around one, and it's clear why corn ethanol has been widely criticized as a low quality energy source.

To get the U.S. economy growing again, King says Americans will have to produce and use energy more efficiently. That's essentially what the U.S. did after the last energy crisis by raising fuel efficiency standards for cars, increasing use of natural gas for electric power generation and developing new technologies such as Enhanced Oil Recovery to coax more oil out of the ground.

"If we aren't fundamentally changing the way we produce or consume energy now, don't expect the economy to grow as much as the past two decades," he says.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.


Source

Thursday, November 25, 2010

North Sea Oil Recovery Using Carbon Dioxide Is Possible, but Time Is Running Out, Expert Says

A new calculation by Durham University of the net worth of the UK oil field shows that using carbon dioxide (CO2) to enhance the recovery from our existing North Sea oil fields could yield an extra three billion barrels of oil over the next 20 years. Three billion barrels of oil could power, heat and transport the UK for two years with every other form of energy switched off.

Importantly, at a time of rising CO2emissions, the enhanced oil recovery process is just about carbon neutral with as much carbon being put back in the ground as will be taken out.

The technique could yield an enormous amount of oil revenue at a time of public service cuts and developing the infrastructure would put the UK in the driving seat for developing enhanced recovery off-shore oil production around the world. It would also allow the UK to develop its carbon storage techniques in line with the UK government's commitments on emissions reductions.

The study, funded by DONG Energy (UK) Ltd. and Ikon Science Ltd., will be presented Oct. 14, 2010, at a conference on Carbon Capture and Storage (CCS), at the Institution of Mechanical Engineers, London. The new figures are conservative estimates and extend a previous calculation that predicted a 2.7 billion barrel yield from selected fields in the North Sea.

The UK Government's Energy Statement, published in April 2010, outlines the continued role that fossil fuels will have to play in the UK energy mix. CO2enhanced oil recovery in the UK would secure supplies for the next 20 years.

Jon Gluyas, a Professor in CCS& Geo-Energy, Department of Earth Sciences, Durham University, who has calculated the new figures, said:"Time is running out to make best use of our precious remaining oil reserves because we're losing vital infrastructure as the oil fields decline and are abandoned. Once the infrastructure is removed, we will never go back and the opportunity will be wasted.

"We need to act now to develop the capture and transportation infrastructure to take the CO2to where it is needed. This would be a world-leading industry using new technology to deliver carbon dioxide to the North Sea oil fields. We must begin to do this as soon as possible before it becomes too expensive to do so.

"My figures are at the low end of expectations but they show that developing this technology could lead to a huge rejuvenation of the North Sea. The industrial CO2output from Aberdeen to Hull is all you need to deliver this enhanced oil recovery."

Carbon dioxide is emitted into the atmosphere when fossil fuels are burnt and the UK Government plans to collect it from power stations in the UK. Capturing and storing carbon dioxide is seen as a way to prevent global warming and ocean acidification. Old oil and gas fields, such as those in the North Sea, are considered to be likely stores.

Enhanced oil recovery using carbon dioxide (CO2EOR) adds further value to the potential merits of CCS.

Oil is usually recovered by flushing oil wells through with water at pressure. Since the 1970s oil fields in West Texas, USA, have been successfully exploited using carbon dioxide. CO2is pumped as a fluid into oil fields at elevated pressure and helps sweep the oil to the production wells by contacting parts of the reservoirs not accessed by water injection; the result is much greater oil production.

Experience from the USA shows that an extra four to twelve per cent of the oil in place can be extracted using CO2-EOR. Professor Gluyas calculated the total oil in place in the UK fields and the potential UK gain in barrels and revenue from existing reserves using the American model.

David Hanstock, a founding director of Progressive Energy and director of COOTS Ltd, which is developing an offshore CO2transport and storage infrastructure in the North Sea, said:"The UK has significant storage capacity potential for captured carbon dioxide in North sea oil and gas fields.

"There is a unique opportunity to develop a new offshore industry using our considerable experience in offshore engineering. This would give us a technical lead on injecting and monitoring CO2that we could then export to the wider world to establish the UK as a world leader in carbon capture and storage technology."

Professor Gluyas added:"Enhanced recovery of oil in the North Sea oil fields can secure our energy supplies for the next fifty years. The extra 3 billion barrels of oil that could be produced by enhanced CO2recovery would make us self sufficient and would add around£60bn in revenue to the Treasury.

"Priming the system now would mean we have 10-15 years to develop CO2recycling and sufficient time to help us bridge to a future serviced by renewable energy."


Source

Wednesday, November 24, 2010

Oil Will Run Dry 90 Years Before Substitutes Roll Out, Study Predicts

The forecast was published online on Nov. 8 in the journalEnvironmental Science& Technology. It is based on the theory that long-term investors are good predictors of whether and when new energy technologies will become commonplace.

"Our results suggest it will take a long time before renewable replacement fuels can be self-sustaining, at least from a market perspective," said study author Debbie Niemeier, a UC Davis professor of civil and environmental engineering.

Niemeier and co-author Nataliya Malyshkina, a UC Davis postdoctoral researcher, set out to create a new tool that would help policymakers set realistic targets for environmental sustainability and evaluate the progress made toward those goals.

Two key elements of the new theory are market capitalizations (based on stock share prices) and dividends of publicly owned oil companies and alternative-energy companies. Other analysts have previously used similar equations to predict events in finance, politics and sports.

"Sophisticated investors tend to put considerable effort into collecting, processing and understanding information relevant to the future cash flows paid by securities," said Malyshkina."As a result, market forecasts of future events, representing consensus predictions of a large number of investors, tend to be relatively accurate."

Niemeier said the new study's findings are a warning that current renewable-fuel targets are not ambitious enough to prevent harm to society, economic development and natural ecosystems.

"We need stronger policy impetus to push the development of these alternative replacement technologies along," she said.


Source

Tuesday, November 23, 2010

Why It's Hard to Crash the Electric Grid

Last March, the U.S. Congress heard testimony about a scientific study in the journalSafety Science. A military analyst worried that the paper presented a model of how an attack on a small, unimportant part of the U.S. power grid might, like dominoes, bring the whole grid down. Members of Congress were, of course, concerned. Then, a similar paper came out in the journalNaturethe next month that presented a model of how a cascade of failing interconnected networks led to a blackout that covered Italy in 2003.

These two papers are part of a growing reliance on a particular kind of mathematical model -- a so-called topological model -- for understanding complex systems, including the power grid.

And this has University of Vermont power-system expert Paul Hines concerned.

"Some modelers have gotten so fascinated with these abstract networks that they've ignored the physics of how things actually work -- like electricity infrastructure," Hines says,"and this can lead you grossly astray."

For example, theSafety Sciencepaper came to the"highly counter-intuitive conclusion," Hines says, that the smallest, lowest-flow parts of the electrical system -- say a minor substation in a neighborhood -- were likely to be the most effective spots for a targeted attack to bring down the U.S. grid.

"That's a bunch of hooey," says Seth Blumsack, Hines's colleague at Penn State.

Hines and Blumsack's recent study, published in the journalChaoson Sept. 28, found just the opposite. Drawing on real-world data from the Eastern U.S. power grid and accounting for the two most important laws of physics governing the flow of electricity, they show that"the most vulnerable locations are the ones that have most flow through them," Hines says. Think highly connected transformers and major power-generating stations. Score one point for common sense.

"If the government takes these topological models seriously," Hines says,"and changes their investment strategy to put walls around the substations that have the least amount of flow -- it would be a massive waste of resources."

At the speed of light

Many topological models are, basically, graphs of connected links and nodes that represent the flows and paths within a system. When a node changes or fails, its nearest connected neighbor will often change or fail next. This abstraction has provided profound insights into many complex systems, like river networks, supply chains, and highway traffic. But electricity is strange and the US electric grid even stranger.

In August of 2003 a blackout started in Ohio and then spread to New York City. Cleveland went down and soon Toronto was affected. The blackout was able to jump over long distances.

"The way topological cascades typically occur -- is they're more like real dominoes," says Hines, an assistant professor in UVM's College of Engineering and Mathematical Sciences."When you push a domino the only thing that can fall is the one next to it. Whereas in a power grid you might push one domino and the next one to fall might be a hundred miles away."

That's because,"when a transmission line fails -- instantly, at nearly the speed of light, everything changes. Everything that is connected will change just a little bit," Hines says,"But in ways that are hard to predict." This strangeness is compounded by the fact that the U.S. electric grid is more an intractable patchwork of history than a rational design.

Which is why he and Blumsack decided to"run a horse race," he says, between topological models and a physics-based one -- applied to the actual arrangement of the North American Eastern Interconnect, the largest portion of the U.S. electric grid.

Using real-world data from a 2005 North American Electric Reliability Corporation test case, they compared how vulnerable parts of the grid appeared in the differing models. The topological measures -- so-called"characteristic path lengths" and"connectivity loss" between nodes -- came up with dramatically different and less accurate results than a model that calculated blackout size driven by the two rules that most influence actual electric transmissions -- Ohm's and Kirchhoff's laws.

In other words, the physics horse won. Or, as their paper concludes,"evaluating vulnerability in power networks using purely topological metrics can be misleading," and"results from physics-based models are more realistic and generally more useful for infrastructure risk assessment." Score one for gritty reality.

The value of unpredictability

An important implication of Hines's work, funded by the National Science Foundation, is that electric grid is probably more secure that many people realize -- because it is so unpredictable. This, of course, makes it hard to improve its reliability (in another line of research, Hines has explored why the rate of blackouts in the United States hasn't improved in decades), but the up-side of this fact is that it would be hard for a terrorist to bring large parts of the grid down by attacking just one small part.

"Our system is quite robust to small things failing -- which is very good," he says,"Even hurricanes have trouble taking out power systems. Hurricanes do cause power system failures, but they don't often take out the whole system."

Blumsack agrees."Our paper confirms that it would be possible for somebody who wanted to do something disruptive to the power grid to do so," he says."A lot of the infrastructure is out in the open," which does create vulnerability to planned attack."But if you wanted to black out half of the U.S., it will be much more difficult than some of these earlier models imply," he says.

"If you were a bad guy, there is no obvious thing to do to take out the power system," Hines says."What we learned from doing the simulations is that if you take out the biggest substation, with the most flow, you get the biggest failure on average. But there were also a number of cases where, even if you took out the biggest one, you don't get much of a blackout."

"It takes an incredible amount of information," he says,"to really figure out how to make the grid fail."


Source

Monday, November 22, 2010

Selfishness Can Sometimes Help the Common Good, Yeast Study Finds

The researchers, from Imperial College London, the Universities of Bath and Oxford, University College London and the Max Planck Institute for Evolutionary Biology, studied populations of yeast and found that a mixture of 'co-operators' and 'cheats' grew faster than a more utopian one of only 'co-operators'.

In the study, the 'co-operator' yeast produce a protein called invertase that breaks down sugar (sucrose) to give food (glucose) that is available to the rest of the population. The 'cheats' eat the broken down sugar but don't make invertase themselves, and so save their energy.

The study, which appears in the journalPLoS Biology, published by the Public Library of Science,used both laboratory experiments and a mathematical model to understand why and how a little"selfishness" can benefit the whole population.

Professor Laurence Hurst, Royal Society-Wolfson Research Merit Award Holder at the University of Bath, explained:"We found that yeast used sugar more efficiently when it was scarce, and so having 'cheats' in the population stopped the yeast from wasting their food.

"Secondly we found that because yeast cannot tell how much sucrose is available to be broken down, they waste energy making invertase even after there is no sugar left. This puts a brake on population growth. But if most of the population are 'co-operators' and the remainder are 'cheats', not all of the population is wasting their energy and limiting growth.

"For these effects to matter, we found that 'co-operators' needed to be next to other 'co-operators' so they get more of the glucose they produce. If any of these three conditions were changed, the 'cheats' no longer benefitted the population."

Dr Ivana Gudelj, NERC Advanced Fellow and Lecturer in Applied Mathematics at Imperial College London added:"Our work illustrates that the commonly used language of 'co-operators' and 'cheats' could in fact obscure the reality.

"When the addition of more invertase producers reduces the fitness of all, it is hard to see invertase production as co-operation, even if it behaves in a more classical co-operative manner, benefitting all, when rare."

The researchers suggest similar situations may exist in other species where 'cheats' help rather than hinder the population.

This study was funded by the Royal Society, the Natural Environment Research Council (NERC) and Conacyt.


Source