A Fifth of Global Power Generation Assets at Risk of Being Stranded If the World Is to Meet Its Climate Goals

A fifth of current global power plant capacity is at risk of becoming stranded in order to meet the climate goals set out in the Paris Agreement, new research from the Oxford Martin School has found.

Power plants are amongst the world’s largest emitters of CO2, the main cause for global warming. Future emissions from plants already in operation will overshoot the target for stabilising temperatures at 1.5-2°C above pre-industrial levels by 60 gigatons of CO2, the researchers say. On top of this, a further $7.2 trillion is likely to be invested in expanding the industry with new power plants and grids over the coming decade, which they say will commit the world to a further 270 gigatons of emissions.

The researchers warn that many fossil fuel power plants are likely to have to be retired early, under-utilised or undergo costly retrofitting with carbon capture and storage (CCS) in order to meet the globally-agreed cap on temperature rises.

“Existing power plant stock, if operated until the end of its useful life, would emit around 300 gigatons of CO2, which exceeds the 240 gigatons we can afford if we are to meet our climate goals,” said lead author Alexander Pfeiffer of the Oxford Martin Programme on Integrating Renewable Energy.

“Any investment made today in CO2-emitting infrastructure is going to have a considerable effect on humanity’s ability to achieve the ambitions of the Paris climate agreement.”

“Companies and investors need urgently to reassess their investments in fossil-fuel power plants, and government policies need to be strengthened to avoid further carbon lock-in.”

Professor Cameron Hepburn, Director of the Economics of Sustainability Programme at the University of Oxford and co-author of the study said, “The analysis shows we are already in a serious dilemma. We must choose between scrapping functioning equipment, capturing the carbon pollution, deploying expensive negative emissions technologies, or abandoning agreed climate goals. Adding new coal makes navigating between the devil and the deep blue sea even harder. ”

Dr Ben Caldecott, Director of the Oxford Sustainable Finance Programme at the University of Oxford and co-author of the study said, “To tackle anthropogenic climate change we need to halt the construction of fossil fuel power generation and immediately begin the dismantling of coal-fired power stations. New coal, let alone existing coal, is entirely incompatible with the Paris Climate Change Agreement.”

Dr Pfeiffer and his colleagues say that even if CCS and negative emissions technologies can be deployed at a large scale, significant asset stranding is still likely to take place.

He added: “Emissions are going to have to decrease rapidly if climate targets are to be met. But the current substantial plans for new fossil-fuel powered generators suggests the risk of asset stranding isn’t being sufficiently considered. This is likely to prove extremely costly in the long-run, to both the industry and its investors.”

Committed emissions from existing and planned power plants and asset stranding required to meet the Paris Agreement ,  Alexander Pfeiffer, Cameron Hepburn, Adrien Vogt-Schilb, and Ben Caldecott, Environmental Research Letters, IOP Science published 4 May 2018.

Source : Oxford Martin School, University of Oxford

‘Limiting Global Warming to 1.5C Will Help Countries Avoid Climate Impacts on Economic Growth’

A new study conducted by authors at the Oxford Martin School shows that there are significant economic benefits to realising the aims of the Paris Agreement and limiting global temperature rise to 1.5°C warming.

Dr Felix Pretis from The Institute for New Economic Thinking at the Oxford Martin School, together with Moritz Schwarz, Kevin Tang, Karsten Haustein, and Myles Allen at the University of Oxford, use a new set of climate projections and empirical estimates of how climate affects economic growth in order to understand the effect of 1.5°C and 2.0°C warming scenarios on global economies.

Their article, published today in the Philosophical Transactions of the Royal Society, shows that while there is a lot of uncertainty and a wide range of possible outcomes, economic growth under 1.5°C warming appears to be indifferent from what we see under current climate conditions. However, with an increase to 2°C warming, our new evidence suggests that a large set of countries will experience a significant decrease in economic growth rates.

While the Northern Hemisphere is projected to warm more rapidly than other regions, countries around the Equator are expected to face significant negative growth impacts if global temperatures exceed 1.5°C warming compared to pre-industrial times. “Stabilising the global temperature increase at 1.5°C may prevent significant negative economic impacts,” says Felix Pretis, co-director of Oxford’s Climate Econometricsproject. “At 2°C global warming we find a decrease in economic growth of up to 2% per year for countries in the Tropics”.

The study finds that with 2°C warming, the global average GDP per capita is likely to be 5% lower by the end of the century than if temperatures are stabilised at 1.5°C. Co-author, Moritz Schwarz, emphasises that “projected economic losses are greatest in low income countries, suggesting increasing levels of inequality with further climate change”.

The challenges of reducing emissions to achieve the targets set out by the Paris Agreement are numerous; however, the study shows the economic gains may well be worth the pursuit.

Source : Oxford Martin School, University of Oxford

New Report Raises Alarm at Vanishing Wealth of Nature

New research from a team of Oxford economists launched at the World Forum on Natural Capital in Edinburgh today shows that governments’ Ministries of Finance and Treasuries are often blind to how dependent economies are on nature. As a result, businesses and politicians are failing to register the systemic risk building up as the natural world fails.

Professor Cameron Hepburn, who led the research at the Institute for New Economic Thinking at the Oxford Martin School, says that flawed economic and political institutions are to blame. “Much of the value that economies create is built upon a natural foundation – the air, water, food, energy and raw materials that the planet provides. Without this natural capital, no other value is possible.”

The authors highlight that extreme weather, mass extinctions, falling agricultural yields, and toxic air and water are already damaging the global economy, with pollution alone costing 4.6 trillion USD every year.

“We are poisoning the well from which we drink,” says Oliver Greenfield, convenor of the Green Economy Coalition, who commissioned the research. “The dire state of nature and the implications for our future, barely registers in economic decision-making.  To put this another way, we are building up a big systemic risk to our economies and societies, and just like the financial crisis, most economists currently don’t see it”.

The research finds three central failings are to blame. First, natural capital is not being accurately measured or valued in the context of ecological tipping points and thresholds. Second, aggregate economic models are ill-equipped for seeing the dependencies between ‘capitals’. Most cost-benefit analyses and economic methodologies used in everyday decisions assume that natural capital can be easily substituted by man-made capital, when in fact it cannot. Third, there are a lack of appropriate political and economic institutions to manage natural capital effectively; even national wealth accounts provide an incomplete picture of the value of natural capital.

However, the research finds encouraging signs that our economy can be rapidly rewired to protect the planet. Governments and businesses can start measuring their stocks of natural capital in comprehensive natural wealth accounts, and ensure that those assets are protected and improved.  In addition, better data can be gathered on the value of the natural wealth that underpins economic activity, so that value can be accounted for by treasuries and financial centres. And critical natural assets must be given special status so that they cannot be squandered.

“The opportunity to properly value nature is not just a task for economists but for all of us,” Oliver Greenfield added. “The societies and economies that understand their dependency on nature are healthier and more connected, with a brighter future.”

Source : Oxford Martin School, University of Oxford

Could nanotechnology offer a cure for inherited blindness?

An Oxford Martin School visiting fellowship has enabled two academics to kickstart an experimental approach to tackling a form of blindness that affects one in four thousand people worldwide.

Dr Sonia Trigueros, Co-Director of the Oxford Martin School Programme in NanoMedicine, is collaborating with the University of Barcelona’s Professor Gemma Marfany, a human molecular geneticist, to study whether nanotechnology can be used to treat the hereditary retinal disorder retinitis pigmentosa.

Professor Marfany’s month-long visiting fellowship at the Oxford Martin School meant the pair could investigate the possibility of using Dr Trigueros’ cancer treatment method – using carbon nanotubes to deliver drugs – for gene therapy.

Dr Trigueros wraps DNA around nanostructures to protect them. But she realised the method could have other applications, and that the DNA “couldn’t only be structural but that it could also be used for genetic information”.

Their aim is to re-introduce correct genetic information into the retinal cells of retinitis pigmentosa patients, in whom the disease has caused genetic mutations. Because in most cases the gene that causes the disease is recessive, it can be superseded by the new, non-mutated genetic information. “We want to prove that it enters the cell and recovers the function of the cell,” says Professor Marfany.

Retinal cells are neurons, and die if they start to malfunction due to the genetic mutation. “They are very specialised cells,” says Dr Trigueros. “If we manage to do this we are opening up a new way for gene therapy using nanotechnology.”

Current methods of introducing new genetic material include using a virus to carry it, which is introduced using a sub-retinal micro-injection. “We want to see whether we can find an easier and less aggressive method,” says Dr Trigueros.

Professor Marfany says the visiting fellowship has allowed them to grow their knowledge together, “step by step”. “Our questions are relevant for our project but they can be used for many others,” she says. “Our aim isn’t just to treat this disease but to find answers to questions that will be relevant for many other areas of research.”

Being able to work together for a month had enabled them to map out the five-year timescale of the project, and to better get to know each other’s research. Dr Trigueros explained: “We realised the importance of seeing someone on a daily basis to build a very risky but potentially high impact project. We needed a month and the funding for the visiting fellowship, very kindly given by Lillian Martin, has enabled us to do this.”

Researchers collaborate with Google DeepMind on artificial intelligence safety

Academics from the Future of Humanity Institute (FHI), part of the Oxford Martin School, are teaming up with Google DeepMind to make artificial intelligence safer.

Stuart Armstrong, the Alexander Tamas Fellow in Artificial Intelligence and Machine Learning at FHI and Laurent Orseau, of Google DeepMind, will present their research on reinforcement learning agent interruptibility at the UAI 2016 conference in New York City later this month.

Orseau and Armstrong’s research explores a method to ensure that reinforcement learning agents can be repeatedly safely interrupted by human or automatic overseers. This ensures that the agents do not “learn” about these interruptions, and do not take steps to avoid or manipulate the interruptions.

Interruptibility has several advantages as an approach over previous methods of control. As Dr. Armstrong explains, “Interruptibility has applications for many current agents, especially when we need the agent to not learn from specific experiences during training. Many of the naïve ideas for accomplishing this—such as deleting certain histories from the training set—change the behaviour of the agent in unfortunate ways.”

In the paper, the researchers provide a formal definition of safe interruptibility, show that some types of agents already have this property, and show that others can be easily modified to gain it. They also demonstrate that even an ideal agent that tends to the optimal behaviour in any computable environment can be made safely interruptible.

Dr Armstrong continued: “Machine learning is one of the most powerful tools for building AI that has ever existed. But applying it to questions of AI motivations is problematic: just as we humans would not willingly change to an alien system of values, any agent has a natural tendency to avoid changing its current values, even if we want to change or tune them.

“Interruptibility, and the related general idea of corrigibility, allow such changes to happen without the agent trying to resist them or force them. The newness of the field of AI safety means that there is relatively little awareness of these problems in the wider machine learning community.  As with other areas of AI research, DeepMind remains at the cutting edge of this important subfield.”

New study links high blood pressure to vascular dementia

High blood pressure could significantly raise the risk of developing the second most common form of dementia, according to a new study from The George Institute for Global Health.

The medical records of more than four million people were analysed with researchers finding heightened blood pressure was associated with a 62 per cent higher risk of vascular dementia between the ages of 30-50.

Lead author Professor Kazem Rahimi, of The George Institute for Global Health in Oxford, which receives support from the Oxford Martin School, said: “Vascular dementia rates are increasing all over the world and will pose a significant economic and social burden in both developed and developing countries. So these results are particularly important.

“We already know that high blood pressure can raise the risk of stroke and heart attack. Our research has shown that high blood pressure is also associated with a significantly higher risk of vascular dementia.”

Key findings:

  • The team at The George Institute analysed the medical records of 4.28 million people in the UK.
  • They found over a seven year period 11,114 people went onto develop vascular dementia.
  • The study found patients aged 30-50, who had high blood pressure, had a 62 per cent higher risk of vascular dementia, and a 26 per cent higher risk at age 51-70.
  • The study also found that high blood pressure was still a risk factor even after adjusting for the presence of stroke, the leading cause of vascular dementia.

Professor Rahimi, deputy director of The George Institute UK, said: “Our results suggest that lowering blood pressure, either by exercise, diet or blood pressure lowering drugs, could reduce the risk of vascular dementia.”

Vascular dementia affects around 9.3 million people globally and is caused by reduced blood supply to the brain due to diseased blood vessels.

High blood pressure cause problems by damaging and narrowing the blood vessels in the brain. Over time, this raises the risk of a blood vessel becoming blocked or bursting. It’s a known risk factor for stroke and cardiovascular disease but until now studies were conflicting over the risks for vascular dementia with several even indicating that low blood pressure was associated with an increased risk of dementia.

Researchers forecast future technological costs, and find good news for solar power

Researchers at the Institute for New Economic Thinking at the Oxford Martin School have used Moore’s law to develop a forecasting model for the unit cost over time of any given technology. In their paper, How predictable is technological progress?, published in Research Policy, they describe how they then ‘went back in time’ to predict the future costs of different technologies from the point of their initial development – and found their forecasts to be reliable.

Professor Doyne Farmer and his colleague Dr François Lafond used historical data from 53 different technologies to test their theory. Using an autocorrelated geometric random walk method of modelling, they “put themselves in the past” and predicted the progress of the technologies, instead of using the usual regression method.

“We put ourselves in the past, pretended we didn’t know the future, and used a simple method to forecast the costs of the technologies,” said Professor Farmer.

“Actually going through the exercise of making out-of-sample forecasts rather than simply doing in-sample regressions has the essential advantage that it fully mimics the process of making forecasts, and allows us to say precisely how well forecasts would have performed.”

They used their model to forecast the path that the price of solar energy technology will follow over the coming decades, and found it likely that solar PV modules will continue to drop in cost at the roughly 10 per cent per year rate that they have in the past.

They also modelled a comparison between solar and a hypothetical competing technology, to estimate the probability that one technology will be cheaper than another at a given time horizon.

Extrapolating the growth trend of solar energy so far, they forecast it could supply 20 per cent of the world’s energy by 2027.

“In contrast, the ‘hi-Ren’ (high renewable) scenario of the International Energy Agency, which is presumably based on expert analysis, assumes that PV will generate 16 per cent of total electricity in 2050,” said Professor Farmer. “Thus even in their optimistic forecast they assume PV will take 25 years longer than the historical trend suggests, to hit what is a lower target.”

Professor Farmer and Dr Lafond say that by gathering more historical data, and adding in other variables such as production, R&D and patent activity “there should be considerable room for improving forecasting power”.

Improved forecasting would be a valuable tool for governments, they say, especially as it would provide an objective comparison to expert forecasts, which are vulnerable to bias.

“Sceptics have claimed that solar PV cannot be ramped up quickly enough to play a significant role in combatting global warming.

“In a context where limited resources for technology investment constrain policy makers to focus on a few technologies that have a real chance to eventually achieve and even exceed grid parity, the ability to have improved forecasts and know how accurate they are should prove particularly useful,” Professor Farmer concluded.

Transparency and future-proofing needed on GM insects, say academics

Transparency, future-proofing and improved monitoring are needed if the UK is to pursue the development and use of genetically-modified insects, Oxford Martin School academics have advised.

Genetically modifying insects such as mosquitoes has the potential to curb the spread of deadly diseases such as malaria and dengue, by rendering insects unable to transmit them, and to reduce insect populations to minimise their threat to animals and crops.

Potential modifications include not only gene editing, where genes are inserted into an insect’s DNA in order to alter its function or reduce its fitness, but also gene driving, which causes a gene to spread through a population at a greater rate than would be the case with natural inheritance.

Dr Javier Lezaun, Deputy Director of the Institute for Science, Innovation and Society (InSIS), founded by the Oxford Martin School, and his colleague Christiaan de Koning, were asked by the House of Lords Science and Technology Committee to submit evidence for its inquiry into genetically-modified insects. Professor Michael Bonsall, Principal Investigator on the Oxford Martin Programme on the Future of Food, served as the Specialist Advisor to the committee, which published its report on 17 December.

Launching the report, the committee’s Chairman, John Roundell Palmer, the Earl of Selbourne, called for the government to initiate field trials “to put not only the science but, crucially, the regulations to the test”.

He said the UK had “a moral duty” to test the potential of genetically-modified insects, “for the long-term benefit of those countries where diseases like dengue and malaria are indiscriminate killers”.

The committee concluded that:

  • GM insects have considerable potential to control insect-borne disease and agricultural pests, but they are no silver bullet
  • that the UK, as a world leader in this area of research, could reap potentially significant economic benefits
  • that EU regulation of GMOs is ‘failing lamentably’, and risks squandering these benefits
  • a lack of international guidance on regulation and governance of GM insect technologies could affect the countries who may benefit from these technologies the most.

It gave the following key recommendations:

  • The Government must act to ensure that the current regulatory system is able to work properly, and must commit to working with the EU to address how the system could be improved.
  • The science, EU regulatory environment and policies on GM insects need to be tested. Government departments should work together in order to instigate a GM insect field trial.
  • Alongside the field trials, the Government should initiate a programme of public engagement.
  • The Government, through Innovate UK in partnership with the Research Councils, must support the commercialisation of UK-based GM insect research.
  • The EU needs to rework its regulation to reflect benefits, not just the risks. Given the evolution of new gene-editing techniques, in the long-term trait-based rather than process-based regulation should be explored.
  • Read the House of Lords Science and Technology Committee report on GM insects
  • Read the evidence submitted by the Institute for Science, Innovation and Society (InSIS)