Weather Anomalies Accelerate the Melting of Sea Ice

In the winter of 2015/16, something happened that had never before been seen on this scale: at the end of December, temperatures rose above zero degrees Celsius for several days in parts of the Arctic. Temperatures of up to eight degrees were registered north of Svalbard. Temperatures this high have not been recorded in the winter half of the year since the beginning of systematic measurements at the end of the 1970s. As a result of this unusual warmth, the sea ice began to melt.

“We heard about this from the media,” says Heini Wernli, Professor of Atmospheric Dynamics at ETH Zurich. The news aroused his scientific curiosity, and a team led by his then doctoral student Hanin Binder investigated the issue. In November 2017, they published their analysis of this exceptional event in the journal Geophysical Research Letters.

Sea Ice
Warm air highway into the Arctic. (Graphic: Sandro Bösch / ETH Zurich)

In it, the researchers show how these unusual temperatures arose: three different air currents met over the North Sea between Scotland and southern Norway, carrying warm air northwards at high speed as though on a “highway”. (see illustration)

One air current originated in the Sahara and brought near-surface warm air with it. To begin with, temperature of this air was about 20 degrees Celsius. While it cooled off on its way to the Arctic, it was still above zero when it arrived. “It’s extremely rare for warm, near-surface subtropical air to be transported as far as the Arctic,” says Binder.

The second air current originated in the Arctic itself, a fact that astonished the scientists. To begin with, this air was very cold. However, the air mass – which also lay close to the ground – moved towards the south along a curved path and, while above the Atlantic, was warmed significantly by the heatflux from the ocean before joining the subtropical air current.

The third warm air current started as a cold air mass in the upper troposphere, from an altitude above 5 kilometres. These air masses were carried from west to east and descended in a stationary high-pressure area over Scandinavia. Compression thereby warmed the originally cold air, before it entered the “highway to the Arctic”.

Poleward warm air transport

This highway of air currents was made possible by a particular constellation of pressure systems over northern Europe. During the period in question, intense low-pressure systems developed over Iceland while an extremely stable high-pressure area formed over Scandinavia. This created a kind of funnel above the North Sea, between Scotland and southern Norway, which channelled the various air currents and steered them northwards to the Arctic.

This highway lasted approximately a week. The pressure systems then decayed and the Arctic returned to its typical frozen winter state. However, the warm period sufficed to reduce the thickness of the sea ice in parts of the Arctic by 30 centimetres – during a period in which ice usually becomes thicker and more widespread.

“These weather conditions and their effect on the sea ice were really exceptional,” says Binder. The researchers were not able to identify a direct link to global warming. “We only carried out an analysis of a single event; we didn’t research the long-term climate aspects” emphasises Binder.

High-pressure systems cause sea ice to melt

Sea Ice
Arctic sea ice on 26 August 2012: Never before since satellite surveillance began, the extent of the ice was as small as it was on that date. (Image: NASA Goddard Space Flight Center)

However, the melting of Arctic sea ice during summer is a different story. The long-term trend is clear: the minimum extent and thickness of the sea ice in late summer has been shrinking continually since the end of the 1970s. Sea ice melted particularly severely in 2007 and 2012 – a fact which climate researchers have thus far been unable to fully explain. Along with Lukas Papritz from the University of Bergen, Wernli investigated the causes of these outliers. Their study has just been published in the journal Nature Geoscience.

According to their research, the severe melting in the aforementioned years was caused by stable high-pressure systems that formed repeatedly throughout the summer months. Under these cloud-free weather conditions, the high level of direct sunlight – the sun shines 24 hours a day at this time of year – particularly intensified the melting of the sea ice.

Areas of low pressure “inject” air masses into the Arctic

These high-pressure systems developed through an influx of air from temperate latitudes. Low-pressure systems in the North Atlantic and North Pacific areas, for example, “inject” air masses into the Arctic at a height of about eight kilometres. This raised the height of the tropopause, the boundary between the troposphere and the stratosphere, in the region of the “injections”. As a result, surface air pressure below rose and a high-pressure system was established. While it dissipated again around ten days later, an unusually high amount of sea ice melted in the interim, and the remaining ice thinned.

The climate scientists’ investigation demonstrated that in the summers of 2007 and 2012, during which these high-pressure situations occurred particularly frequently, they led to cloud-free conditions every third day. The high level of solar radiation intensified and accelerated the melting of the sea ice. “The level of solar radiation is the main factor in the melting of the ice in summer. Unlike with the winter anomaly, the “injected” air at about 8 kilometre altitude from the south is not warm – with minus 60 degrees it’s ice-cold,” says Wernli. “The air temperature therefore has very little effect on the ice.” Furthermore, the northward transport of warm, humid air masses at the edge of the high-pressure systems reduces (heat) emission, which further intensifies melting.

Their analysis has allowed the researchers to understand the meteorological processes leading to significant variations in summertime ice melt for the first time. “Our results underline the fundamental role that weather systems in temperate latitudes play in episodes of particularly intense ice melt in the Arctic,” says the ETH professor.

Source : ETH Zurich

North Sea Water and Recycled Metal Combined to Help Reduce Global Warming

Scientists at the University of York have used sea water collected from Whitby in North Yorkshire, and scrap metal to develop a technology that could help capture more than 850 million tonnes of unwanted carbon dioxide in the atmosphere.

High levels of carbon dioxide in the atmosphere are a major contributor to greenhouse gases and global warming. Carbon overload is mainly the result of burning fossil fuels, such as coal and oil, as well as deforestation.

Global efforts are being made to reduce carbon dioxide levels as well as find novel ways of trapping excess gases from the atmosphere.  The team at York have now found a way to safely trap the gas as dawsonite, a solid mineral and natural component of the Earth’s crust.

Professor Michael North, from the University’s Department of Chemistry, said: “We wanted to look for methods of trapping the gas using environmentally friendly tools to produce a result that could be highly scalable to capture millions of tonnes of unwanted carbon dioxide.

“We started with the realisation that using graphite, the material used in pencils, to line aluminium reactors, results in the mineralisation of carbon dioxide.  We wanted to trap the gas at much higher levels, using low-energy processes, so we decided to look at waste materials, such as scrap metals, to see if this could be done without using chemical agents as a catalyst.”

Kitchen foil and food wrappings

Researchers filled the aluminium reactor with sea water taken from Whitby Bay, and waste aluminium such as that found in kitchen foil or food wrappings. The gas is transferred to the sea water inside the reactor. Electricity, captured from solar panels, is passed through it, resulting in the aluminium turning the dissolved carbon dioxide into the mineral, dawsonite.

Professor North said: “Tens of millions of tonnes of waste aluminium are not recycled each year, so why not put this to better use to improve our environment?  The aluminium in this process can also be replaced by iron, another product that goes to waste in the millions of tonnes. Using two of the most abundant metals in the Earth’s crust means this process is highly sustainable.”

The research showed that 850 million tonnes of carbon dioxide could be mineralised each year using a combination of sea water, solar-powered electricity, and scrap metal, eliminating the need to use high energy gas-pressurisation and toxic chemicals to produce the same effect.

Hydrogen by-product

Unlike other electrical reaction systems for carbon dioxide treatment, hydrogen is not needed to cause the chemical reaction in the first instance, which would normally make the process more expensive.

Instead, hydrogen is produced from the electrical circuit and becomes a side-product of the process.  Hydrogen gas, a non-polluting gas that is valuable to the future of fuel production at low cost and ‘zero emissions’.

Researchers are now working to maximise the energy efficiency of the process and allow the hydrogen by-product to be collected and utilised, before seeking to build toward full-scale production.

This work is published in the journal ChemSusChem.

Source : University of York

Under the Sea: Ensuring the Safety of Offshore Carbon Storage

Faced with increasing levels of carbon dioxide in our atmosphere and oceans, scientists are developing on- and offshore carbon capture and storage (CCS) systems as a potential solution to the problem. Further research is still needed, however, to ensure the safety of new technologies that capture and permanently store industrial emissions in the seabed.

This process is the subject of a multi-disciplinary EU-funded project, which looks at new approaches, methodologies and tools to ensure the safe operation of offshore CCS sites – often existing oil and gas reservoirs that are economically unviable. The STEMM-CCS project aims to develop approaches that will identify appropriate marine storage sites and monitor them effectively, increasing public confidence in CCS as a viable option for reducing CO2 in the atmosphere and seas.

The research team, led by scientists in the UK, have published a paper in the ‘Journal of Geophysical Research: Oceans’, in which they outline cost-effective ways of detecting the source of leaks from storage sites seeping through the ocean floor. Such leaks may be harmful to humans and the marine environment, but monitoring operations can be costly.

In his report, Guttorm Alendal from the University of Bergen combines Bayes’ theorem and ‘footprint predictions’ to suggest three strategies for deciding the search paths of autonomous underwater vehicles carrying sensors.

Tracing leaks quickly, efficiently and autonomously

Tracing the source of a leak depends to a large extent on local oceanic and atmospheric conditions. Environmental shifts such as changes in fauna or elevated concentrations of dissolved gases can be used as indicators of marine gas releases, but variability in ocean dynamics – local topography and varying current directions caused by tidal variations, for example – create challenges for autonomous vehicles.

Such vehicles are capable of making instant measurements to identify the source of gas leaks: each measurement updates the vessel’s probability field and informs its decision about where in the designated search area to examine next. Creating a probability map based on Bayes’ theorem can point the vehicle on the most cost-effective path to tracking down the origin of the leak.

STEMM-CCS (Strategies for Environmental Monitoring of Marine Carbon Capture and Storage) will carry out a number of research cruises at a potential CCS site in the North Sea, where researchers will release CO2 beneath the seabed and track its path to the seafloor and into the water column. Using a combination of existing technology and their own new sensors and techniques to examine baseline conditions, sub-seafloor structures and fluid pathways, the team aims to generate a substantial amount of knowledge to support recommendations for future best practice.

Source: CORDIS

Fisheries Physicist Boils Theory down to Five Words

It is not often that a doctoral dissertation can be boiled down to five words, but Ken H. Andersen’s case in an exception. The five words are: ‘Big fish eat small fish’—which has an intuitive ring to it. However, the five words actually reflect a break with the way in which fish stocks are traditionally investigated.

“The earliest attempts to design an optimal fishing management system date back 70 years when UK researchers showed how the fishing industry affected plaice and cod of different ages in the North Sea. Their research showed that it was most advantageous to fish the oldest fish and let the younger ones live. Similar models have subsequently been devised for other species of fish. Generally speaking, this is the same approach employed to this day,” says Ken Haste Andersen, Professor at DTU Aqua:

“A given species is typically described using a range of parameters—e.g. size, growth rate, maturity, and its risk of being eaten. But if all these parameters have to be in place, it becomes extremely difficult to assess the consequences of various types of fishing,” he says.

Hidden reality
Add to this real-life conditions at sea—e.g. that fish also eat each other—and you end up with a highly complex picture of an entire ecosystem. So complex, in fact, that it can be to hard to focus on the area you are truly interested in knowing more about.

For the past ten years, Ken Haste Andersen—who has a background as a theoretical physicist—has therefore worked on creating models that can describe what is actually going on below the sea surface. It is all about obtaining data about fish stock numbers, permissible fishing quotas, and ultimately how much money society can earn from the fishing industry.

“What ultimately defines what you eat and who you are eaten by is your own size.”

Professor Ken Haste Andersen, DTU Aqua

“Fishing for cod, for example, will affect sand eel numbers, which will increase as there are fewer cod to eat them. The whole ecosystem is inextricably linked—an action in one area affects the entire ecosystem. Of course, researchers are aware of this and have tried to improve the old models by grouping populations, but these efforts are little more than stop-gap solutions,” says Ken Haste Andersen.
In his doctoral dissertation, he therefore proposes new models which, among other things, means talking about the size of the fish rather than different species such as cod and eel.

“What ultimately defines what you eat and who you are eaten by is your own size,” says Ken Haste Andersen.

Information overkill
‘Big fish eat small fish’ reflects a break with the current models’ focus on species and their many characteristics.

“It means throwing away some information. Conversely, I can say something about a less complex part of the ecosystem—e.g. size—with greater certainty. It’s enough for me to know that I’m dealing with a fish that weighs two pounds, for example. Based on this, I can say a lot about the other parameters, and then make an impact assessment of different types of fishing activity and their effect on the other fish stocks.”

According to Ken Haste Andersen, the approach will pave the way for improved fishing management in places where—unlike Denmark—there are insufficient fish stock data.

“We need methods for managing stocks we don’t know much about. But if we have to examine each individual species and the age of the fish, it will take a very long time and be very expensive. All things considered, it is much easier to say something about the size of the individual species. The nature of the species itself is less important. For no matter where we look, we know that big fish eat small fish.

Source : Technical University of Denmark

Oil and Gas Wells as a Strong Source of Greenhouse Gases

Boreholes in the North Sea could constitute a significantly more important source of methane, a strong greenhouse gas, than previously thought. This is shown by a new study of the GEOMAR Helmholtz Centre for Ocean Research Kiel recently published in the international peer-reviewed journal Environmental Science & Technology. As the authors state large amounts of methane are released from the sediments surrounding boreholes, probably over long periods of time.

greenhouse gases
Methan gas leakage near a well. Photo: ROV KIEL6000, GEOMAR.

The pictures went around the world. In April 2010, huge amounts of methane gas escaped from a well below the Deepwater Horizon platform in the Gulf of Mexico. This “blow-out” caused an explosion, in which eleven people died. For several weeks, oil spilled from the damaged well into the ocean. Fortunately, such catastrophic “blow-outs” are rather rare. Continuous discharges of smaller amounts of gas from active or old and abandoned wells occur more frequently.

greenhouse gases
Bacteria mats degrading methane on the seabed. Photo: ROV KIEL6000, GEOMAR.

Scientists from GEOMAR Helmholtz Centre for Ocean Research Kiel and the University of Basel now published new data in the international journalEnvironmental Science & Technology, indicating that gas migration along the outside of wells could be a much bigger problem than previously assumed. This type of leakage is currently neither considered by operators nor regulators, but could be just as important as fugitive emissions through damaged wells, which are usually recognized and quickly repaired. “We estimate that gas leakage around boreholes could constitute one of the main sources of methane in the North Sea“, says Dr. Lisa Vielstädte from GEOMAR, the first author of the study.

During expeditions to oil and gas fields in the central North Sea in 2012 and 2013, the scientists discovered a number of methane seeps around abandoned wells. Interestingly, the gas originates from shallow gas pockets buried less than 1,000 meters below the seabed. They are simply penetrated when drilling into the underlying, economically interesting hydrocarbon reservoirs. “These gas pockets usually do not pose a risk to the drilling operation itself. But apparently disturbing the sediment around the well enables the gas to rise to the seafloor,” explains Dr. Matthias Haeckel from GEOMAR, who initiated the study.

Seismic data from the subsurface of the North Sea further show that about one third of the boreholes perforated shallow gas pockets and may thus leak methane. “Considering the more than 11,000 wells that have been drilled in the North Sea, this results in a fairly large amount of potential methane sources”, states Dr. Vielstädte who is currently based at the Stanford University in California, USA.

According to the team’s calculations shallow gas migration along wells may release around 3,000 to 17,000 tonnes of methane from the North Sea seafloor per year.  “This would reflect a significant contribution to the North Sea methane budget”, emphasizes Dr. Haeckel.

In the ocean, methane is usually degraded by microbes, thereby locally acidifying the seawater. In the North Sea, about half of the wells are located in such shallow water depths that the methane leaking from the seabed can reach the atmosphere, where it is acting as a potent greenhouse gas – much more efficient than carbon dioxide.

“Natural gas, thus methane, is often praised as the fossil fuel that is most suitable for the transition from coal burning towards regenerative energies. However, if drilling for gas leads to such high atmospheric methane emissions, we have to rethink the greenhouse gas budget of natural gas “, summarizes Dr. Haeckel.

In order to better quantify the human impact on the methane budget of the North Sea, Kiel’s research vessel POSEIDON will investigate further gas seeps in the vicinity of oil and gas wells in October.

Source : GEOMAR Helmholtz Centre for Ocean Research Kiel

Unbalanced Wind Farm Planning Exacerbates Fluctuations

The expansion of renewable energy has been widely criticised for increasing weather-dependent fluctuations in European electricity generation. A new study shows that this is due less to the variability of weather than from a failure to consider the large-scale weather conditions across the whole continent: many European countries are unilaterally following national strategies to expand wind energy capacities without looking beyond their own backyard.

It would be better, however, for individual countries to work together and to promote the expansion of wind capacity in other European regions that are currently making very little use of wind power.  Balancing capacity across the continent would effectively minimise the extreme fluctuations caused by the varied weather conditions that currently affect wind speeds. This is the conclusion reached by a group of weather and energy researchers from ETH Zürich and Imperial College London in a new study, which has just been published in the journal Nature Climate Change.

Combining weather data and production capacities

The researchers conducted their study by combining Europe-wide data on large-scale weather conditions from the past 30 years with wind and solar electricity production data. This made use of the Renewables.ninjaplatform developed at ETH Zürich for simulating the output of Europe’s wind and solar farms based on historical weather data. This open simulation tool is available for anyone to use worldwide, as part of the effort to improve transparency and openness of science.

The researchers used this data to model how wind power is related to seven prevailing “weather regimes” in Europe and how it will change with the further expansion of wind energy capacity. These weather regimes explain why European wind electricity generation suffers from fluctuations lasting several days.

Some regimes are characterised by cyclones rolling in from the Atlantic bringing high winds to western Europe, but these are accompanied by concurrent calm conditions in the east.  Other regimes see calmer weather from the Atlantic. But at the same time, wind speeds consistently increase in southern Europe and northern Scandinavia.

“There is hardly a weather situation in which there is no wind across the entire continent and thus all of Europe would lack wind power potential” explain Christian Grams, lead author of the study from the Institute for Atmospheric and Climate Science at ETH Zurich.

However, today’s wind farms are distributed irregularly across Europe, mostly in countries bordering the North Sea. This results in uneven wind electricity generation, because most capacity is installed in neighbouring countries with similar weather conditions. This means that if a stable high-pressure system causes a lull for a few days or even weeks over the North Sea, as happened in the winter of 2016/17, Europe-wide wind electricity generation drops dramatically.

Cooperation would compensate for fluctuations

The problem for Europe’s power system will be exacerbated by countries following their own national strategies for expanding wind power, which will further concentrate capacity in the North Sea region. This will lead to even more extreme fluctuations: the difference between high production in favourable wind conditions and low production during a lull could be as much as 100 gigawatts – roughly the same capacity as 100 nuclear power plants – and would have to be made available or held back within the course of only a few days.

If European countries were to cooperate and set up future wind farms based on understanding of the continent-scale weather regimes, fluctuations in future wind energy could be stabilised at the current level  of around 20 gigawatts. The Balkans, Greece, the western Mediterranean, and northern Scandinavia are all potential sites.

These locations would all have enough wind if, for example, high pressure led to a lull in the North Sea. Likewise, if a stable high-pressure area slowed wind production in the Mediterranean, the wind farms around the North Sea would produce enough electricity. “This is why wind capacity in countries such as Greece or Bulgaria could act as a valuable counterbalance to Europe’s current wind farms. However, this would require a paradigm shift in the planning strategies of countries with wind power potential,” emphasises co-author Iain Staffell from Imperial College London.

Electricity storage not feasible

The authors say that it would be difficult to store electricity for several days to balance these multi-day fluctuations – with batteries or pumped-storage lakes in the Alps, for example – since the necessary amount of storage capacity will not be available in the foreseeable future. Current storage technologies are more suited to compensating for shorter fluctuations of a few hours or days.

Moreover, a wider geographical distribution of wind farms also requires the expansion of the transmission grid. However, such a pan-European renewable energy system could still provide Switzerland with the opportunity to use its hydropower capacities more economically in order to compensate for short-term fluctuations.

Political will and network expansion needed

Using solar energy to compensate for gaps over several days would only work on a regional level at best. The researchers say that in order to compensate for fluctuations across Europe, solar energy capacity would have to be increased tenfold.

“The sun often shines when it’s calm,” explains co-author Stefan Pfenninger, from the Institute for Environmental Decisions at ETH Zürich, “but in winter, there is often not enough sunshine in central and northern Europe to produce sufficient electricity using solar panels.” It would therefore make little sense to compensate for fluctuations in wind energy with a massive expansion of solar capacity.

The researchers now hope that energy producers and network operators, as well as governments and politicians, will hear about these new findings and better coordinate Europe-wide planning and grid expansion.

Source : ETH Zürich

Seaweed as sustainable food for people and animals

Seaweed is on the rise as a sustainable source of protein for people and animals alike. In the four-year Social Innovation Programme ‘Seaweed for Food and Feed’, Wageningen University & Research and the North Sea Farm foundation are partnering with industry to develop a comprehensive and sustainable seaweed sector. The work involves multifunctional seaweed farms in the North Sea linked to a land-based chain for logistics, processing and sales to the food industry. Dutch State Secretary of Economic Affairs Martijn van Dam has pledged to invest five million euros in Seaweed for Food and Feed.

Seaweed is a nutritious and versatile crop that is increasingly important as a healthy and sustainable protein source for people and animals. Seaweed production does not require agricultural land or fresh water, and all its biomass can be used. The purpose of Seaweed for Food and Feed is to create a sustainable source of healthy food products, additives and feed by means of cultivation in the Dutch waters.

Large-scale seaweed production

Large-scale seaweed production for human and animal food in Western Europe is yet to become profitable. The innovations from Seaweed for Food and Feed should result in a higher yield for seaweed and a reduction of the production costs via measures such as the utilisation of all seaweed components, a circular approach, local production and a chain approach. The new knowledge will be used for the production of new thickening agents, special (low-calorie) sugars for food and feed, and ingredients for aromatics, colourants and flavourings.

Maximise seaweed value

Seaweed for Food and Feed applies various lines of knowledge and innovations to maximise the value of seaweed:

• The selection and breeding of high-quality varieties
• Optimal cultivation systems and optimal locations (to ensure the composition of the seaweed is as favourable as possible)
• Multiple processing of seaweed to ensure all components are optimally used at the highest possible economic value
• Making seaweed products more appealing to consumers by developing attractive products and accessible information
• High-quality applications, such as the utilisation of nutritional value, reducing the use of antibiotics and applications for the pharmaceutical and cosmetics industries.

Wageningen University & Research has been carrying out research into issues such as the applications of seaweed, the economic feasibility of the sector and consumer acceptance for many years. The Seaweed for Food and Feed programme is aimed at developing a comprehensive and sustainable seaweed sector in the Netherlands.

Finding Our Sea Legs on the Southern Ocean

After two field campaigns in Switzerland, I now have the chance to take part in a large-scale research expedition – a unique opportunity for a young weather scientist. The expedition aims to circumnavigate Antarctica – an ambitious project (see box). For me, it’s also the realisation of a dream. The anticipation I felt, as well as the team’s meticulous preparations, reflected the scale of the occasion.

A closer look at water cycles

The team consists of ten scientists from ETH, EPFL and the University of Bergen (Norway). Iris Thurnherr and I will be the only ones on board though. We are both doctoral students in ETH’s Atmospheric Dynamicsgroup and are in charge of carrying out the measurements on board the ship. Our colleagues will be supporting us from home with weather forecasts, advice and tips wherever possible.

The aim of our research is to investigate the interaction between sea and atmosphere in the Southern Ocean around the Antarctic. To achieve this, we will be using two laser spectrometers, a small rain radar, weather balloons and vials to collect rainwater. The laser spectrometers, which we will be using to measure atmospheric water vapour, form the heart of the project.

Starting out is always tough

As I boarded the ship in Bremerhaven, I was full of expectation and enthusiasm. Unfortunately, it wasn’t long before something put a damper on our high spirits: Iris and I had hardly set foot on deck when we discovered that the container in which we had installed our measuring equipment wouldn’t have power at least until we reached Cape Town. The reason for this, unbelievably, was an insufficient power supply on board.

We had spent months meticulously planning for everything that could possibly go wrong with the measuring equipment, organising replacement parts and planning for every eventuality – only for everything to come unstuck because of something so simple. Something we actually couldn’t prepare for.

Troubleshooting in Bremerhaven

I felt at a loss, and somewhat disheartened at the prospect of not being able to take measurements during the preparatory journey to Cape Town, missing out on the exciting cross-section of climatic zones from Europe, through the tropics and onto South Africa. But we didn’t let it get us down for too long. Instead, we headed out in search of an alternative location for our measuring instruments. There was no time to lose. The ship was due to cast off in less than two days, and storm Uwe, which was forecast to meet us in the North Sea, would blast everything from the deck that wasn’t bolted down.

Eventually, we found what we were looking for: there was a space on the top deck of the ship in a small radar control room. We immediately removed the frame with our measuring instruments from the container and transported it by crane to its new location. Seeing all our measuring equipment swaying through the air made us feel a little queasy, but thankfully everything went well, and we were able to fix the equipment down in its new location. In the last few minutes, just as the ship was setting sail, we got the devices running and began our measurements.

From the stormy North Sea to the tropics

The first few days on board were marked by strong winds and high waves. In the North Sea, we were caught by the first true storm of the coming winter and, after a brief respite, a low-pressure area pursued us along the west coast of Spain and Portugal. The measurements proved to be similarly challenging, since access to the deck was often prohibited on account of the high waves.

On top of that, we were also still waiting for permission to launch our weather balloons. Curiously, these hadn’t made it onto the list of authorised experiments. The balloons, which are equipped with measuring devices for temperature, humidity and atmospheric pressure, are an important part of our project. One of our first challenges was to convey this to the almost exclusively Russian-speaking crew. On the tenth day after setting out, the last piece of the puzzle finally fell into place: we received permission to launch, and the weather was perfect for a first test. It was a clear, beautiful and almost windless day, and the sea was as glassy and still as Lake Zurich.

Improvisational skills required

Good measuring data is an essential part of successful research. Getting hold of it, though, is far from straightforward, and requires a great deal of effort, patience, and a high tolerance for frustration. It’s important not to pitch your expectations too high to avoid disappointment. My experience from previous field campaigns has been confirmed on the Akademik Treshnikov: many things can and will go wrong, regardless of how well you prepare yourself. The important thing is to keep a cool head and find a workable alternative using the means available to you, as well as a healthy dose of improvisation.

New Simulations of Wind Power Generation

There has been a massive boom in wind power capacity both in Europe and worldwide. In 2015 global installed capacity was around 350 gigawatt (GW), with 135 GW installed in Europe, distributed across some 87,000 wind turbines. Wind power now provides a bigger share (13 percent) of electricity than nuclear power stations. In countries such as Spain, Denmark and Germany, the amount of wind power already installed is in theory enough to cover nationwide demand for electricity under ideal conditions, i.e. maximum wind power output and low consumer demand.

Inconsistent output

However, the amount of installed capacity says very little about how much electricity is actually fed into the national grid by a country’s wind fleet. Unlike nuclear power, wind is by nature harder to predict. This makes it difficult to connect wind farms to existing power grids.

Both energy researchers and providers therefore need to simulate electricity production across very short time intervals to accurately predict how high the load could be at any given point in time.

Recently, researchers have started performing such simulations with the help of “reanalysis” models: global meteorological models fed with measured data such as from weather stations and satellites, which process these measurements into a coherent world-wide simulation of atmospheric conditions.

Critical review of weather models

However, there is one major drawback with data from reanalysis: meteorological models simplify the real world do not provide adequately detailed simulation of factors that are important for wind power, such as the topology around a wind farm. So, if data from reanalysis models  is used to simulate wind power production without further correction, the models are liable to produce a systematically distorted picture. Despite this, a number of studies have been published on wind power generation that are based on uncorrected data.

This inspired the energy researcher Stefan Pfenninger from ETH Zurich and his colleague Iain Staffell from Imperial College London to create a large database of recorded electricity output from wind farms across Europe, as well as country-wide production data reported by transmission network operators, and to use that database to derive correction factors for each European country. They then use their Virtual Wind Farm Model (VWF) to simulate wind power production in Europe over the course of 20 years.

Fresh simulation of output

By adopting a rigorous approach, the two researchers have managed to create a more realistic picture of wind energy output in Europe. Their corrected simulations show that the uncorrected simulations used in other studies have overestimated wind power output in north-western Europe by up to 50 percent, while underestimating it by as much as 30 percent in southern Europe.

The researchers also recalculated the capacity factors for Europe: the current European average is 24.2 percent, compared with 32.4 percent in the UK and 19.5 percent in Germany. The European average only varies by a few percent from one year to the next. “This fluctuation is much less than the deviation observed in individual countries”, says Pfenninger. “The bigger the wind fleet and the wider the geographical footprint, the smaller the fluctuations on the supply side”. It is therefore important for national grids to be interconnected more efficiently so as to be able to offset power outages in one region with surplus output in another country.

The simulation also shows that capacity factors are improving, partly thanks to technological advances and better offshore locations. Britain’s wind parks are now 25 percent more productive than they were 10 years ago.

North Sea countries in expansion mode

Given the current state of planning, Pfenninger and Staffell predict that the average capacity factor for Europe could rise by a third, to more than 31 percent. “Countries adjoining the North Sea should experience particularly strong growth in the near future”, says Pfenninger. The UK could achieve a capacity factor of almost 40 percent, and Germany close to 30 percent.

But in order for planners, network operators, utility companies and other scientists to be able to continue using the simulations developed by the energy researchers, Pfenninger and Staffell have devised an interactive web application, www.renewables.ninja, where the European data sets are also available as a download. The web platform also gives access to data from a study, published at the same time, which develops simulation of Europe’s photovoltaic power output. Pfenninger and Staffell have been beta testing Renewables.ninja for six months and now have users from 54 institutions across 22 countries, including the International Energy Agency and IRENA.