Wednesday, 11 December 2013

The Surge 2013

by Chris Skinner (@cloudskinner)

This blog post is going to talk about the storm surge that swept along the east coast of the UK on the 5th December 2013, last week. Rather ironically, I was going to post about problems in predicting disasters and how we mitigate against these, but this seems more topical and worthy of a post., and I hope to give you a bit of an insight into how a GEES researcher responds to live events relevant to their field.

The surge seemed to catch everyone by surprise. I checked the forecast on the Monday as my in-laws were travelling to Hull from London to visit us on the 5th, and the Met Office app suggested Thursday was going to be quite nice, but a bit windy in the far North-East. The forecast did evolve over the week, but not so much to suggest the conditions that resulted in them passing at least five overturned lorries on their journey (two and a van on the Ouse bridge alone).

On Thursday afternoon, there were warnings of a storm surge – a temporary increase of sea level caused by low pressure and high winds – that would potentially flood coastal towns on the east coast. Our local news focussed on Grimsby and Cleethorpes as being the most likely to be hard hit. Hull was just at medium risk. Myself and Prof Coulthard (my boss) watched the tide rise on the Immingham tidal gauge and compared it to the data we held from the same site during the 1953 storm surge.

The 1953 storm is THE storm when talking about storm surges in the UK. It was big and it caused extensive damage and over 300 people lost their lives. This storm was being billed as ‘the worst since 1953’, yet to our astonishment we saw the tidal gauge go up and look increasingly like it was going to exceed the level recorded back then.

As we were leaving the office, around five thirty, the first warnings started coming in that Victoria Dock in Hull was at risk. I followed the story unfold on Twitter as photos popped up showing the first signs of water overtopping the defences. The Marina also flooded and water spilled out into the Kingston Retail Park, and the home of the Hull Stingrays and Hull’s most versatile venue, the Ice Arena.


Flooding between the Marine and Kingston Retail Park in Hull Photo by @estuary_ecology


The City of Hull held its breath as high tide approached. Only the tidal barrier stood between the surging sea and thousands of properties in the flood plain of the River Hull behind. The tide crept ever upwards, lapping at the sides of the mighty barrier but could not overcome it. But it was close – only 40cm remained of that barrier, built to defend the city after the1953 surge. It had done its job, just. The tide height of 5.8m is a record high for Hull.


The Saviour of Hull! - The Tidal Barrier holds back the tide. Photo by @Tom_Coulthard (This is just one of many great photos).

The sea water eventually receded at Hull.  High tide was later in the inner estuary and badly flooded South Ferriby and Goole. The flooding continued further south, in Skegness and Boston. Another great tidal barrier, the Thames, was also needed to save large areas of East London.

Now that the waters have passed the data is beginning to be collected and analysed. What seemed to take everyone by surprise was the scale of it. Data from the Immingham gauge stopped when the level reached 8.5m*, but from the curve it looks like it would have continued to around 9.5m – 2m above the predicted astronomical tide (from the pull of the Sun and Moon), and over a metre greater than the highest reading from the 1953 storm surge (at 8.4m).

*I don't know why the gauge stopped, most of them did before high tide that night. My guess is that they either reached the top of their scale, or exceeded a threshold where it is assume too high to be accurate - The Immingham gauge stopped at around the maximum of the 1953 tide level.

This is very significant. I don’t think anyone anticipated it. As I said previously, 1953 was THE storm. For the last few months I have been working on a computer model to simulate the flows in the Humber, with one of the aims to be able to predict the estuary’s response to 1953-like events, especially in the face of rising sea levels. Much of the Humber’s defences were built after the 1953 surge so unsurprisingly the model showed they coped well. Our hypothesis was that the rising sea levels on top of that might cause them some issues, so we wanted to try and model that.

Naturally, first chance on Friday we ran our model with the tidal heights recorded on the evening before. Our model suggests that if we had been able to predict the scale of the surge we could have anticipated the flooding, even just based on this preliminary data (although a large pinch of salt is needed when interpreting the simulation below).



As bad as the flooding was, it has to be said that our infrastructure did a fantastic job. The scale of this surge was unprecedented, quite a bit bigger than 1953, yet there has not been the devastation, and thankfully, the loss of life that followed that storm. If it were not for structures like the Hull Tidal Barrier, it would have been much, much worse.

And that leaves us with a warning. The International Panel for Climate Change (IPCC), uses different models to try and predict future sea level rises for the next 100years, and the 'Best Case Scenario' - where greenhouse gas emmissions are cut immediately - would likely cause a sea level rise of 40cm. This is the capacity left over on the Hull Tidal Barrier. When we consider that an increase of 60-80cm is probably a better estimate, the ability of our infrastructure to manage this size of event in the future needs to be considered. It maybe that this storm surge is an event that won't be repeated in our lifetimes, but it now stands as THE storm we’ll be using the measure future resilience and it pushed us right to the edge.

Wednesday, 4 December 2013

What’s in a photograph?

By Lucy Clarke (@DrLucyClarke)

Everyone is familiar with photography, with the rise of digital cameras and increasingly high resolution cameras available on mobile phones and tablets people are photographing everything from their holidays, pets, family, friends and even themselves. I love photography and enjoy capturing images of my travels, but I also use photographs in a very different way as part of my research: rather than just appreciating their aesthetic value I use photographs to recreate and measure features on the Earth’s surface and in the lab.

In remote areas where it is difficult to access a location or when looking into the past, photographs can often be the only option available to explore an area. Historic photographs are therefore extremely valuable, as they provide a record of what something looked like at various points in time and so can be used to look at temporal change. This is especially useful if an extreme event occurs in a location that has never been measured before, so you can look at the impact the event had. An example of this is shown below; the 2 photographs show the Poerua alluvial fan in New Zealand before and after a big event. In 1999 a large rock avalanche occurred in the headwaters of this system forming a dam in the gorge below, the water ponded up behind this for 2 days before it finally burst, creating a flood wave that engulfed the area downstream and caused the river to avulse (move to a new location) and deposit large areas of gravel on top of the agricultural land. Using photographs from before and after this event enables identification of the area of land that has been affected and the new position of the river channel to try and assess the damage.


Aerial photographs from 1984 (before) and 2005 (after) the 1999 flood event on the Poerua alluvial fan in New Zealand (Images courtesy of NZ Aerial Mapping Ltd and GeoSmart)

Although it is useful to look at these photographs and see the changes between when they were taken and what is there now, it doesn’t help us to actually measure anything or quantify the change. So in my research I use a technique called photogrammetry, which allows me to process photographs and extract quantitative data from them. In its simplest form, photography converts the 3D real world into a 2D image, and photogrammetry converts this 2D image back into a 3D representation - using information on the type of camera and lens used to take the image and the relationship between the camera and the ground at the time that the image was captured. This requires two overlapping images of the same place which are viewed at the same time in a single 3D image, in what is known as a stereo-image. Traditionally, this was done using a stereoscope (which uses mirrors and viewing lens to fuse the 2 images together when you look at them - like a magic eye picture) but in modern digital photogrammetry this is done using specialist software on a computer using a 3D screen and glasses – like when you watch a 3D film at the cinema. In the digital workflow the images are adjusted according to the camera parameters and georeferenced using the coordinates of known positions from the ground to create a true scale representation. A digital elevation model (a 3D map of the surface area) can then be extracted and used to measure features, this gives the same results as it would have done if you were standing on the ground measuring it.



Ways of viewing images in stereo (a) the traditional method using stereoscope and (b) my digital photogrammetric computer set up with 3D screen and glasses

Photogrammetry is most commonly used with aerial photography but it can be applied to any overlapping imagery if you have the correct information. For example, below is a photograph and associated digital elevation model I created from my alluvial fan experiments, outlined in my previous blog post: What drives change on alluvial fans? 


Photogrammetry software is expensive to purchase and processing the images can be complex and involves training, so traditionally photogrammetry has only been used by specialists. But recently there has been a development in something called Structure for Motion – this involves taking multiple photographs of objects from different angles and then uploading these into software that uses photogrammetric principles to automatically create a 3D model. This software is available on the web – e.g. Bundler (free to download), Photosynth (free to download) and AgriSoftPhotoscan  (the demo version is free, which allows you to create models but not save) - so you can upload your own photos and have a go at creating your own 3D model from them!


An example of the 3D model created by Structure for Motion (Source: Goesele et al, 2007)


I first used photogrammetry many years ago for my Masters’ thesis and since then I have incorporated it into all of my subsequent research, whether it is analysing a field site or an experimental landform. In my first blog post I mentioned that I am a fluvial geomorphologist (my research is all about rivers), but since writing that post I have changed jobs and I am now using my photogrammetry skills in a whole new environment – Antarctica – working for the British Antarctic Survey in Cambridge. I have just started a new project using an archive of approximately 30,000 aerial photographs of the Antarctic Peninsula that date back to the 1940s to investigate how glaciers in this region have changed in the last 70-80 years, an area little is currently known about. This means that I am lucky enough to have access to the most amazing photography of the one of the most remote and stunning places on the planet and will be working with this for the next couple of years, which I will keep you updated on in future blog posts.

(a) Part of the archive of aerial photos held at the British Antarctic Survey, examples of historic aerial photos of the Antarctic Peninsula from (b) 1950s and (c) 1980s




Reference:
Goesele M., Snavely N., Curless B., Hoppe H. And Seitz S. 2007. Multi-View Stereo for Community Photo Collections. Proceedings of ICCV: Rio de Janerio, Brazil: 14-20 October 2007.