This presentation is part of the 2017 3D Digital Documentation Summit.

Utilization of Bathymetric Surveys for the Location and Monitoring of Archaeological Shipwrecks

Speaker 1:           I’m going to be talking about my master’s dissertation research, which I finished up last fall and also a bit of work I’ve been doing since then because we got a new data set related to the work I was doing; and so I’ve been working with that a bit. Anyway, my research recently has been using bathymetric data sets to update and improve the wreck record of the Goodwin Sands, which … Oops, that’s a clicking button. Okay.

The Goodwin Sands is a pair sandbanks off the Deal coast in the UK. It’s very historically relevant. So much so that it’s been referenced in a number of works of literature as are quoted on the board. It is very well-known for wrecking lots of ships. Partly due to, one sec, due to a combination of its location because it’s sort of centrally located along a lot of different maritime trade routes, and also the fact that since it’s a giant mass of unconsolidated sand, it has a tendency to shift due to the tidal influences of the water around it. The geomorphology of the area is really fascinating and if you’re interested in that, you should definitely come talk to me later.

But what I’m trying to focus on here is the shipwrecks specifically. And one point that I will point up here is Calamity Corner which is very interestingly named and we’ll talk about in a little bit, but it’s the place that I’ve been focusing a lot of my research because it has a lot of shipwrecks that are well-exposed and things that you can see on the bathymetric data. So bathymetric data, for anybody … Why does this keep doing that? Sorry. Bathymetric data is, for anybody who’s not familiar ocean research, is basically like a topographic map of the seabed. It generally will either come as a 3D point cloud or as a Raster image that’s basically taken a 3D point cloud and bent it into a bunch of pixels, which is what I was working with for my original dissertation research. We actually got access to the original point cloud data for one of the data sets I was working with, but it too late to include it in my dissertation research, but that’s what I’ve been working with since then.

Reasons that this bathymetric data is useful in maritime archeology is that it’s broad scale allows a much wider search area than you would get if you were individually trying to individually look at every site. This can be useful both for kind of getting a broad picture of a wreck that you know is there and also for searching for other wrecks that might be in the area that you didn’t know about already. The fine resolution that you can get from bathymetric data allows pretty decent analysis of wrecks without having to go through any onsite work and often bathymetric data sets are already available because they’re super useful for navigational safety and other projects that might involve construction in the ocean and things like that. These kinds of data sets are out there for a lot of places if yo look for them.

The data sets I was using specifically, this keeps happening … the data sets I was using, there were three consecutive data sets from the UKHO, which is the United Kingdom Hydrography Office. They, as is kind of needed in the Goodwin Sands, take regular bathymetric surveys to kind of see how the sands is changing so that they can update their navigational safety charts and records, and things like that. And then I also had another survey that was taken for the UK Department of Environment, Food, and Rural Affairs. I wasn’t able to find a whole lot of information about why that survey was taken, but it was available through the UKHOs inspire website, which there’s a link for up there. Basically the UKHO is really awesome and puts most of their data sets up online, or at least kind of bin versions of them. We sent off to the UKHO to get the original point cloud data set, which is what I’ve been working with recently.

As for the ship wreck data set things, we used several different data sets about historical ship wrecks in the Goodwin Sands. Basically the data sets had information about what the ships were, what lengths and widths, and things like that. Kind of depended on the wreck how much information we had, but a few of them I had enough information to kind of compare them with my bathymetic data sets and say fairly definitively yes this is this wreck so that we could confirm those locations.

My data processing, I used [esrey 00:05:32] arch map for all of my analysis, but there are also similar products that are free to use, open source, things like that. [QJS 00:05:43] is one of the ones that I know of, which is a point that I’ll be making throughout the talk is that basically everything tat I’ve been doing is something that can be done in a lot of places when the data is available basically for free. Because the data’s already been collected and there are a lot of open source and free software that you can use for his stuff.

Anyway, I created the slope maps, hill shades, and curvature maps for each of my bathymetic data sets. The slope kind of gives you a good sense of kind of where the edges of the ship wreck are because it sticks up really well from the rest of the surrounding seabed. The hill shade kind of, of those three gives the best realistic view of what the ship looks like, cause its kind of as if you were shining a light on it. So you’re kind of seeing a shaded image of it. And then the curvature kind of gives you a better idea of areas where there’s a lot of change vertically in the wreck. You can see several little points on the lower bit there that the … it’s kind of going up and down in the center of the wreck piece. And then for making pretty images of things, I layered the hill shade underneath the original data set to get kind of a composite look that sort of looks sort of like a real wreck and also gives you some amount of useful data about what the wreck looks like.

My first step in analyzing the wrecks was ignoring all my ship wreck data sets and just look at the bathymetry to make sure that I was getting an unbiased view and didn’t have any preconceptions about what might or might not be a wreck. Those are shown in little red dots and then I compared it afterwards to the different ship wrecks that I knew to exist in the area. And for the most part a lot of them kind of were one to one or else there were somethings that I wasn’t sure if were wrecks but I listed anyway and then they didn’t turn out to have wrecks associated with them but it’s all good. There were a couple of … there were a few instances where the historic record and the bathymetirc data didn’t line up in ways that I thought were interesting and were potentially areas where I could update and improve our knowledge of the area.

Specifically there were Calamity Corner wrecks, which as I said we’d get back to. There were the three wrecks here, here, and here. Two of which were labeled as the S.S. Ira and the [Lory 00:08:32] Victory. And the third of which didn’t have a label. And there is also the wreck over here which I’ve associated with the [Valsaleese 00:08:41], which we’ll get to in a minute.

Really quick aside on Calamity Corner, cause it’s a fascinating area. It got the nickname during World War II because a lot of wrecks including the ones that are over here sunk during that time period. There were so many wrecks that sunk during that time period that they had to nickname it because it was rather significant. Anyway, so the [Valsaleese 00:09:15] in the historic record the location that’s given is listed as, for filing purposes only because they didn’t really know exactly where the wreck went down. It sunk during a major storm in 1916 and no one really knew exactly where the wreck had gone to or where it had gone down exactly. I took some measurements of the wreck that I found over here and the beam length and the length of the vessel match up nearly exactly with the recorded length and beam for the [Valsaleese 00:09:53], so I think that it’s the correct wreck but there’s not really much more that I can say abut that at the moment. Because it’s in this image mostly buried and this was the earliest of the three surveys that I had. And in the later ones it’s still sort of visible, but it’s very quickly covering over.

As for the other three wrecks, wreck 45 which is this pair of wreck cause, I’ve imaged them separately but they are basically next to each other. Wreck 45 was previously unidentified. Wreck 46 has been labeled the [Lory 00:10:27] Victory and wreck 48 had been labeled as the S.S. Ira. And based on the length and beam measurements I figured out that it looks like if you combined those two wrecks, they match the length and beam for the [Lory 00:10:48] Victory. And similarly these pieces if you add up all the lengths match the S.S. Ira. That’s about as far as I was able to get with GIS, with [bin rastor 00:11:27] data, say yeah the lengths and the widths match. And there are a couple of you know, maybe defining features. Like this looks like it might be a mast or a cargo winch or something like that.

But then we looked at the point cloud data, and the point cloud data is much prettier. You can very clearly see a lot of things on that, that you couldn’t see just from the [bin rastor 00:11:22] data. For processing the point cloud data, it took me awhile to find a piece of software that did it, because at the point where I was trying to process it I was no longer a student at the University of South Hampton, which is where I did my masters degree. And so I didn’t have access to the proprietary software that they had for doing this sort of stuff. But I found a really cool program called Pure File Magic, which is a great name by-the-way. Pure File Magic Area Based Editor, which was and is still being developed by the US Naval Oceanographic Office for processing and analyzing sonar and lidar data. It supports a lot of different file types including GSF or General Sensor Format Files, which is what I was working with and it’s available for download from their handy website here.

Pure File Magic is a fairly straight forward list of steps that you go through to get your data from the format that it comes out of the sensors as into something that can be read by most other 3D point cloud software. The file edit 3D, which I didn’t end up using because the data I was looking at had already been de-junked in a sense. Basically allows you to edit your point cloud before you put it through into whatever other file types you want to put it into so that you can get rid of any outliers or things that just look like noise and not good data. The track line function in these are just going sequentially along the button there. Track line creates a minimum bounding rectangle, which shows you just kind of what the area of the individual track lines are. And track lines are kind of the way that you collect the bathymetry data is you take a boat and you go back and forth and each line that you take is a track line. And so you get a bunch of different track lines that you then need to put into the software or other software to make into one single point cloud.

Area check allows you to compare the individual bounding rectangles with the relevant areas where you think the wrecks are going to be. So it’s useful to have both the GIS or bend data so you can look at that and figure out what areas you want to focus on before going into the software to make your 3D point cloud pretty. After you have an idea of what track lines you want to load, you then use PNF load to load the track line data. In that step it gives you the option of selecting an area that you want to only look at data from that area. And so it will look through all the data and all the track lines that you give it but it will only take out the points that are in the area that you tell it to look at.

PFM view then gives you this kind of a view set up and lets you look at the data that you’ve loaded into your Pure File Magic File, which again is a great name. And then from there you can extract it to a variety of file formats and if you wanted to take it from here into GIS type software, you can make it a GRF reference TIFF image. From there I took the data … or I exported the data as an Askey XYZ file so that I could open it in cloud compare, which is a 3D point cloud editing software that I am more familiar with. It allows you to clip the point clouds so then you’re only looking at the relevant areas when you’re trying to visualize it. It has kin of the ability to create view ports and videos and all kinds of stuff so that you can make really nice pretty images and videos with your data. And the handy color by height function, which I used to make it actually look like a ship wreck instead of a giant green blob.

With this we were able to get a much better comparison to historic photographs and other things of the wrecks. On this you can see there’s rather obviously the helicopter pad and this mast that I think might be a cargo winch, I forget exactly what it was. And then there’s these two bits that are sticking up that I wasn’t able to figure out what those were called, but they are fairly obvious over there. And even more fun, it spins. It’s a little fuzzy for some reason, it looked better on my computer than it does now but again you can see the two posts and some of the other features I was talking about. It makes really pretty spiny models. I like 3D spinning models, they’re very fun.

Similarly the S.S. Ira has similar level of visual detail. You can see a cargo hatch and a couple of potentially cargo winches and one thing that the … this was the first of the different models that I brought up and I was just kind of amazed by the fact that you can see the individual posts on the railing on the side of the ship, which just kind of blew my mind. Cause this was data that wasn’t even taken for archeological purposes and we’re still able to get this level of detail. And you can also see the rudder in the corner and the propeller of the ship. There’s a lot of good stuff there and again we have pretty spiny models that show. And you can see there’s a bulk head kind of in the underneath there, which is probably why that part is still being held up.

One thing that I haven’t done yet that I think would be interesting to do, especially for ships that have pretty good historical records of the dimension of the ship and specific parts of the ship, is another feature that cloud compare has is being able to measure between two points on a model. So I could literally go in and compare, okay is this the same distance as it is on the historical record of what the ship should look like?

Unfortunately the [Valsaleese 00:17:48] model didn’t really turn out quite as well. This is the 2015, which is the most recent of these surveys and the model gives a little bit more data but it still kind of mostly looks like it’s been buried, which is what’s happening to the ship. That’s one downside of this method is it only really can work on ships that are uncovered and exposed because if they’re not exposed you’re just going to see seabed. Basically for wrecks like this, you’re not getting a whole lot more out of the point cloud that you would if you were just using the GIS model because there’s not really much more information that you’re going to get out of it.

One of the next steps I’m looking at is trying to make mesh models out of my point clouds. I haven’t gotten very far with this yet, but my first couple of attempts have been a bit if-y, because it turns out the way that you get a point cloud from swath bathymetry data is it basically goes in that every time step takes a [inaudible 00:19:02] ping out and so there’s a different distance between points in one direction than in the other direction. And so when you’re doing your [inaudible 00:19:11] type calculations it screws it up a bit and is kind of hard to create a 3D model from that. But I’m still working on it, so fingers crossed with that.

Anyway, specific things that I’ve managed to do with this project were correcting the identification of the S.S. Ira wreckage and recognizing both of the fragments of the [Lory 00:19:32] Victory, which are currently somewhat separated. And also so suggest a possible location for the [Valsaleese 00:19:39] wreck. And then more in general, this methodology could be used in other areas for similar situations where you have ship wrecks that you would like to have more data about, but don’t have the funds or ability to do field work on the actual site.

If anyone has questions?

Speaker 2:           What’s the depth approximately?

Speaker 1:           It really depends. The Goodwin Sands has one of the largest tidal ranges in the world, and so the depths that you’re looking at are at high tide between 10 to 20 or so meters. And then at low tide some of it is not actually underwater anymore. So, yeah that’s kind of the tide range you’re looking at there. I don’t know about other places, but that is what you are looking at, at the Goodwin Sands.

Speaker 3:           What were the … what was the earliest ships that you could see this way and how did they differ in materials?

Speaker 1:           So the ships that I was able to see on my data were mostly World War I, World War II wrecks. They are mostly all made of metal at that point, but there are a few wrecks that have been studied more extensively already because they were exciting and interesting when they were found that were from the early 1700s. So those were wooden ship wrecks.

Speaker 3:           [crosstalk 00:21:09].

Speaker 1:           And I was able to see one or two of those on my data …

Speaker 3:           Right.

Speaker 1:           But most of them are kind of … the longer they’re there they sink into the sands, which … just the massive sand can be up to 80 feet deep at it’s deepest. A lot of the wrecks that have been there for a while are just going to be under there somewhere but hard to actually see.

Speaker 4:           So as the sands move are new wrecks being exposed and how frequently, forgive me if you already said this, but how frequently are the surveys made?

Speaker 1:           I had it in my notes but I think i forgot to actually say it, the three surveys that I had were from 2009, 2012, and 2015, so every three years-ish. I don’t know if the UK is planning to continue doing three year survey spans, but they’re fairly regular surveys of the area.

Was there more to your question? I feel like there was another part. Sorry.

Speaker 4:           Are wrecks being uncovered …

Speaker 1:           Ah, yes. There were maybe one or two wrecks that I say that were significantly uncovered, but during the time period that I was looking at they were already a little bit uncovered. I already saw them on the first data set but then they’re becoming more uncovered. A lot of the wrecks that were fairly obvious wrecks tend to be in the areas that are becoming more covered. It seems like there’s more of a tendency currently for wrecks to be becoming covered than becoming uncovered. But again, that also has to do with the geomorphology because … if you’re interested in the geomorphology, I can talk you about the geomorphology but I don’t know.

Yes?

Speaker 5:           You said at high tide that this is about 20 meters.

Speaker 1:           Yeah, it depends on where on the sands you are because it can vary from five to 10 meters at some of the higher areas to like 20 to 30 meters down depending on where on the sand bank you are.

Speaker 5:           Do you know if you could get the same kind of results at a much more significant depth [inaudible 00:23:22]?

Speaker 1:           I haven’t thought about doing things at a more significant depth and I think that most swath bathymetry data sets tend to be kind of in that range of depths. I think that there is technology that lets you get bathymetry readings of lower depths, but I haven’t looked into it much.

Yes?

Speaker 6:           What is the [travitaty 00:23:46] like in the Goodwin Sands and would that effect the data that you’re getting.

Speaker 1:           I have not looked into the [travitaty 00:23:46] a whole lot. I mostly just trusted that they UK chose … surveys were pretty reasonable.

Speaker 6:           I was wondering if this same kind of things could be used in like the Mississippi River.

Speaker 1:           I suspect so given that what I have seen of the Goodwin Sands is that it is a really, really poor place to go diving because it’s just you can’t see anything because of the massive tidal range. There’s always sediment everywhere. I would suspect it’s possible, but it probably would be somewhat noisy data. As with mine, you’d probably have to do some editing to the data, but it was done before I got a hold of it so I don’t have as much experience with that.

Yep?

Speaker 7:           What is the technology? Is it lidar or is it something different?

Speaker 1:           The ones that I’m using or …

Speaker 7:           The bathymetry, is it …

Speaker 1:           Yeah, it’s multi-beam sonar data.

Speaker 7:           Gotcha.

Speaker 1:           Basically you have a tow fish behind the boat that knows it’s location and then pings out fans of sonar pings.

Is that … any other questions?

Speaker 8:           Any questions?

Alright.

Speaker 1:           Thank you.

Speaker 8:           Thank you.

 

Abstract

For many shipwrecks we have historic records of the location of sinking, but the precise current location is unconfirmed. Confirming these locations through diving or other onsite work is expensive and often hazardous. Bathymetric surveys provide a potential means to establish exact locations of wreck sites. Using this technique allows researchers to cover a large search area. This can be valuable in finding a specific wreck or in documenting a number of wrecks in a regional area. With repeated surveys it is possible to monitor the state of wreck sites even when it is not feasible to directly monitor a site via onsite work or reports from recreational divers. High-traffic maritime areas often have accessible bathymetry data available for navigation safety. The resolution of this data can sometimes be high enough to allow analysis at scales relevant to archaeological studies.

Here we present an example of such work, completed in September 2016, cataloging wrecks on the Goodwin Sands. This work utilized bathymetric surveys undertaken by the United Kingdom Hydrography Office between 2009 and 2015. The Goodwin Sands comprises a pair of large sandbanks off the southeast coast of England and has long been a major hazard due to its proximity to active shipping routes and tendency to shift dramatically. The overall bank has been calculated to have drifted around 1km shoreward between 1887 and 2000 (Bates et al., 2007), and localized features of the bank were determined by this work to have shifted as much as 400m in the span of 6 years. Swath bathymetry data from the UKHO (UKHO, 2013a; 2013b; 2013c) was combined with a comprehensive catalog of known wrecks in the area (SeaZone Solutions Limited, 2011a; 2011b) to investigate the presence or absence of known wrecks and to locate previously undocumented ones. Several of these wrecks use slope analysis to better illustrate the state of preservation of one of the wreck sites.

The repeated surveys allowed consideration of changes over time, both to the individual wrecks and the bank as a whole. Patterns of exposure and burial were investigated to assess where wrecks may become buried or exposed over time and to predict which wrecks may be at risk for damage due to the bank’s movement. The synthesis of historical information and modern spatial data allowed the identification of a number of formerly unidentified or misclassified wrecks, as well as the refinement of the location of several previously unlocated wrecks.

Swath bathymetric data is originally sampled as a 3D point cloud, but is often binned for ease of conventional analysis. Using the original high-resolution point cloud, we can construct a 3D model of the wreck and compare it to historic photographs and other documents. In addition to the obvious value in confirming a wreck’s identity, repeat swath surveys can be utilized to monitor a wreck for degradation.

Speaker Bio

Elizabeth Krueger is based in Boston, Massachusetts, where she volunteers with the Boston City Archaeology Program. She has also worked on projects with the Massachusetts Board of Underwater Archaeological Resources (BUAR). Elizabeth has an MSc in Maritime Archaeology from the University of Southampton and a BS in Archaeology and Materials from the Massachusetts Institute of Technology. Her interest is in using science and technology to facilitate cultural resource management. Specific interests include GIS, reflectance transformation imaging, and material conservation.

Tagged with →  
National Center for Preservation Technology and Training
645 University Parkway
Natchitoches, LA 71457

Email: ncptt[at]nps.gov
Phone: (318) 356-7444
Fax: (318) 356-9119