As those responsible for the management of cultural resources become more concerned with the landscape perspective, the development of new methods and techniques appropriate to the scale is essential.
Our project helps to meet this need by developing and refining methods for using remotely sensed imagery to analyze historic cultural landscapes and to document land cover change over time in map form. This research was made possible through a grant from the National Center for Preservation Technology and Training.
Because of the size and the scope of rural landscapes, methods for analysis that can allow economy of field work are particularly valuable. The study on analysis of historic landscapes presents several challenges that are, if not unique to resources of this scale, at least more prevalent than they are in historic resources that exist at the building scale. The sheer size of many historic landscapes, the constantly changing nature of plant material, and the relative ease with which the spacial configuration of the landscape can be altered, all raise issues for those engaged in documentation of rural cultural landscapes. That is not to say that one is more or less difficult than the other, but the challenges posed are specific to the resource type.
The methods described in this project manipulate and analyze remotely sensed data to provide information about change in rural cultural landscapes, describing two promising methods. One approach merges historical aerial photographs with modern imagery in a geo-referenced overlay to clarify changes in vegetation and other features that have taken place between the periods represented by the two photos. The other method uses light detection and ranging or LIDAR topographic data, to make visible terrain anomalies that may suggest locations of historic roads, fences, and other landscape features no longer visible on the ground.
Both methods do have their limitations, but can be of significant assistance in identifying and interpreting changes in some of the less permanent features frequently found in rural cultural landscapes. The case study for this project is the historic estate of the Honorable Brutus Junius Clay. The farm, named Overn, is located in Bourbon County, Kentucky along Winchester Road, approximately 20 miles northeast of downtown Lexington. Bourbon County is in the inner bluegrass region, which is in the central part of the state. Agriculturally, the area is defined by plentiful grassland and rich soil thanks to the limestone, calcium and phosphorous deposits which continue to support the horse and livestock industry. Early explorers described the area as being woodland meadows. Bourbon County has topography that consists of soft, gently rolling hills and a few creeks with Stoner Creek being the largest in the county. Overn has remained in the same family for the site’s entire Anglo-American history, and the Clay family, including the current owner, Billy Clay, has kept excellent records of the site. Ownership can be traced to General Green Clay, who was granted the land following his service in the American Revolution.
His son, Brutus Junius Clay, inherited the land and moved to Bourbon County in 1830, where he became an important and respected member of the community, building the main house and outbuildings, making significant investments in the land, and turning Overn into a working agricultural landscape. The farm today is roughly 1200 acres in size. The site was selected for several reasons, largely having to do with the combination of landscape features and data availability. One reason Overn was selected was that it represented a working agricultural landscape and that it experienced the types of changes relevant to this research over an extended period of time. Roads had been cut and abandoned. Trees had been cleared and planted. Fences had been built and removed. The yellow line in this image, for example, highlights the original 1830’s entrance to the farm which is now all but invisible to the casual observer.
Another prominent landscape feature on this site, which is characteristic of the bluegrass region, and which is a reflection of its geology and cultural influences, is the rock fence. This image shows a rock fence in good repair. This image shows a rock fence in the process of being overtaken by nature. Rock fences, even when they no longer serve their original purpose, can remain in the landscape for quite a long time.
The row of trees pictured in this image indicate the presence of another, less permanent, wooden fence. The trees grew up along the fence while it was still intact and then remained on the landscape after the fence itself had been removed.
Another reason Overn was used for this project was the availability of significant historical documentation. In addition to black and white aerial photographs taken during the 1930’s, the Clay family has maintained extensive historical and agricultural records that document the farm’s development over time, and a great deal of work had already been done documenting changes in the property’s boundaries through property records research. From these sources, the researchers could develop a generalized understanding of the landscape’s arrangement and use over time. Furthermore, water, open fields and areas of dense tree growth were present on the site as well. And finally, contemporary aerial photographs and LIDAR imagery for the site were publicly available.
Data for the project included historical black and white aerial photographs, contemporary color aerial imagery, and LIDAR data. The black and white photographs were taken by the U.S. Department of Agriculture in 1937. They were obtained from the National Archives and scanned at 1200 dpi. The modern color imagery was obtained through the U.S.D.A. Geo-Spatial Gateway, and the LIDAR data was downloaded from the Kentucky Geological Survey website. In addition, preparatory data collection included visiting Overn to take GPS points for use in geo-referencing the historic imagery. The steps for pan sharpening the combined historic and modern aerial imagery and for preparing the LIDAR map are as follows:
The pan sharpening process uses a low resolution, multi-color image (in this case the modern aerial photo) and high resolution black and white image (in this case the historical black and white aerials) and combines them to produce a high resolution color image. Pan sharpening is image specific and thus the values entered may vary from site to site. After analyzing the pan sharpened image, the resulting image will show where the 1930’s field batteries were located in comparison to the modern landscape. It will also highlight the location of since-removed landscape features such as fences, buildings, trees, roads and sometimes other agricultural features, such as haystacks or tobacco plants. This allows people in cultural resource management firms and other professionals to understand how the layout of the landscape has changed since the black and white photograph was taken.
Begin by opening Arc map and navigating to the Add Data button located at the top of the screen on the Standard toolbar. Click the Connect Folder button on the pop-up box’s toolbar. Navigate to the folder where you’re storing your historic and contemporary color images and click OK. Add the files. You will also want to create a geo-database for your files. Using the Arc catalog tab on the right side of the screen, navigate to the connected folder that contains the files just uploaded. In this connected folder, right click, scroll down, select New, and click File Geo-Database.
In the first phase, the investigators had to geo-reference the historic black and white photographs. To do this, we connected the black and white photographs with the modern aerials visually and then verified those links using the GPS control points that had been taken during earlier field work. These points identified the boundaries of the property along with key features within the property that were present in both the 1937 image and the present day. The GPS device used in the field to obtain the GPS points was the Magellan Triton 500.
This image illustrates the process of linking a point on the two maps and these links are established for all control points on the map, as is shown in this image. When that process is complete, the geo-referencing is updated, and the modern image is then clipped to the extent of the historic image so that unnecessary imagery is not included.
Once this process is complete, the next stage involves texturizing the historic aerial image. To texturize the image, we used a methodology developed by the United States Forest Service in 2005 and described in the document “Remote Sensing for Ranger Districts Using Image Analysis for ArcGIS” which is available online. Texturing is done to increase the roughness of the image so that ArcGIS can distinguish the differences in the pixels, as it would if the black and white image was a color image. This process looks for change and texture on the image and produces lines showing change in field vegetation, fences and roads. In our Toolbox, in the Spatial Analyst toolbox, select Focal Statistics under the Neighborhood Statistics tool tray. In the Focal Statistics pop-up box, identify the Input Raster as the historic black and white photograph. In the Output Raster box, navigate to the file in Geo-Database and give the file a name (ideally, something with “texture” in the name so that you recognize it later). Identify the neighborhood as Rectangle. We identified the height and width as seven each and identified the units as Cell. The Statistics Type chosen was Standard Deviation. When finished, click OK. This slideshow is what the result of the Focal Stat run should look like. What you see up here on the screen is your texturized image. The next step is to filter this image.
To filter the image, return to the Focal Statistics tool under the Neighborhood Statistics tool tray in the Spatial Analyst toolbox. In the Focal Statistics pop-up box, identify the Input Raster as the texturized image that you just created. In the Output Raster box, navigate to the file in Geo-Database and give the file a name (ideally, something with “filter” in the name so that you recognize it). Once again, the neighborhood is Rectangle, but this time the height is three and the width is three. The units chosen will remain as Cell, but the Statistics Type will be changed to Minimum. Click OK. What you see if your filtered image.
The next step is to smooth this image. To smooth the image, again navigate to the Focal Statistics tool under the Neighborhood Statistics tool tray in the Spatial Analyst toolbox. In the Focal Statistics pop-up box, identify the Input Raster as your filtered textured image. In the Output Raster box, navigate to the file Geo-Database and give the file a name (ideally, something with the word “smooth” in it so it is easy to identify). Once again, the neighborhood is Rectangle but this time the height is five and the width is five. The units will remain Cell. The Statistics Type is Mean. Click OK. The image that you see if your smoothed image.
The next step is to rescale this image. Right now the data is 32-bit at floating point; however, it needs to be 8-bit and sined integer. This will reduce the file size and allow for further processing. To rescale the image, open the Properties box for the smooth image. In the Layer Properties pop-up box, write down the following statistics found under the Source tab: Write down (or copy and paste into Excel) the number to the right of Minimum or MIN, and write down the number next to Maximum or MAX. Click OK.
Next, return to the Arc toolbox in the Spatial Analyst tool tray and, in the Map Algebra tool tray, select the Raster Calculator. In the Raster Calculator pop-up box, you will be entering a new formula. This image shows the Raster Calculator pop-up box with the formula entered, but, in order to enter the appropriate values, you will need the actual formula that you will want to enter into the Raster Calculator pop-up box, which is included on this slide. In this formula, A is the name for your new rescaled image, B stands for integer, C is the name that you gave your smooth textured image, D is the minimum value that you recorded earlier, and E is the maximum value that you recorded earlier. When you have entered this formula with the appropriate file names and values into the Raster Calculator, click OK. The rescaled image should appear on the left in your Table of Contents.
Next, double click your rescaled textured image in the Table of Contents and click on the Source tab and the Layer Properties pop-up box. Write down (or copy and paste into Excel) the number to the right of the word MEAN and the number to the right of STANDARD DEVIATION. Using the following formulas, calculate the low threshold values and the high threshold values outside of ArcGIS. When you are finished, return to the Arc toolbox. Open the Spatial Analyst toolbox, and then the Map Algebra tool tray, and finally the Raster Calculator. Enter the following formula in the formula area of the Raster Calculator. Where A is the new file name for your new textured layer, B is the function CON (which means that you are creating a conditional statement), C is the file name for your rescaled texture image, D is the low texture threshold value, and E is the high texture threshold value. Click OK. The new textured image should appear. You can change the colors from black and white to whatever colors you find most appealing.
Once the image has been rescaled and textured, it is ready to be pan sharpened. To pan sharpen the image, return to the Arc toolbox. Navigate to Data Management Tools, then Raster, then Raster Processing, and then finally Create Pan Sharpened Raster Dataset. In the Create Pan Sharpened Raster Dataset pop-up box, identify the Input Raster as your color photograph and identify the red channel as 3, the green channel as 2 and the blue channel as 1. In the Output Raster box, create the new name for your new pan sharpened image. Click OK. You may now view your pan sharpened image. This image shows the farm without the textured image, while this image shows the pan sharpened image with the texture profile. The line differentiating the right side of the composite image and the left is the result of differences in the original black and white images. As is clear in this picture, the upper left-hand corner of the photograph was slightly over-exposed while the bottom right corner was slightly under-exposed. Placing a brighter edge of one photograph next to the darker edge of another created this difference.
Returning to a close-up selection from the final pan sharpened image, it is clear that the modern aerial image now also includes various no longer extant features, including the tobacco plants which form the regularly spaced dots on the north, south and western edges of the image. Trees are represented by the irregularly shaped small white blobs spread throughout the image. Paths are the irregular lines located throughout and evidence of non-extant fences is also easier to see, particularly in the northwest corner, just south of the property line. An extant row of trees in the modern photograph suggests a censored row of fence, but the pan sharpened image puts the shadow of that fence back on the landscape.
LIDAR was the second tool that we tested for use on this historic landscape. The LIDAR-derived data was downloaded from the Kentucky Geological Survey’s website and contour lines were built using the X, Y and Z stored in the raster file. It will depend on your data source and the size of the historic landscape that you are documenting, but, in this case, it was necessary to use the ArcGIS Mosaic to New Raster tool to assemble the different pieces that made up the digital elevation model. Alternatively, if you already have your digital elevation model, or you just don’t have to assemble it from multiple pieces, you can just use the Add Data button and add it to your map.
Again, you will want to create a Geo-Database for your files. Using the Arc catalog tab on the right side of the screen, navigate to the connected folder that holds your digital elevation model and right click, scroll down and select New, and then click File Geo-Database. Once added, your digital elevation model will look something like this, though the colors will probably be different. You can change the colors by clicking on the color ramp next to the name of the digital elevation model in the Table of Contents.
To create the contour lines on the digital elevation model, navigate to the Customize tab and select Extensions. Ensure that the boxes for 3D Analyst and Spatial Analyst are checked and click Close. Using the Arc toolbox tab, open the Spatial Analyst toolbox in the Surface toolkit and then double click on the Contour tool. Your input data is your digital elevation model file. In the Output box, navigate to your file Geo-Database and give the file a name (maybe something with “contour” in the title). Set the contour interval at two, though this may be different, depending on your data. Ignore the optional boxes. Click OK. The contour map should appear on top of the digital elevation model. The second image shows a close-up view of a portion of the Overn property.
You may find having the digital elevation model underneath the contour lines to be a bit distracting, not to mention more expensive to print, so you may want to consider turning off the digital elevation model before using the map in the field.
We were pleased with the ability of LIDAR to indicate the presence of long-abandoned rock fences and roads, despite the presence of dense tree canopy and overgrowth. Non-extant wire or wood fence locations were less visible. Indications on LIDAR would likely have been considered insignificant without the supporting presence of existing trees that had grown up along the fence line. We did find that remnant foundations and other collapsed materials had visible signatures as small bumps in the landscape. One condition that did seem to stump LIDAR was the 42″ high rock fence against which had been stacked a row of large, round hay bales. The LIDAR was unable to penetrate the hay bales and, thus, they obscured the presence of the wall.
To conclude, we hope you find this presentation to be helpful for your work and we would, once again, like to thank the National Center for Preservation Technology and Training for supporting this research.