GIS Applications in Hazardous Waste Remediation

R.A. Smith, W.T. Dudley, and T.L. Rutherford

Black & Veatch, 6601 College Blvd, Overland Park, KS 66213, Phone: (913) 458-6606, FAX: (913) 458-6645

Abstract

The environmental consulting and remediation market was an estimated $12.5 billion in 1994 (Farkas 1995). A key project activity that impacts remediation decisions, cost recovery, and cleanup confirmation is data collection, analysis and reporting. Hardware and software improvements in the personal computing industry have enabled more people to utilize powerful data analysis and display tools such as geographic information systems (GIS), 2-dimensional geostatistics (contouring), and 3-dimensional presentations of contaminant dispersion. Visualization and modeling tools help portray site characteristics and are used to predict future conditions. Multiple data types are combined to provide full featured images including aerial photographs, utilities, parcel tracts, contaminant contours, land use and zoning, soil types, and digital elevations.

Successful use of environmental software depends on ready access to accurate data in digital form. An electronic information management system (EIMS) is an efficient tool to manage environmental data and significantly reduce project costs. Black & Veatch has developed an EIMS, EnviroEDGE, to load and store data from laboratory electronic submittals, perform various data manipulations and maintenance, generate tabular data summaries, and provide data to environmental analysis software.

Keywords: information management, GIS, graphic display

Introduction

An electronic information management system (EIMS) can be an efficient tool to manage environmental data and significantly reduce project costs. According to The Environmental Business Report, "Data management and analysis will become the environmental manager's most important tool", and, "74% of environmental managers are actively using software to help manage the environmental function" (1995). The benefits of an EIMS over traditional manual data management methods include increased accuracy, reduced rework, and expanded data analysis capabilities.

Technology advancements in the last 10 to 15 years have enabled information to be moved from bulky and expensive mainframe systems to more accessible and cost-effective PCs or workstations. Furthermore, advancements in software development have created improved data analysis tools. A full suite of data management software typically consists of a database, a spreadsheet/statistical package, graphic generators, and other data analysis tools such as geographic information systems. A centralized relational database system in a client/server environment offers many benefits over traditional spreadsheets or word processing tables.

Data typically collected for contaminated sites include geologic information, such as lithology and soil characteristics; hydrogeologic information, such as hydraulic conductivity’s and specific yields; and chemical information on the types and concentrations of contaminants.

It is a difficult task for scientists and engineers to easily and accurately depict site conditions to a broad audience. Interested parties typically include the general public, government agencies, and private industry. Voluminous reports often contain scientific jargon not easily understood by decision makers and affected members of the public. At the same time, oversimplification of conclusions or estimations often lead to misconceptions and the need for further explanation for readers who understand the technical nuances.

Studies with large data sets, such as remedial investigations at federal military installations, often contain over 100,000 analytical results and may take hundreds of hours to manually organize and examine. An EIMS can automate labor intensive tasks: loading and storage of laboratory analytical results, data retrieval (queries), generation of tabular and graphical summaries, and interpreted results displayed in a map view. Automating key data management tasks can reduce labor costs from 20 to 50 percent over manual data evaluation techniques. On such example includes a case study concerning a Texas refinery discussed later in this paper.

Data Analysis and Presentation

The primary goals of conducting remedial investigations or site characterizations are to determine what, if any, contaminants exist in the surface or subsurface environment, to determine the nature and extent of contaminants, to reasonably depict physical and chemical surface or subsurface conditions, to identify health risks associated with contaminants, and to provide estimations on the fate and transport of these substances. Visual graphics such as GIS provide an excellent method for showing the types, location, and concentration of substances to a varied audience. The process of creating 2 or 3-dimensional graphics is part science and part artistry. As with all scientific work, conclusions must be clearly backed by valid data, the results must be reproducible, and documentation must exist to show that generally accepted scientific methods were utilized. Computer data analysis and display tools also require the same if not greater levels of care and scientific documentation. Artistic approaches are employed when the author wishes to emphasize a point or draw attention to a particular feature.

Figure 1 shows a 3-dimensional interpretation of benzene in soil. Base map features are included on the top to relate contamination to topographic features. Figure 2 shows the placement of horizontal wells in the same area as figure 1 but a large portion of the block is cut-away to show the wells’ placement relative to contamination. The primary focus of Figure 3 is to show RDX dispersion in soil where two wastewater lagoons received ordnance tainted "pink water". The surface depressions identify the lagoons and the sampling grid, a primary feature in the majority of project drawings, is overlain to gain perspective.

Figure 4 provides a good example of how GIS can be utilized to demonstrate areas of contamination relative to geographic features and the utilization of various digital information. The figure shows mercury dispersion in river sediments located in an industrial area. The georeferenced images and wastewater pipelines (colored light-blue) were obtained from the City of Seattle’s Engineering GIS group as ARC/INFO coverages. The navigation channel was digitized in AutoCAD, the contours were generated in Surfer for Windows, and the sample points were obtained from an ACCESS database.

The ability and costs to produce these types of images can vary greatly. There are powerful workstations with complex data analysis software. There are also simple but useful desktop contouring packages. Prices for environmental software range from $250 to $50,000 for a single license. Prior to making a purchase, a needs assessment must be performed to find the right software fit as it relates to needs, costs, and usability. Black & Veatch utilizes ESRI’s ARC/INFO and Integraph’s Environmental Resource Management Application (ERMA) for extremely large, complex projects with large data sets such as watershed studies. However, the majority of projects can be accomplished with PCs and desktop software. To encourage usage, the software must be easily operated by most engineers, scientists, and even non-technical professionals. Black & Veatch selected the following software to meet our day to day needs based on price, ease of use, and interoperatbility with other applications:

All of these packages can read and write AutoCAD drawing exchange files (DXF), thus drawing on and utilizing existing resources. The total cost for this software suite was approximately $6,500 retail [GMS being the largest expense].

Computer visualization limitations related to hardware and software are decreasing rapidly. Unfortunately, new limitations arise as the old ones diminish. The new obstacles are related to the information collection, storage, and management processes.

Data Collection

The timing of your decision to use color graphics in a contaminated site project is crucial. If the decision is made after the planned data collection effort is over, you may find that specific data types required to power your software are missing or existing data is not in the right format. Additional data collection and data conversions can add unexpected and unwanted costs to the project. If data collection is implemented with the knowledge that computer analysis will be involved, the collection process becomes more efficient; sometimes enough to collect more information than was anticipated.

At least one member of the project team must understand basic concepts of cartography or the data’s spatial relationships. This person must be included early in the project, ideally at the scoping stage, and should ensure that the right information is collected in the proper format. As the availability of digital data increases, particularly through the Internet, and data sharing between government and industry grows, data users will benefit with greater choices at lower costs.

Information Management

The ability to economically and successfully utilize graphic software relies on the data management process. All modeling and graphic generators require digital input, either through direct connections to databases or individual ASCII text files.

Data management techniques vary but the ultimate goal is always the same: to store information so that it can be retrieved quickly, accurately, and in a useful format by all data users. During the 1980s, data management for contaminated sites usually meant the majority of site information was stored in hardcopy format, with limited use of spreadsheets for calculations and reporting. Until recently, "tradition" meant files full of drawings, reports, field log books, boring and well logs, calculations, and boxes of analytical reports. It is a difficult task to manage all this paperwork and information so that nothing is ever lost and ensuring accessibility.

Manual manipulation and storage of sample data was often accompanied by transcription errors and instances where all the data was not used. Accessibility is another common problem associated with manual management. For example, can all individuals easily find and access hardcopy or electronic data simultaneously? Who has the most up to date data set? When using graphic generators such as GIS, a source file must be created from the main data set. Source files are typically ASCII text files with multiple rows of and X, Y, and Z values. If one typographical error is introduced, such as a misplaced decimal, the interpretation and image can be greatly skewed.

A client/server relational database provides the most accurate and accessible means for storing and retrieving site information, particularly analytical results. The primary challenges with a database approach are flexibility without complexity and ease of use. Black & Veatch designed an environmental database application to store, manage, and manipulate environmental data. The database serves as a central warehouse for all sample, analytical, and hydrogeologic information. Frequently used printouts or reports are automated and several features exist to easily obtain information from the database for use in GIS, contouring, statistical analysis, and report tables.

Case Study: Clark Refining & Marketing, Inc. Environmental Due Diligence

Black & Veatch was contracted by Clark Refining & Marketing, Inc. (Clark) to perform an Environmental Due Diligence (EDD) at the Port Arthur Refinery, as part of a sales agreement with Chevron U.S.A. Products Company (Chevron). The primary focus of the EDD was to characterize the Refinery through sampling and provide remediation alternatives and costs.

The Refinery was built on manmade land using dredge material and construction debris as fill material. This facility, once the largest refinery in the world (approximately 3800 acres), has been in operation since 1902. The products manufactured at the Refinery throughout its history include petroleum products and byproducts such as gasoline, jet fuel, kerosene, diesel fuel, heating oil, motor oils, industrial oils, gear lubricants, grease, coke, and sulfur; and petrochemicals such as ethylene, propylene, cyclohexane, benzene, cumene, and butadiene

Objectives

The overall objective of the EDD was to establish a baseline for environmental contamination at the Refinery (Black & Veatch 1996). Specifically subsurface and surface soil, sediment, and groundwater were investigated to determine the amount and type of contamination. Additionally, the presence of phase-separated hydrocarbons was documented and included in the final report.

Sampling Scheme

Based on the large size of the Refinery and the many different waste streams that are and have been generated required the collection of over 2500 environmental samples. Before collecting samples, however, careful planning was necessary for the selection of areas which historically presented the largest potential environmental concerns. Representatives from Clark with knowledge of each area were interviewed concerning the location of potential contamination. Sampling locations were established within these potential areas of contamination and established at points where visible signs of contamination were observed.

Soil Investigation

Approximately 300 monitoring wells were installed during the EDD. Borings were drilled using eight inch O.D. hollow stem augers and sampled using a split spoon. Over 700 geoprobe locations were drilled and sampled using a geoprobe continuos-barrel soil sampler with disposal acetate liners. The most contaminated soil sample from each boring was submitted for laboratory analysis based on visual observations of contamination and field readings from a photoionizing detector.

Over 700 surface soil samples were collected from within 2 feet of the surface. Eighty sediment samples were collected within 3 feet of the upper surface of the sediment.

Soil samples were analyzed using contract laboratory program (CLP) methodologies for target compound list (TCL) volatile and semivolatile organics; TCL pesticides/polychlorinated biphenyls (PCBs); target analyte list (TAL) metals and total petroleum hydrocarbons (TPH).

Water Investigation

Approximately 500 groundwater samples were collected from new and existing monitoring wells. Over 100 groundwater samples were collected from geoprobes. The groundwater samples were collected using disposable bailers.

Groundwater samples were analyzed using CLP methodologies for TCL volatile and semivolatile organics; TAL metals, TPH, and total dissolved solids. For approximately ten percent of the groundwater samples, water quality parameters were also analyzed, including total suspended solids, alkalinity, sulfide, sulfate, and nitrate.

Analytical Data Management

Due to the large number of samples, two laboratories were used to perform the sample analyses. At the time of sample collection, samples were subdivided into sample designation groups (SDG). Each SDG consisted of 20 soil or 20 water samples. For quality control purposes, one field duplicate, a matrix spike (MS), and a matrix spike duplicate (MSD) were collected and analyzed for each SDG.

Upon completion of the requested analyses for each SDG, the laboratories submitted a hard copy and an electronic version of the analyses to Black & Veatch and Heartland Environmental Services, Inc. (Heartland). Heartland performed laboratory data validation services for the project which included determining the usability of the results by checking instrument calibrations, trip blanks, method blanks, holding times, percent recovery of spike surrogates, and relative percent difference. A hard copy of Heartland’s report was then submitted to Black & Veatch which contained qualifiers indicating data usability. In addition, Heartland made the necessary adjustments by adding qualifiers to the analytical data in electronic spreadsheets. Heartland uploaded via FTP the corrected spreadsheets to Black & Veatch. Once the validated data was received from Heartland, Black & Veatch checked for the correct format, and uploaded into EnviroEDGE.

Results of Investigation

EnviroEDGE was used to generate tables and maps that were used as the basis for conclusions and recommendations of the sampling investigation. Numerous tables were generated through EnviroEDGE and included in the final report. The key tables generated were:

Remediation Alternatives

Based on the results generated from EnviroEDGE, B&V was able to calculate the areas of contamination for the Refinery and generate remedial action alternatives. The alternative chosen for remediation was the Corrective Action Management Unit.

Conclusion

The key management tool used on the EDD project for Clark Refining & Marketing, Inc. was EnviroEDGE. Because of the large amount of data (over 90 megabytes of analytical data), it was critical to the success of the project. It was essential to setup EnviroEDGE prior to project commencement which included having the laboratories provide the analytical data in the EnviroEDGE format. In conclusion, EnviroEDGE:

References

BLACK & VEATCH Waste Science, Inc., Summary Investigation Environmental Due Diligence, Port Arthur Refinery, Prepared for Clark Refining & Marketing, Inc., Black & Veatch, 1996.

The BTI Consulting Group, Inc., "How Do Your Customers Spell Success?", The Environmental Business Report, Vol. 2, Number 9, 1995, pp. 2 - 8.

Farkas, A.L. 1995, "Overview of the Environmental Cleanup Industry," Presented at the Hazardous Waste Action Coalition’s 10th Anniversary Annual Meeting, Washington, D.C., June 12, 1995.

Figure 1. Benzene concentration in soil.

 

 

 

Figure 2. Geologic cross-section.

 

Figure 3. Ordnance pink water lagoons.

 

 

Figure 4. Mercury in sediments.