environmental chemistryThe environmental chemistry discipline encompasses a number of related fields of chemistry and is complimentary to other disciplines offered by I2M Consulting such as environmental geology, hydrogeology, mining and remediation, toxicology, risk assessment, microbiology and bioremediation. This discipline also is involved in corrosion studies and related failure analysis, and in health and safety-related issues.

Environmental chemistry is essentially the science of identifying and measuring the amount of chemicals species in the environment, natural or man-made. It also includes the study of the fate and effects of these chemicals species in the environment. It includes such tasks as defining the intended use of analytical data, preparing sampling plans to satisfy the intended use, selecting appropriate analytical methods, advising on the collection of samples in the field, interpreting laboratory analytical results, and assuring the validity and legal defensibility of analytical results.

In determining fate and effects, it often involves an evaluation or organic and inorganic chemical reactions as well as physical processes such as volatilization, co-solvency effects, and soil adsorption. The broad area of environmental chemistry encompasses a number of related fields, including: analytical chemistry, chemical engineering, organic chemistry, data quality assurance, radiation chemistry, and inorganic chemistry.

Data Quality Objectives

Environmental chemistry includes such tasks as definition of intended use of analytical data, preparation of sampling plans to satisfy intended use, selection of appropriate analytical methods, sample collection in the field, interpretation of laboratory results, and the fate and effects of chemicals in the environment. Intended use of the data can include such purposes as site characterization, compliance monitoring, determination of extent of contamination, toxicological risk assessment, personnel monitoring, remediation alternative studies, and remediation verification. The selection of appropriate sampling and analysis methods must satisfy the applicable state and federal regulations. The appropriate sampling and analytical methods are determined during the Data Quality Objective (DQO) process.

DQOs are qualitative and quantitative statements derived from the outputs of the first six steps of the DQO process. DQOs clarify the study objective, define the most appropriate type of data to collect, determine the most appropriate conditions from which to collect the data, and specify tolerance limits on the data used to make decision. DQOs are not restatements of laboratory limits for precision and accuracy, but statements that result from the DQO process.

The seven steps of the DQO process are:

  • State the problem – concisely describe the problem to be studied,
  • Identify the decision – identify what the study will resolve,
  • Identify the inputs to the decision – identify the information that is needed,
  • Define the boundaries of the study – specify time periods and spatial areas,
  • Develop a decision rule – define the statistical parameter of interest,
  • Specify tolerable limits on decision errors – define the decision maker’s error limits, and
  • Optimize the design for obtaining data – generate alternative data collection designs.

By following the DQO process, one can improve the effectiveness, efficiency, and defensibility of decisions in a resource-effective manner. Sample collection must be carried out in a manner that will not compromise the intended use. Interpretation of laboratory data will involve working with the users of the data (geologists, hydrogeologists, toxicologists, and environmental engineers) to determine the suitability of the data for their intended use. Fate and effects studies address those reactions that may occur due to physical and chemical processes in the environment and often involve evaluation of the role of organic and inorganic chemical reaction mechanisms as well as physical processes such as volatilization, water transport, and soil adsorption.

Data Quality Assessment

Data Quality Assessment (DQA) is a formal, rigorous scientific and statistical evaluation to determine if environmental data are of the right type, quality, and quantity to support their intended use. The process involves review of data quality objectives (DQOs), sampling purpose, sampling design, sampling methods, documentation, analytical procedures, validation procedures, data reduction procedures, review of database procedures, and review of statistical methods used for decision making. The five steps in the DQA process are:

  • Review the DQOs and Sampling Design,
  • Conduct a Preliminary Data Review,
  • Select the Statistical Test,
  • Verify the Assumptions of the Statistical Test, and
  • Draw Conclusions from the Data

DQA frequently requires data validation. Data validation is a process to verify that the laboratory has complied with all the requirements (Quality Control Checks) of the specified analytical method. Data qualification is the process of qualifying (flagging) data to reflect any failures to meet the requirements according to the sets of pre-established functional guidelines. The USEPA has published generic functional guidelines for flagging environmental analytical data, much of which is applicable to forensic data. The qualifications of the data are considered with respect to the intended use of the data to determine the suitability (or technical validity) which can range from complete acceptance to partial restriction to complete rejection. Often, data that has been rejected during the validation process can be “rescued.” Data rescue involves techniques to salvage data that appears to be unsuitable upon completion of qualification.

Quality Assurance (QA) is the total integrated process for assuring the defensibility and reliability of decisions based on chemical analysis data. The principal goal of the QA process is to provide the necessary documentation for a comprehensive user Data Quality Assessment. The DQA determines if the data quality is adequate for its intended use and involves a four step process: 1) data validation, 2) data qualification, 3) data rescue, and (4) data suitability determination. The qualifications of the data are considered with respect to the intended use of the data to determine the suitability (or technical validity) which can range from complete acceptance through partial restriction to complete rejection.

Radioactive Materials

Radioactive materials encountered in the environment arise from numerous sources including both man-made and natural materials. Naturally-occurring radioactive material (NORM) contains radioactive nuclides from the decay of uranium and thorium. NORM frequently forms as scale in pipes or monitoring wells. As insoluble calcium compounds are formed, radium and other decay products of uranium and thorium co-precipitate. Daughter products of radium such as various isotopes of radon, polonium, bismuth, and lead are deposited in the scale. NORM is often associated with petroleum production, phosphate and titanium dioxide mining and processing, and with coal ash. Treatment and disposal of materials contaminated with NORM is strictly controlled by the NRC and the EPA.

Radiation detectors are extremely sensitive with counting efficiencies of close to 100% for some types of radiation. This is possible because of the large amounts of energy deposited in the detector by the radiation. Selection of the type of detector used is primarily based on the type (and hence the penetration power) of radiation. Alpha particles have little penetrating ability and are best detected by thin-window or windowless gas proportional counting or by liquid scintillation. Beta particles have limited penetrating ability and are best detected by liquid scintillation. Gamma rays have high penetrating ability and are best detected with semiconductor counters such as GeLi or HPGe.

Sampling Program Design

The design of a sampling scheme and selection of sampling methods is a multidisciplinary process that often requires input from environmental chemistry, environmental geology, environmental microbiology, and risk assessment. The purpose of sampling is to obtain a fraction of some lot of material that accurately represents the characteristics of the entire lot. The “lot” is what we are sampling. It may be a landfill, a tank car, a drum, sediments under a pond, or a jar in the analytical laboratory. In practice, it is not possible to obtain a perfectly representative sample of soils or sediments. Sampling procedures should be designed to minimize the sampling error and to document an estimate of the overall error, which includes: sampling error, sample handling error, and analytical error. Some sources of sampling error cannot be eliminated, however with proper understanding of sampling theory and with clear understanding the purpose of the sampling, error can be minimized and documented.

Incorrect sampling design or incorrect sampling procedures are just as damaging to the defensibility of data as are incorrect analytical methods. Planning for sampling and assessment of sampling methods should be given the same consideration as analytical data validation.

Inorganic Modeling

Powerful modeling programs (such as the U.S. Geological Survey’s PHREEQC) allow one to calculate: speciation, saturation-index, reaction-path and advective-transport, mixing of solution, mineral and gas equilibria, surface-complexation reactions, ion-exchange reactions, and inverse modeling (which finds sets of mineral and gas transfers that account for compositional differences between aquifers).

Although PHREEQC’s database contains a large amount of thermodynamic data, complete understanding of the chemical composition of the aquifer is required. Interpretation of the modeling results from such programs requires a thorough knowledge of chemical equilibrium and physical/chemical processes. The handling and misuse of analytical data are common subjects of litigation.

Data Validation

Historically, the term “validation” has been used to denote the systematic review of analytical data. In CERCLA (Superfund) programs, the data are compared with the USEPA Contract Laboratory Program (CLP) Inorganic/Organic Statements of Work. In a similar manner, analytical data generated under RCRA are compared to Test Methods for Evaluating Solid Waste, SW-846. In both instances, qualifiers (such as U, J, or R) are added to the data. These data qualifier flags provide data users with information about the quality of the data.

The data validation process has been enormously successful in improving the quality of analytical data. The review process determines the quality of a data set, but does not improve it. However, it is an audit program that has pushed the laboratories to improve their procedure and their adherence to method requirements. Validation is the standard for data that are being collected under litigious or potentially litigious circumstances.

To review the technical literature on these technologies and techniques, search “chemistry” and related key words in the I2M Web Portal (here).

Feel free to contact us to discuss your project needs or to arrange a speaking engagement by one of our Associates for a professional training session, a technical conference, society meeting, or for a graduation ceremony or other function where the knowledge and experience of our Associates may be of interest to your group.

The I2M Principal responsible for this discipline’s activities is:

Michael D. Campbell, P.G., P.H.