To illustrate the importance of a data quality assessment, assume a small risk assessment is to be performed using analysis data for semivolatile organics by GC/MS. The soil samples are collected, analyzed, and the data package received from the laboratory. The only flags applied by the laboratory are J's and E's for sample results that are below or above the calibration range, respectively; and B's for analytes found in a blank, a standard industry practice. In the QA narrative, the only QC failures identified by the laboratory are a few low surrogate recoveries. Upon environmental data validation by a third-party data quality assessor, numerous internal standard response failures are found. This QC Check was not included in the QA review performed by the laboratory, another standard industry practice.
For a semivolatile GC/MS analysis, six internal standards are used to ensure the GC/MS can properly locate, identify, and quantify the surrogates and target analytes in the sample matrix. According to the EPA's National Functional Guidelines (NFG), a sample with a single extremely low or multiple low internal standard responses must have all detects qualified as estimated (J) and all non-detects qualified as rejected (R). A sample with a single low or high internal standard responses must have all detects and non-detects qualified as estimated (J). Of the approximately 100 analyses performed, 35 fail the internal standard response check with 25 having extremely low or multiple low responses. A suitability determination indicates that the data for detects can be used assuming it may be off by a maximum factor of five. The data for all non-detects for 25 of the samples is not useable as reported by the laboratory. At about $350 a sample, that's $8,750 in analysis costs plus the sampling costs which may well double the amount.
Examining the chromatograms for the affected samples, an experienced data quality assessor recognizes that the low internal standard responses are due to background interference from residual petroleum-based contamination. The assessor also recognizes that the detection limits reported by the laboratory for non-detects have been taken directly from the analytical method and are for clean water samples that have good internal standard response. As suggested in NFG, these detection limits are not valid for a high organic content sample with low internal standard response. However, it is possible to calculate detection limits that are specific to a particular sample based on the level of background noise. This procedure, which is described in all dioxin/ furan analytical methods, is used to determine valid detection limits for each of the affected samples and thus rescue 25% of the analytical data.
Additionally, an experienced data quality assessor may suggest procedural changes to improve future analyses. One such change may be to require a minimum internal standard response in the daily calibration standard. Instituting this requirement along with establishing regular communication with the laboratory has been shown to improve the completeness for the internal standard check from the above 65% to 90%.
It is easy to imagine the consequences of using this data without the benefit of an environmental data validation or qualification to identify the associated QC problems. Without the benefit of a suitability determination and data rescue, the user would be stuck with analytical data that could not be used and would probably get more of the same from future sampling events. Alternatively, a thorough data quality assessment provides legally defensible, technically valid data for the project. And even for a scenario this complex, a data quality assessment runs only about 15-20% of analytical costs or 7-10% of the sampling and analysis cost - an expense more than justified by averting disastrous decisions based on invalid data and by savings in re-sampling and re-analysis costs.
For a semivolatile GC/MS analysis, six internal standards are used to ensure the GC/MS can properly locate, identify, and quantify the surrogates and target analytes in the sample matrix. According to the EPA's National Functional Guidelines (NFG), a sample with a single extremely low or multiple low internal standard responses must have all detects qualified as estimated (J) and all non-detects qualified as rejected (R). A sample with a single low or high internal standard responses must have all detects and non-detects qualified as estimated (J). Of the approximately 100 analyses performed, 35 fail the internal standard response check with 25 having extremely low or multiple low responses. A suitability determination indicates that the data for detects can be used assuming it may be off by a maximum factor of five. The data for all non-detects for 25 of the samples is not useable as reported by the laboratory. At about $350 a sample, that's $8,750 in analysis costs plus the sampling costs which may well double the amount.
Examining the chromatograms for the affected samples, an experienced data quality assessor recognizes that the low internal standard responses are due to background interference from residual petroleum-based contamination. The assessor also recognizes that the detection limits reported by the laboratory for non-detects have been taken directly from the analytical method and are for clean water samples that have good internal standard response. As suggested in NFG, these detection limits are not valid for a high organic content sample with low internal standard response. However, it is possible to calculate detection limits that are specific to a particular sample based on the level of background noise. This procedure, which is described in all dioxin/ furan analytical methods, is used to determine valid detection limits for each of the affected samples and thus rescue 25% of the analytical data.
Additionally, an experienced data quality assessor may suggest procedural changes to improve future analyses. One such change may be to require a minimum internal standard response in the daily calibration standard. Instituting this requirement along with establishing regular communication with the laboratory has been shown to improve the completeness for the internal standard check from the above 65% to 90%.
It is easy to imagine the consequences of using this data without the benefit of an environmental data validation or qualification to identify the associated QC problems. Without the benefit of a suitability determination and data rescue, the user would be stuck with analytical data that could not be used and would probably get more of the same from future sampling events. Alternatively, a thorough data quality assessment provides legally defensible, technically valid data for the project. And even for a scenario this complex, a data quality assessment runs only about 15-20% of analytical costs or 7-10% of the sampling and analysis cost - an expense more than justified by averting disastrous decisions based on invalid data and by savings in re-sampling and re-analysis costs.