QAA Logo 


Quality Assurance Associates
QAALLC

home  |  profile   |  services  |  projects  |  staff  |  articles   |  links  |  contact     


 

PERFORMANCE BASED MEASUREMENT SYSTEMS (PBMS)

Donald A. Flory and Taryn G. Scholz

COMPLIANCE vs. PERFORMANCE

Quality assurance and quality control requirements for EPA regulatory analysis methods have been a mixture of compliance and performance criteria since they were first promulgated. These requirements are both procedural and numerical. Compliance criteria are mandated and often involve step-by-step procedures and the establishment of numerical control limits for calibration data. Procedural compliance criteria include such items as the number and concentration of calibrations, the length of time allowed between calibrations, the use of various quality control samples, the relative abundances of mass spectrometer tune data, and the specific gas chromatographic conditions. Examples of numerical compliance control limits include calibration linearity expressed as the relative standard deviation (RSD) of relative response factors (RRF) and minimum RRFs for all or some target analytes. There is considerable variation in these compliance criteria throughout the various EPA accepted methods. There have also been considerable differences in these compliance criteria throughout the history of individual methods. "Within compliance" often depends on the specific analytical method employed and the date that the analysis was performed; which is, of course, a ridiculous situation and impossible to defend on any kind of logical basis. Performance-based criteria are not procedural but rather define the detection limit, accuracy, and precision that must be achieved for a particular analysis.

Performance-based criteria consist of numerical control limits for specific quality control samples and QC tests for target analytes such as method detection limit studies, blank contamination, matrix spike duplicate recoveries and precision, laboratory control sample recoveries and precision, surrogate recoveries in samples, internal standard recoveries in samples, and target analyte recoveries in performance evaluation samples. In summary, compliance criteria tell the analyst how to do the analysis and performance criteria define the detection limit, accuracy and precision that must be achieved.

There has always been a small group of advocates of a strictly performance-based method of quality control, but they have only begun to prevail in the environmental field within the last few years. The major criticisms of the compliance criteria approach to QC are that it stifles the creativity of chemists, virtually prohibits the application of new instrumental techniques, has resulted in methods for identical analytes that have considerably different compliance criteria, and is extremely costly due to the inefficiency associated with antiquated methods. Simply put, the performance-based philosophy states that if a chemist can get the right answer for samples, then it makes no difference how it is done. The quality of the data is then assessed by continuous analysis and reporting of performance evaluation samples and the other performance criteria mentioned above.

GCMS TUNING

Mass spectrometer tuning is not verified in the PBMS approach now being used by the EPA Office of Solid Waste (OSW) for RCRA sites. An understanding of the tuning process is helpful in enabling one to accept this approach. The first mass spectrometers to gain wide usage were magnetic sector instruments that use a magnetic field to achieve mass separation. The magnetic instruments were used to obtain most of the reference spectra that were entered into mass spectral library collections (such as the NIH library). These library mass spectra were published in hardcopy and were used by mass spectroscopists in manual interpretation of unknown compound spectra. These magnetic spectra eventually made their way into electronic libraries and began to be used by computerized mass spectra matching software to identify unknown mass spectra. This procedure was called a forward library search, because each GC peak to be identified goes forward to the mass spectral library.

A new type of mass spectrometer, the quadrupole mass analyzer, came into popular use in the early 1970's and has become the predominant mass analyzer in the environmental industry. This mass analyzer utilizes a combination electric field and DC voltage ramp imposed on a set of four metal rods to achieve mass separation. When the quadrupole instruments came into use, it was noted that the spectra produced could be different than those obtained with a magnetic instrument. Tuning procedures were therefore adopted for the quadrupole instruments to force them to produce magnetic-like spectra. It was later learned that the differences between the magnetic and quadrupole did not significantly affect the outcome of qualitative identifications using forward library searching algorithms.

Tuning is a procedure to adjust the mass analyzer operating conditions so that a specified mass spectrum is obtained for a reference compound introduced into the mass analyzer. Tuning can affect GCMS analysis in three ways: (1) qualitative identification (selectivity) of unknowns by mass spectral matching (i.e., the relative abundance of ions in the mass spectrum), (2) quantitation of sample concentrations (i.e., the relative response factor), and (3), sensitivity (the absolute response factor). We noted above that it has been shown that the algorithms used in forward searching are not greatly affected by differences in magnetic and mass spectra, and therefore would not be affected by similar differences in quadrupole spectra. Possible errors in identification are further ameliorated by the reverse search procedure used in all EPA methods for target analytes. The methods all use spectra generated by the instrument from the analysis of daily standards containing all the target analytes. The qualitative identification is done by a reverse search technique as described above. This effectively corrects for any difference in mass spectra due to tuning (or lack thereof). The successful analysis of MS/MSD and LCS/LCSD pairs confirms that this is true. Concerning quantitation, any change in the relative response factor (RRF) due to tuning differences between the initial calibration and daily calibration will be reflected in and can be evaluated by the %D test. Likewise, any change in sensitivity will be reflected in and can be evaluated by the minimum RRF test. Finally, all potential errors due to tuning differences can be automatically corrected by using the daily standard for quantitation, although this may decrease the accuracy for reported values, which are outside the 25-100 ppb range (the daily standard is 50 ppb) if linearity is a problem (as reflected in the RSD test).

COMPLIANCE vs. PERFORMANCE BASED DATA QUALITY REVIEW

QAA believes that data quality should be performance-based and has therefore, developed a Standard Operating Procedure (SOP) for data quality review (DQR) based on performance. The DQR described in this SOP is performance-based in that it does not include checking procedural requirements and it employs a set of universal data quality criteria rather than the specific criteria set forth in any analytical method. Thus, analysis results will be similarly flagged regardless of the analytical method employed or the date the analysis was performed. Additionally, the data quality criteria and flagging procedures are defined such that a particular flag can easily be associated with a level of bias. Thus, no flag means the result is within the accuracy and precision that is typically achieved by a laboratory for a nominal sample. As the level of flagging progresses, the bias associated with the result is more severe. This approach facilitates a usability determination by the data user. Although the DQR provides information on the quality of data, usability can only be determined with a complete understanding of the intended use of the data.

 

home  |  profile   |  services  |  projects  |  staff  |  articles   |  links  |  contact

Quality Assurance Associates
Last Update: June 20, 2005