Your understanding of relative retention time (RRT) is essentially correct. It is the ratio of the retention time of the analyte over the retention time of the internal standard.
Since both retention times will have the same units (whatever the lab chooses to use), those units can be viewed as cancelling one another out. Thus, RRT is often expressed as a unitless quantity.
Beginning with the earliest EPA GC/MS methods, EPA has provided an acceptance limit for the agreement of the RRT between the sample and the standard. The language has always read very much like what is in Sec. 18.104.22.168 of Method 8260B (PDF, 86 pp, 444K), namely that:
"The relative retention time (RRT) of the sample component is within +/- 0.06 RRT units of the RRT of the standard component."
All that one is doing here is comparing the RRT of the peak in the sample analysis to the peak in the calibration standard. The RRTs must be within 0.06 of one another. Thus, for example, if the RRT of compound A in the calibration standard is 0.98, then the RRT of the peak in the sample analysis that you want to call compound A must be within the RRT range of 0.92 to 1.04. If the peak in the sample run is not within that window, then you may not call that peak compound A, since the RRT does not agree well enough.
We believe that the intent of the original wording of "0.06 RRT units" was to keep one from thinking that the acceptance limit of 0.06 is a simple 6% window. Otherwise, one might think that the two RRT values have to agree within 6% of the standard, which is a slightly narrower window (i.e., 6% of 0.98 is less than 0.06).
The GC/MS methods, 8260B and 8270C, refer to relative retention time (RRT) windows of 0.06 units. Please clarify exactly what that means. As I understand it, the RRT is a ratio of RT of target to RT of internal standard. How can that ratio have units?
Have more questions? Submit a request