The most widely used method to quantify RNA is traditional UV spectroscopy. A diluted RNA sample is quantified by measuring its absorbance at 260 nm and 280 nm. The concentration is calculated using the equation:

[RNA] ?g/ml = A260 x dilution factor x 40
where 40 is the average extinction coefficient for RNA

In addition, the A260/A280 ratio can be used to estimate RNA purity. An A260/A280 ratio between 1.8 and 2.1 indicates a highly pure RNA sample.

UV spectroscopy is relatively simple to perform but has several drawbacks. It does not discriminate between RNA and DNA so it is advisable to DNAse treat RNA samples before quantifying. DNA in the sample will lead to an overestimation of RNA concentration. Since proteins and residual phenol from the purification can interfere with absorbance readings, it is important to remove these contaminants in purification. Also, absorbance readings are dependent on pH and ionic strength. Dilute RNA samples in TE (pH 8.0) and use TE to blank the spectrophotometer before taking absorbance readings.

An alternative method for quantifying RNA samples is to use fluorescent dyes such as RiboGreen (Invitrogen). RiboGreen exhibits a strong fluorescent signal when bound to nucleic acids. Samples are quantified in a fluorescence microplate reader or standard spectrophotometer relative to a nucleic acid standard curve of known concentration. The linear range of quantification using RiboGreen is three orders of magnitude, from 1 ?g/ml down to 1 ng/ml. The major advantage of fluorescent dyes over absorbence-based methods is that it is not affected by contaminating proteins or organic solvents carried over from the purification process. DNAse treatment is still recommended as RiboGreen does not discriminate between RNA and DNA.