摘要

Over the last few years, there has been a dramatic increase in the use of cDNA microarrays to monitor gene expression changes in biological systems. Data from these experiments are usually transformed into expression ratios between experimental samples and a common reference sample for subsequent data analysis. The accuracy of this critical transformation depends on two major parameters: the signal intensities and the normalization of the experiment vs. reference signal intensities. Here we describe and validate a new model for microarray signal intensity that has one multiplicative variation and one additive background variation. Using replicative experiments and simulated data, we found that the signal intensity is the most critical parameter that influences the performance of normalization, accuracy of ratio estimates, reproducibility, specificity, and sensitivity of microarray experiments. Therefore, we developed a statistical procedure to flag spots with weak signal intensity based on the standard deviation (delta (ij)) of background differences between a spot and the neighboring spots, i.e., a spot is considered as too weak if the signal is weaker than c delta (ij). Our studies suggest that normalization and ratio estimates were unacceptable when this threshold (c) is small. We further showed that when a reasonable compromise of c (c = 6) is applied, normalization using trimmed mean of log ratios performed slightly better than global intensity and mean of ratios. These studies suggest that decreasing the background noise is critical to improve the quality of microarray experiments.

  • 出版日期2001-10-10