![]() Bruce Seligmann, PhD
|
Obstacles to measuring gene expression have hindered the impact of genomics on drug discovery and the life sciences. Among these are the need to extract and amplify RNA, the difficulty and unreliability of measuring gene expression from fixed tissue or whole blood, the lack of reproducibility between sample measurements, and day-to-day repeatability. Performance has not been even close to that of biochemical enzyme and receptor assays, which have therefore remained the mainstay of modern drug discovery. These limitations hinder the exploitation of polymerase chain reaction (PCR) for drug discovery and the power of whole-genome high-density array methods to identify new targets and biomarkers.
Fixed tissues represent a tremendous sample source for use in target and diagnostic biomarker identification and validation. The level of gene expression is essentially fixed in time in such specimens. Fixing tissues for pathology and archiving is standard clinical practice. The RNA is crosslinked and degraded by cuts in the sequence, making quantitative recovery and measurement problematic— evidenced by quantitative differences between matched frozen and fixed tissue results and low yield/poor quality of RNA from fixed tissue. This led to the practice of collecting frozen tissues for gene expression studies.
The reproducibility and repeatability of PCR has also limited its usefulness. Modern drug discovery, and medicine in general, is all about dose-response effects and the comparative difference not only of efficacy at a single target between analogs but also between efficacy, selectivity, and safety across targets for a single compound. Biochemistry spawned the measurement of EC50 values (the characteristic concentration of a compound or stimulus having a half maximal effect), as well as modern medicinal chemistry and the concept of quantitative structure activity relationship (QSAR) computational analysis and data. To construct a QSAR database, the measurements made one day must be quantitatively identical to those made years later or made in other laboratories. In order to measure a dose response curve and calculate useful EC50 values, the variability of an assay between biological replicates must be <20%, yet most find gene expression differences measured by PCR that are less than 1.5-fold (a 50% increase difference) to be relatively unreliable, and the cut-off for whole-genome methods is typically twofold (100% increase difference). The failure of gene expression methods to-date is evident simply by scanning the literature—tens of thousands of dose-response curves and EC50 values are published for enzyme activity and receptor or protein binding assays, but practically none for gene expression.
Alternative assays are based on a lysis-only multiplexed nuclease protection assay protocol with high sample throughput, no more extraction or gene amplification, and which measures whole sets (or signatures) of genes, not just one at a time. Just as enabling is the assay’;s reproducibility, which provides coefficients of variability of 2%. This performance is as good as it gets with any whole-cell biochemical assay. Retrospective studies using archived fixed tissues are enabled, including dose-response studies, as are novel diagnostic assays. Genes responsive to a stimulus or a drug can be clustered according to their EC50, reflecting differences in molecular mechanisms related to efficacy, nonspecific effects, and safety; hence, analogs can be quantitatively compared and these parameters optimized using a single assay across in vitro cell systems and in animals or humans.
“Biochemical quality” gene expression assays and data enable better validation of targets and biomarkers, high-throughput screening (HTS), QSAR-based lead profiling, and lead optimization. Companies can now pursue “fast track” programs to identify second-generation drugs based on defining mode of action and side effects of first-generation drugs using clinical samples. The identification of novel clinical candidates can then be pursued through HTS and lead optimization using a complete “signature” of gene effects without ever having to develop an assay for the proteins involved.
About the Author
Seligmann is the founder, chairman, and chief scientific officer for High Throughput Genomics Inc., Tucson, Ariz.
This article was published in G & P magazine: Vol. 6, No. 1, January/February, 2006, pp. 32.
Filed Under: Genomics/Proteomics