Drug Discovery and Development

  • Home Drug Discovery and Development
  • Drug Discovery
  • Women in Pharma and Biotech
  • Oncology
  • Neurological Disease
  • Infectious Disease
  • Resources
    • Video features
    • Podcast
    • Webinars
  • Pharma 50
    • 2025 Pharma 50
    • 2024 Pharma 50
    • 2023 Pharma 50
    • 2022 Pharma 50
    • 2021 Pharma 50
  • Advertise
  • SUBSCRIBE

To Automate or Not to Automate?

By Drug Discovery Trends Editor | December 27, 2007

Beckman Coulter Biomek 3000

An automated method for spotting samples onto a MALDI plate was developed on Beckman Coulter’s Biomek 3000 Laboratory Automation Workstation. (Source: Beckman Coulter, Inc.)

Lab Automation Supplement

Knowing your needs—and what works—provides the answer for your proteomics lab.

Proteins are very complex in nature. Analyzing and characterizing them can get even more complicated. Proteins involve techniques that are time-consuming and require both skill and experience. Hence, it’s no surprise that many proteomic researchers are tempted to get help from robots. “A lot of laboratories that want to get into proteomics and don’t have the expertise go out and buy a bunch of robots. They feel that it is going to simplify the analysis or help them get consistent results and that is often times not the case,” says David Speicher, PhD, professor and chair of the Systems Biology Division and director of the Proteomics Laboratory at the Wistar Institute in Philadelphia, Pa.

Automation does seem like a very appealing option, especially for applications in clinical proteomics that involve routine analysis of hundreds of samples. “When you start working with clinical samples where you have a lot of interpersonal variation and you need to have large cohorts for doing discovery-based proteomics, then throughput really becomes an issue,” says David Friedman, PhD, a research assistant professor of Biochemistry and associate director of the Proteomics Laboratory and Mass Spectrometry Research Center at Vanderbilt University in Nashville, Tenn.

But many researchers turn to robotic solutions without putting much thought into whether they really need it and how best they can utilize it. “In many cases people get into automation when they don’t need to, prematurely or with unrealistic expectations,” says Speicher. But often times the investment pays off only when researchers have done their homework and are realistic when evaluating the present and future needs of their labs.

Automation Using Bioinformatics

Strictly speaking, automation relates to the hardware. However, improvements in instrument software and data analysis tools have led to automation in data handling and reporting. “A lot of what we do in proteomics is heavily dependent upon bioinformatics tools,” says Vanderbilt University’s David Friedman. For instance, all the database searching that follows a mass spectrometer run, to analyze and identify the spectral peaks, can now be done automatically using various commercial and in-house software solutions. “All this can be done manually, but it would take a tremendous amount of time and effort,” he says.

“Data analysis is clearly a big challenge because the data sets we work with have become much larger,” says David Speicher from the Wistar Institute. So even before the samples are analyzed, managing and tracking the samples and transferring various types of data throughout the analysis pipeline becomes essential. Most laboratories now use some sort of laboratory information management system (LIMS) to effectively track and store data, but they have their limitations. “The commercial systems cost a lot and don’t always offer all the functionalities that you are looking for,” says Speicher. So there is a lot of soul-searching and decision-making involved in finding the right tools for automating data management as well.

Knowing what works
So when does one really need automation? Does it work well for all applications? What are the advantages and disadvantages of employing automated systems? “All processes can be automated in some way, shape, or form but it’s really the experiments that dictate its necessity and feasibility,” says Mike Pisano, PhD, chief scientific officer at NextGen PLC in Ann Arbor, Mich.

Most researchers hope that automating a process will help improve their throughput and lower the variability in their experimental protocols. And in some cases they do. “If you are doing quantitative comparative analysis, then each experiment has to be reproducible and automation can help a lot there,” says Pisano. His company works a lot with clinical samples and uses a variety of proteomics tools for applications like biomarker discovery. Most of the processes in his lab are fully automated and he finds it extremely valuable to have the reproducibility and consistency that automation offers.

On the other hand, Richard Caprioli, PhD, at the Vanderbilt University School of Medicine, Nashville, has chosen to selectively automate only some of the experimental protocols in his laboratory. He has also met with remarkable success. Caprioli, a professor of Biochemistry and director of the Mass Spectrometry Research Center at Vanderbilt, is a pioneer in the field of matrix-assisted laser desorption/ ionization (MALDI) mass spectrometry (MS)-based imaging. One of the critical parts in the MALDI process is the application of the matrix, says Caprioli. “It is not that it is hard, but unless it’s done in a very reproducible way, you can get different results. We have found that using robotics has led to a very high reproducibility of the MALDI process.”

The other aspect where he has found automation to offer consistency is in picking the tissue sections for analysis. In the past, the tissue section selected for picking was a highly subjective process and varied from person to person. Automating it so that a microscope takes a picture of the tissue sample and allows the analyst to pick areas in exactly the same way each time using registration points on the slide has proven very beneficial. “We found that automating the process gave us very high reproducibility in terms of going to exactly the area on the tissue that we wanted to,” says Caprioli.

Where automation seems to have worked best in proteomics is for MS. “The technology [MS] is at a point now where it is quite reliable and that then leads to the ability to automate it,” says Caprioli. “It doesn’t take a human being to tune and refine it for every single experiment.” For techniques like LC-MS, the use of autosamplers for moving and injecting hundreds of samples sequentially into the instrument have proven helpful and are now routine in most labs.

Protein Forest ProteomicChip 

Automation can also be achieved by working at a micro-scale. The digital ProteomeChip separates proteins by isoelectric focusing at 0.05 pH resolution in just 30-45 minutes. The dPC is a high resolution device to separate proteins for global and targeted proteomics. It is used for mass spec sample prep and for pI Western Blot analysis. (Source: Protein Forest Inc.)
 

Weighing the pros and cons
However, automation does have its limitations. Friedman has found a definite trade-off between speed and quality. “With automation you do get a small hit in sensitivity,” he says. Speicher has also found that, most often, automated methods tend to use larger volumes as compared to manual methods.

“There are relatively few places where automation works well,” says Speicher. His lab works a lot with 2D-gel electrophoresis and automating 2D-gels can sometimes be very challenging. However, there are some parts of the gel electrophoresis process that are easier to automate than others, such as in-gel digestions. In-gel digests require many manipulative steps that can be easily automated using robotic liquid handlers. This task, if done manually, can give slightly better results, but it does take a very long time.

Having a process in automated mode also raises the question of reliability. Is the robot going to perform reliably all the time? “Even if it performs well 99% of the time, there will be a couple of instances when things go wrong,” says Speicher. A column can get clogged in the middle of a run leading to a loss of time or sample or both. Or a column can degrade over time and the deterioration in the quality of information gathered may go unnoticed.

Labcyte Portrait 630 reagent multi-spotter

The Portrait 630 reagent multi-spotter uses non-contact acoustic droplet ejection to deposit MALDI matrix and other reagents onto tissue sections for proteomics and small molecule analysis by MALDI imaging mass spectrometry. (Source: Labcyte Inc.)
 

“The counter-argument is, will a person doing it manually do any better,” says Speicher. “At least the hope or expectation is that if a person working with a sample manually makes a mistake, they have a chance to catch it, whereas you are blind to what a robot has done.”

Most modern mass spectrometers are equipped to perform data-dependent acquisition. The instrument in this case makes the decision on what to isolate and analyze depending on the ions that are detected during the run. “This saves a tremendous amount of time and increases the throughput, but the downside is: what if there is a certain peptide ion in there that is of interest? You now have to go back and sift through the data and re-analyze it,” says Friedman. “If it were being done slowly and manually, you can look at the data and then decide how you want to proceed next.”

Automation Advice

  • If you are not running a lot of samples you probably don’t need to automate.
  • There is always a trade-off between speed and sensitivity.
  • Automated methods tend to use larger volumes than manual methods.
  • If it’s not reproducible then it’s not usable, especially for quantitative analysis.
  • Run internal standards intermittently to check that the instruments are operating correctly.
  • Put in place standard operating protocols and ways to calibrate the robot and other ancillary equipment.
  • Keep track of the various sources of technical errors.
  • Set up manual interventions at key points in the process to catch errors instantaneously.
  • Don’t rely entirely on the vendor’s viewpoint since things operate differently on a smaller scale.
  • Pay attention to details.
  • The decision on what equipment to invest in should be driven by the application for which it is sought.
  • Buy the best piece of equipment to do the specific job you want done. Sometimes dedicated systems work better than the workstations that have multi-purpose use.
  • You can build robots for specific needs rather than buying off-the-shelf equipment.
  • Check out semi-automated options.

” It’s easy to automate, but there are times when things will go wrong,” admits Pisano.

While Pisano has had success operating proteomic protocols in his lab in a fully automated fashion, he agrees that human intervention cannot be completely eliminated. He recommends a manual intervention between key steps in the process so that if there is a fault with the robot there is a stop point before everything is lost. This intervention is important both at the experimental- and data-analysis stage. “The database searching and data-crunching are done automatically, but we also go through it manually to check the data,” says Pisano. “We monitor the mass spectrometers all the time, either from the lab or remotely.”

Choosing what works best for you
It does not always have to be all-manual or all-automated. “One of the things that we have done as an alternative to using automation is to go the semi-automated route where you are working with 96-well plates and 8-channel pipettors to get a fairly large number of samples processed,” says Speicher.

The needs of the end-user and the demands of the application should ultimately drive the decision of whether or not to automate. “Analyze your needs on a continuing, long-term basis and determine the level of performance you want, before choosing your method of analysis,” advises Speicher. Also scrutinize the time required for the downstream analysis. “If it is going to take you weeks to months to analyze your samples, then it doesn’t matter if it takes you a couple of days to do those digestions,” he says.

Once the decision to automate is made, the various technologies have to be carefully evaluated before investments are made. “No one particular instrument does everything,” says Caprioli. Users should focus on the analytical endpoint that they want to accomplish and then choose the right instrument, he says. Since robotics and analytical instruments are sold by different vendors, integration of various robotic devices can be an issue. “Even though each vendor is very good, these instruments don’t necessarily talk easily with one another,” says Caprioli. “It’s awkward to sometimes get them functioning as an integral unit.” Caprioli suggests that one of the ways in which this integration can be accomplished is by formatting the instrument output in a standard manner rather than having them done in a proprietary fashion. “The industry is moving towards that, but we are not there yet.”

About the Author
Tanuja Koppal, PhD, is a freelance writer specializing in life science and pharmaceutical topics.

This article was published in the Lab Automation supplement to Drug Discovery & Development and Bioscience Technology magazines: November, 2007, pp. 10-14.


Filed Under: Genomics/Proteomics

 

Related Articles Read More >

Spatial biology: Transforming our understanding of cellular environments
DNA double helix transforming into bar graphs, blue and gold, crisp focus on each strand, scientific finance theme --ar 5:4 --personalize 3kebfev --v 6.1 Job ID: f40101e1-2e2f-4f40-8d57-2144add82b53
Biotech in 2025: Precision medicine, smarter investments, and more emphasis on RWD in clinical trials
DNA helix 3D illustration. Mutations under microscope. Decoding genome. Virtual modeling of chemical processes. Hi-tech in medicine
Genomics in 2025: How $500 whole genome sequencing could democratize genomic data
A media release and Scientific Report image of Elizabeth Kellogg. - Camera Settings: ILCE-9M2, 12mm, ISO 1000, 1/80, f/3.2, Fri, 04-19-2024 at 10:10. v.12.01.23
St. Jude pioneers gene editing and structural biology to advance pediatric research
“ddd
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest news and trends happening now in the drug discovery and development industry.

MEDTECH 100 INDEX

Medtech 100 logo
Market Summary > Current Price
The MedTech 100 is a financial index calculated using the BIG100 companies covered in Medical Design and Outsourcing.
Drug Discovery and Development
  • MassDevice
  • DeviceTalks
  • Medtech100 Index
  • Medical Design Sourcing
  • Medical Design & Outsourcing
  • Medical Tubing + Extrusion
  • Subscribe to our E-Newsletter
  • Contact Us
  • About Us
  • R&D World
  • Drug Delivery Business News
  • Pharmaceutical Processing World

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search Drug Discovery & Development

  • Home Drug Discovery and Development
  • Drug Discovery
  • Women in Pharma and Biotech
  • Oncology
  • Neurological Disease
  • Infectious Disease
  • Resources
    • Video features
    • Podcast
    • Webinars
  • Pharma 50
    • 2025 Pharma 50
    • 2024 Pharma 50
    • 2023 Pharma 50
    • 2022 Pharma 50
    • 2021 Pharma 50
  • Advertise
  • SUBSCRIBE