|
Lab Automation Supplement
Knowing your needs—and what works—provides the answer for your proteomics lab.
Proteins are very complex in nature. Analyzing and characterizing them can get even more complicated. Proteins involve techniques that are time-consuming and require both skill and experience. Hence, it’s no surprise that many proteomic researchers are tempted to get help from robots. “A lot of laboratories that want to get into proteomics and don’t have the expertise go out and buy a bunch of robots. They feel that it is going to simplify the analysis or help them get consistent results and that is often times not the case,” says David Speicher, PhD, professor and chair of the Systems Biology Division and director of the Proteomics Laboratory at the Wistar Institute in Philadelphia, Pa.
Automation does seem like a very appealing option, especially for applications in clinical proteomics that involve routine analysis of hundreds of samples. “When you start working with clinical samples where you have a lot of interpersonal variation and you need to have large cohorts for doing discovery-based proteomics, then throughput really becomes an issue,” says David Friedman, PhD, a research assistant professor of Biochemistry and associate director of the Proteomics Laboratory and Mass Spectrometry Research Center at Vanderbilt University in Nashville, Tenn.
But many researchers turn to robotic solutions without putting much thought into whether they really need it and how best they can utilize it. “In many cases people get into automation when they don’t need to, prematurely or with unrealistic expectations,” says Speicher. But often times the investment pays off only when researchers have done their homework and are realistic when evaluating the present and future needs of their labs.
|
Knowing what works
So when does one really need automation? Does it work well for all applications? What are the advantages and disadvantages of employing automated systems? “All processes can be automated in some way, shape, or form but it’s really the experiments that dictate its necessity and feasibility,” says Mike Pisano, PhD, chief scientific officer at NextGen PLC in Ann Arbor, Mich.
Most researchers hope that automating a process will help improve their throughput and lower the variability in their experimental protocols. And in some cases they do. “If you are doing quantitative comparative analysis, then each experiment has to be reproducible and automation can help a lot there,” says Pisano. His company works a lot with clinical samples and uses a variety of proteomics tools for applications like biomarker discovery. Most of the processes in his lab are fully automated and he finds it extremely valuable to have the reproducibility and consistency that automation offers.
On the other hand, Richard Caprioli, PhD, at the Vanderbilt University School of Medicine, Nashville, has chosen to selectively automate only some of the experimental protocols in his laboratory. He has also met with remarkable success. Caprioli, a professor of Biochemistry and director of the Mass Spectrometry Research Center at Vanderbilt, is a pioneer in the field of matrix-assisted laser desorption/ ionization (MALDI) mass spectrometry (MS)-based imaging. One of the critical parts in the MALDI process is the application of the matrix, says Caprioli. “It is not that it is hard, but unless it’s done in a very reproducible way, you can get different results. We have found that using robotics has led to a very high reproducibility of the MALDI process.”
The other aspect where he has found automation to offer consistency is in picking the tissue sections for analysis. In the past, the tissue section selected for picking was a highly subjective process and varied from person to person. Automating it so that a microscope takes a picture of the tissue sample and allows the analyst to pick areas in exactly the same way each time using registration points on the slide has proven very beneficial. “We found that automating the process gave us very high reproducibility in terms of going to exactly the area on the tissue that we wanted to,” says Caprioli.
Where automation seems to have worked best in proteomics is for MS. “The technology [MS] is at a point now where it is quite reliable and that then leads to the ability to automate it,” says Caprioli. “It doesn’t take a human being to tune and refine it for every single experiment.” For techniques like LC-MS, the use of autosamplers for moving and injecting hundreds of samples sequentially into the instrument have proven helpful and are now routine in most labs.
|
Weighing the pros and cons
However, automation does have its limitations. Friedman has found a definite trade-off between speed and quality. “With automation you do get a small hit in sensitivity,” he says. Speicher has also found that, most often, automated methods tend to use larger volumes as compared to manual methods.
“There are relatively few places where automation works well,” says Speicher. His lab works a lot with 2D-gel electrophoresis and automating 2D-gels can sometimes be very challenging. However, there are some parts of the gel electrophoresis process that are easier to automate than others, such as in-gel digestions. In-gel digests require many manipulative steps that can be easily automated using robotic liquid handlers. This task, if done manually, can give slightly better results, but it does take a very long time.
Having a process in automated mode also raises the question of reliability. Is the robot going to perform reliably all the time? “Even if it performs well 99% of the time, there will be a couple of instances when things go wrong,” says Speicher. A column can get clogged in the middle of a run leading to a loss of time or sample or both. Or a column can degrade over time and the deterioration in the quality of information gathered may go unnoticed.
|
“The counter-argument is, will a person doing it manually do any better,” says Speicher. “At least the hope or expectation is that if a person working with a sample manually makes a mistake, they have a chance to catch it, whereas you are blind to what a robot has done.”
Most modern mass spectrometers are equipped to perform data-dependent acquisition. The instrument in this case makes the decision on what to isolate and analyze depending on the ions that are detected during the run. “This saves a tremendous amount of time and increases the throughput, but the downside is: what if there is a certain peptide ion in there that is of interest? You now have to go back and sift through the data and re-analyze it,” says Friedman. “If it were being done slowly and manually, you can look at the data and then decide how you want to proceed next.”
|
” It’s easy to automate, but there are times when things will go wrong,” admits Pisano.
While Pisano has had success operating proteomic protocols in his lab in a fully automated fashion, he agrees that human intervention cannot be completely eliminated. He recommends a manual intervention between key steps in the process so that if there is a fault with the robot there is a stop point before everything is lost. This intervention is important both at the experimental- and data-analysis stage. “The database searching and data-crunching are done automatically, but we also go through it manually to check the data,” says Pisano. “We monitor the mass spectrometers all the time, either from the lab or remotely.”
Choosing what works best for you
It does not always have to be all-manual or all-automated. “One of the things that we have done as an alternative to using automation is to go the semi-automated route where you are working with 96-well plates and 8-channel pipettors to get a fairly large number of samples processed,” says Speicher.
The needs of the end-user and the demands of the application should ultimately drive the decision of whether or not to automate. “Analyze your needs on a continuing, long-term basis and determine the level of performance you want, before choosing your method of analysis,” advises Speicher. Also scrutinize the time required for the downstream analysis. “If it is going to take you weeks to months to analyze your samples, then it doesn’t matter if it takes you a couple of days to do those digestions,” he says.
Once the decision to automate is made, the various technologies have to be carefully evaluated before investments are made. “No one particular instrument does everything,” says Caprioli. Users should focus on the analytical endpoint that they want to accomplish and then choose the right instrument, he says. Since robotics and analytical instruments are sold by different vendors, integration of various robotic devices can be an issue. “Even though each vendor is very good, these instruments don’t necessarily talk easily with one another,” says Caprioli. “It’s awkward to sometimes get them functioning as an integral unit.” Caprioli suggests that one of the ways in which this integration can be accomplished is by formatting the instrument output in a standard manner rather than having them done in a proprietary fashion. “The industry is moving towards that, but we are not there yet.”
About the Author
Tanuja Koppal, PhD, is a freelance writer specializing in life science and pharmaceutical topics.
This article was published in the Lab Automation supplement to Drug Discovery & Development and Bioscience Technology magazines: November, 2007, pp. 10-14.
Filed Under: Genomics/Proteomics