[Check out Part 1 of this feature on what constitutes a true cloud laboratory.]
The term “cloud lab” was first coined by Stephen Wolfram and Brian Frezza in 2012 to describe the lab being built by Emerald Therapeutics. The term came into public use in 2014, with the introduction of Emerald Cloud Lab.
“Cloud lab” as a technical term was intended to be analogous to the cloud computing paradigm in the IT world. The IT community also initially struggled to define what exactly is meant by the term “cloud computing” and how it was fundamentally different from other technologies of its day. Over time, a set of criteria evolved to distinguish cloud computing from prior, incomplete technologies such as local data centers or rentable shared access data centers.
The same is true of the term “cloud lab,” which is defined by five criteria:
- Remote, on-demand experimentation.
- On-demand control of every experiment.
- Comprehensive instrumentation.
- comprehensive sample preparation.
- a single software interface for the entire lab.
In the first installment of this two-part series, we explored the concepts of remote on-demand experimentation and on-demand control of every experiment. This article will take a deeper look at the remaining three criteria to better understand why each matter and how they create a cohesive vision for a true cloud laboratory.
A cloud lab must provide users on-demand access to instrumentation covering the full scope of laboratory work they will be performing, obviating the need to perform any of your daily lab work outside of the cloud laboratory.
It is critical for a cloud lab to include all of the necessary instrumentation users might require in their research. As mentioned previously, flexibility is a fundamental value proposition of what a cloud lab can offer. Diversity of instrumentation enables scientists to work around scientific challenges that pop up during their research. This both helps de-risk a research program and allows scientists to creatively solve problems without the equipment access and technique constraints encountered in a traditional lab, leading to better science and outcomes.
On a more fundamental level, a cloud lab without the necessary instrumentation would force its users to run some or part of their methods elsewhere. This takes those methods outside the software-mediated, full remote control of the cloud lab, with the attendant downsides discussed above. In some cases, this may be manageable. In the great majority of cases, however, the downsides outweigh the benefits of using a cloud lab. For example, shipping samples to and from the cloud repeatedly during the execution of a single method to incorporate an unsupported instrument would be intolerably slow.
Additionally, having a complete feature set enables larger workflows to be conducted in a more automated manner. For example, many true cloud lab users bring together key subject matter experts (SMEs) across disciplines (synthesis, purification, analytical) to develop one step of a larger workflow. Each SME can develop their step in the process and has visibility into the work performed by their team members, and each can integrate their work into a larger workflow when they are ready. Without the diversity of instrumentation that a true cloud lab provides, the work would have to be performed at many different locations. The teams would also need to manage the logistics of sample shipments and external project management.
To be clear, a cloud lab does not need to support every conceivable instrument in existence to create substantial value, but it should be well-rounded enough to allow scientists to do most of their day-to-day research in the cloud. For example, in practical experience, 250 instrument models, or unique experimental capabilities, are required to support the needs of pharmaceutical or biotechnology R&D organizations. Other research domains may require fewer or more, depending on the scope and breadth of the science they need to support.
Comprehensive sample preparation
A cloud lab must allow customers to perform all aspects of sample preparation, storage and handling remotely. At a minimum, sample preparation capabilities must include liquid handling ranging from microliter to liter scale, solid handling from grams to kilograms, interoperability with any container form factor, indefinite sample storage in typical storage conditions, and operations in specific environments such as biosafety cabinets, fume hoods and glove boxes.
While sample preparation can often be overlooked, every experiment starts with sample preparation. Therefore, it is an essential component of any method. It follows that to enable customers to fully express their methods in the software interface, a cloud lab must support a robust set of remote-controlled sample preparation techniques. Without this support, a cloud lab would not be helpful as a daily driver of lab activities. Absent adequate sample preparation support, a customer does not have full remote control over their method.
This effectively reduces the cloud lab to a conventional CRO, as explained in part one of this series. Practically speaking, if samples had to be hand-processed in a traditional lab between each instrumentation step, the value of integrated instrumentation would be lost, as each step becomes a point of communication friction and potential failure outside the software-specified method. It would be akin to having a person in a data center retrieve a hard drive by hand and plug it into a server to download an email attachment from the cloud. Additionally, if the cloud lab were limited to certain form factors, scientists would need to route all samples through a traditional lab to transfer them into new containers before they could be used in a cloud lab, negating the efficiency gains a cloud lab offers.
Supporting all of the permutations of sample preparation required for a research and development organization is one of the most complex parts of a cloud lab. For example, operations in a true cloud lab are underpinned by automated liquid handling platforms. Nonetheless, these are supported by many other methods, traditionally unfriendly to automation, to move liquids and solids around the lab.
One software interface for the entire lab
A cloud lab interface must allow customers to script experiments (including connecting multiple experiments) and process, analyze, visualize and interpret all of the data generated by the interface without requiring third-party software or involvement of software or automation engineers to reconfigure software or hardware.
One of the biggest frustrations for scientists trying to streamline or automate their work is the need to manage and move data across multiple systems. Code or RPA tools can help with well-defined and repetitive workflows. For the research or development scientists who are typically doing open-ended research, however, data transfer is not well-defined. Instead, it often employs whatever is easiest for the scientist at the time — often USB drives, network storage and email. This means many critical steps in data management are done outside of the system, eliminating data integrity along with any ability to streamline or automate these tasks.
Additionally, supporting all of the instruments in a cloud lab would require hundreds of different software packages just for data processing and analysis. For this reason, the cloud lab must provide all of the tools necessary to process, visualize, interpret and report any experiment.
Combining all of the tools to design, run, and analyze experiments in a single platform allows customers to not only perform the experiments as they would in a traditional lab but also script experiments and decision points together to generate larger and more comprehensive workflows, letting them focus on less well-defined areas of research and as a result, greatly improving their capacity to perform research.
The currently fragmented ecosystem that we all live with comes at a significant cost to science. Cloud labs provide an opportunity to unify equipment, data and services, allowing for automation for data processing, analysis and reporting.
The value of a true cloud lab
A true cloud lab frees scientists worldwide to follow their scientific questions wherever they may lead by giving them virtually unlimited access to a vast portfolio of instruments and allowing them to focus on experimental design instead of lab logistics. Cloud labs reduce costs, accelerate research and drug development, increase reproducibility, make research more inclusive, and will open science up to a whole new generation of thinkers who no longer have to be limited by technique, budget, space or time.
When addressing this new technology, everyone is faced with a buy versus build decision. Beyond the economic incentives to not reinvent the wheel, the experiments run on these cloud labs and the data processing, visualization and analysis must be as universally interoperable as possible across the broader scientific community (biopharma, startups and academia).
This interoperability, in most cases, may tip the balance of the buy versus build decision toward buying the services of a true cloud lab provider. Rather than building bespoke, individual cloud labs for each research organization, it would seem to make better sense both operationally- and scientifically speaking, to consolidate around a commercial platform.
Toby Blackburn is the head of business development at Emerald Cloud Lab.
Filed Under: Drug Discovery