
[From IDBS/Microsoft-Designer]
According to a January 2024 McKinsey report, generative AI is poised to unlock tens of billions in new biopharma revenue each year. It will help scientists extract relevant information from scientific papers, predict chemicals that are good bets for further testing, help researchers better design large molecules, use real-world patient data to find new drug indications, optimize clinical trial populations, and more.
Cloud vs. on-prem
Training custom Large Language Models (LLMs) for generative AI requires huge amounts of computing power upfront: a task that necessarily requires working in the cloud. That said, companies with an on premise approach can still access pre-built LLM tools. In March 2024, NVIDIA released new generative AI tools specifically for biopharmaceutical companies; they are available to both cloud-based and on premise companies.
But that doesn’t mean that an on premise approach is future proof. For one thing, licensing costs vary greatly; solutions are often cheaper up-front for cloud-based customers, and it is worth running the numbers. But the costs associated with using generative AI models are negligible compared to the cost of making internal data intelligible to them.
Is data from discovery, manufacturing, and trials clean, accessible or well organized? Or is it stored in silos, on paper, or not at all? Companies with meaningful, accessible data will be able to unlock more value from AI solutions. Smart data management is where a cloud-first approach can be truly transformative. This approach is scalable: it lets biopharmaceutical companies integrate data quickly and flexibly as digital transformation plans evolve.
Putting the data puzzle together
Despite various initiatives over the years to harmonize data, many companies still face a fragmented data landscape: they still use multiple LIMSs, ELNs and instruments that still need to be connected into a coherent ontology. With an on premise approach, integrating that data means planning ahead to add more servers. No matter what, thoughtful data mapping and ongoing change management will be needed to allow previously siloed data to feed LLMs. But with a cloud-first approach, a server build will not become a project bottleneck.
Likewise, a cloud-first approach can help companies scale quickly as they implement other strategies related to digital transformation: one client of ours, for example, acquired another company and needed to onboard dozens of new users quickly. Provisioning cloud licenses shaved months off of the process. The converse also holds: as companies pivot, business units can wind down quickly, without sunk costs related to server space.
Other considerations
There are still plenty of reasons for companies to value an on premise approach. Perhaps manufacturing hubs are located in areas with poor connectivity, or capital outlays look good on balance sheets, or the company is not ready for a big change. But thankfully, many other traditional cloud hesitations are often no longer relevant.
In the past, for example, companies worried about keeping their data in the correct jurisdiction; now, cloud providers do this by default, and keep backups in multiple locations to safeguard against disasters. A useful perk is faster data access for global collaborators and for contract research organizations based elsewhere.
Previously, security concerns were also key. This is still a good reason to choose trustworthy SaaS providers, but most data breaches actually originate from within companies. Physical access to a server room is a liability; in contrast, cloud providers offer layers of anonymity that make it much more difficult for unauthorized users to access data. In addition, cloud can provide more robust disaster recovery and backup processes, compared to what can be economically provided on-premise. We have seen organizations having their on-premise software availability significantly impacted following an incident. Whereas for cloud-based software, automatic failover and/or robust disaster recovery processes mean that there is minimal impact on the software availability and hence businesses can continue to focus on the science.
Aligning tech + human capital
Moving to the cloud can also unlock human capital: IT teams can refocus on digital transformation instead of maintaining physical servers and managing updates. An on premise approach often means a backlog of tickets, with bench scientists reverting to paper in the meantime, compromising data quality. With a cloud-based approach, partners can offer updates on demand and address tickets seamlessly; IT can focus on priorities instead of roadblocks.
Moving to the cloud is not a panacea; likewise, keeping some functions on premise does not spell doom. But companies that lean into cloud solutions now have a much better chance of making generative AI a true competitive advantage and help bring therapies to the market faster.
About the author

Stuart Ward
Stuart Ward is Head of Platform Strategy at IDBS and is responsible for ensuring that IDBS’ software platform and products meet the needs of customers today and in the future. The Platform Strategy team provides the necessary business, technical and domain experience along with customer and market insights required to create software and solutions to enable BioPharma and other industries to achieve faster scientific breakthroughs. He led the creation and launch of The E-WorkBook GxP Cloud, which was IDBS’ first SaaS product for use in regulated (21 CFR Part 11, GxP) environments. Before starting this role in January 2014, he was Product Manager for E-WorkBook for four years and worked in IDBS Global Professional Services for five years, responsible for deploying IDBS’ products both from a technical and project management perspective. Prior to working at IDBS, Stuart completed a post-doctoral fellowship at the NIH and then worked for Ionix Pharmaceuticals. Stuart obtained his PhD in Pharmacology from the MRC National Institute for Medical Research (University of London).
Filed Under: Data science, Industry 4.0, machine learning and AI