Earlier this fall, the FDA detected potential issues in data generated by two contract research organizations (CROs). Specifically, the agency’s statement noted “significant instances of misconduct and violations of federal regulations, which resulted in the submission of invalid study data to FDA.” Accordingly, the agency advised that all affected sponsor organizations repeat any bioequivalence and bioavailability studies essential for approval, and not surprisingly, to do it through organizations other than the two in question or any others with “unresolved data integrity concerns.”
The above anecdote, which is not an isolated occurrence, highlights the importance of data integrity, the lack of which can derail research and set biopharma organizations back by months if not years. My team’s research has found that up to 30% of work is subject to rework because of data issues, wherein the data describing process execution and outcomes simply can’t be located.1 When you factor in the unpredictability and complexity of therapies like biologics, the time impacts of poorly managed data are on the order of 6–18 months.
Sophisticated data management is critical for avoiding this kind of issue, which can hold up approval, force much unnecessary rework, and all but ruin a company’s reputation. And it will only become more critical as research reaches greater levels of complexity and the amount of data generated grows. Unfortunately, the reality is that legacy systems, including laboratory information management systems (LIMS) and others, which many companies may still use, can no longer keep up with the pace of research. A fundamental issue is that LIMS and the other three- and four-letter acronyms describe software categories, not the work itself — which is a major disservice to the research a company is doing and the data it’s generating. This partially stems from the fact that some legacy systems are 40 years old and have origins in totally different fields (like manufacturing quality control). This fact means there are inherent mismatches between these legacy systems, which can only evolve so much to biopharma’s increasingly complex R&D needs.
What’s needed, then, both to avoid data integrity issues and take research into the future, is a method that integrates data across the entire development ecosystem — biopharmaceutical lifecycle management (BPLM2). This kind of end-to-end management supports research from early development to clinical and goes beyond the acquisition of data to place it in a critical context. Among the key benefits of a BPLM is automatic, seamless integration, where process and analytical data are captured right where the process is executed. When this happens, a contextualized data “backbone” emerges, illuminating the entire development lifecycle and enhancing the insight derived from data. Companies that have already switched over to such a system or built their organization around it experience a significant reduction in errors and rework, which translates into considerable cost and time savings.
There are also headier implications of such a system, beyond avoiding mishaps and even saving time and money. As mentioned, BPLM is particularly important to support the increasingly complex therapies that are emerging, particularly biologics, which will make up and greater and greater percentage of the organization’s research portfolios in the coming years. It also allows biopharma companies to fully leverage the power of data in advanced modeling and analytics endeavors, such as digital twins, artificial intelligence and machine learning. Those techniques themselves will play larger roles in research and in shortening the development lifecycle.
Finally, BPLM represents an interesting new point in the evolutionary timeline of computing. As computers got smaller and storage became non-localized, new services that didn’t exist before suddenly came into being, like remote hosting-as-a-service and software-as-a-service. With a BPLM, we’ve now reached the point of workflow-as-a-service since such products don’t just provide a new software program but an entirely new way of working. Risk now becomes shared, which is appealing to organizations looking for more from a vendor than just selling them a product. Now, the service provider shares in the process and the risk and has an even greater stake in the outcome.
Biopharma is embracing the digital revolution more and more and realizing the benefits of taking the plunge. As technologies become even more advanced and innovative, the divide between companies that have embraced transformation and those that haven’t will become wider. The risk calculation around digitization in biopharma is shifting. Rather than it being risky for an organization to invest in digital transformation, it has become unsustainably risky not to.
- IDBS Whitepaper: Why Biopharma Lifecycle Management? Available at: https://insights.idbs.com/l/468401/2021-02-02/83js2/468401/16122843440GchFStZ/IDBS_Whitepaper___Why_Biopharma_Lifecycle_Management.pdf
Graeme Dennis is the Commercial Director, Preclinical Pharma at IDBS since 2018. Before IDBS, Graeme held scientific informatics leadership roles in academia and industry, including Accenture and Vanderbilt University, where he studied Chemistry (B.S. 1999). Graeme is interested in systems that help organizations position scientific data as an asset for operational and strategic use. He lives in Nashville, Tennessee.
Filed Under: Drug Discovery and Development, Industry 4.0