Good data capture is crucial across the biopharmaceutical lifecycle. Good data matters not only for internal process management but—just as crucially—across external stakeholders.
At the close of 2022, there were 295 active drug shortages, according to a new Senate report. That’s a five-year high that has now decreased—but it underscores an ongoing and complex challenge. In 2023, as industry actors continue to face capacity shortages, there will be an increasing need for more effective partnerships across external manufacturing networks and distributed research and development teams.
One solution for faster and smarter decisions throughout the biopharmaceutical lifecycle is to invest in better data management. To create transparency, facilitate collaboration, and enable accurate forecasting between sponsors, collaborators, contract organizations, and IT service providers, working toward digital maturity is key.
Digital maturity will require shared data standards, strategies for streamlining data management and a collaborative approach that involves all players in the supply chain.
Shared data standards are the foundation of collaboration
Advanced computational techniques like artificial intelligence and machine learning can provide logistics insights for smoother supply chain management. In the drug development phase, they can also help zero in on new drug pathways and research areas.
But these techniques can only yield results if built on a foundation of usable data. There are four broad process area priorities where accurate and timely data exchange is fundamental: Quality, Supply chain, Manufacturing and Tech transfer.
When different partners in the drug development and manufacturing processes store data in incompatible ways, partners face missing information, a lack of visibility and difficulties in predicting needs and challenges along the supply chain.
A wealth of data is generated during each phase of the biopharma lifecycle and the speed at which that data is being generated is only growing. But data from different silos and partners is often incompatible or incomplete in ways that make it tricky to mine for useful insights. To overcome these challenges many companies are building aggregated, contextualized data backbones using data lakes, or implementing commercially available digital systems that help them build these backbones at the point of data capture.
To enable machine learning techniques, it is critical that players along the supply chain embrace a common understanding of data management strategy. Thankfully, a movement is already underway to build this shared strategy: a number of industries and academic researchers are already moving to adopt the various principles for data interoperability.
First, the F.A.I.R. principles ask that data is Findable, Accessible, Interoperable and Reusable. These principles are an important baseline, and the Go Fair initiative offers roadmaps for working toward F.A.I.R. data.
ALCOA+ is another data standard that can help integrate data between teams and partners. ALCOA+ outlines additional attributes of the most actionable data: data should be Attributable, Legible, Contemporaneous, Original, Accurate, Complete, Consistent, Enduring and Available (ALCOA+). These standards can help data stay usable throughout the biopharma lifecycle. FDA first introduced the ALCOA guidance for data integrity.
Structured data drives value
Even with this foundation, shared data by itself is insufficient to unlock the opportunities that collaboration can provide. Scientists and process engineers can spend countless hours finding, reconciling and assembling data maintained in disparate systems. To ensure interoperable data is leveraged, data should be accompanied by a semantic enrichment layer.
Utilizing digital workflows at the point of data creation ensures data quality. They provide the foundation for real process understanding by automatically mapping the complex relationships between material attributes, process parameters and product quality across unit operations.
This allows scientific terms to be automatically aligned with standard terminology and ontologies. This can be accomplished by using digital workflows that capture process and analytical data in full context right from the start. This is where choosing the right tools and platforms comes in: utilizing a platform for streamlined workflow execution facilitates collaboration between internal and external stakeholders.
Collaboration: A pattern for digital maturity
In BioPhorum’s Vision for digital maturity in the integration between biomanufacturers and partner organizations report, the authors lay out five patterns of digital maturity to describe how teams extract and manage their data: pre-digital, manual extract, auto-extract, auto-ingest, and near-real-time. Often, partners in the biopharmaceutical supply chain are in different places along this journey. [Editor’s Note: The author’s employer is a member of BioPhorum]
Realistically assessing partner capabilities and bringing everyone along is critical for a successful digital maturity roadmap. While it may be tempting to invest only in internal needs, improving by leaps and bounds ahead of partners will only create new bottlenecks. By moving along this maturity model together—while taking the time to bring everyone along—collaborators can achieve improvements in quality, speed and flexibility while simultaneously reducing cost.
Sponsors, collaborators, contract organizations, and IT service providers must all utilize standards-based digital engagement to realize the benefits that automated insights can unlock for patients and for our industry. Organizations must both generate and mobilize structured data in a way that means both data and best practices can be communicated.
Collaboration across the biopharmaceutical lifecycle has not reached digital maturity—but moving in this direction is crucial. Doing so will be key to improving patient outcomes: both through new discoveries and through better supply chain management.
Photo: Liana Nagieva, Getty Images