Featured Article
Ensuring the accuracy and completeness of biopharmaceutical data throughout the drug development pipeline is of paramount importance for biotech and pharmaceutical companies. However, as R&D workflows become more complex, and new high-throughput technologies generate multidimensional data on an ever-increasing scale, achieving the highest levels of data integrity and demonstrating this to regulatory authorities is a growing challenge.
Cloud-based informatics platforms offer a flexible and scalable solution for managing biopharmaceutical data, allowing laboratories to easily ensure the integrity of data throughout its lifecycle. The data aggregated into the platform can then be analyzed using data analytics and visualization tools. The end-to-end workflow visibility afforded by these systems is also driving enhanced process monitoring, facilitating faster and more robust decision-making, and resulting in increased operational output. By enabling laboratories to efficiently produce data of high scientific integrity, cloud-based informatics platforms are streamlining biopharmaceutical R&D and accelerating the process of bringing new drugs to market.
Ensuring data integrity: A growing challenge
To ensure new treatments are safe and effective, it is critical that the highest standards of data integrity are maintained throughout the R&D pipeline. While regulators focus on the later stages of the development process, the same principles should apply throughout the process. Researchers are making critical go/no-go decisions. By adhering to good data integrity practices, they can ensure they are making the right decisions based on sound scientific results. As compounds advance through the pipeline, laboratories may be required to provide assurances of scientific integrity to a range of individuals and organizations, including research leaders, patent reviewers, and regulatory agencies. As such, complete and comprehensive record-keeping is essential.
Nevertheless, making sure R&D workflow data is accurate, complete, and readily accessible is proving to be increasingly challenging in today’s rapidly evolving biopharmaceutical landscape. Advances in high-throughput techniques such as next-generation sequencing, qPCR, and mass spectrometry have led to the development of more complex and highly automated workflows that can generate huge amounts of multidimensional data. This data is then analyzed using advanced data analytics tools like Shiny from RStudio (Boston, MA) to help scientists identify promising targets and drug candidates. To achieve the highest standards of scientific integrity, data must be organized so that all samples and reagents are traceable, and information from all procedures must be readily available. Moreover, as R&D streams grow in complexity, it is becoming increasingly important not only to monitor specific processes, but also to have complete end-to-end visibility over all workflows within these complicated value chains. By managing the end-to-end sample analysis and linking the sample to the appropriate instruments and reagents, scientists can be confident that their data, and ultimately their decision, has not been compromised by a failed reagent or instrument run.
In addition to meeting the needs of evolving workflows, data management systems must be sufficiently flexible to ensure compliance with changing regulatory requirements. Wherever they intend to file applications, laboratories must demonstrate compatibility with the latest regulations in those countries. As requirements change, manually reviewing systems to confirm compliance with regional updates is a time-consuming process. Yet few conventional data management systems can achieve this in a fast, convenient, and cost-effective way.
Proof of good data management practice is essential to demonstrate regulatory compliance in an audit, and to submit crucial applications such as a biologics licensing agreement (BLA) or a new drug agreement (NDA). If data submitted in these applications is deemed incomplete, laboratories may be required to search through old records to find key data or even repeat studies, leading to significant additional costs in terms of time and resources. Biopharmaceutical organizations must be able to provide evidence demonstrating an unbroken chain of custody for every sample, measurement, and analytical workflow. This includes information on sample source, storage conditions, analysis methods, as well as instrument validation and calibration.
Traditionally, laboratories have employed on-site systems to manage their data pipelines. While digital systems have largely replaced the paper-based approaches of the past, workflows often rely on multiple disparate systems that have been introduced as point solutions, rather than as part of an integrated whole. Digital solutions such as electronic laboratory notebooks (ELNs), laboratory information management systems (LIMS), and scientific data management aystems (SDMS) are helping to increase efficiency, but it is not always easy to coordinate the flow of data between these separate systems if they are used in a fragmented way across different facilities and workflows. Consequently, there is a need for better integration to guarantee data integrity in expanding high-throughput workflows.
Cloud-based solutions to improve and demonstrate data integrity
To achieve the robust standards of data integrity needed in biopharmaceutical R&D, laboratories are increasingly turning to cloud-based platforms. Cloud-based data management platforms overcome the pitfalls of fragmented on-site data management tools by providing a single integrated ecosystem for an organization’s data, making it easier to both enhance data integrity and ensure regulatory compliance. The most sophisticated platforms are able to integrate all stages of R&D workflows, including data acquisition and analysis, meaning that instruments and devices can be programmed to automatically upload their data to this platform, where it is stored in a highly organized format. This allows laboratories to streamline their entire operation, enabling data acquisition to be easily configured to comply with regulatory requirements, and organizing stored data into a format readily accessible for an audit.
Some of the more advanced cloud-based platforms can even be adapted to meet specific workflow needs. Systems such as the Thermo Fisher Platform for Science software (Thermo Fisher Scientific, Philadelphia, PA), for example, are based on a modular framework, allowing workflow-specific applications to be installed onto the platform alongside elements such as ELN, LIMS, and SDMS to provide templates that support industry best practice and enhance regulatory compliance. Given their flexible and extensible nature, cloud-based platforms are able to easily integrate additional systems as workflows change and new technologies become available.
Cloud-based platforms are also well equipped to help laboratories deal with changing regulatory requirements: as guidelines change, updates are managed by the cloud service provider, meaning it is no longer necessary to conduct a time- and resource-intensive overhaul of on-site systems. This flexibility to accommodate future needs means that cloud-based platforms allow laboratories to guarantee data integrity while adapting quickly to the evolving needs of biopharmaceutical R&D.
In addition to ensuring data integrity, informatics platforms must allow laboratories to easily demonstrate regulatory compliance. Since data is integrated into a single digital ecosystem, cloud-based informatics platforms allow organizations to easily access the information required to assure their adherence to regulatory requirements. With previous fragmented data management platforms, the process of manually searching through disparate systems in search of information needed for an audit could be resource-intensive. In contrast, cloud-based platforms organize and cross-reference structured, unstructured, and reference data, making it much easier to search and mine this information.
By storing data in an organized and readily accessible format, cloud-based informatics platforms not only enable laboratories to achieve the highest levels of data integrity, but also monitor their workflows to increase efficiency and improve quality control. With immediate and easy data access, scientists can evaluate and adjust processes in real time, with reference to regulatory requirements where appropriate. The end-to-end visibility enabled by these cloud-based platforms permit processes to be streamlined, increasing productivity and allowing better decision-making. In the manufacturing stage of the pipeline, these systems can be used to detect trends indicative of deteriorating instrument performance, enabling corrective action to be taken before serious consequences develop. In this way, cloud-based solutions are ensuring high-quality data is managed and analyzed with greater efficiency throughout the R&D pipeline.
Conclusion
As the biopharmaceutical R&D landscape changes and high-throughput technologies continue to generate multidimensional data on an unprecedented scale, it is increasingly challenging to manage data in an efficient and regulatory-compliant way. Cloud-based informatics platforms are ideally suited to overcome these challenges, providing scalable systems that guarantee data integrity, with the flexibility to adapt as both workflows and regulatory requirements change. These platforms are providing laboratories with the fundamental infrastructure to easily achieve and demonstrate compliance, so changing regulatory requirements do not result in a growing burden on laboratory resources.
Trish Meek is director of marketing, Digital Science, Thermo Fisher Scientific, 1601 Cherry St., Ste. 1200, Philadelphia, PA 19102, U.S.A.; tel.: 215-964-6020; e-mail: [email protected]; www.thermofisher.com