Automation in Bioprocess Development: Near-, Mid-, and Long-Term Opportunities

Featured Article

 Automation in Bioprocess Development: Near-, Mid-, and Long-Term Opportunities

Automation, combined with other technological advances such as the Internet of Things (IoT) and analytics platforms like machine learning, will be a major part of the future of bioprocess development, and will continue to make waves in the industry by removing mundane tasks from scientists while increasing accuracy and efficiency. 

Liquid handling robots and high-throughput process equipment are already commonly found in development labs, and academic groups and technology vendors are pushing the boundaries of microfluidics to create “lab-on-a-chip” devices for process development applications. But, despite the interest and investment in these new technologies, very few labs can currently use them to their full potential.

Shifting the bottleneck

Bioprocess development is very labor-intensive, particularly compared to small-molecule development, and there are many opportunities to take some of the manual handling burden off of lab scientists and technicians. When the speed of experiments is increased, however, the bottleneck shifts from equipment setup and manual operations to analytical assay turnaround and data analysis. This is especially true when processes are scaled down to bench or even microwell scale, because more experimental data is needed to account for potential variability.

Automation is more than robotics

Only a few years ago automation was synonymous with robotics. And, initially, lab automation focused on wet lab requirements such as plate handling and pipetting. This led to the creation of assay platforms designed to be run on robotic systems that all but removed the need for human intervention. These platforms were designed to connect with corporate inventory and sample management and dispensing systems, which meant scientists could design experiments and let the robotic system run the assay.

While this type of system clearly brought many advantages, such as the ability to run experiments overnight and on weekends without manual supervision, it only covers part of the story. The definition of automation has now progressed to cover all aspects of the process development workflow, including automated data analysis and automated decision-making and progressive experiment design.

Advances in technology

As the Internet of Things (IoT) makes its way from our homes into the laboratory, we are seeing many opportunities for the simplification and improvement of laboratory workflows. Companies such as BioBright (Boston, MA), TetraScience (Boston, MA), and Elemental Machines (Cambridge, MA) are leading the way in demonstrating what technology can deliver for next-generation “smart” laboratories. For example, the BioBright voice-recognition system and “hotfolder” software that automatically collects all experimental data including transcriptions of voice recordings were used in a neuroscience lab at New York University.1 The system not only improved the accuracy of the delicate work by more than 20 times, but also meant that a single person could do the work of two scientists.

Most of the new instruments coming to market are now IoT-enabled and vendors are using this technology to help deliver a better service with features such as automated preventative maintenance and reagents delivered Just in Time (JIT). The benefits of advanced monitoring and alerting can be highly valuable when expensive long-term studies and instruments are involved.

A large pharmaceutical company used TetraScience’s cloud-enabled monitoring technology to detect a potentially disastrous deviation in an accelerated lifetime experiment in time to take corrective action and avoid estimated costs of more than $30 million.2 There are other examples, too. After failing to troubleshoot problems with a high-performance liquid chromatography (HPLC) instrument using traditional methods, a lab in Massachusetts installed a cloud-connected temperature sensor from Elemental Machines near the instrument and soon discovered that the problem was caused by the building’s climate control system.3

So, while robotics will still form the foundation to automation, monitoring analytical instruments in real time, gathering data, and performing automated analysis and feedback loops all come into play with the concept of the totally connected lab.

What comes after the connected lab?

With a fully connected and automated environment comes the next phase of development—a self-monitoring, self-regulating closed system that can make decisions on what to do next based on the results of the current experiment. Here, we see the importance of machine-learning techniques, where models can be developed and trained to allow systems to control themselves based on patterns they are seeing in real time.

In addition to academic research, companies such as LabGenius (London, U.K.) are already applying machine-learning algorithms to protein engineering with therapeutic applications in mind. The company developed EVA, an artificial intelligence-driven evolution engine, and is using this technology to learn how to enhance therapeutic protein stability with the potential to address unmet clinical needs.4

The foundations of this approach are IoT and robotics, but their development will require considerable optimization and testing. This also requires the ability to integrate data acquisition and process control strategies across different systems and vendors. There is some standardization in the industry, such as the use of the OPC for interoperability across process equipment, but more open-system architectures and better collaboration between vendors are still needed.

Analysis, analysis, analysis

The final part of the system to consider for automation is the analysis and reporting aspects—all important parts of the decision-making process for scientists. Getting the fundamentals of data collection and analysis right is a key prerequisite to any automation effort, yet this isn’t as simple as it sounds.

Typically, the process for data aggregation and analysis and reporting is still manual, despite automation of the instruments. There are different stages of data analysis, ranging from converting raw measured values into biologically meaningful parameters through comparing data across unit operations, across different analytical assays, and over extended time periods. Improvements at the instrument level can greatly enhance the quality and quantity of data analysis for individual unit operations, but there still needs to be a higher-level integration layer to link the results of each unit operation together and enable rapid and sophisticated root cause analysis across the entire process.

Great strides are being made in this area—the concepts of alerting, automated decision trees based on business rules and research and development project progress reporting are starting to emerge. This has the potential to not only reduce the burden of data administration faced by many scientists, but also enable greater visibility of the dependencies between process parameters and quality attributes leading to much more profound process understanding. The key with the analysis aspects is making sure that all the different parts of the system are integrated at the data level and this, in turn, will require collaboration from biopharmaceutical companies, instrument vendors, and software vendors.

By considering automation in this holistic manner, we can see great opportunities for new technology and a streamlining of research and development. The imperative to increase efficiency and reduce time to bring new biologics to market is forcing biopharmaceutical companies to reconsider the ways they interact with and leverage technology. With the advent of continuous processing and new types of biological products such as cell and gene therapies, these advances are now becoming more important than ever.

References

  1. https://www.businesswire.com/news/home/20170314005466/en/BioBright-Tools-Bring-%E2%80%98Superpowers%E2%80%99-Biology-Lab
  2. http://www.tetrascience.com/case-studies/drug-development
  3. https://www.the-scientist.com/bio-business/bringing-the-internet-of-things-into-the-lab-64265
  4. https://www.labgeni.us/#markets-sectionAnchor

Paul Denny-Gouldson is vice president Strategic Solutions, and Claire Hill is solutions support manager, IDBS, 2 Occam Ct., Surrey Research Park, Guildford, Surrey GU2 7QB, U.K.; tel.: +44 1483 595 000; e-mail: [email protected]www.idbs.com

Related Products