The New Era of the Laboratory

The New Era of the Laboratory

It’s no secret that data is the new oil – but the ultimate value for a life sciences company comes only when it can turn its data into knowledge that drives decision making. The ability to gather data from lab equipment, via the Internet of Lab Things (IoLT) and software solutions, enables researchers to simply access complete and accurate data from lab equipment. But as data volumes grow and complexity deepens, it is outgrowing the traditional means of collecting that have been used before – such as, file transfers, and even manual collection and curation. These approaches are no longer practical and are contributing to the current laboratory productivity crisis. With new IoLT devices capturing data, labs now need to ensure they have comprehensive solutions in place to take data and seamlessly connect it to decision support software.

To do this successfully, companies need a solution at the end of the chain that not only captures and interprets experimental results but harmonizes and serves data up to either project scientists and/or to AI tools to allow analysis to take place. Driving innovation through good decisions is paramount, and this link in the workflow between capturing and analyzing data is as much about enabling innovation as it is about lab efficiency. All sectors are faced with this information challenge. The amount of data currently being produced is exploding across industries from manufacturing, to retail, to life sciences, to finance – a staggering 2.5 quintillion bytes are created each day. Paired with this is the exponential growth in connectivity, with the market for the Internet of Things (IoT) expected to be worth $520 billion by 2021. While only a small amount of this volume is related to scientific research settings, these two increases are still being felt acutely in the life sciences industry.

As we move towards Industry 4.0, the life science industry is grappling with the Lab of The Future (LoTF) – as labs strive to be tech and data-driven workplaces. Firms are having to embrace digital transformation of data handling through lab informatics platforms, alongside the drive for greater automation, robotics, and the introduction of artificial intelligence (AI). The LoTF aims to modernize lab environments through embracing data and technology, including informatics, to extend and optimize operations. Network-capable laboratory devices with automation capabilities are indispensable for this new era, connected to new software systems that can ultimately drive the data into electronic lab notebooks (ELNs) and then on to software analytics and decision support systems

With this amplified focus on the use of data, lab workflows within informatics applications need to be more efficient and controlled than ever before to ensure data is valid and representative of what was measured, according to robust standards. However, there is a technical debt from the past that needs to be overcome thanks to the abundance of legacy equipment many labs still have. And, as each instrument initially is such a large investment when purchased, these assets necessarily have long working lives. It’s crucial that laboratories are upgraded to automate processes and greatly improve data reliability, especially as more researchers look to adopt AI, machine learning and deep learning systems, which are reliant on large volumes of accurate data. So how can labs now effectively move towards the LoTF and enable the changes?

Removing the Technical Debt of the Past 

In almost every lab environment there is a wide variety of instruments from different vendors – such as chromatography systems, spectrometers and image analyzers – all with their own unique interface, method of data collection and storage and level of connectivity to the laboratory infrastructure. Just collecting data from the machines and validating it in each format is a huge undertaking, and results in an unmanageable deluge of instrument data that cannot be easily transferred to an informatics application for analysis. Because there are so many different instrument vendors, there is a lack of standardization, resulting in significant costs when taking new data off legacy instruments into a usable system. Companies simply can’t keep up and there is no incentive for vendors to develop one standardized format.

Instrument data needs to be fed into an informatics platform in a way that enables scientists to progress with their experiments, makes data usable, and gives it the right context. Today, it is not uncommon for a scientist to walk around the lab from instrument to instrument with a USB and manually copy and paste data because these instruments were not networked when installed – a time consuming, inefficient, and error-prone process. Often scientists have such low confidence in their data they will repeat experiments unnecessarily, to avoid reusing the original data. Even if they do opt to use original data, it is estimated 50 percent is ‘bad’, i.e. it includes errors, is incomplete or is low quality. In research, bad data has bad consequences – for instance, if an AI algorithm ‘learns’ from bad data, the outcomes will be significantly skewed.

One further reason labs often suffer from bad data is due to the lack of digital skills available, as life science continues to compete against traditional tech companies to attract the best talent. Therefore, the processes need to be automated and trusted by researchers so they can be confident in their results. Organizations also need to embrace the LoTF to facilitate greater collaboration with partners like contract research organizations (CROs), which is necessary for modern research. As researchers’ workloads continue to increase, they will look for ways to be more efficient with the time they have available. Connecting everything in the lab is the ultimate goal, yet it is a complex problem to solve, with informatics just comprising one of the pillars within the overall automation and digitization of the lab.

One Smooth Workflow 

Dotmatics has partnered with TetraScience to offer a data integration solution to automate the end-to-end process of data collection, capture and analysis, that will move towards solving these problems by creating a smooth workflow. Instrument outputs are automatically uploaded, standardized, error-checked and parsed by Tetrascience, securing the data in their database, and then automatically moved into Dotmatics’s experiment capture and informatics platform in real-time for analysis and reporting. Today up to 80% of the time a scientist spends on experiment analysis can actually be spent on data acquisition and preparation, and so the time/cost saving can be huge by almost completely eliminating the burden on the scientists to do this.

When CROs are involved, the burden on data acquisition and preparation can be even higher. Anecdotal evidence and in our conversations with customers it has been suggested that every CRO FTE used by a company can generate 40-50% of an FTE of work in the sponsor company in data processing. The Dotmatics-Tetrascience integration has also been used in the context of CRO collaborations, facilitating seamless data transfer from one organization to another, eliminating much of this additional time and cost.

Plate Reader Workflow

Scientists prepare and load plates onto an instrument and run a protocol, after which, the file is saved directly to a network drive from the instrument. The TetraScience solution, using the Egnyte data connector, recognizes a new file has been uploaded and begins the integration process. This ensures it is error-checked, standardized and tagged with the correct metadata, and then it is moved into the TetraScience data lake. Through the Tetrascience user interface an scientist can access a detailed explanation of what happened at each step Once stored in Tetrascience the data is then sent to Dotmatics via automation for upload into the ELN or assay data management system, and from there the results can be accessed in the decision support capabilities. The results are stored on the Dotmatics Platform, either in the cloud or on-premise allowing for easy searching, retrieval and integration with other data (full example workflow illustrated in figure 1).

Figure 1: Represents a full example workflow, from the instrument and CRO capturing experiment data, to storage, through the TetraScience solution and into the Dotmatics platform where it can be analyzed and used to drive decision making.

The Dotmatics and Tetrascience integration lays the groundwork for increased use of machine learning and AI in the future – enabling scientists to work more efficiently, spend more time on analyzing results and making discoveries – instead of going back and forth between resources. Issues with data management need to be addressed quickly, especially with increased collaboration with external partners and contract research organizations becoming the norm. When experiments may be designed by one organization and then carried out by a laboratory of another, it highlights the imperative for usability and accessibility of informatics software in developing the Lab of the Future.

  • <<
  • >>