Blog

Reimagining Quality Testing

October 9, 2024

Dr. Daniela Pedersen
Ph.D., TetraScience VP of Product Marketing

What if quality testing could be easier, faster, and less prone to human errors?

Quality testing is an indispensable step in releasing a therapeutic for clinical use. It guarantees the safety, efficacy, and quality of a drug. Extensive documentation must be generated to prove that the required tests have been performed and the results comply with the specifications. Altogether, this is a tedious and time-consuming process often considered as non-value adding for the actual product.

At TetraScience, we have analyzed the quality testing workflows of many customers, identifying data-related bottlenecks and challenges. By reimaging and redefining these workflows with the Tetra Scientific Data and AI Cloud, we’re radically accelerating and improving quality testing.

Capturing instrument data

One common issue arises when lab teams capture data from instruments. Typically, data from small instruments like balances or pH meters is printed and entered manually into systems and documents. While data from more complex systems is usually captured electronically, results are often still transferred manually, sometimes even involving paper-based steps. 

Now, imagine you could collect data in a compliant manner by just taking a picture of the instrument’s display showing the measured value. That data would then be transferred automatically into your system of record without any manual intervention.

Transferring data between systems

Data transfer doesn’t just involve instruments. We see scientists spending lots of time moving data between lab informatics systems, like among electronic lab notebooks (ELNs) and analytics tools—sometimes via USB sticks! Or they re-enter data manually. 

Imagine having all your scientific data readily accessible in your most common applications, without the need for manual data handling. 

Scientists typically spend 50% of their time on manual data handling.1

Searching for data

Another time-consuming task for scientists is searching for data across hard drives and databases, even when file names are descriptive. Without proper context and labeling, retrieving the right data from peers or even oneself can be difficult. If you can’t find the data, you might need to repeat the experiments. 

Imagine all your scientific data is automatically contextualized and easily searchable using common scientific metadata, regardless of who created it, where, or when.

Scientists are often spending 4 hours or more per week searching for data.2

Analyzing data

Scientists often face challenges when analyzing data, especially if it’s incompatible with their preferred tools to get the required insights. Data generated from different instruments or vendors may not be comparable, requiring significant preparation just to make it usable for analytics.

Imagine your scientific data automatically engineered as it is captured, ready for immediate use in your analytics tools—no preparation required.

50-80% of scientists' and data scientists' time is spent on low-level data extraction, cleansing, and manipulation tasks in order to get raw or primary data ready for higher-value analysis.3

Creating reports

Creating reports involves gathering the relevant data from different siloed data sources. Teams need to know where to find the data, what it relates to, and how to organize it in the report. We see organizations dedicating immense time and effort to complete this process.

Imagine searching and accessing all the data right from your reporting application, without sifting through a plethora of data sources.

80% of manual documentation work and second-scientist review can be eliminated through digitalization.4

Investigating deviations

Investigations are an additional burden. When deviations occur, teams need to undertake time- and resource-consuming root cause analysis which delays batch releases. We’ve noticed that one of the main issues in this process is the lack of data visibility and the inability to drill down into the details.

Imagine having complete oversight of your data, with dashboards that allow you to quickly identify the reason for deviations and conduct investigations in a fraction of the time.

For most quality teams, the average investigation and closure cycle time hovers around 60 days.5

Validating software

Efforts of software validation and its documentation are high but critical to a biopharmaceutical company’s business and compliance requirements. Having many disjointed systems increases the burden of this process. 

Imagine a single centralized platform for all your scientific data with industrialized, validated interfaces. And imagine you could accelerate your validation processes by leveraging computer software assurance (CSA) along with an extensive verification and validation documentation package.

Go one step further and imagine having a truly digital QC lab with AI/ML capabilities. In this lab, you could detect anomalies, predict and prevent potential deviations, implement corrections faster and easier, and perform real-time release testing or parametric release.


Digital QC labs can expect:

  • 3x increase in sample throughput
  • 80% reduction in deviations
  • 90% faster investigation closure times
  • 200% boost in lab productivity
  • 30-40% reduction in operational costs

Inspired? Check out our Quality Testing Solution Brief to see how this future can become a reality for your lab!

References

  1. 2022 State of Digital Lab Transformation Survey
  2. Increasing efficiency in purification process development
  3. AWS & TetraScience: 500 Pharma Executives Research Survey
  4. McKinsey: Digitization, automation, and online testing: Embracing smart quality control
  5. McKinsey: Making quality assurance smart