Could Internet-of-Things be the next step in the evolution of Chemistry

About the author: Theo Martinot is a skilled synthetic organic chemist with over 10 years of industry experience including methodology development, total synthesis of natural products and pharmaceutical targets, and route evaluation and development. A strong proponent of scientific innovation across all channels including laboratory automation and data rich experimentation, he drives efficiency improvements through knowledge infrastructure (Lab Equipment Integration, etc.) and supports implementation of a Design of Experiments (DoE) approach to projects in all stages of development. Theo is currently an Associate Principal Scientist of Discovery Process Chemistry at Merck & Co., Inc., Kenilworth, NJ, USA.

Chemistry provides the foundation for the innovations society depends on every day: from new medicines to greener plastics to healthier foods. Since its induction as a scientific field (i.e., beyond alchemy), it has gradually evolved in its pursuit and techniques to become the mature science that it is today.

And yet, these times are bringing an ever-growing pressure for chemistry to further innovate and evolve in practice and objective.

From my perspective, the typical output of synthetic chemists has been molecules. These molecules are designed with an explicit purpose or function. However, the one-off synthesis of a specific compound has only limited value unless the experiment can be replicated and scaled easily. This is true of all science in general: if the experiment cannot be replicated and the results confirmed, the conclusion(s) of the experiment come into question.

One observation I have made is that the results of our investigations are rich: throughout discovery (i.e., inventing new molecules) and into development (i.e., bringing these molecules to patients) we produce more than a product, but a process as well. In other words, it’s not just the drug, but also the recipe that is a clear output. A thorough understanding and awareness of the synthetic process is critical to doing good chemistry.

Yet, there exist numerous challenges in our ability to capture the relevant data easily and perform experiments more reproducibly.

Some of these challenges have been around forever, and others are confounded by current technology paradigms (i.e., the use of an Electronic Lab Notebook or ELN). Here are a few examples.

Weighing

Since the beginning of chemistry, a critical component of the experimentation has been to accurately and precisely record weights (i.e., of starting materials or products). While the weighing technology (the balance) has changed, the practice – conceptually – has not. This is an example of an old practice that has been complicated by technology.

Consider the way we perform experiments: scientists plan experiments in their notebooks, and calculate the precise weight of each ingredient needed for the experiments. In the “old” days, you would then just bring your notebook with you to the balance, weigh, and record everything right there directly. But the ELN is seldom anywhere near the balance for a number of reasons.

In principle, this means that the scientist has to memorize what he/she has to weigh, weigh it, and then re-type the data into the ELN. In practice, this becomes challenging when multiple reagents are weighed together. So chemists will use notes (i.e., notepad, printed out charging tables, post-it notes, etc.) as temporary data transfer. If you survey an average chemistry lab, you will likely observe that the hood sashes are often peppered with chemical structures and notes of various kinds (including weights, temperatures, or other pertinent reaction information).

However, in a regulated environment, companies (because of quality guidelines, including cGMP) forbid the use of temporary data transfer. Cognizant of the challenge described above, procedures are established to remedy the problem in a number of ways. But the situation is not optimal:

  • Working in pairs, one scientist in the field will perform the operation and read out the values to the other acting as the scribe. Have you ever played telephone?
  • Alternatively, the scientist performing the operation will don and doff PPE between the ELN and the balance repeatedly as the task is completed. “Was that 75.38 g or 75.83 g?”

Inaccuracies? Possibly.
Wasted time? Certainly.

Re-doing experiments

As scientists, we are always running on tight deadlines. So while we may not be working 24 hours a day, we try to maximize our output by having our chemistry do so for us. Some of my greatest frustrations were experienced when experiments failed unexpectedly in the middle of the night. Maybe the chiller shut down, the temperature deviated, or the compound began to precipitate. The bottom line is that we often don’t know what happened beyond the fact that the experiment failed for some reason. So we either concede that the experiment did not work or repeat the experiment to troubleshoot it. But it is often frustrating (and a waste of time) to start over from square one and/or chase a root cause of failure that may not necessarily manifest itself again.

Another significant downside to performing overnight reactions is that the data captured is usually very lean (often only one data point – at the end). You can usually pick these experiments out when a procedure reads “reaction was stirred for 16 h”. But what happened after 2 h, 8 h, 12 h? Is 16 h really necessary? Did an impurity develop over time that could have been controlled with a shorter reaction? Understanding a reaction profile is very important to control a process. And while the overnight reaction satisfies the need to turn around experiments, it does so at the cost of the richness of information that can come from running an experiment when multiple data points can be captured.

Unfortunately, we have become numb to these pains day-to-day. Worse yet, some do not even recognize these inefficiencies as important enough to warrant a solution. We assume the consequences (project delays, irreproducibility, etc.) are simply the cost of doing business. While some solutions have emerged to solve these problems - automated reactors, robotic weighing, etc. - few of these complicated approaches have truly lived up to expectations. They end up being too complicated, too expensive, too inflexible, and, ultimately, inadequate.

Where I see the greatest opportunity is in simple, elegant solutions that enhance the scientist.

The Internet-of-Things (IoT) provides a technical foundation to make this reality. Whether capturing weights electronically from a balance, remotely pumping reagents, or logging temperature and notifying me of a deviation, the IoT promises to improve the quality of our chemistry and bring the scientist closer to the experiment and data.

I am excited to see how the field of chemistry evolves. As industries seek to drive innovation more efficiently, the future most certainly involves a greater emphasis on quality and productivity. I’m optimistic that technology will play a critical role in the future and that the Internet-of-Things is the lynchpin.

Like this article? Check out Theo's next post on "Transforming Chemistry with Machine Learning."