0% found this document useful (0 votes)
21 views22 pages

Optimizing Chemical Dev. With DRE & Kinetic Modeling

Uploaded by

hjuimlo987
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views22 pages

Optimizing Chemical Dev. With DRE & Kinetic Modeling

Uploaded by

hjuimlo987
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Optimizing Chemical Development with

White Paper
DRE and Mechanistic Kinetic Modeling

Table of Contents
1. Introduction 2
2. Data-Rich Experimentation (DRE) and Modeling –
The Basis for Optimized Process Development 3
3. Choosing the Right Model for the Right Problem 4
4. Streamlining Chemical Process Development With DRE and Mechanistic Kinetic Modeling 6
5. How to Apply DRE and Mechanistic Kinetic Modeling 8
5.1 Common Applications 8
5.2 Tips for Successful Implementation 9
6. Integrating DRE and Mechanistic Kinetic Modeling With the METTLER TOLEDO and
Scale-Up Suite Ecosystem 10
7. Case Studies 11
8. A Look Toward the Future 19
9. In Summary -
The Value of Combining Data-Rich Experimentation and Mechanistic Kinetic Modeling 20
10. References 21
1. Introduction
Efficient chemical process development and scale-up is becoming increasingly important across the
pharmaceutical, fine chemical, and agrichemical industries due to growing pressure to reduce timelines, prioritize
Combining DRE and Mechanistic Kinetic Modeling

sustainability, and cut costs. Recent success stories, such as the COVID vaccine, and technological advances
such as digitalization and automation have set the precedent of bringing high-quality products to market sooner.
Sustainability, encompassing environmental considerations and all aspects of product safety, has become a key
strategic goal of most companies. At the same time, the molecular structures, novel synthetic routes, and methods
of modern chemistry continue to become more complex, and often result in the need for additional processing time
and steps. Meeting the demand for frequently conflicting economic, safety, environmental, and quality goals
requires in-depth understanding and subsequent optimization of chemical reactions and processes at all stages
of development.

Data-rich experimentation (DRE) and modeling are powerful tools that are increasingly being used to enable
scientists to effectively and efficiently accomplish these tasks. Leveraging either technique individually can lead
to significant improvements in process efficiency, understanding, and optimization. Combining the two opens the
possibility for exponential benefits. The recent joining of Scale-up Systems and METTLER TOLEDO brings together
decades of experience in these two areas, combining knowledge and tools across advanced process analytical
technology, automated reactor systems, and process modeling.

This white paper shows how integrating DRE and mechanistic kinetic modeling into a seamless chemical process
development workflow enables significant gains in process understanding and efficiency, reducing the amount of
time and resources required for optimization and ensuring right first time scale-up. Data-rich experimentation and
different categories of modeling are introduced, and the tangible benefits of combining the two are outlined. Specific
focus is placed on the exponential value and benefits of a combined data-rich experimentation and mechanistic
kinetic modeling approach. Case studies drawn from industry and academia further illustrate the power of this
combined approach and show how METTLER TOLEDO and Scale-up Systems technologies have been successfully
applied to optimize various problems and processes at different stages of chemical development.

Figure 1: Accelerating the chemical process development workflow: Data-rich experiments (DRE) provide high-quality, time-course data used to build
robust mechanistic kinetic models that enable better design of subsequent experiments and next steps, effectively turbocharging the iterative
workflow and delivering more information with fewer experiments.

2 METTLER TOLEDO White Paper


2. Data-Rich Experimentation (DRE) and Modeling –
The Basis for Optimized Process Development
Data-rich experimentation powered by the adoption of automated reactors, process analytical technology
(PAT), and automated sampling has become an increasingly popular technique for optimizing chemical
process development. Dense, often real-time data is systematically collected, processed,
and analyzed over the entirety of a reaction, allowing scientists to gain more information
from every experiment. The rich quality of information gained provides in-depth insight
into reaction performance, variables, and outcome, leading to significant improvement
in the efficiency and quality of scientific research.1,2 One recent example is the use of
data-rich experimentation followed by functional principal component analysis (FPCA)
to improve the process characterization of a fluorination reaction that is a crucial
step in the synthesis of the HIV drug, islatravir.3 Data-rich experimentation has
also been used in combination with automated sampling technologies and
modeling techniques to enable and optimize production of the cancer drug,
divarasib,4 and heart failure drug, omecamtiv mecarbil.5

Modeling is often seen as a separate activity from experimentation, but in


practice, a model is only as robust as the data used to build it. The quality and
density of data available from existing resources such as databases is still limited.
Even when available, it is unlikely to account for all variables and parameters involved
in the specific chemistry being studied. Data-rich experiments provide an ideal
source of the dense, high-quality data needed to build successful models. Given the
right data, a well-designed and executed model can significantly reduce the cost and
resources required to fully understand and optimize chemical reactions and processes,
and even predict performance outside a predetermined design space. Combining with FAIR
(Findable, Accessible, Interoperable, Reusable) based digitalization initiatives can help to ensure that the data
collected provides maximum value to your organization, and potentially enable automation of some or all of the
process, including the transfer of data between DRE and modeling steps.

METTLER TOLEDO White Paper 3


3. Choosing the Right Model for the Right Problem
While the overarching argument and case studies in this white paper focus on the application of mechanistic kinetic
modeling, there are several different approaches to modeling chemical reactions and processes. These can be
Combining DRE and Mechanistic Kinetic Modeling

systematically divided into three categories: static empirical models, dynamic empirical models, and mechanistic
kinetic models. The different approaches to modeling have different requirements and offer a range of different
benefits. A large part of a process model’s success involves choosing the right model for the right problem. The best
type of model to use depends on a variety of factors including the stage of development, the information required
from the model, and the amount of experimental resources available.

Static Empirical Models


Static empirical models are based on data collected at a single timepoint during a reaction. The most common
technique within this group is Design of Experiments (DoE), which uses statistical techniques to build a model
that empirically describes the relationship between measured experimental variables and the input and output of
a reaction. In most cases, DoE studies involve performing predefined experiments and collecting data at a single
timepoint, usually at the end of a reaction. Therefore, they do not provide direct information about the dynamics of
the system under study, although time can be a variable.

DoE studies are powerful for screening the performance of a chemical reaction over a range of variables and
conditions that are a mixture of continuous (e.g., concentration, temperature) and discrete (e.g., different reagents,
catalysts). DoE is therefore widely used in early-stage experimental design, when the focus is on identification of
the best combinations of materials and conditions to apply to a process. While DoE can also be used for reaction
optimization, it does not allow for predictions outside the measured design space and cannot provide reliable
predictions of process variables related to physical transformations such as mixing, heat transfer, or feed rates.

Dynamic Empirical Models


Dynamic empirical models use reaction profile data, such as that collected by DRE, to build a statistical model
that describes the empirical relationship between experimental input and output variables. Two recent examples
are Dynamic Response Surface Methodology (DRSM)6,7 and Knowledge Constrained Empirical Modeling (KCEM).8
Because dynamic empirical models are based on dynamic time-course data, this type of model can offer direct
insight into system change.

Dynamic empirical models are valuable when there is limited mechanistic understanding and can be used to
optimize dynamic experimental variables such as reaction time. However, because they are based on empirical
relationships between measured experimental data, dynamic empirical models have the same practical
disadvantages as DoE. They should not be used to predict outside the experimentally measured design space or for
processes involving physical transformations.

Mechanistic Kinetic Models


Mechanistic kinetic models are based on scientific understanding of chemical and physical transformations
occurring in a process rather than statistical relationships between empirical variables and outcomes. This means

4 METTLER TOLEDO White Paper


that kinetic parameters, usually in the form of individual chemical steps involved in a reaction mechanism, along
with any relevant physical transformations, such as mixing, heat transfer, rates of addition, and removal of a
material, must be defined prior to building the model. The relationship between these components is then modeled
mathematically. High-quality dynamic data, such as that provided by DRE, is necessary to gain the required level
of process understanding needed to build the model and ensure best fit. Mechanistic data analysis techniques
like Reaction Progress Kinetic Analysis (RPKA)9, 10 and Variable Time Normalization Analysis (VTNA)11 can be
useful for gaining mechanistic understanding. However, these are not full mechanistic kinetic models in their own
right because they only highlight reaction driving forces in a qualitative manner. They do not contain an internal
representation of reaction steps and parameters and therefore do not enable reliable quantitative prediction of
reaction performance.

Full mechanistic kinetic models require more data and planning to build and optimize than empirical models, but
offer significantly greater insight and predictability outside the measured experimental design space. Thus, they
can be particularly useful for solving difficult reaction optimization challenges, understanding the impact of different
process design options including different equipment, predicting behavior outside a predefined design space, and
changing the scale of an operation. Mechanistic kinetic models offer the most value when applied to systems where
a significant level of effort is required to deliver a commercially viable product, for example if control of competing
reaction outcomes is challenging, especially as the scale of operation is changed.

Hybrid Mechanistic Kinetic Modeling Approaches


In some cases, taking a hybrid modeling approach works best. This might entail using different models at different
stages of the chemical development process. For example, DoE modeling may be all that is needed when first
developing a reaction, whereas a full mechanistic kinetic model offers more value for resources invested during
scale-up. Scientists at Merck recently combined DoE, dynamic empirical modeling, and mechanistic kinetic
modeling to optimize the process design of Nemtabrutinib.12 It may also be that some parts of a system are difficult
to model mechanistically, even if a full mechanistic kinetic model would be the ideal approach. Scientists at
AstraZeneca recently reported the use of a hybrid modeling approach to the design and optimization of a dynamic
kinetic resolution process for the synthesis of a chiral amine.13 This system is described in more detail later in this
document as a case study.

A Powerful Combination: DRE and Mechanistic Kinetic Modeling


Static and dynamic empirical models are widely used DRE provides the ideal source of high-quality,
and often combined with DRE to optimize processes. time-course data needed to build successful models.
However, there is particular power in the combination Better data yields better models, which help to build
of mechanistic kinetic modeling and DRE, especially better process understanding, and meet targets in
when solving complex chemical process development less time and with fewer experiments, effectively
challenges. Because mechanistic kinetic models are streamlining chemical process development.
based on scientific understanding, they can be used:
► mt.com/scale-up

• predictively to explore beyond the experimental


design space,
• to account for physical transformations, and
• to run virtual experiments and develop digital twins.

METTLER TOLEDO White Paper 5


4. Streamlining Chemical Process Development With DRE and
Mechanistic Kinetic Modeling
An Integrated Chemical Process Development Workflow
Combining DRE and Mechanistic Kinetic Modeling

Optimizing chemical reactions and processes typically follows a


Design, Make, Test, and Analyze (DMTA) workflow. This terminology
stemming from the pharmaceutical industry essentially follows
the scientific method. Scientists must first design an experiment or
series of experiments. They then carry out the experiment(s) to make
and test the desired compound. The resulting data is then analyzed
Figure 2a. Traditional Workflow: manual,
to inform next steps. The workflow repeats until the desired result labor-intensive, and requires many experiments.
is achieved. Using traditional tools and techniques, this workflow
is typically manual, labor-intensive, and requires many iterations
(Figure 2a). The rate of progress and number of experiments needed
are greatly dependent upon the experience and knowledge of the
individual scientist.

Integrating DRE and mechanistic kinetic modeling into this workflow


significantly reduces the amount of time and resources required to
achieve the level of chemical understanding needed to meet the
specified goal, particularly when optimizing complex, work-intensive
chemical development processes. Even in simpler cases, adopting
a combined DRE and mechanistic kinetic modeling approach can
increase efficiency and chemical understanding, and highlight
potential process scale-up and technology transfer risks.

Figure 2b shows how DRE and mechanistic kinetic modeling work


synergistically to reduce the amount of time, resources, risk, and Figure 2b. Combined DRE and Modeling Workflow:
iterations that are required to achieve the desired result. Data-rich integrated, highly efficient, and at least
partially automated.
experiments effectively combine and enhance the make and test
steps, providing greater value for each experiment performed. Automated lab reactor systems in combination with
on- or at-line sampling and analytical tools enable experiments to run efficiently, gathering large amounts of high-
quality data across the entirety of an experiment with minimal human intervention. This rich, time-course data
is then tested and processed, yielding in-depth chemical understanding from far fewer experiments. Mechanistic
kinetic modeling then leverages the high-quality data to permit rapid analysis and exploration of the design space,
including physical factors and the ability to predict the impact of changes outside the bounds of what has been
empirically measured. This information can then be used to identify gaps, design next steps, and even run future
experiments in a purely virtual environment.

Key Advantages and Benefits


Key advantages of integrating DRE and mechanistic kinetic modeling into the traditional chemical process
development workflow are summarized in Table 1. Integrating DRE increases data density and reduces the number
of resources and experiments required to optimize reactions and processes. Adding a mechanistic kinetic model
widens the accessible design space, accounts for physical transformations, and enables prediction and virtual
experimentation. Combining the two approaches effectively turbocharges the DMTA cycle, transforming it into a
highly efficient ecosystem that allows scientists to optimize processes and achieve targets in less time, with fewer
resources, and with greater assurance of success than a traditional workflow.

6 METTLER TOLEDO White Paper


Typical Workflow Data-Rich Experiments Physical and Virtual Experiments
Many Exp., Few Data Points Fewer Exp., Many Data Points Infinite Data Points

Experiment Number/Type Many Physical Fewer Physical Few Physical, Many Virtual

Lab Time/Resources Required Significant Reduced Reduced

Reduced Time-to-Information • • •
Real-Time Insight • • •
Simulate New Conditions • • •
Optimize Physical Parameters • • •
Reliably Predict Scale-up • • •
Easy Tech-Transfer • • •
Table 1. Demonstration of technical advantages provided by data-rich experimentation and its combination with mechanistic kinetic modeling, compared
with typical low data density experimentation.

A well-designed and integrated DRE and mechanistic kinetic modeling workflow can be leveraged to reach the following
scientific goals and business outcomes:

Scientific Goal Business Outcome


Identify knowledge gaps and optimal design space early on » Reduced development cost and timeline

Identify optimum solvent options and temperature ramps » Increased quality and yield

Understand and predict thermal process safety issues


» Avoid failure, inherently safer processes
related to changing equipment and scale

Predict downstream impact in-silico » Reduced development timeline, less rework

Optimize reactions and processes for reduced waste » More sustainable development

One overarching advantage of a combined DRE and mechanistic kinetic modeling approach is flexibility. A well-planned
and implemented DRE-driven mechanistic kinetic model can be modified to account for additional parameters or design
constraints with minimal additional work. In most cases, strategic adjustments to the model and performing a handful of
additional experiments leveraging the already implemented DRE workflow will provide the information required to ensure
robust model expansion.

The same principle applies, whether optimizing individual reactions, entire processes, or stages in the product development
lifecycle. This flexibility is particularly useful for accelerating knowledge and tech transfer between subsequent stages
of development, for example when scaling up processes from lab to plant. A successfully implemented DRE-driven
mechanistic kinetic model used to optimize reactions and processes at lab scale can subsequently be used to predict and
virtually test how changes in conditions and environment will affect safety, critical quality attributes, and reaction outcomes
at lab and plant scale, thereby increasing the likelihood of right first time scale-up.

METTLER TOLEDO White Paper 7


5. How To Apply DRE and Mechanistic Kinetic Modeling
Successful implementation of DRE and mechanistic kinetic modeling as part of a streamlined DMTA workflow
requires an initial investment of time and resources. However, the magnitude of that initial investment is small
Combining DRE and Mechanistic Kinetic Modeling

compared to the overall benefits derived from being able to thoroughly understand, predict, and control process
performance. A well-designed set of relatively few DRE experiments can be used to define, build, and fit a
mechanistic kinetic model that leverages the full power of physical and virtual experimentation to solve complex
process design challenges.

The work required to successfully apply a such an approach follows a logical step-by-step sequence:

1. Design Experiments
• Design DRE with varied parameters to understand the driving forces Understand
• Include chemical variables and physical variables driving forces
• Select analytical technique

2. Run Experiments
Collect and interpret
• Run initial experiments and collect data (generally 5-10 DRE) data to gain
• Compare reaction profiles of reactions run under different conditions understanding
• Identify which parameters have the largest impact, either desired or undesired

3. Propose Mechanism and Build Model


• Propose one or more possible mechanisms based on understanding of driving forces (including Propose and build
impurity forming pathways) base model
• Build a base model

4. Estimate Kinetic Parameters


• Estimate kinetic parameters by fitting experimental data to model
• If poor agreement between predicted and experimental outcomes, test alternative Test and
reaction mechanism(s) modify model
• If conformity still not achieved, gather additional experimental data based on knowledge gaps
• Repeat previous steps until conformity is achieved

5. Apply Model
• Validate the model experimentally by comparing predicted outcome to the outcome from further Validate and
experiment(s) apply model
• If there is good agreement, the model can be used predictively

5.1. Common Applications

Once validated, a mechanistic kinetic model can be applied to a wide range of useful applications related to process
design, optimization, and scale-up, for example:

1. Reaction optimization based on defined process constraints.


2. Identification of reaction conditions to meet required process specifications.
3. Prediction of a suitable approach for running the reaction on larger scale or with alternative material contact
patterns (e.g., comparison of reagent dosing strategy options).
4. Prediction of performance in different process equipment (e.g., batch to flow, scale-up from pilot to manufacturing, etc.)
5. Ideas for future experiments required to extend the model predictivity.

8 METTLER TOLEDO White Paper


5.2. Tips for Successful Implementation

1. Start early. The understanding gained from a few well-designed experiments used to propose and build a full
kinetic model can provide early insight into viable process design options and help avoid pursuing ‘dead ends,’
which will not deliver the desired results.
2. Understand driving forces. DRE to support modeling should be designed to understand the impact of key
driving forces on reaction profiles (e.g., reagent concentrations, temperature, mass transfer) rather than mirror
final processes. This often entails varying parameters by a significant extent, such as +/- 50-100% for material
concentrations and +/- 10-20 °C for temperature.
3. Understand mass balance. DRE should be targeted at understanding the overall mass balance of the process.
Besides being important for effective process scale-up, mechanistic kinetic modeling is most reliable when a
complete mass balance understanding is available.
4. Validate before use. The model should always be validated experimentally before being used predictively. Once
reasonable agreement between the model’s predictions and experimental outcomes has been established, the
model may be used predictively.
5. Consult experts when necessary. If no in-house expertise is available, METTLER TOLEDO AutoChem and
Scale-up Systems can provide extensive resources on how to achieve success in both DRE and mechanistic
kinetic modeling.

METTLER TOLEDO White Paper 9


6. Integrating DRE and Mechanistic Kinetic Modeling With the
METTLER TOLEDO and Scale-up Suite Ecosystem
Successful implementation of a combined DRE and mechanistic kinetic modeling approach requires integrating
Combining DRE and Mechanistic Kinetic Modeling

experimental data and insights gained into the model build. Using a range of hardware and software solutions from
different vendors is a barrier to the widespread adoption of this approach. This often involves complex, homemade
Excel solutions and requires a large number of manual interactions. In addition, mechanistic kinetic modeling
software has traditionally been seen as an ‘expert system,’ best used by dedicated physical organic chemists.

The combination of METTLER TOLEDO automated reactors and process analytical technology and
Scale-up Systems modeling software provides a unified, straightforward workflow enabling the easy execution of
data-rich experimentation and seamless import into mechanistic kinetic modeling software. The iC Suite software
and Scale-up Suite already have significant levels of interoperability, and this functionality will be developing rapidly
in the future.

METTLER TOLEDO provides a suite of integrated tools including:

Synthesis Workstations: ParticleTrack and EasyViewer in-situ probe systems are


EasyMax and OptiMax automated reactors provide the precisely real-time technologies for investigating particle and crystal
controlled environments required for obtaining kinetic and development and kinetics.
thermodynamic data for modeling and simulation. Information
provided by models can also be tested and validated in these These Synthesis workstation and PAT systems are all linked by the
workstations, for example via scale-down experiments. iC Software Suite, an integrated control and data analysis system,
which can provide a single source of data for process modeling.
Process Analytical Technology:
ReactIR and ReactRaman spectrometers with associated probe and Modeling and Simulation Software:
flow cell technology are the means to acquire data-dense, real-time Dynochem and Reaction Lab are used to develop mechanistic
analytical measurements via molecular spectroscopy. kinetic models based on experimental data. Dynochem supports all
API process steps, while Reaction Lab is designed for reaction
EasySampler provides timed sampling of reactions for analysis by kinetic modeling by process chemists, with an interface designed to
chromatography and other offline techniques. be accessible to all levels of modeling expertise.

10 METTLER TOLEDO White Paper


7. Case Studies
The following articles from scientists at AstraZeneca, Boehringer Ingleheim, Merck, and Siegfried AG respectively,
demonstrate the importance and value of integrating physical and virtual experiments for optimizing processes and
scale-up, while reducing experimental burden.

7.1. Copper-Catalyzed Racemization-Recycle of a Quaternary Center and Optimization


Using a Combined Kinetics-DoE/MLR Modeling Approach
Cox, R. J., McCreanor, N. G., Morrison, J. A., Munday, R. H., & Taylor, B. A. (2023). Copper-Catalyzed Racemization-Recycle of a Quaternary
Center and Optimization Using a Combined Kinetics-DoE/MLR Modeling Approach. The Journal of Organic Chemistry, 88(9), 5275–5284.
https://doi.org/10.1021/acs.joc.2c02588


For a copper-catalyzed racemization and associated
dimerization, mechanistic kinetic and statistical modeling
The resulting modeling/understanding-
are combined using a single, common data set to yield
based approach led directly to a
an effective method to gain mechanistic and process
new and improved reaction space
understanding. This approach yields a more complete model
that was accurately predicted before
to support process development, and enables the user to
experimental work had been carried
decide whether to prioritize mechanistic understanding,
out. The combination of kinetic and
empirical understanding, or a combination of both. Data
statistical modeling for interrogation of
acquired for this work was obtained using an EasyMax
a single, common data set provided an
system equipped with EasySampler. Mechanistic kinetic
experimentally efficient method of gaining
modeling was performed using Dynochem. The use of
mechanistic and process understanding.”
data-rich experimentation, combined with a hybrid
mechanistic and empirical model, allowed the efficient
development of process understanding that underpinned the
identification of an optimized reaction space that had not
been previously investigated experimentally.

» In-depth process understanding and modeling offered insight into the mechanism and behavior of
copper-catalyzed racemization of a pharmaceutical API with a chiral amine center and associated dimerization.
» Mechanistic insight and process understanding of the racemization came from applying a combined
kinetics-statistical modeling approach, rather than separate kinetic and multiple linear regression modeling.
» This enabled the prediction of a new and more effective reaction space prior to performing the experimental
work, saving substantial time and resource expenditure.

Dynamic kinetic resolution (DKR) of chiral amines can provide high yield and enantiomeric efficiency, however
this is typically performed on less complex amines and rarely on complex quaternary ring systems. The goal of
this work was to implement an effective strategy for racemization of the chiral center of an API, which has a chiral,
spirocyclic center [Figure 1].
HN
H 2N M NH + NH H 2+
2
N NH 2
M
N N N N N
Br N Br N Br C+ Br N Br N

OMe OMe OMe OMe OMe

S-1 S-2 R -2 R -1

Figure 1. The racemization strategy proposes treating S-1 with an azaphilic Lewis acid leading to C–N bond cleavage, rotation of the C–N bond,
and ring closure.

METTLER TOLEDO White Paper 11


To effectively accomplish DKR for a late-stage intermediate requires optimization and in-depth process
understanding to minimize impurity formation and maximize process efficiency. To achieve this, a combined
kinetics-statistics modeling approach was used for development of the racemization step, giving a versatile
approach compared to using kinetic and MLR modeling separately.
Combining DRE and Mechanistic Kinetic Modeling

Through initial high throughput screening studies,


tetrakis(acetonitrile)copper(I) triflate in anisole was chosen for
further optimization of the racemization step [Figure 2]. A key
challenge with the reaction was control of dimeric side-products. Figure 2. Copper catalyst and reaction conditions
selected for racemization development.
An experimental design that would accommodate both kinetic
and multiple linear regression (MLR) modeling was pursued, with the intent to develop a full mechanistic kinetic
model that would be useful regardless of the equipment, scale, and factor ranges, and which utilized the combined
strengths of both MLR and kinetic modeling. To fit results with MLR, DoE was used to construct a set of experiments,
and Definitive Screening Design (DSD) was selected for use in the combined approach. For S-1 racemization, three
factors: temperature, S-1 concentration, and copper concentration, were chosen to understand how reaction rate
and yield are dependent on reaction conditions. In addition, the concentration of a ketone used in the previous
reaction step was chosen as a further parameter, since it was thought to impact dimer formation. A 6-factor DSD
formed the design, which included 2 “dummy factors” along with the 4 main factors. The dummy factors reduce
confounding between factors and enable understanding of design errors such as model and experimental error.

Reactions were performed in a 50 mL EasyMax vessel equipped with an EasySampler to provide time-course
samples for chiral HPLC and UPLC analysis. The latter technique was used to differentiate the mixture of dimers
formed during the reaction, and the automated sampling ensured accurate rate data for both the racemization and
dimer formation. Data analysis led to the finding that a statistical model for the dimerization can be combined with
the kinetic model for the racemization reaction, resulting in a well-fitting combined model. This was realized by
incorporating the MLR model equations, which supply variable rate constants and induction times, into a Dynochem
kinetic model. Whereas the statistical model predicts reaction outcome within the original design space, the kinetic
model can extrapolate outside the original range. In optimizing the reaction, the racemization end point can be
predicted, giving reaction end times that avoid yield loss and the dimer formation.

The highest yield was achieved by using a high concentration of Cu, low concentration of starting material S-1,
and high temperature. As higher temperatures were favored, the authors considered performing the reaction under
pressure in a flow reactor. To emulate the flow conditions, the reaction was performed under pressure using
microwave heating. This confirmed that at elevated temperature, dimer formation was reduced and yield enhanced,
while concurrently requiring less solvent and shorter reaction time, yielding a more efficient and sustainable large-
scale process.

7.2. Optimization of an Azaindazole Series of CCR1 Antagonists and Development of a


Semicontinuous-Flow Synthesis
Harcken, C., Grant, J., Razavi, H., Marsini, M. A., Buono, F. G., Lorenz, J. C., & Reeves, J. T. (2019). Optimization of an Azaindazole Series
of CCR1 Antagonists and Development of a Semicontinuous-Flow Synthesis. In J. A. Pesti, A. F. Abdel-Magid, & R. Vaidyanathan (Eds.), ACS
Symposium Series (Vol. 1332, pp. 185–238). American Chemical Society. https://doi.org/10.1021/bk-2019-1332.ch008

Experimental data from in-situ ReactIR measurements provided kinetic rate information for key species in a Curtius
rearrangement reaction used in the synthesis of a key carbamate intermediate. This information was then used to

12 METTLER TOLEDO White Paper


build a mechanistic kinetic model in Dynochem, enabling the prediction of optimized conditions and guiding the
development of a semi-continuous flow process used to form the carbamate.

» A novel, eco-friendly synthesis featuring direct amide synthesis semicontinuous Curtius rearrangement and
acid–isocyanate coupling was developed for use in a pilot-scale continuous process.
» In-situ FTIR analysis enabled understanding and control of the potentially hazardous acyl azide formation.
» In-depth mechanistic kinetic modeling provided mechanistic understanding to aid in development of the
continuous flow system.


In the synthesis of a cyclopropyl benzylamine-substituted
6-azaindazole under investigation as a chemotactic cytokine
…mechanistic guidance via
receptor 1 (CCR1) antagonist, a Curtius rearrangement strategy was
PAT and in-house process
implemented for one of the synthetic steps. The goal was the formation
development of continuous-
of a carbamate that could be deprotected to yield an amine salt for use
flow technology allowed for
in the final amide coupling step. There was a significant safety concern
the execution of a concise
in this approach due to acyl azide formation, and therefore in-depth
and scalable synthesis of
mechanistic understanding was required to enable an intrinsically safe
CCR1 antagonist… critical
manufacturing process.
mechanistic understanding
Diphenylphosphoryl azide (DPPA) was used as the reagent for the of each process parameter
formation of the acyl azide. In-situ FTIR (ReactIR) PAT analysis enabled allowed for accomplishment of
monitoring of the acyl azide intermediate (2), its rearrangement to an the first example of direct amide
isocyanate (3) and eventual formation of the carbamate [Figure 1]. synthesis semicontinuous
As a key goal was Minimizing acyl azide concentration, a dose- Curtius rearrangement and
controlled, semi-batch addition of DPPA at high temperature was acid–isocyanate coupling…”
proposed and investigated.

At <100 g scale, the 2147 cm -1


2271 cm -1
semi-batch process O 2168 cm -1 O
N
afforded 60–65% OH DPPA ,Et3N N3
D
C
O

yield with minimal toluene -N 2


N SO 2Me N SO 2Me N SO 2Me
side products (3-5%). 1 2 3
However at kilogram- 25 ˚C:45 min 65 ˚C:90 min
55 ˚C:5 min 95 ˚C:30 min
scale, the yield was
~39% and the two Figure 1. ReactIR data tracks DPPA, azide and isocyanate intermediate concentrations vs time to provide kinetic
rate information for use in Dynochem kinetic model.
major impurities,
an amide and a
symmetrical urea, were observed at much higher levels . A detailed kinetic and mechanistic investigation, showed
that on scale-up the impurities increased proportionally with the longer DPPA addition time required to minimize
azide accumulation.

METTLER TOLEDO White Paper 13


To investigate the effect of DPPA addition rate on the product to impurity ratio more fully, the rearrangement process
was modeled using Dynochem with the reaction rates for each step that had been determined using the FTIR
measurements. The kinetic modeling accurately predicted the reaction performance across different scales as had
been observed in the physical experiments. The model also predicted that a higher DPPA addition rate would result
in a higher yield and purity [Figure 2].
Combining DRE and Mechanistic Kinetic Modeling

H
N OR

O O O
DPPA ,D N k3
OH N3 C
(slowadd.) -N 2 O +ROH N SO 2Me
4
k1 k2 slow
N SO 2Me fast N SO 2Me slow N SO 2Me k4
Impurities
1 2 3

Figure 2. Dynochem model analysis - DPPA addition rate correlated with product yield and impurity level. Prediction indicates fast addition rate
results in better carbamate yield (4) and less impurities (5,6).

However, this higher addition rate was not feasible under batch operation due to safety concerns. For this reason,
a continuous flow approach was investigated, as this was expected to minimize impurities, be safer because of the
smaller reactor footprint, and allow more control over reagent additions.

Initially a CSTR approach was trialled, but the concern of acyl azide accumulation in the vessel was a significant
drawback. For this reason, a fully continuous flow approach using a two-stream process in a single plug-flow
reactor was investigated. Further experimentation determined that when the flow-generated stream of isocyanate
was transferred into a hot solution of tert-amyl alcohol via batch trapping, an 85% yield of the carbamate was
obtained with minimal impurities. With this information in hand, an in-house modular system featuring both flow
and batch components was assembled to achieve rapid scale-up, higher throughput and yield.

Further modification of the process, including using p-methoxybenzyl alcohol (PMBOH) to afford the corresponding
PMB carbamate and using a larger tube-in-tube reactor, adjusted pump flow rates to achieve the optimum residence
time and resulted in the final optimized Curtius rearrangement process. [Figure 3].

OH
PMBOH
Et3N H
N O
toluene
MeO 2S N 135 ˚C,40 bar
O
3 min RT
1
DPPA MeO 2S N
toluene OMe
5

Figure 3. Schematic of optimized process for formation of carbamate. In line FTIR continually monitors the acyl azide levels.

7.3. Automated End-to-End Workflow for Volumetric Mass-Transfer Coefficient (kLa)


Characterization in Small-Molecule Pharmaceutical Development
Mattern, K., & Grosser, S. T. (2023). Automated end-to-end workflow for volumetric mass-transfer coefficient (kLa)
characterization in small-molecule pharmaceutical development. Organic Process Research & Development, 27(11),
1992–2009. https://doi.org/10.1021/acs.oprd.3c00191

This work integrates physical experimental data obtained though in-depth experiments using Mettler Toledo
Automated Lab Reactors with in-silico calculations enabled by Scale-up Systems Dynochem modeling software. The
result is an automated workflow for characterization of volumetric mass-transfer coefficent (kLa) in reactors, and a
comprehensive database enabling the scale-up of processes using kLa as the reaction scaling parameter.

14 METTLER TOLEDO White Paper


» The utilization of biocatalytic aerobic oxidations as a green alternative to chemocatalyzed reactions is
dependent on a successful understanding of engineering issues such as mass transfer and reactor design,
and it is critical to understand these variables across different scales.
» In contrast with laborious kLa (mass transfer coefficient) trials, a fully automated workflow for the
characterization of kLa in reactors designed for small-molecule synthesis was developed, which uses
Dynochem for modeling the full heat and mass balance on a reactor and determination of kLa.
» A database of kLa values was assembled enabling mass-transfer studies in laboratory reactors as well as
the ability to use kLa as a reaction scaling factor.


Understanding the impact of mass transfer and reactor design
is critical for the implementation of aerobic oxidations using
…we have developed a
biocatalysts on large scale. A challenge is to identify and
fully automated end-to-end
characterize appropriate scaling parameters to meet current
workflow for experimental
and future reactor requirements, thereby ensuring a successful
execution, data analysis, and
scale-up strategy. The overall volumetric gas–liquid mass-
parameter regression for the
transfer coefficient (kLa) is a commonly used scaling parameter,
characterization of kLa in
but the complex physical variables associated with stirred tank
relevant small-molecule centric
reactors complicates a general prediction of kLa in bioreactors.
reactors… The combination of
Here, automated techniques and data-rich experimentation (DRE)
kinetic and statistical modeling
were used to investigate the sources of error in conventional
for interrogation of a single,
mass-transfer characterization experiments and to develop
common data set provided an
modeling for more accurate kLa parameter regression. A strategy
experimentally efficient method
and workflow for kLa reactor characterization was demonstrated,
of gaining mechanistic and
along with a validation trial across process development scales.
process understanding.”
To develop an accurate and robust regression model for kLa,
which takes into consideration measurement lag and error, it
is necessary to have a platform that controls and time-records
all key process variables such as temperature, pressure,
air flow rate, and agitation, etc. An EasyMax automated
lab reactor provided the ideal platform, enabling accurate
control and recording of both the composition and flow rate
of gas to the vapor and liquid phases for kLa trials [Figure
1]. The EasyMax was equipped with sensors to measure
liquid-phase-dissolved oxygen, temperature, and pH. More
representative reactor hydrodynamics were obtained by use
of dummy probes with the same dimensions as common
PAT probes.

The automated kLa trials performed across a range of Figure 1. The automated reactor characterization platform was
pressure conditions generated very large data sets. METTLER developed in the laboratory using an EasyMax Automated
Laboratory Reactor platform with associated iControl software.
TOLEDO iControl software automatically exported the data
stream for further refinement, enabling the kLa value to
be obtained via parameter regression in Dynochem. This resulted in the construction of a Dynochem model that
accounted for full heat and mass balance in an experimental reactor and easily allows for regression of kLa. The
authors commented that improved kLa regression techniques required determining uncaptured dynamics in the

METTLER TOLEDO White Paper 15


trials across different scales. For example, a major source
of lag is the delay in response of the dissolved oxygen
sensor τ. Since kLa and agitation rate are strongly correlated,
agitator start-up lag must also be considered. Larger scale
vessels may use recycling loops for real-time analysis and
Combining DRE and Mechanistic Kinetic Modeling

the residence time and dispersion introduced of these loops


can have significant effects on kLa characterization trials.
Lastly, methods used to describe the mass balance of oxygen
between the liquid and vapor phases were considered. All
of these error sources were refined in the Dynochem model.
This enabled characterization of kLa across variables such as
air flow rate, agitation, temperature, pressure and fill volume
in reactors, resulting in an extensive database of regressed
kLa values. This automated workflow was used for a variety
Figure 2. The automated workflow enabled the development of
of reactor geometries and process variables, resulting in an a comprehensive kLa database for typical laboratory, kilo-
extensive mass-transfer database using kLa as the reaction scale, and pilot-scale reactors ranging in scale from 2 mL to
12,000 L. The 2000+ data points each represent a specific
scaling parameter in laboratory, kilo-scale, and pilot-scale processing condition for each reactor.
reactors [Figure 2]. This database will result in a very large
saving in future resource required for the development of biocatalytic oxidation species, as the impact of kLa on
scale-up behavior can be predicted much more accurately, and the appropriate agitation conditions selected without
extensive experimental trials.

7.4. New Scale-Up Technologies for Multipurpose Pharmaceutical Production Plants: Use
Case of a Heterogeneous Hydrogenation Process
Furrer, T., Levis, M., Berger, B., Kandziora, M., & Zogg, A. (2023). New scale-up technologies for multipurpose pharmaceutical production
plants: Use case of a heterogeneous hydrogenation process. Organic Process Research & Development, 27(7), 1365–1376.
https://doi.org/10.1021/acs.oprd.3c00124


This case study demonstrates the power of integrating
physical data with mechanistic kinetic modeling for
A sophisticated scale-up strategy has been
successful scale-up [Figure 1]. Methodology is described
validated based on a heterogeneously
for effective process scale-up from the laboratory to
catalyzed hydrogenation reaction. A
production scale, as well as optimizing scale-dependent
laboratory-scale imitation (1.4 L SDR) of
variables through in-depth process understanding.
a production-scale reactor (4000 L) and
A dynamic process model based on a digital twin is
a sophisticated in silico process model
used with an innovative, laboratory scale down reactor
were described as part of this strategy…
replicating a 4000 L scale production reactor. Using
This approach aims to reproduce defined
a reaction calorimeter and real-time PAT analysis,
scale-dependent effects at the laboratory
experiments are performed in the laboratory that reproduce
scale, which might otherwise remain
production-scale conditions and validate the acceptable
unnoticed and may result in deviations
ranges for process variables. This dynamic model enables
during the scale-up process.”
the scale-up of a heterogenous hydrogenation processes
without the need for pilot-scale trials.

16 METTLER TOLEDO White Paper


» A novel, scale down reactor (SDR) was designed and constructed for the purpose of imitating the heat and
mass transfer performance of a large-scale reactor.
» A hydrogenation reaction was performed in the PAT-equipped SDR and an in-silico simulation based on a
mechanistic kinetic model developed in Dynochem, consisting of heat transfer, mass transfer and kinetic
model subsections, successfully identified scale-dependent and scale-independent process variables for
the production system.
» A scale-up strategy based on the SDR and the mechanistic kinetic model can be applied to other similar
processes in different production scale reactors, thus saving time, resource and ensuring product yield and
quality objective are met.

Figure 1. A digital twin of the large-scale reactor combined with information about the chemical process enables
development of a dynamic process model, which is used to optimize parameters simulating the production-scale
process. Laboratory scale experiments that simulate production conditions enable validation of the proven acceptable
range for those process parameters.

Process understanding at production scale is essential for reliable prediction of process safety, quality and
economics, and the importance of modeling, digitalization and automation as fundamental enablers for such
predictions. They also note that mechanistic models which are useful for homogenous processes can be more
challenging for heterogeneous processes, due to the difficulty of predicting mass transfer between phases and
heat transfer. A strategy was developed that uses scale down reactors (SDR) in combination with data-rich
experimentation and mechanistic modeling tools to mimic the large-scale production reactors. The authors refer to
the development of a toolbox of technology and methodology that includes scale-up and scale-down principles,
process modeling, well-characterized reactors of different scales, and meaningful laboratory experiments.

A key scale-up challenge for heterogenous process arises from the influence on kinetics of mass transfer, partial
dissolution, and shear forces on particles, largely due to mixing and stirring. In complex heterogeneous reactions,
mixing that is optimal for one of these variables may negatively affect another on scale-up. To simulate the
scale-up behavior for these reactions, a geometrically similar reactor with constant volume-specific power
dissipation due to agitation is applicable. Another issue in scale-up prediction for both homogeneous and
heterogeneous reactions is the scale-dependence of heat transfer for reactors of different sizes and geometry. Lab
reactors have higher heating/cooling capacity per unit volume than plant scale reactors, which must be considered
in scale-up behavior investigations.

METTLER TOLEDO White Paper 17


To address this issue, a 1.4 L scaled-down reactor was designed and built with similar heat and mass transfer
behavior to a 4000 L production-scale reactor [Figure 2], enabling the process performance of the production-scale
reactor (4000 L) to be replicated at laboratory scale.
Combining DRE and Mechanistic Kinetic Modeling

Figure 2. A scale-down reactor was developed that included a novel insert device (H/C finger), which enables
preservation of the heat transfer area to volume ratio and the heat transfer coefficient of the production scale
vessel. The geometry of the internal elements of the reactor mimicked that of the larger vessel.

In addition to the SDR work, a dynamic process model was developed


enabling the in-silico simulation of the hydrogenation reaction below in a
4000 L stirred pressure vessel.

The kinetic parameters including reaction orders, activation energies and


rate constants, as well as reaction enthalpies of all relevant reactions,
were determined for the multiple chemical reactions of the hydrogenation.
Data-rich experiments were run in an RC1 reaction calorimeter, with a combination of heat flow measurements,
in-situ ReactIR data and off-line HPLC measurements on samples collected using EasySampler were used to
evaluate the reaction kinetics. The in-silico model developed using Dynochem included all the chemical reactions
occurring during the hydrogenation process and the mass transfer of the laboratory-scale reaction calorimeter.

The Dynochem model takes into account process parameters such as the agitation speed, fill level, starting
temperature and concentrations, jacket inlet temperature, and gas-phase pressure, and allows optimization of
variables for the hydrogenation process. With this information, the process was then performed at production scale
with a reaction time reduction from 22 to 12 h. The use of this dynamic process model permits direct scale-up
without the requirement for traditional pilot scale runs. Furthermore, the workflow developed will allow significant
resource savings for the scale-up of future catalytic hydrogenation reactions through the use of well-designed
laboratory experiments and accurate in-silico simulations of the full-scale process.

18 METTLER TOLEDO White Paper


8. A Look Toward the Future
The Potential for Autonomous Workflows
Integrating DRE with process modeling is paving the way for more
sophisticated innovations leveraging related technologies such as
artificial intelligence, machine learning, and advanced robotics.
While still early in their development, visionaries in the industry are
taking steps towards creating closed-loop process development
systems that enable seamless, autonomous execution of the DMTA
loop. High-quality, real-time data fed into advanced empirical
and mechanistic kinetic models allows for rapid exploration of
the process space and predictive analysis of changes in process
conditions. This workflow forms the foundation of a self-optimizing
system, in which machine learning and artificial intelligence can
autonomously propose optimal conditions and execute physical or
virtual experiments to validate and refine the process.

Several pioneering examples illustrate the potential of such autonomous systems. For example, a modular flow
system integrating online analysis and chemometric modeling has been used for rapid self-optimization of multi-
step reaction systems.14 Similarly, continuous-flow electrochemical synthesis platforms have demonstrated the self-
optimization of organic reactions using real-time monitoring techniques like FTIR and GC.15

A significant reason that these closed-loop autonomous systems remain rare in implementation is the amount of
development and customization required to build one. Self-optimization approaches require a central system that
can interface between logging data, performing analysis, modeling processes, and controlling all components
autonomously. To achieve this, researchers are often required to use custom-developed solutions to bridge across
different hardware and software, and to solve complex compatibility and connectivity issues.

Looking to the future, continued development and integration of rapidly advancing techniques holds tremendous
potential for the industry. We can envision solutions that allow seamless data exchange and control between
significant portions of the instrumentation and software, helping to make the tools and knowledge required to build
and successfully implement a fully autonomous workflow more accessible. The combined portfolio of
METTLER TOLEDO solutions for data-rich experimentation and the Scale-up Suite of modeling and simulation
software are already taking steps to make this a reality.

METTLER TOLEDO White Paper 19


9. In Summary – The Value of Combining Data-Rich
Experimentation and Mechanistic Kinetic Modeling
Integrating DRE and mechanistic kinetic modeling into the process development workflow significantly increases
Combining DRE and Mechanistic Kinetic Modeling

process understanding, prediction, and insight. This combined approach streamlines process development,
enabling more efficient delivery of desired targets when compared to traditional workflows based on large amounts
of physical experiments and end-point analysis only. A carefully designed and well-implemented DRE and
mechanistic kinetic modeling workflow can deliver significant overall resource savings, including:

» More Value From Every Experiment: Leading to more targeted process development, with model outputs
used to guide experimental design for the next stages of development and scale-up.
» Fewer Wasted Experiments: Predicting areas of chemical space that are unlikely to deliver a successful
outcome, thus saving large amounts of wasted experimentation.
» Efficient Scale-up: Providing a robust framework for ‘right first time’ scale-up and technology transfer,
thereby preventing the need for extensive piloting studies or batch failures, which are a large and avoidable
resource drain.
» Strategic Goal Achievement: Supporting strategic targets, such as design of sustainable and resource-
efficient processes, or ensuring that process design and optimization efforts are directed towards feasible
and impactful objectives.

Modeling is often seen as a separate activity from experimental work, but the greatest impact can be achieved
when DRE are designed and run to support the application of mechanistic kinetic models. A comprehensive solution
from METTLER TOLEDO that includes PAT, automated laboratory reactors, advanced sampling technology, and
sophisticated modeling and simulation software can effectively integrate physical and virtual experimentation, and
greatly accelerate the development, optimization, and scale-up of chemical processes.

20 METTLER TOLEDO White Paper


10. References

[1] Caron, S., & Thomson, N. M. (2015). Pharmaceutical Process Chemistry: Evolution of a Contemporary Data-
Rich Laboratory Environment. The Journal of Organic Chemistry, 80(6), 2943–2958.
https://doi.org/10.1021/jo502879m
[2] Jurica, J. A., & McMullen, J. P. (2021). Automation Technologies to Enable Data-Rich Experimentation: Beyond
Design of Experiments for Process Modeling in Late-Stage Process Development. Organic Process Research &
Development, 25(2), 282–291. https://doi.org/10.1021/acs.oprd.0c00496
[3] McMullen, J.P., Wyvratt, B.M., Hong, C.M. and Purohit, A.K. (2024). Integrating Functional Principal
Component Analysis with Data-Rich Experimentation for Enhanced Drug Substance Development. Organic
Process Research & Development, 28(3), 719−728. https://doi.org/10.1021/acs.oprd.3c00379
[4] Kelly, S.M., Lebl, R., Malig, T.C., Bass, T,M., Kummli, D., Kaldre, D., Orcel, U., Tröndlin, L., Linder, D.,
Sedelmeier, J., Bachmann, S., Han, C., Zhang, H. and Gosselin, F. (2023). Synthesis of a Highly
Functionalized Quinazoline Organozinc toward KRAS G12C Inhibitor Divarasib (GDC-6036), Enabled Through
Continuous Flow Chemistry. Organic Process Research & Development.
https://doi.org/10.1021/acs.oprd.3c00164
[5] Murray, J. I., Silva Elipe, M. V., Cosbie, A., Baucom, K., Quasdorf, K., & Caille, S. (2020). Kinetic Investigations
to Enable Development of a Robust Radical Benzylic Bromination for Commercial Manufacturing of AMG 423
Dihydrochloride Hydrate. Organic Process Research & Development, 24(8), 1523–1530.
https://doi.org/10.1021/acs.oprd.0c00256
[6] Klebanov, N., & Georgakis, C. (2016). Dynamic Response Surface Models: A Data-Driven Approach for the
Analysis of Time-Varying Process Outputs. Industrial & Engineering Chemistry Research, 55(14), 4022–4034.
https://doi.org/10.1002/anie.201609757
[7] Dong, Y., Georgakis, C., Mustakis, J., Hawkins, J. M., Han, L., Wang, K., McMullen, J. P., Grosser, S. T.,
& Stone, K. (2019). Constrained Version of the Dynamic Response Surface Methodology for Modeling
Pharmaceutical Reactions. Industrial & Engineering Chemistry Research, 58(30), 13611–13621.
https://doi.org/10.1021/acs.iecr.9b00731
[8] Griffin, D., & Huggins, S. (2018). Applying Automation and Data-Driven Modeling to Perform Rapid Reaction
Optimization. In 2018 AIChE Annual Meeting. AIChE.; Griffin, D., Partopour, B., & Huggins, S. (2019,);
Knowledge-Constrained Machine Learning: A Strategy for Producing Predictive Process Models in the Absence
of Mechanistic Understanding and Large Data Sets [Conference Abstract]. FOPAM.
[9] Blackmond, D. G. (2005). Reaction Progress Kinetic Analysis: A Powerful Methodology for Mechanistic Studies
of Complex Catalytic Reactions. Angewandte Chemie International Edition, 44(28), 4302–4320.
https://doi.org/10.1002/anie.200462544
[10] Burés, J. (2016). Variable Time Normalization Analysis: General Graphical Elucidation of Reaction Orders From
Concentration Profiles. Angewandte Chemie International Edition, 55(52), 16084–16087.
https://doi.org/10.1002/anie.201609757
[11] D.-T. Nielsen, C., & Burés, J. (2019). Visual Kinetic Analysis. Chemical Science, 10(2), 348–353.
https://doi.org/10.1039/C8SC04698K
[12] Crawford, M. (2024, April 15). Late-Stage Pharmaceutical Development of Nemtabrutinib (MK-1026) at MSD.
Late-Stage Pharmaceutical Development of Nemtabrutinib (MK-1026) [Webinar]. Mettler-Toledo.
[13] Cox, R. J., McCreanor, N. G., Morrison, J. A., Munday, R. H., & Taylor, B. A. (2023). Copper-Catalyzed
Racemization-Recycle of a Quaternary Center and Optimization Using a Combined Kinetics-DoE/MLR Modeling
Approach. The Journal of Organic Chemistry, 88(9), 5275–5284. https://doi.org/10.1021/acs.joc.2c02588

METTLER TOLEDO White Paper 21


[14] Ke, J., Gao, C., Folgueiras-Amador, A. A., Jolley, K. E., De Frutos, O., Mateos, C., Rincón, J. A., Brown, R. C.
D., Poliakoff, M., & George, M. W. (2022). Self-Optimization of Continuous Flow Electrochemical Synthesis
Using Fourier Transform Infrared Spectroscopy and Gas Chromatography. Applied Spectroscopy, 76(1), 38–50.
https://doi.org/10.1177/00037028211059848
Combining DRE and Mechanistic Kinetic Modeling

[15] Sagmeister, P., Ort, F. F., Jusner, C. E., Hebrault, D., Tampone, T., Buono, F. G., Williams, J. D., & Kappe, C. O.
(2022). Autonomous Multi-Step and Multi-Objective Optimization Facilitated by Real-Time Process Analytics.
Advanced Science, 9(10), 2105547. https://doi.org/10.1002/advs.202105547

Want to Learn More?

Hear From Learn From Our Contact


Industry Experts Extensive Library Our Experts

Visit our webinar collection and Sign-up for Dynochem Resources, Our team of technical and
listen to your peers present case our comprehensive library of models application consultants are
studies demonstrating how they and user knowledge, to learn from ready to discuss your application
are leveraging the combined power our experts and the community, and and how you can take steps to
of data-rich experiments and build your expertise. start accelerating your
kinetic models to optimize development processes.
chemical development.

mt.com/ac-ModelingCollection dcresources.scale-up.com mt.com/ac-contactus

METTLER TOLEDO Group


Automated Reactors and In-Situ Analysis
www.mt.com/AutoChem
For more information

Subject to technical changes.


© 09/2024 METTLER TOLEDO. All rights reserved.

You might also like