Optimizing Chemical Dev. With DRE & Kinetic Modeling
Optimizing Chemical Dev. With DRE & Kinetic Modeling
White Paper
DRE and Mechanistic Kinetic Modeling
Table of Contents
1. Introduction 2
2. Data-Rich Experimentation (DRE) and Modeling –
The Basis for Optimized Process Development 3
3. Choosing the Right Model for the Right Problem 4
4. Streamlining Chemical Process Development With DRE and Mechanistic Kinetic Modeling 6
5. How to Apply DRE and Mechanistic Kinetic Modeling 8
5.1 Common Applications 8
5.2 Tips for Successful Implementation 9
6. Integrating DRE and Mechanistic Kinetic Modeling With the METTLER TOLEDO and
Scale-Up Suite Ecosystem 10
7. Case Studies 11
8. A Look Toward the Future 19
9. In Summary -
The Value of Combining Data-Rich Experimentation and Mechanistic Kinetic Modeling 20
10. References 21
1. Introduction
Efficient chemical process development and scale-up is becoming increasingly important across the
pharmaceutical, fine chemical, and agrichemical industries due to growing pressure to reduce timelines, prioritize
Combining DRE and Mechanistic Kinetic Modeling
sustainability, and cut costs. Recent success stories, such as the COVID vaccine, and technological advances
such as digitalization and automation have set the precedent of bringing high-quality products to market sooner.
Sustainability, encompassing environmental considerations and all aspects of product safety, has become a key
strategic goal of most companies. At the same time, the molecular structures, novel synthetic routes, and methods
of modern chemistry continue to become more complex, and often result in the need for additional processing time
and steps. Meeting the demand for frequently conflicting economic, safety, environmental, and quality goals
requires in-depth understanding and subsequent optimization of chemical reactions and processes at all stages
of development.
Data-rich experimentation (DRE) and modeling are powerful tools that are increasingly being used to enable
scientists to effectively and efficiently accomplish these tasks. Leveraging either technique individually can lead
to significant improvements in process efficiency, understanding, and optimization. Combining the two opens the
possibility for exponential benefits. The recent joining of Scale-up Systems and METTLER TOLEDO brings together
decades of experience in these two areas, combining knowledge and tools across advanced process analytical
technology, automated reactor systems, and process modeling.
This white paper shows how integrating DRE and mechanistic kinetic modeling into a seamless chemical process
development workflow enables significant gains in process understanding and efficiency, reducing the amount of
time and resources required for optimization and ensuring right first time scale-up. Data-rich experimentation and
different categories of modeling are introduced, and the tangible benefits of combining the two are outlined. Specific
focus is placed on the exponential value and benefits of a combined data-rich experimentation and mechanistic
kinetic modeling approach. Case studies drawn from industry and academia further illustrate the power of this
combined approach and show how METTLER TOLEDO and Scale-up Systems technologies have been successfully
applied to optimize various problems and processes at different stages of chemical development.
Figure 1: Accelerating the chemical process development workflow: Data-rich experiments (DRE) provide high-quality, time-course data used to build
robust mechanistic kinetic models that enable better design of subsequent experiments and next steps, effectively turbocharging the iterative
workflow and delivering more information with fewer experiments.
systematically divided into three categories: static empirical models, dynamic empirical models, and mechanistic
kinetic models. The different approaches to modeling have different requirements and offer a range of different
benefits. A large part of a process model’s success involves choosing the right model for the right problem. The best
type of model to use depends on a variety of factors including the stage of development, the information required
from the model, and the amount of experimental resources available.
DoE studies are powerful for screening the performance of a chemical reaction over a range of variables and
conditions that are a mixture of continuous (e.g., concentration, temperature) and discrete (e.g., different reagents,
catalysts). DoE is therefore widely used in early-stage experimental design, when the focus is on identification of
the best combinations of materials and conditions to apply to a process. While DoE can also be used for reaction
optimization, it does not allow for predictions outside the measured design space and cannot provide reliable
predictions of process variables related to physical transformations such as mixing, heat transfer, or feed rates.
Dynamic empirical models are valuable when there is limited mechanistic understanding and can be used to
optimize dynamic experimental variables such as reaction time. However, because they are based on empirical
relationships between measured experimental data, dynamic empirical models have the same practical
disadvantages as DoE. They should not be used to predict outside the experimentally measured design space or for
processes involving physical transformations.
Full mechanistic kinetic models require more data and planning to build and optimize than empirical models, but
offer significantly greater insight and predictability outside the measured experimental design space. Thus, they
can be particularly useful for solving difficult reaction optimization challenges, understanding the impact of different
process design options including different equipment, predicting behavior outside a predefined design space, and
changing the scale of an operation. Mechanistic kinetic models offer the most value when applied to systems where
a significant level of effort is required to deliver a commercially viable product, for example if control of competing
reaction outcomes is challenging, especially as the scale of operation is changed.
Experiment Number/Type Many Physical Fewer Physical Few Physical, Many Virtual
Reduced Time-to-Information • • •
Real-Time Insight • • •
Simulate New Conditions • • •
Optimize Physical Parameters • • •
Reliably Predict Scale-up • • •
Easy Tech-Transfer • • •
Table 1. Demonstration of technical advantages provided by data-rich experimentation and its combination with mechanistic kinetic modeling, compared
with typical low data density experimentation.
A well-designed and integrated DRE and mechanistic kinetic modeling workflow can be leveraged to reach the following
scientific goals and business outcomes:
Identify optimum solvent options and temperature ramps » Increased quality and yield
Optimize reactions and processes for reduced waste » More sustainable development
One overarching advantage of a combined DRE and mechanistic kinetic modeling approach is flexibility. A well-planned
and implemented DRE-driven mechanistic kinetic model can be modified to account for additional parameters or design
constraints with minimal additional work. In most cases, strategic adjustments to the model and performing a handful of
additional experiments leveraging the already implemented DRE workflow will provide the information required to ensure
robust model expansion.
The same principle applies, whether optimizing individual reactions, entire processes, or stages in the product development
lifecycle. This flexibility is particularly useful for accelerating knowledge and tech transfer between subsequent stages
of development, for example when scaling up processes from lab to plant. A successfully implemented DRE-driven
mechanistic kinetic model used to optimize reactions and processes at lab scale can subsequently be used to predict and
virtually test how changes in conditions and environment will affect safety, critical quality attributes, and reaction outcomes
at lab and plant scale, thereby increasing the likelihood of right first time scale-up.
compared to the overall benefits derived from being able to thoroughly understand, predict, and control process
performance. A well-designed set of relatively few DRE experiments can be used to define, build, and fit a
mechanistic kinetic model that leverages the full power of physical and virtual experimentation to solve complex
process design challenges.
The work required to successfully apply a such an approach follows a logical step-by-step sequence:
1. Design Experiments
• Design DRE with varied parameters to understand the driving forces Understand
• Include chemical variables and physical variables driving forces
• Select analytical technique
2. Run Experiments
Collect and interpret
• Run initial experiments and collect data (generally 5-10 DRE) data to gain
• Compare reaction profiles of reactions run under different conditions understanding
• Identify which parameters have the largest impact, either desired or undesired
5. Apply Model
• Validate the model experimentally by comparing predicted outcome to the outcome from further Validate and
experiment(s) apply model
• If there is good agreement, the model can be used predictively
Once validated, a mechanistic kinetic model can be applied to a wide range of useful applications related to process
design, optimization, and scale-up, for example:
1. Start early. The understanding gained from a few well-designed experiments used to propose and build a full
kinetic model can provide early insight into viable process design options and help avoid pursuing ‘dead ends,’
which will not deliver the desired results.
2. Understand driving forces. DRE to support modeling should be designed to understand the impact of key
driving forces on reaction profiles (e.g., reagent concentrations, temperature, mass transfer) rather than mirror
final processes. This often entails varying parameters by a significant extent, such as +/- 50-100% for material
concentrations and +/- 10-20 °C for temperature.
3. Understand mass balance. DRE should be targeted at understanding the overall mass balance of the process.
Besides being important for effective process scale-up, mechanistic kinetic modeling is most reliable when a
complete mass balance understanding is available.
4. Validate before use. The model should always be validated experimentally before being used predictively. Once
reasonable agreement between the model’s predictions and experimental outcomes has been established, the
model may be used predictively.
5. Consult experts when necessary. If no in-house expertise is available, METTLER TOLEDO AutoChem and
Scale-up Systems can provide extensive resources on how to achieve success in both DRE and mechanistic
kinetic modeling.
experimental data and insights gained into the model build. Using a range of hardware and software solutions from
different vendors is a barrier to the widespread adoption of this approach. This often involves complex, homemade
Excel solutions and requires a large number of manual interactions. In addition, mechanistic kinetic modeling
software has traditionally been seen as an ‘expert system,’ best used by dedicated physical organic chemists.
The combination of METTLER TOLEDO automated reactors and process analytical technology and
Scale-up Systems modeling software provides a unified, straightforward workflow enabling the easy execution of
data-rich experimentation and seamless import into mechanistic kinetic modeling software. The iC Suite software
and Scale-up Suite already have significant levels of interoperability, and this functionality will be developing rapidly
in the future.
“
For a copper-catalyzed racemization and associated
dimerization, mechanistic kinetic and statistical modeling
The resulting modeling/understanding-
are combined using a single, common data set to yield
based approach led directly to a
an effective method to gain mechanistic and process
new and improved reaction space
understanding. This approach yields a more complete model
that was accurately predicted before
to support process development, and enables the user to
experimental work had been carried
decide whether to prioritize mechanistic understanding,
out. The combination of kinetic and
empirical understanding, or a combination of both. Data
statistical modeling for interrogation of
acquired for this work was obtained using an EasyMax
a single, common data set provided an
system equipped with EasySampler. Mechanistic kinetic
experimentally efficient method of gaining
modeling was performed using Dynochem. The use of
mechanistic and process understanding.”
data-rich experimentation, combined with a hybrid
mechanistic and empirical model, allowed the efficient
development of process understanding that underpinned the
identification of an optimized reaction space that had not
been previously investigated experimentally.
» In-depth process understanding and modeling offered insight into the mechanism and behavior of
copper-catalyzed racemization of a pharmaceutical API with a chiral amine center and associated dimerization.
» Mechanistic insight and process understanding of the racemization came from applying a combined
kinetics-statistical modeling approach, rather than separate kinetic and multiple linear regression modeling.
» This enabled the prediction of a new and more effective reaction space prior to performing the experimental
work, saving substantial time and resource expenditure.
Dynamic kinetic resolution (DKR) of chiral amines can provide high yield and enantiomeric efficiency, however
this is typically performed on less complex amines and rarely on complex quaternary ring systems. The goal of
this work was to implement an effective strategy for racemization of the chiral center of an API, which has a chiral,
spirocyclic center [Figure 1].
HN
H 2N M NH + NH H 2+
2
N NH 2
M
N N N N N
Br N Br N Br C+ Br N Br N
S-1 S-2 R -2 R -1
Figure 1. The racemization strategy proposes treating S-1 with an azaphilic Lewis acid leading to C–N bond cleavage, rotation of the C–N bond,
and ring closure.
Reactions were performed in a 50 mL EasyMax vessel equipped with an EasySampler to provide time-course
samples for chiral HPLC and UPLC analysis. The latter technique was used to differentiate the mixture of dimers
formed during the reaction, and the automated sampling ensured accurate rate data for both the racemization and
dimer formation. Data analysis led to the finding that a statistical model for the dimerization can be combined with
the kinetic model for the racemization reaction, resulting in a well-fitting combined model. This was realized by
incorporating the MLR model equations, which supply variable rate constants and induction times, into a Dynochem
kinetic model. Whereas the statistical model predicts reaction outcome within the original design space, the kinetic
model can extrapolate outside the original range. In optimizing the reaction, the racemization end point can be
predicted, giving reaction end times that avoid yield loss and the dimer formation.
The highest yield was achieved by using a high concentration of Cu, low concentration of starting material S-1,
and high temperature. As higher temperatures were favored, the authors considered performing the reaction under
pressure in a flow reactor. To emulate the flow conditions, the reaction was performed under pressure using
microwave heating. This confirmed that at elevated temperature, dimer formation was reduced and yield enhanced,
while concurrently requiring less solvent and shorter reaction time, yielding a more efficient and sustainable large-
scale process.
Experimental data from in-situ ReactIR measurements provided kinetic rate information for key species in a Curtius
rearrangement reaction used in the synthesis of a key carbamate intermediate. This information was then used to
» A novel, eco-friendly synthesis featuring direct amide synthesis semicontinuous Curtius rearrangement and
acid–isocyanate coupling was developed for use in a pilot-scale continuous process.
» In-situ FTIR analysis enabled understanding and control of the potentially hazardous acyl azide formation.
» In-depth mechanistic kinetic modeling provided mechanistic understanding to aid in development of the
continuous flow system.
“
In the synthesis of a cyclopropyl benzylamine-substituted
6-azaindazole under investigation as a chemotactic cytokine
…mechanistic guidance via
receptor 1 (CCR1) antagonist, a Curtius rearrangement strategy was
PAT and in-house process
implemented for one of the synthetic steps. The goal was the formation
development of continuous-
of a carbamate that could be deprotected to yield an amine salt for use
flow technology allowed for
in the final amide coupling step. There was a significant safety concern
the execution of a concise
in this approach due to acyl azide formation, and therefore in-depth
and scalable synthesis of
mechanistic understanding was required to enable an intrinsically safe
CCR1 antagonist… critical
manufacturing process.
mechanistic understanding
Diphenylphosphoryl azide (DPPA) was used as the reagent for the of each process parameter
formation of the acyl azide. In-situ FTIR (ReactIR) PAT analysis enabled allowed for accomplishment of
monitoring of the acyl azide intermediate (2), its rearrangement to an the first example of direct amide
isocyanate (3) and eventual formation of the carbamate [Figure 1]. synthesis semicontinuous
As a key goal was Minimizing acyl azide concentration, a dose- Curtius rearrangement and
controlled, semi-batch addition of DPPA at high temperature was acid–isocyanate coupling…”
proposed and investigated.
H
N OR
O O O
DPPA ,D N k3
OH N3 C
(slowadd.) -N 2 O +ROH N SO 2Me
4
k1 k2 slow
N SO 2Me fast N SO 2Me slow N SO 2Me k4
Impurities
1 2 3
Figure 2. Dynochem model analysis - DPPA addition rate correlated with product yield and impurity level. Prediction indicates fast addition rate
results in better carbamate yield (4) and less impurities (5,6).
However, this higher addition rate was not feasible under batch operation due to safety concerns. For this reason,
a continuous flow approach was investigated, as this was expected to minimize impurities, be safer because of the
smaller reactor footprint, and allow more control over reagent additions.
Initially a CSTR approach was trialled, but the concern of acyl azide accumulation in the vessel was a significant
drawback. For this reason, a fully continuous flow approach using a two-stream process in a single plug-flow
reactor was investigated. Further experimentation determined that when the flow-generated stream of isocyanate
was transferred into a hot solution of tert-amyl alcohol via batch trapping, an 85% yield of the carbamate was
obtained with minimal impurities. With this information in hand, an in-house modular system featuring both flow
and batch components was assembled to achieve rapid scale-up, higher throughput and yield.
Further modification of the process, including using p-methoxybenzyl alcohol (PMBOH) to afford the corresponding
PMB carbamate and using a larger tube-in-tube reactor, adjusted pump flow rates to achieve the optimum residence
time and resulted in the final optimized Curtius rearrangement process. [Figure 3].
OH
PMBOH
Et3N H
N O
toluene
MeO 2S N 135 ˚C,40 bar
O
3 min RT
1
DPPA MeO 2S N
toluene OMe
5
Figure 3. Schematic of optimized process for formation of carbamate. In line FTIR continually monitors the acyl azide levels.
This work integrates physical experimental data obtained though in-depth experiments using Mettler Toledo
Automated Lab Reactors with in-silico calculations enabled by Scale-up Systems Dynochem modeling software. The
result is an automated workflow for characterization of volumetric mass-transfer coefficent (kLa) in reactors, and a
comprehensive database enabling the scale-up of processes using kLa as the reaction scaling parameter.
“
Understanding the impact of mass transfer and reactor design
is critical for the implementation of aerobic oxidations using
…we have developed a
biocatalysts on large scale. A challenge is to identify and
fully automated end-to-end
characterize appropriate scaling parameters to meet current
workflow for experimental
and future reactor requirements, thereby ensuring a successful
execution, data analysis, and
scale-up strategy. The overall volumetric gas–liquid mass-
parameter regression for the
transfer coefficient (kLa) is a commonly used scaling parameter,
characterization of kLa in
but the complex physical variables associated with stirred tank
relevant small-molecule centric
reactors complicates a general prediction of kLa in bioreactors.
reactors… The combination of
Here, automated techniques and data-rich experimentation (DRE)
kinetic and statistical modeling
were used to investigate the sources of error in conventional
for interrogation of a single,
mass-transfer characterization experiments and to develop
common data set provided an
modeling for more accurate kLa parameter regression. A strategy
experimentally efficient method
and workflow for kLa reactor characterization was demonstrated,
of gaining mechanistic and
along with a validation trial across process development scales.
process understanding.”
To develop an accurate and robust regression model for kLa,
which takes into consideration measurement lag and error, it
is necessary to have a platform that controls and time-records
all key process variables such as temperature, pressure,
air flow rate, and agitation, etc. An EasyMax automated
lab reactor provided the ideal platform, enabling accurate
control and recording of both the composition and flow rate
of gas to the vapor and liquid phases for kLa trials [Figure
1]. The EasyMax was equipped with sensors to measure
liquid-phase-dissolved oxygen, temperature, and pH. More
representative reactor hydrodynamics were obtained by use
of dummy probes with the same dimensions as common
PAT probes.
The automated kLa trials performed across a range of Figure 1. The automated reactor characterization platform was
pressure conditions generated very large data sets. METTLER developed in the laboratory using an EasyMax Automated
Laboratory Reactor platform with associated iControl software.
TOLEDO iControl software automatically exported the data
stream for further refinement, enabling the kLa value to
be obtained via parameter regression in Dynochem. This resulted in the construction of a Dynochem model that
accounted for full heat and mass balance in an experimental reactor and easily allows for regression of kLa. The
authors commented that improved kLa regression techniques required determining uncaptured dynamics in the
7.4. New Scale-Up Technologies for Multipurpose Pharmaceutical Production Plants: Use
Case of a Heterogeneous Hydrogenation Process
Furrer, T., Levis, M., Berger, B., Kandziora, M., & Zogg, A. (2023). New scale-up technologies for multipurpose pharmaceutical production
plants: Use case of a heterogeneous hydrogenation process. Organic Process Research & Development, 27(7), 1365–1376.
https://doi.org/10.1021/acs.oprd.3c00124
“
This case study demonstrates the power of integrating
physical data with mechanistic kinetic modeling for
A sophisticated scale-up strategy has been
successful scale-up [Figure 1]. Methodology is described
validated based on a heterogeneously
for effective process scale-up from the laboratory to
catalyzed hydrogenation reaction. A
production scale, as well as optimizing scale-dependent
laboratory-scale imitation (1.4 L SDR) of
variables through in-depth process understanding.
a production-scale reactor (4000 L) and
A dynamic process model based on a digital twin is
a sophisticated in silico process model
used with an innovative, laboratory scale down reactor
were described as part of this strategy…
replicating a 4000 L scale production reactor. Using
This approach aims to reproduce defined
a reaction calorimeter and real-time PAT analysis,
scale-dependent effects at the laboratory
experiments are performed in the laboratory that reproduce
scale, which might otherwise remain
production-scale conditions and validate the acceptable
unnoticed and may result in deviations
ranges for process variables. This dynamic model enables
during the scale-up process.”
the scale-up of a heterogenous hydrogenation processes
without the need for pilot-scale trials.
Figure 1. A digital twin of the large-scale reactor combined with information about the chemical process enables
development of a dynamic process model, which is used to optimize parameters simulating the production-scale
process. Laboratory scale experiments that simulate production conditions enable validation of the proven acceptable
range for those process parameters.
Process understanding at production scale is essential for reliable prediction of process safety, quality and
economics, and the importance of modeling, digitalization and automation as fundamental enablers for such
predictions. They also note that mechanistic models which are useful for homogenous processes can be more
challenging for heterogeneous processes, due to the difficulty of predicting mass transfer between phases and
heat transfer. A strategy was developed that uses scale down reactors (SDR) in combination with data-rich
experimentation and mechanistic modeling tools to mimic the large-scale production reactors. The authors refer to
the development of a toolbox of technology and methodology that includes scale-up and scale-down principles,
process modeling, well-characterized reactors of different scales, and meaningful laboratory experiments.
A key scale-up challenge for heterogenous process arises from the influence on kinetics of mass transfer, partial
dissolution, and shear forces on particles, largely due to mixing and stirring. In complex heterogeneous reactions,
mixing that is optimal for one of these variables may negatively affect another on scale-up. To simulate the
scale-up behavior for these reactions, a geometrically similar reactor with constant volume-specific power
dissipation due to agitation is applicable. Another issue in scale-up prediction for both homogeneous and
heterogeneous reactions is the scale-dependence of heat transfer for reactors of different sizes and geometry. Lab
reactors have higher heating/cooling capacity per unit volume than plant scale reactors, which must be considered
in scale-up behavior investigations.
Figure 2. A scale-down reactor was developed that included a novel insert device (H/C finger), which enables
preservation of the heat transfer area to volume ratio and the heat transfer coefficient of the production scale
vessel. The geometry of the internal elements of the reactor mimicked that of the larger vessel.
The Dynochem model takes into account process parameters such as the agitation speed, fill level, starting
temperature and concentrations, jacket inlet temperature, and gas-phase pressure, and allows optimization of
variables for the hydrogenation process. With this information, the process was then performed at production scale
with a reaction time reduction from 22 to 12 h. The use of this dynamic process model permits direct scale-up
without the requirement for traditional pilot scale runs. Furthermore, the workflow developed will allow significant
resource savings for the scale-up of future catalytic hydrogenation reactions through the use of well-designed
laboratory experiments and accurate in-silico simulations of the full-scale process.
Several pioneering examples illustrate the potential of such autonomous systems. For example, a modular flow
system integrating online analysis and chemometric modeling has been used for rapid self-optimization of multi-
step reaction systems.14 Similarly, continuous-flow electrochemical synthesis platforms have demonstrated the self-
optimization of organic reactions using real-time monitoring techniques like FTIR and GC.15
A significant reason that these closed-loop autonomous systems remain rare in implementation is the amount of
development and customization required to build one. Self-optimization approaches require a central system that
can interface between logging data, performing analysis, modeling processes, and controlling all components
autonomously. To achieve this, researchers are often required to use custom-developed solutions to bridge across
different hardware and software, and to solve complex compatibility and connectivity issues.
Looking to the future, continued development and integration of rapidly advancing techniques holds tremendous
potential for the industry. We can envision solutions that allow seamless data exchange and control between
significant portions of the instrumentation and software, helping to make the tools and knowledge required to build
and successfully implement a fully autonomous workflow more accessible. The combined portfolio of
METTLER TOLEDO solutions for data-rich experimentation and the Scale-up Suite of modeling and simulation
software are already taking steps to make this a reality.
process understanding, prediction, and insight. This combined approach streamlines process development,
enabling more efficient delivery of desired targets when compared to traditional workflows based on large amounts
of physical experiments and end-point analysis only. A carefully designed and well-implemented DRE and
mechanistic kinetic modeling workflow can deliver significant overall resource savings, including:
» More Value From Every Experiment: Leading to more targeted process development, with model outputs
used to guide experimental design for the next stages of development and scale-up.
» Fewer Wasted Experiments: Predicting areas of chemical space that are unlikely to deliver a successful
outcome, thus saving large amounts of wasted experimentation.
» Efficient Scale-up: Providing a robust framework for ‘right first time’ scale-up and technology transfer,
thereby preventing the need for extensive piloting studies or batch failures, which are a large and avoidable
resource drain.
» Strategic Goal Achievement: Supporting strategic targets, such as design of sustainable and resource-
efficient processes, or ensuring that process design and optimization efforts are directed towards feasible
and impactful objectives.
Modeling is often seen as a separate activity from experimental work, but the greatest impact can be achieved
when DRE are designed and run to support the application of mechanistic kinetic models. A comprehensive solution
from METTLER TOLEDO that includes PAT, automated laboratory reactors, advanced sampling technology, and
sophisticated modeling and simulation software can effectively integrate physical and virtual experimentation, and
greatly accelerate the development, optimization, and scale-up of chemical processes.
[1] Caron, S., & Thomson, N. M. (2015). Pharmaceutical Process Chemistry: Evolution of a Contemporary Data-
Rich Laboratory Environment. The Journal of Organic Chemistry, 80(6), 2943–2958.
https://doi.org/10.1021/jo502879m
[2] Jurica, J. A., & McMullen, J. P. (2021). Automation Technologies to Enable Data-Rich Experimentation: Beyond
Design of Experiments for Process Modeling in Late-Stage Process Development. Organic Process Research &
Development, 25(2), 282–291. https://doi.org/10.1021/acs.oprd.0c00496
[3] McMullen, J.P., Wyvratt, B.M., Hong, C.M. and Purohit, A.K. (2024). Integrating Functional Principal
Component Analysis with Data-Rich Experimentation for Enhanced Drug Substance Development. Organic
Process Research & Development, 28(3), 719−728. https://doi.org/10.1021/acs.oprd.3c00379
[4] Kelly, S.M., Lebl, R., Malig, T.C., Bass, T,M., Kummli, D., Kaldre, D., Orcel, U., Tröndlin, L., Linder, D.,
Sedelmeier, J., Bachmann, S., Han, C., Zhang, H. and Gosselin, F. (2023). Synthesis of a Highly
Functionalized Quinazoline Organozinc toward KRAS G12C Inhibitor Divarasib (GDC-6036), Enabled Through
Continuous Flow Chemistry. Organic Process Research & Development.
https://doi.org/10.1021/acs.oprd.3c00164
[5] Murray, J. I., Silva Elipe, M. V., Cosbie, A., Baucom, K., Quasdorf, K., & Caille, S. (2020). Kinetic Investigations
to Enable Development of a Robust Radical Benzylic Bromination for Commercial Manufacturing of AMG 423
Dihydrochloride Hydrate. Organic Process Research & Development, 24(8), 1523–1530.
https://doi.org/10.1021/acs.oprd.0c00256
[6] Klebanov, N., & Georgakis, C. (2016). Dynamic Response Surface Models: A Data-Driven Approach for the
Analysis of Time-Varying Process Outputs. Industrial & Engineering Chemistry Research, 55(14), 4022–4034.
https://doi.org/10.1002/anie.201609757
[7] Dong, Y., Georgakis, C., Mustakis, J., Hawkins, J. M., Han, L., Wang, K., McMullen, J. P., Grosser, S. T.,
& Stone, K. (2019). Constrained Version of the Dynamic Response Surface Methodology for Modeling
Pharmaceutical Reactions. Industrial & Engineering Chemistry Research, 58(30), 13611–13621.
https://doi.org/10.1021/acs.iecr.9b00731
[8] Griffin, D., & Huggins, S. (2018). Applying Automation and Data-Driven Modeling to Perform Rapid Reaction
Optimization. In 2018 AIChE Annual Meeting. AIChE.; Griffin, D., Partopour, B., & Huggins, S. (2019,);
Knowledge-Constrained Machine Learning: A Strategy for Producing Predictive Process Models in the Absence
of Mechanistic Understanding and Large Data Sets [Conference Abstract]. FOPAM.
[9] Blackmond, D. G. (2005). Reaction Progress Kinetic Analysis: A Powerful Methodology for Mechanistic Studies
of Complex Catalytic Reactions. Angewandte Chemie International Edition, 44(28), 4302–4320.
https://doi.org/10.1002/anie.200462544
[10] Burés, J. (2016). Variable Time Normalization Analysis: General Graphical Elucidation of Reaction Orders From
Concentration Profiles. Angewandte Chemie International Edition, 55(52), 16084–16087.
https://doi.org/10.1002/anie.201609757
[11] D.-T. Nielsen, C., & Burés, J. (2019). Visual Kinetic Analysis. Chemical Science, 10(2), 348–353.
https://doi.org/10.1039/C8SC04698K
[12] Crawford, M. (2024, April 15). Late-Stage Pharmaceutical Development of Nemtabrutinib (MK-1026) at MSD.
Late-Stage Pharmaceutical Development of Nemtabrutinib (MK-1026) [Webinar]. Mettler-Toledo.
[13] Cox, R. J., McCreanor, N. G., Morrison, J. A., Munday, R. H., & Taylor, B. A. (2023). Copper-Catalyzed
Racemization-Recycle of a Quaternary Center and Optimization Using a Combined Kinetics-DoE/MLR Modeling
Approach. The Journal of Organic Chemistry, 88(9), 5275–5284. https://doi.org/10.1021/acs.joc.2c02588
[15] Sagmeister, P., Ort, F. F., Jusner, C. E., Hebrault, D., Tampone, T., Buono, F. G., Williams, J. D., & Kappe, C. O.
(2022). Autonomous Multi-Step and Multi-Objective Optimization Facilitated by Real-Time Process Analytics.
Advanced Science, 9(10), 2105547. https://doi.org/10.1002/advs.202105547
Visit our webinar collection and Sign-up for Dynochem Resources, Our team of technical and
listen to your peers present case our comprehensive library of models application consultants are
studies demonstrating how they and user knowledge, to learn from ready to discuss your application
are leveraging the combined power our experts and the community, and and how you can take steps to
of data-rich experiments and build your expertise. start accelerating your
kinetic models to optimize development processes.
chemical development.