Realizing The Vision of Digital Engineering Is2022 v1.3.4

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Realizing the Promise of Digital Engineering:

Planning, Implementing, and Evolving the Ecosystem

William D. Schindel, ICTT System Sciences


[email protected]
Copyright © 2022 by William D. Schindel. Permission granted to INCOSE to publish and use.

Abstract. Gaining benefits of Digital Engineering is not only about implementing digital tech-
nologies. An ecosystem for innovation is a system of systems in its own right, only partly engi-
neered, subject to risks and challenges of evolving socio-technical systems. This paper summa-
rizes an aid to planning, analyzing, implementing, and improving innovation ecosystems. Repre-
sented as a configurable model-based reference pattern used by collaborating INCOSE working
groups, it was initially applied in targeted INCOSE case studies, and subsequently elaborated and
applied to diverse commercial and defense ecosystems. Explicating the recurrent theme of Con-
sistency Management underlying all historical engineering, it is revealing of Digital Engineering’s
special promise, and enhances understanding of historical as well as future engineering and life
cycle management. It includes preparation of human and technical resources to effectively con-
sume and exploit digital information assets, not just create them, capability enhancements over
incremental release trains, and evolutionary steering using feedback and group learning.
Keywords: digital ecosystem; digital engineering; digital thread; digital twin; collaboration; MBSE

Introduction
Many large-scale human endeavors have grown up and proliferated through the evolutionary
forces of large-scale interactions and selection processes; however, as interacting systems of
systems, they have not been consciously human-engineered in the traditional sense.
Human-performed systems of innovation include interacting elements such as competitive
markets, scientific research, engineering, production, distribution, sustainment, and regulatory
processes, and other life cycle management familiar to the systems engineering community (ISO
2015), (INCOSE 2015). In the natural world, systems of innovation provide a much longer history
for discovery and study than the more recent human-performed cases (Schindel 2013). For this
paper’s interest in human-performed cases for human use, we define “innovation” as delivery of
significantly increased stakeholder value (Schindel, Peffers, et al 2011).
The term “ecosystem”, borrowed from the life sciences, has become more frequently applied to
label the human-performed case, out of recognition of the vast extent, complexity, and dynamic
evolution of the human-performed cases. Systems engineers less familiar with MBSE details are
encouraged to view this approach as a systems view of that ecosystem and systemic impacts of
information, not the details of models. The descriptive backbone of this article is the formal
INCOSE Innovation Ecosystem Reference Model, configurable across diverse specific cases.
(Since this paper is about that formal reference model, terms which are modeled class names from
that reference model are shown in title case as they appear in the named model components.)
The engineering community is certainly not without high value historical models of at least
portions of the human-performed Innovation Ecosystem. The above-referenced ISO standard and
INCOSE Handbook, the ubiquitous “Vee” model, DoD and enterprise-specific models, new
model-based standard efforts to describe the Model-Based Enterprise, and others provide vital
guidance. Out of respect for those historical assets and the importance of building upon them, they
are accommodated within and mate up with the larger-scale Innovation Ecosystem reference
model’s configurations referenced in this article.
Why is an ecosystem-level model needed? Smaller scale models have served to inform teams
about what work needs to be done, coordinate flows of information, plan information systems, and
other purposes. Is there really a need for an ecosystem level reference? Do our innovation
ecosystems work well enough, and do we understand them well enough? Consider the following.
Ecosystem-level efforts and issues are arising that challenge our group-level abilities to effectively
understand (individually and together) and communicate about the innovation ecosystem across
life cycles, and particularly so while that ecosystem itself is evolving and the stakes are rising. We
are increasingly interested in how to understand the basis of performance of the ecosystem as a
whole (as in its timely delivery of competitive solutions) through its system components and their
organization—for performance improvement, robustness, pathology, and security reasons. How
do we integrate across supply chains? Are there other effective architectures besides historical
OEM and captive supplier relationships? How can we improve the real effectiveness of those or
other combinations? Can we even effectively communicate about this subject without a shared
neutral reference model? What is the connection of the engineering community’s interest with the
business management community’s interest in “business ecosystems” (Jacobides 2017)?
Growth in conversations about “digital engineering”, “digital twins” and “digital threads”, all
illustrate a growing need for foundational insight to support the “buzz” and to better connect to
history even where departures are needed. The Innovation Ecosystem Reference Model described
in this paper focuses on such a set of ecosystem issues. Following a brief introduction to the
structure of the reference model, this article summarizes selected aspects which related experience
has shown provide important insight and understanding worthy of increased attention:
1. Ecosystem-level capabilities’ connection to underlying interactions;
2. Connecting historically understood business processes to evolving digital infrastructure;
3. Consistency Management’s connection to realizing the promise of digital engineering;
4. Effectiveness of distributed, multi-level group learning across an ecosystem;
5. Group trust in the credibility of models;
6. Managing the proliferation of virtual model diversity and instances;
7. Effective evolution of the ecosystem itself—including implementation challenges.

Selected Aspects of the Innovation Ecosystem Pattern


The reference model was proposed in a series of papers to describe adaptive purpose-seeking
innovation ecosystems (Beihoff and Schindel 2011) (Schindel 2013). It was then elaborated
during a multi-year INCOSE joint project of the Agile Systems Engineering and MBSE Patterns
Working Groups to study agility across a range of aerospace and defense programs by leading
enterprises (Schindel and Dove, 2016), (Dove, Schindel and Scrapper 2016), (Dove and Schindel
2017), (Dove, Schindel, and Hartney 2017), (Dove, Schindel, and Garlington 2018), (Dove and
Schindel 2019). Since that time, it has been further elaborated by the MBSE Patterns Working
Group to study issues listed in the introduction across other enterprises, and migrated into a ge-
neric configurable S*Pattern expressed in OMG SysML. At the time of this writing, it is also being
applied as a reference model in joint publication projects by AIAA, INCOSE, and others to study a
series of Digital Twin and Digital Thread cases and principles. This article summarizes aspects of
the reference pattern translated from its more detailed OMG SysML version, using accurate but
less formal graphic renditions, for ease of comprehension.
Reference Model Structure. Figures 1-3 informally summarize the formal model’s logical ar-
chitecture, Levels 0-2, the first three decomposition levels of the logical architecture.

Figure 1. Level 0 Logical Architecture, Systems 1, 2, and 3


By Level 2, these separate the roles played by ecosystem information classes from the business
and technical processes that produce and consume that information. The blocks shown represent
generic configurable logical roles (behaviors), not specific methods, until they are configured.
Prominent in this decomposition are three reference boundaries, for defined Systems 1, 2, and 3:
System 1--The Engineered System of Interest: Viewed at any and all times in its life cycle.
System 2--The Life Cycle Domain System: The environment with which the Engineered
System interacts, across its life cycle. This includes all Life Cycle Management systems re-
sponsible for the Engineered System (research, engineering, manufacturing, distribution,
markets, operations, sustainment). System 2 is responsible to observe and learn about System 1
and its environment, not just engineer and deploy it. A model or artifact describing System 1 is
a subsystem of System 2, which also includes collaborating users of that information.
System 3—The Innovation Ecosystem: Includes the system responsible to plan, deploy, and
evolve System 2, responsible to observe and learn about System 2 and its environment.
Writing and reading this article are System 3 activities, as are many other technical society
activities intended to improve the future System 2’s of the world.
As an MBSE S*Pattern (a reusable, configurable MBSE model), the reference model has more
components than just logical architecture, including stakeholder features (Figure 4) describing
configurable ecosystem capabilities, functional interactions between functional roles, interfaces
and systems of access, allocations to design components, attributes, and other components,
mapped into OMG SysML. The details of the pattern methods of representation are beyond the
scope of this ecosystem model article, but described further in (Schindel and Peterson 2016),
(INCOSE Patterns WG 2019b), and (Patterns WG 2020a).

Figure 2. Level 1 Logical Architecture--Separates Learning from Applying What Is Learned


.
Figure 3. Level 2 Logical Architecture--Process Roles versus Information Roles
1. Ecosystem-level capabilities’ connection to underlying interactions. Our first concern for
an Innovation Ecosystem is for its capabilities. Figure 4 summarizes the modeled Stakeholder
Features built into the configurable reference model. For a given current or planned ecosystem of
interest, these are configured by variably populating them (multiple instances in some cases) or
not, and setting their attribute values, similarly to viewing the Innovation Ecosystem as a config-
urable Product Line Engineering model—but as a product line of configurable ecosystems. The
resulting configured feature model represents the overall capabilities of an innovation ecosystem
of interest—whether past, current, or future, whether favorable or unfavorable, for analysis,
planning, communication, or other purposes. A series of these configurations represents a planned
or real trajectory of ecosystem capabilities evolution over time. Figure 4 shows sample capabilities
(features and their attributes) from ISO15288 systems engineering, along with agile engineering
capabilities, digital threads and twins, and other capabilities at a stakeholder level. The feature
attributes (properties) shown include Feature Primary Key attributes whose configured values
invoke modeled population of specific ecosystem interactions of the roles from Figure 3,
providing technical behaviors delivering the configured capabilities.
2. Connecting historically understood business processes to evolving digital infrastructure.
The System Life Cycle Business Processes shown in the upper sections of Figures 3 and 5 rep-
resent either traditional or evolving business processes from the ecosystem (supply chain partners,
enterprises, etc.) description of existing or planned business processes for research, engineering,
production, distribution, sustainment, and other life cycle management processes. It is these pro-
cesses (typically some targeted subset of them) that the Digital Engineering enhancements shown
in other blocks are to advance, as discussed in the following sections. The important point here is
that the advanced digital engineering roles to be discussed next are by this means connected to the
more familiar existing, traditional, or planned local reference business process framework they are
to serve and enhance. We are now ready to connect those business processes to the digital engi-
neering promise, using the key insight of the Consistency Management role introduced in Figure 3.
Figure 4. Configurable Stakeholder Features (Innovation Ecosystem System 2 Capabilities)

Figure 5. Business Processes of the Ecosystem Appear in the Configurable Reference Model
3. Consistency Management’s connection to realizing the promise of digital engineering.
The traditional systems engineering “Vee diagram” in the lower left of Figure 5, along with the
other adjacent USDoD and enterprise models, all remind us that all engineering methods in one
way or another inherently manage a series of “gaps” into acceptable “consistencies”:
• Consistency of formally recorded system requirements with stakeholder needs
• Consistency of system designs with system requirements
• Consistency of virtual simulations with empirical measurements (model VVUQ)
• Consistency of system component production with system design
• Consistency of system performance with system requirements
• Consistency of system operation with system requirements and design
• Consistency of system sustainment with system requirements and design
• Consistencies of many aspects with applicable technical standards, regulation, and law
• Consistencies of many aspects with learned experiences, formal patterns of requirements and
design, physical science, product line rules, architectural frameworks, shared ontologies,
domain specific languages, and model semantics
• Managed consistencies of the Digital Thread and Digital Twin
• Many other types of consistencies
Nearly all of these were also required consistencies in the traditional, more “tolerant” hu-
man-performed ecosystems lacking as much digital technology, even if not recognized as so.
The Consistency Management Role in Figure 3 represents the configurable set of process roles
responsible for consistency management—whether performed by humans or automated, and
whether performed well or not. It is understandable that much of this role has historically been
performed by humans, because of required skills, judgement, experience, and information forms.
The digital engineering and modeling community finds itself in frequent conversations about a
perceived need for a “single source of truth” or “authority”, reflecting frustrations with diverse and
inconsistent information about systems. Figure 6 reminds us this situation is not as simple as might
be assumed, showing the three main sources of information in any ecosystem:
T1. What the stakeholders say (market and sponsor truths);
T2. What experience says (accumulated, hard-won past discoveries; includes physical science);
T3. What empirical observation says (observation, measurement, experiment).
The challenge is that these three sources will frequently be inconsistent (disagree with each other).
The Figure 3 Consistency Management Roles of engineering and other life cycle management
processes historically must recognize (detect) those inconsistencies and reconcile them. While the
resulting reconciliations may be considered “authoritative” or “single”, they are short-lived.
The rise of interest in digital thread and digital twin methods fits into this consistency management
perspective. This is currently being applied in a series of industry case studies by AIAA with
INCOSE support. In the case of the digital twin, it reminds us of the importance of (1) managing
both consistency between the virtual simulation model and the real system it simulates, and (2)
managing the consistency of business processes and their information with what the trusted digital
twin virtual model tells them. In the case of the digital thread (Figure 7), the central issue of the
“thread” is managed consistency between a range of information objects along that thread. (Even
sources external to the thread generate information samples within it.) Historical predecessors to
the digital thread bring important perspective to this evolution. Depending on industry domain,
these include (SAE 2016), (AIAG 2006), (ISO 2016).

Figure 6. Roots of the Consistency Management Challenge

Figure 7. The Consistency Thread—Antecedent of the Digital Thread


Because consistency gaps are often rooted in conflicting interests of different parties, the Con-
sistency Management role is the potential site for impactful multi-party collaboration across the
ecosystem or supply chain. Enabling this collaboration with explicit models of the respective
parties’ collaboration configuration spaces makes it easier to understand it as a problem of dif-
ferential or modular games (Schindel and Seidman 2021), (Schindel 2021), (Leitmann 1975).
The history of consistency management across the product life cycle has seen varied gap sizes at
some stages versus others. This has meant that production, logistics, sustainment, and operation
consistency gaps may be larger or longer-lived until reconciled. The ASELCM analysis frame-
work helps us to see that these may be viewed not just as consistency gaps in System 1’s life cycle
(as viewed by System 2), but also as consistency gaps in the description of System 2 (as observed
and modeled by System 3). This suggests another way to recognize and head off these gaps sooner
and at less cost.
Many benefits sought through transformation to Digital Engineering have been discussed widely,
such as basic issues of improved information accessibility, early virtual verification through sim-
ulation, and other gains. The Innovation Ecosystem Pattern reminds us, through the Consistency
Management Role, of the wider promise that a variety of Consistency Management issues at the
heart of every life cycle stage may ultimately be attacked more effectively through the aid of
digital information technologies that assist in Consistency Management. These include semantic
web technologies, machine learning, consistency thread signatures, configurable patterns, and
pattern-based model metadata. (Herzig and Paredis 2014), (Herzig, Qamar and Paredis 2014),
(Kerstetter and Woodham 2014), (Redman, 2014), (Patterns WG 2020b).
4. Effectiveness of distributed, multi-level, group learning across an ecosystem. The promise
of digital engineering should not be to optimize single program outcomes while “forgetting” what
is learned when the next program starts, nor to arbitrarily isolate one team’s learning from other
teams within a shared community. Traditional descriptions of the SE life cycle processes (e.g.,
ISO 15288, INCOSE SE Handbook, etc.) describe all the processes a program should follow to
generate all the information needed across the life cycle, but are relatively silent on the questions:
“What about what we already know?” and “What about the impact on future programs of what we
learned the hard way on past programs?” This begs the question of what is really meant by “what
we learned” and “what we know”—what is group knowledge?
The management of balancing acquisition and validation of new information versus exploiting
existing information is also frequently omitted in those descriptions, left for separate considera-
tion. This is in ironic contrast to one of the great successes of modern signal processing and control
theory—the optimal mixing of past experience with new information, in the presence of uncer-
tainty, discussed further in a later section below.
The Innovation Ecosystem model views effective learning as not just accumulation of information
as IP assets, but instead as improvement of future performance across the ecosystem based on past
experience. This especially includes more effective application of learned results that were ac-
quired by different people at different times. The Ecosystem Pattern makes explicit the two roles
of learning and subsequent application, and their integration—refer to Figure 2. That integration
can include starting new program executions by configuring general learned System 1 and 2 pat-
terns in a form specific to the new program. To the degree it is performed, this capability is referred
to by the MBSE Patterns Working Group as “pattern-based systems engineering (PBSE) (Patterns
WG 2020a).
Key System 2 capabilities that, if present, contribute to that performance include:
- Synthesizing Generalization: Distillation of learning as model-based abstractions, cu-
rated at the abstraction hierarchy level where they can have the greatest future impact. The
“up” (Learn) arrows in Figure 2;
- Validation for Context of Use: Reusable configurable model verification, validation, and
uncertainty quantification, credibility assessment, establishment of pattern metadata on
provenance, credibility, and intended range of use. More on this in next section;
- Configuring Specialization: Harvesting of accumulated learned patterns at the place and
time (and for the people where) they are impactful, through their configuration into new
projects as part of the initiation of those projects. The “down” (Apply) arrows in Figure 2.
After it is understood that configuration space is not “flat”, but organized by evolving patterns at
different abstraction levels, two challenging opportunities can be better understood:
- The dynamic evolutionary nature of semantic interoperability: Domain-specific on-
tologies will continue to spring up as long as new system interactions and interaction levels
are pursued describing new phenomena—and this is forever. One of the competencies
required of the digital ecosystem is continuous collaborative synthesis of new, often
higher-level, semantic frameworks for interoperability (Schindel 2020).
- The opportunities for sharing and ownership at different levels: Shared frameworks
across large ecosystems can lift the fortunes of all boats, as in the case of pre-competitive
standards shared by competitors—but can be perceived as counter to the interests of indi-
vidual suppliers, customers or employees who wish to own, control, or be differentiated by
less shared models. Non-flat pattern hierarchy allows for mixing of shared ecosystem-wide
generic patterns with compatible specializations controlled or licensed by competitive
ecosystem members, providing simultaneous differentiation and compatibility.
Digital Engineering offers special promise in the above areas through the use of information
technologies that empower virtual models, their generalization and configuration, and related
processes with capacity exceeding human performance alone. But it also demands new human
skills and orchestration on the human side of the Digital Engineering partnership. Model-based
group learning is also related to issues of trust in model credibility, discussed next.
5. Group trust in the credibility of models. Model credibility involves the verification and
validation of a model’s fitness for use for a stated purpose (ASME 2018), explicit tracking of
related uncertainties (NAE 2012), and larger issues of propagation of trust (Rhodes 2018). The
growing proliferation of model instances, types, and uses means that more uniform model
metadata approaches are becoming important to describe those diverse assets in more uniform
ways—somewhat like the emergence of bar code labels on supermarket products. Because there
are a variety of model credibility factors that may be applied, Credibility Assessment Frameworks
(CAFs) can serve a useful purpose as part of that model metadata. (Kaizer 2018) The INCOSE
MBSE Patterns Working Group has developed a Model Characterization Pattern (MCP) descrip-
tive of models of all types (Patterns WG 2019a), building in enterprise-configured CAFs.
Many aspects of the engineering cycle are concerned with determining whether aspects of related
information are worthy of trust for use in a given context. When this interest is translated to operate
with virtual models, it is bolstered by the powerful technical toolset developed over the longer
history of the (model-based) scientific revolution, in which the credibility of candidate models,
and their repeated uses across different instances are both central. Computational model verifi-
cation, validation, and uncertainty quantification (VVUQ) is a vital portion of this infrastructure.
Group trust in model credibility is not just a technical matter of the fidelity of the models them-
selves. Group trust is a socially-transmitted property, in which additional credibility factors such
as trust in intermediate messengers and interpreters carries great weight (Rhodes 2018). Models of
how credibility (or doubts of credibility) are propagated through ecosystems can illustrate the
contest of multiple factors impacting group trust, distrust, confidence, or doubt. The above
Credibility Assessment Frameworks (CAFs) preserve for future reference the basis on which
credibility was assessed for a given model, whether it later proves to be valid or not.
6. Managing the proliferation of model diversity and instances. Such model credibility in-
formation is a special case of larger class of model metadata—information outside a virtual model
that describes the virtual model. Model metadata can variously include description of a model’s
focal subject, structure, algorithms, intended model use and context of that use, model provenance,
model credibility, the nature and scope of the virtual model, and refer to related model artifacts,
datasets, and life cycle maintenance history. Figure 3 graphically notes the role that model
metadata plays within the innovation ecosystem, describing diverse virtual models (and datasets)
to their potential users, as a kind of uniform “labeling wrapper” of evolving virtual models. While
it has been common to consider many aspects of information technology in planning Digital En-
gineering, awareness of the broader roles of virtual model metadata deserves expanded awareness.
The diversity of types of virtual models includes computational models (simulations of all kinds)
and descriptive MBSE models, but can also other forms of formalized standards-based data
structures. Simulations alone may include physics-based FEA and CFD discretized continuum
simulations, ordinary differential equation-based simulations, machine learning models and other
forms of data-driven models, and others. Adding to this diversity are varied model author styles,
computing environments, and methodologies for model verification, validation, uncertainty
quantification, and credibility management. The resulting explosion of model diversity as well as
model quantities is exacerbated by increasing separation between model authors and model users.
The Model Wrapper generic metadata role shown in Figure 3 serves purposes similar to the
package labeling, inserts, and supplemental downloads common to consumer products. Imagine
walking into a modern supermarket, big box store, or distributor web site, and finding that all the
package and shelf labeling and explanations have disappeared except for the ability to directly
view the products (remember earlier open-air market bazaars). This conveys some idea of the
current situation concerning proliferation of thousands of models within an enterprise, and even
more pronounced across a future multi-enterprise ecosystem in which exchange of models occurs.
Generic metadata frameworks for engineering models, such as the Model Characterization Pattern
(Patterns WG 2019a) and Model Identity Card (MIC) (Goknur 2015) are key enablers to the ef-
fectiveness of the digital engineering in the Innovation Ecosystem.
7. Effective evolution of the ecosystem itself—including implementation. Among the promises
of the digital engineering ecosystem are its own adaptability, as future environments and market
situations demand. An essential capability described by the Innovation Ecosystem Pattern is that
adaptability. In Figures 1 and 2, System 3 is concerned with adaptability of System 2, beginning by
observing and representing it, followed by analyzing and deploying adaptations to System 2 in-
stances. Viewing System 2 through the lens of systems engineering, this includes implementation.
The deployed or updated “design components” of System 2 are collaborating people, enterprises,
information systems, equipment, and facilities of System 2, and how they are organized (interact
with each other), planned over agile release train configurations of the System 2 pattern. In addi-
tion to the challenges of engineering, such adaptation implementation also carries all the chal-
lenges of enterprise organizational change management (OCM) (Kotter 2014). Just as the forces of
multi-stage selection operate over the life cycles of the engineered products of System 1, (other)
multi-stage selection forces also shape the evolution of System 2 (Patterns WG 2020a). Under-
standing those forces is essential to the conscious design of (or at least influence on) the evolution
of System 2. For complex business ecosystems involving multiple partners, not only is the
alignment of their technical capabilities vital, but also the alignment of their business interests and
incentives. These issues should remind us that successful collaboration across System 2 requires
more than just a digital medium for that collaboration. Heeding the wisdom of the lengthy related
literature (e.g., (Kotter 2014)) on organizational change is a key part of implementation planning.
In the language of business management community, “business ecosystem” has come to refer to
particular ecosystem architectures (for System 2) which operate flexibly as small “markets” in
which modularity of the System 1 technical approach encourages a more dynamic (and accord-
ingly less stable) arrival and departure of competing candidate System 2 partners offering con-
tributions to solutions. (Jacobides 2017)—this in contrast to traditional OEM plus captive smaller
suppliers linear supply chain network models. Both the advantages and disadvantages of such
approaches can be seen in the real history of the personal computer (PC). Early PCs were propri-
etary closed architectures from competing end product suppliers. This picture was disrupted when
IBM opened the digital product’s electronic circuit card bus specification and business ecosystem
to third party suppliers who could directly supply add-in circuit cards to the end user. The market
dramatically expanded through innovative add-ons, lifting all boats, but eventually driving the
originator (IBM) of that approach out of the market. These are not just stories of the System 1
architecture, but also of the System 2 architecture.
Selection processes performed by System 2 and 3 can be understood as cycles of their Consistency
Management Roles (see Figure 3), selecting opportunities, requirements, candidate designs, and
other aspects of both System 1 products and System 2 enterprise designs. In those cycles, Digital
Engineering offers special promise for exploiting the following “Goldilocks” insight from the
successful history of engineering certain challenging systems:
- More consideration of empirical inputs: When more agility was needed to converge sooner
on the real needs of stakeholders and real solutions to them, the pioneers of agile engineering
introduced cycles that paid earlier, more frequent, and ongoing attention to incoming reality
signals from System 2 experiment and empirical measurements involving real world signals
instead of isolated planning. The upside of this produces early minimum viable products
(MVPs), rapid learning by individuals and small teams, and successful “pivots”. On the
downside, it may miss exploitation of what was already discovered and can produce
ill-conceived course changes chasing noisy data.
- More consideration of patterns of experience: When more instances of variant products
proliferated to address different market segments, the pioneers of design patterns and product
line engineering introduced cycles that paid more attention to shared historical patterns of
product designs, requirements, and other common but configurable assets. The upside of this
produces increased IP leverage and flexibility. If overperformed, it risks constraints that may
miss external shifts and trends, dragging along too much of the past.
- Goldilocks as Kalman: More optimal mixing observation and experience: Formal systems
engineering process descriptions often tell us all the things we should do to learn what is
needed for good life cycles, but may be silent on the questions “what about what we already
know?”, and “how can we discover new things sooner?”, addressed by the two complementary
points above. In one of the most impactful examples of breakthrough engineering through
applied mathematics, Rudolf Kalman introduced an approach to optimal mixing of these two
in the presence of uncertainty, the Kalman Filter approach to Bayesian estimation, power
navigation to landing on the Moon, world-wide personal communication systems, countless
industrial control systems, and other applications of this combination. Digital Engineering
offers a medium in which the Consistency Management Role of Figure 3 can be advanced to
leverage those insights in support of human decision-making (Schindel 2017b). Improving
ontological patterns and their use can improve meaning and understanding of empirical data
from improved sensory and observational networks. Collaborative ecosystem efforts to create
capabilities such as JADC2 can benefit from these historical insights. (CRS 2021).

Conclusions, next steps, and an invitation


The seven selected aspects of the Ecosystem Pattern discussed in this paper demand greater
community-wide attention in planning and analyzing digital ecosystems, and the neutral descrip-
tive framework described offers a means of doing so. The systems engineering community has a
shared interest in the network benefits of community-wide advancement of ecosystems for digital
engineering. The INCOSE MBSE Patterns Working Group continues to pursue the discovery and
expression of explicit model-based patterns, which fuel digital ecosystems as “water through their
pipes”, but which also represent those ecosystems themselves (Patterns WG 2021).
The Patterns Working Group conducts most of its activities as collaborations with other INCOSE
and additional technical society groups, to advance awareness and the state of practice. Interested
readers are invited to participate in this progress and learn along with us about use of the related
aids and examples that this reference pattern supports:
• Details of the Ecosystem Pattern, now being tested in its OMG SysML form
• The Ecosystem Pattern as a digital engineering capability planning aid (Patterns WG 2020c)
• Basics of S*Models, S*Patterns, and the S*Metamodel (Patterns WG 2019b)
• Domain specific applications of model-based patterns (Patterns WG 2021)

Acknowledgements
The Ecosystem Pattern is informed by the practices and ideas from numerous pioneers and prac-
titioners. The encouragement, suggestions, and inspiration from Rick Dove, chair of the INCOSE
Agile Systems Engineering Working Group, the lead team of the INCOSE Agile Systems Dis-
covery Project, and the membership of INCOSE MBSE Patterns Working group, along with the
anonymous reviewers of this paper, are all acknowledged with gratitude.

References
AIAG 2006, ‘APQP & PPAP Requirements for Automotive’, Automotive Industry Action Group,
Southfield, MI (US). <https://www.techstreet.com/standards/aiag-ppap-4?
product_id=1257705>
ASME 2018, ‘VV40: Assessing Credibility of Computational Modeling through Verification and
Validation: Application to Medical Devices’, New York, NY (US).
<https://www.asme.org/codes-standards/find-codes-standards/v-v-40-assessing-credibilit
y-computational-modeling-verification-validation-application-medical-devices>
Beihoff, B & Schindel, W 2011, ‘Systems of Innovation I: Summary Models of SOI Health and
Pathologies’, in Proc. of 2011 International Symposium on Systems Engineering,
International Council on Systems Engineering, San Diego, CA (US).
CRS 2021, ‘Joint All-Domain Command and Control (JADC2): Background and Issues for
Congress’, Congressional Research Service, Washington, DC (US).
<https://crsreports.congress.gov/product/pdf/R/R46725/2>
Dove, R & Schindel, W 2019, ‘Agile Systems Engineering Life Cycle Model for Mixed Discipline
Engineering’. in Proc. of 2019 International Symposium on Systems Engineering,
International Council on Systems Engineering, San Diego, CA (US).
Dove, R & Schindel, W 2017, ‘Case study: Agile SE Process for Centralized SoS Sustainment at
Northrop Grumman’. in Proc. of 2017 International Symposium on Systems Engineering,
International Council on Systems Engineering, San Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:is2017--northr
up_grumman_case_study_dove_and_schindel_bp.pdf>
Dove, R, Schindel, W & Garlington, K 2018. ‘Case Study: Agile Systems Engineering at
Lockheed Martin Aeronautics Integrated Fighter Group’, in Proc. of 2018 International
Symposium on Systems Engineering, International Council on Systems Engineering, San
Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:is2018_-_asel
cm_lmc_case_study.pdf>
Dove, R, Schindel, W & Hartney, W, 2017, ‘Case Study: Agile Hardware/Firmware/Software
Product Line Engineering at Rockwell Collins’, in Proc of 11th Annual IEEE International
Systems Conference. Institute of Electrical and Electronic Engineers, New York, NY (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:pap170424sys
con-casestudyrc.pdf>
Dove, R, Schindel, W & Scrapper, C 2016. ‘Agile Systems Engineering Process Features
Collective Culture, Consciousness, and Conscience at SSC Pacific Unmanned Systems
Group’. in Proc. of 2016 International Symposium on Systems Engineering, International
Council on Systems Engineering, San Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:is2016_--_aut
onomous_vehicle_development_navy_spawar.pdf>
Göknur, S, Paredis C, Yannou, B, Coatanéa, E & Landel, E 2015. ‘A Model Identity Card to
Support Engineering Analysis Model (EAM) Development Process in a Collaborative
Multidisciplinary Design Environment’. IEEE Systems Journal, IEEE, New York, NY
(US). <https://hal.archives-ouvertes.fr/hal-01184938/document>
Herzig, S & Paredis, C 2014, ‘A Conceptual Basis for Inconsistency Management in Model-Based
Systems Engineering’, Proc of CIRP 2014 Design Conference, International Academy for
Production Engineering, Paris, France.
<https://www.sciencedirect.com/science/article/pii/S2212827114007586/pdf?md5=c9bdd
8aba94e820ec43b56330225daa6&pid=1-s2.0-S2212827114007586-main.pdf>
Herzig, S, Qamar, A & Paredis, C 2014. ‘Inconsistency Management in Model-Based Systems
Engineering’, Proc of 2014 Global Product Data Interoperability Summit, Southfield, MI
(US).
<http://gpdisonline.com/wp-content/uploads/past-presentations/AC45_GeorgiaTech-Seba
stianHerzig-InconsistencyManagementInMBSE.pdf>
ISO 2015, ‘ISO 15288:2015 Systems and Software Engineering — System Life Cycle Processes’.
International Standards Organization, Geneva, Switzerland.
<https://www.iso.org/standard/63711.html>
ISO 2016, ‘ISO 13485: 2016 Medical devices — Quality Management Systems — Requirements
for Regulatory Purposes’, International Standards Organization, Geneva, Switzerland.
<https://www.iso.org/standard/59752.html>
Jacobides, M 2017, ‘Towards a Theory of Ecosystems (with Phenomenological Preamble)’,
Keynote presentation at 5th International Conference of the Armand Peugeot Chair Paris,
France.
<https://chairgovreg.fondation-dauphine.fr/sites/chairgovreg.fondation-dauphine.fr/files/a
ttachments/JCG%20CAP%20Paris%202017%20presentation%20S.pdf>
Kaizer, J 2018, ‘Credibility Assessment Frameworks - Personal Views’, presented at ASME
Symposium on Verification and Validation, American Society of Mechanical Engineering,
New York, NY.
<https://cstools.asme.org/csconnect/FileUpload.cfm?View=yes&ID=54674>
Kerstetter, M & Woodham, K 2014, ‘SAVI Behavior Model Consistency Analysis’, in Proc. of
2014 Global Product Data Interoperability Summit, Southfield, MI (US).
<http://gpdisonline.com/wp-content/uploads/past-presentations/AVSI-Kerstetter-SAVIBe
haviorModelConsistencyAnalysis-CAE-Open.pdf>
Kotter, J 2014, Accelerate: Building Strategic Agility for a Faster-Moving World, Harvard
Business Review Press, Cambridge, MA (US).
Leitmann G 1975. ‘Cooperative and Non-Cooperative Differential Games’, in Leitmann G &
Marzollo A (eds), Multicriteria Decision Making, International Centre for Mechanical
Sciences (Courses and Lectures), vol 211, Springer, Vienna, Austria.
<https://doi.org/10.1007/978-3-7091-2438-3>
NAE 2012, ‘Assessing the Reliability of Complex Models: Mathematical and Statistical
Foundations of Verification, Validation, and Uncertainty Quantification’, National
Academy of Engineering, Washington, DC (US).
<https://www.nap.edu/catalog/13395/assessing-the-reliability-of-complex-models-mathe
matical-and-statistical-foundations>
Patterns WG 2021, INCOSE MBSE Patterns Working Group Web Site:
<https://www.omgwiki.org/MBSE/doku.php?id=mbse:patterns:patterns>
Patterns WG 2019a, ‘The Model Characterization Pattern: A Universal Characterization &
Labeling S*Pattern for All Computational Models’, V1.9.3, INCOSE Patterns Working
Group web site, International Council on Systems Engineering, San Diego, CA, (US).
https:/</www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:model_charac
terization_pattern_mcp_v1.9.3.pdf>
Patterns WG 2019b, ‘Methodology Summary: Pattern-Based Systems Engineering (PBSE), Based
On S*MBSE Models’, INCOSE Patterns Working Group web site, International Council
on Systems Engineering, San Diego, CA (US)
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:pbse_extensio
n_of_mbse--methodology_summary_v1.6.1.pdf>
Patterns WG 2020a, ‘ASELCM Reference Pattern: Reference Configuration Stages for Models,
Model Patterns, and the Real Systems They Represent’, INCOSE Patterns Working Group
web site, International Council on Systems Engineering, San Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:configuration
_stages_v1.4.5.pdf>
Patterns WG 2020b, ‘Consistency Management as an Integrating Paradigm for Digital Life Cycle
Management with Learning’, INCOSE Patterns Working Group web site, International
Council on Systems Engineering, San Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:aselcm_patter
n_--_consistency_management_as_a_digital_life_cycle_management_paradigm_v1.2.2.p
df>
Patterns WG 2020c. ‘Example Use of ASELCM Pattern for Analyzing Current State, Describing
Future State, and Constructing Incremental Release Roadmap to Future’, INCOSE Patterns
Working Group web site, International Council on Systems Engineering, San Diego, CA
(US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:example_evol
utionary_roadmap_v1.3.3a.pdf>
Redman, D 2014, ‘Importance of Consistency Checking in the SAVI Virtual Integration Process
(VIP)’, in Proc. of 2014 Global Product Data Interoperability Summit, Southfield, MI
(US).
<http://gpdisonline.com/wp-content/uploads/past-presentations/SE_67_AVSI-Redman-C
onsistencyCheckingInSAVI.pdf>
Rhodes, D 2018, ‘Interactive Model-Centric Systems Engineering (IMCSE) Phase 5 Technical
Report SERC-2018-TR-104’, Systems Engineering Research Center, Hoboken, NJ (US).
<https://apps.dtic.mil/sti/pdfs/AD1048003.pdf>
SAE 2016, ‘AS9145: APQP & PPAP Requirements for Aerospace and Defense’, SAE
International, Warrendale, PA (US).
<https://www.sae.org/standards/content/as9145/>
Schindel, W 2013, ‘Systems of Innovation II: The Emergence of Purpose’. in Proc. of 2013
International Symposium on Systems Engineering, International Council on Systems
Engineering, San Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:systems_of_in
novation--the_emergence_of_purpose_v1.3.6.pdf>
--------- 2021, ‘Variational Forces of Modularity: Coupled Macro and Micro Patterns in the
Innovation Ecosystem’, Presented at PLE Momentum 2021 Conference, Big Lever Inc,
Austin, TX.
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:the_forces_of
_modularity_v1.3.3.pdf>
-------. 2017a, ‘MBSE Maturity Assessment: Related INCOSE & ASME Efforts, and ISO 15288’.
Presented at MBSE World Symposium, No Magic, Inc., Allen, TX.
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:model_based_
maturitiy_planning_asme_incose_may_2017.pdf>
-------. 2017b, ‘Innovation, Risk, Agility, and Learning, Viewed as Optimal Control &
Estimation’, in Proc. of 2017 International Symposium on Systems Engineering,
International Council on Systems Engineering, San Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:risk_and_agili
ty_as_optimal_control_and_estimation_v1.7.2.pdf>
-------. 2020, ‘SE Foundation Elements: Implications for Future SE Practice, Education,
Research’, P21-22, INCOSE Vision 2035 Project, International Council on Systems
Engineering, San Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:science_math
_foundations_for_systems_and_systems_engineering--1_hr_awareness_v2.3.2a.pdf>
Schindel, W & Dove, R 2016, ‘Introduction to the Agile Systems Engineering Life Cycle MBSE
Pattern’. in Proc. of 2016 International Symposium on Systems Engineering, International
Council on Systems Engineering, San Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:is2016_intro_
to_the_aselcm_pattern_v1.4.8.pdf>
Schindel, W., Peffers, S., Hanson, J., Ahmed, J., Kline, W. 2011, ‘All Innovation Is Innovation of
Systems: An Integrated 3-D Model of Innovation Competencies’, in Proc. of the 2011
Conference of the American Society for Engineering Education (ASEE), Vancouver, BC.
Schindel, W & Peterson, T 2016, ‘Introduction to Pattern-Based Systems Engineering (PBSE):
Leveraging MBSE Techniques’, in Proc. of 2016 INCOSE Great Lakes Regional
Conference on Systems Engineering, INCOSE, San Diego, CA (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:pbse_tutorial_
glrc_2016_v1.7.4.pdf>
Schindel, W & Seidman, M 2021, ‘Applying Digital Thread Across the Product Life Cycle’.
Presented at IN Defense Network Technical Interchange Meeting, June 9-10, 2021,
Indiana Defense Network, Indianapolis, IN (US).
<https://www.omgwiki.org/MBSE/lib/exe/fetch.php?media=mbse:patterns:team_top_gun
_idn_presentation_06.09.2021_v2.1.1.pdf>
Walden, D ed. 2015. Systems Engineering Handbook: A Guide for System Life Cycle Processes
and Activities. Fourth Edition. Revised by D. Walden, G. Roedler, K. Forsberg, R.
Hamelin, and T. Shortell. San Diego, US-CA: INCOSE.

Biography
William D. (Bill) Schindel is president of ICTT System Sciences. His en-
gineering career began in mil/aero systems with IBM Federal Systems, in-
cluded faculty service at Rose-Hulman Institute of Technology, and found-
ing of three systems enterprises. He is an INCOSE Fellow, chair of the
MBSE Patterns Working Group of the INCOSE/OMG MBSE Initiative, and
was a member of the lead team of the INCOSE Agile Systems Engineering
Life Cycle Discovery Project. Bill co-led a 2013 project on Systems of In-
novation in the INCOSE System Science Working Group.

You might also like