Integration TechTalk Session 1 - Final (Publish)
Integration TechTalk Session 1 - Final (Publish)
Co-Presenters/Q&A:
Akshar Singh Sr. Solution Architect
Michele Mazzucco, Sr. Solution Architect
• Session 1: Introduction & concepts – October the 2nd 2023
• Session 2: Integration patterns for Dataverse – November the 4th 2023
TechTalk • Session 3: Integration patterns for Finance and Operations
Series applications – November the 11th 2023
• Session 4 – Complex integration scenarios - December the 4th 2023
• Introduction
• Fundamental principles
Agenda • Integration patterns
• Common applications capabilities
• Common Azure integration tools
• Security
• Roadmap
• Q/A
Introduction
Introduction
Dataverse and Dynamics 365 applications provide a rich set of integration options to address
different business and technical scenarios. The different approaches allow for a flexible design to
increase automation, improve processes optimization, reduce costs and increase security.
On a very high level, the integration options can be grouped based on the application they can be
used for:
- Dynamics 365 for Finance and operations apps (e.g. DMF)
- Dynamics 365 for Customer Engagement / Dataverse (e.g. Plug-ins)
- Both (e.g. events)
Another popular categorizations is based on the flow direction (inbound vs outbound) or the
decoupling pattern (synchronous vs asynchronous).
The scope of this presentation are Dataverse and Dynamics 365 apps. We will explore the
integration options in more detail in the following slides.
Integration components - Overview
Event Grid Service Bus Web Jobs Data flow Azure Function Data Factory Logic Apps API management Power Automate
Outbound
Technical Performance
Business requirements
requirements/limitations requirements
Security/Regulatory
requirements.
Existing strategy
Align with the company’s integration strategy and keep the
bigger schema always in mind.
Consider Stay aligned with
Consider existing
company’s modern cloud
patterns/tools and
integration integration
middleware
platform approaches
Every critical
Monitoring should
component must
be integral part of
have High
the design
Availability
Notification should
Error handling must
be considered,
be designed, tested
especially for
and documented.
unattended systems
Simplification
Keep it as simple as reasonably possible
Use low-code/no-
code capabilities Define uniform Consider using a
from Power Platform integration patterns middleware
and Microsoft Azure
Centralized error
handling and
notification systems
Scale
Build for growth
Consider the impact of
additional requirements
Scalability and Consider the impact
in the long term and
expansion and cadence of updates
extensibility effort
required
Selected components
Consider parallelism to
should support ALM
overcome latency and
tools such as Azure
service limits
DevOps
Integration
Patterns strategy
Factors to consider when choosing a pattern
Latency Message Routing
• Synchronous: Integration is triggered with an immediate • Point-to-Point: Dynamics 365 to Legacy System
response required. • Enterprise Service Bus: BizTalk Server, Azure Integration
• Asynchronous: Integration is triggered with a delayed Services etc.
response required. • Broker, Hub & Spoke, or Extract, Transform, Load (ETL): , Azure
– Near Real-Time: Minimal latency (<1 min) is allowed between trigger Synapse, Azure Data Factory, SQL Server
and transmission. Integration, Services Event Grid etc.
– Scheduled Batch: Integration will occur on a scheduled basis with a
pre-determined recurrence.
Frequency Trigger
• Integration request can be classified into following • On-Demand/Manual: Integration is manually initiated by either
frequencies: an End User or IT User.
• High: seconds or minutes • Event Triggered: Integration is triggered based on an event or
Medium: Hours or Days condition in the source or destination system.
Low: Weeks or Months • Time/Date Scheduled: Integration is triggered based on a pre-
determined schedule.
Factors to consider when choosing a pattern
Interaction/Operation Batching
• Create: Record will be created in the destination system. • Un-batched: Individual records are sent in the integration
• Read: Integration will query the source and return a request.
specifically requested piece of data. • Batched: Records are consolidated for transmission through the
• Update: Integration will update an existing record in the integration request.
destination system.
• Delete: Results in the deletion of a record in the destination
system.
• Action: Integration will trigger a system event. E.g. Calculate
sales order price.
Recurring FO Inbound/Outbound Asynchronous CRU Y High Yes High volume asynchronous import/ export
Integration
Business/Data FO Outbound Asynchronous R N High Yes High volume status event notifications to
Events DV subscribers, workflows, and outbound
integrations
Synapse Link FO Outbound Asynchronous R Y High Log High volume data integration for Analytics
DV
Virtual table FO Inbound/Outbound Synchronous CRUD N N/A No Integration of data residing in external systems
DV without data replication.
SQL/TDS DV Outbound Synchronous R N Low/Medium No Read only access, respects Dataverse security.
endpoint Should be used for analytics with PBI.
Plug-in DV Outbound Synchronous CRUD Y Low/Medium Yes Event handler that executes in response to a
Asyncronous Medium/High specific event raised during processing of a
Dataverse data operation. When running in sync
mode, it executes as part of database transaction.
Webhook DV Outbound Syncronous & CUD N Low/Medium Yes Sends POST requests with JSON payload to an
Asynchronous external service.
Legend
Volume: Low – 0 to thousands, Medium – thousands to tens of thousand, High – hundreds of thousands to millions.
Operations: C- Create, R- Read, U-Update, A-Action
Error handling: No – errors are raised but not saved, Yes – errors are captured and saved, Log – errors are logged, Extensible –
Batching available: Use endpoint on a set of records
can use 3rd party tools internal extension tools for handling errors.
Common Apps
capabilities
Dual-write
Provides tightly coupled, bidirectional integration between Dataverse and Finance and Operations apps.
Low volume
Outbound/Inbound Syncronous/Asynchronous
Dual-write initial sync can be a tool leveraged
Tightly-coupled, bi-directional integration for Data replication across Dataverse and finance with data migration (see guidance matrix.)
master and reference data. and operation apps for Create and Update Dual-write live sync works in the context of one
events. Also available are play, pause, and transaction and with high numbers of cascading
catchup modes to support the system during records (e.g., customers with multiple related
online and offline/asynchronous modes. addresses) performance testing is
recommended.
O-Data
OData endpoints are available in both Dataverse and Finance and Operations apps.
What is it ?
• Decouple your applications from each other
• Distribute messages to multiple independent back-end
systems
• Protect your application from temporary spikes in
Azure Service traffic
Bus • Scale out ordered messaging to multiple readers
• Connect your existing on-premises systems to cloud
solutions
How to start ?
• Introduction to Azure Service Bus
• Configure Microsoft Azure (SAS) for integration
• Integrate Microsoft Dataverse Azure solutions - Training |
Microsoft Learn
Common technologies : Azure Logic Apps
What is it ?
How to start ?
What is it ?
How to start ?
What is it ?
How to start ?
What is it ?
How to start ?
How to start ?
• Export APIs from Azure API Management to Microsoft
Power Platform | Microsoft Learn
• Enable CORS policies for Azure API Management custom
connector | Microsoft Learn
Poll
Are you familiar with the previous tools?
https://forms.office.com/r/46iEFTXp4h
Integration – Security
General guidance
Authentication
Authorization
External Security
• Encrypted channels
• External tools with dedicated security
• Hybrid security (On Premises Data Gateway)
Compliance
• Regulatory requirements
• Internal policies
Roadmap
Roadmap
Virtual Entities (Finance and Operations applications):
Performance enhancements
Simplified deployment experience
Improved development experience
Synapse link
Simplified deployment experience
Simplified tighter integration with Microsoft Fabric
Reduce the time to initialize a table (aka. Parallel initialization)
Enable incremental folders (aka. change feeds)
Enable Managed Lake for F&O tables
Enum numeric values
Dual Write
Asynchronous processing for DW maps
General quality improvements
Plugins
Network isolation to connect to private, endpoint-enabled resources in Azure or resources within their network.
Resources
Learning path Technical Talks
45
Q/A
Q:In terms of centralized monitoring, can we expect that in the future all Dynamics 365 standard
integration patterns will drop a relevant status in i.e. Azure Application Insights?
A:Microsoft is taking initiatives to extend the use of Application Insights and increasing the out-of-the-box
options in the applications. For example, you can see this already happening in the Warehouse management
https://learn.microsoft.com/en-us/dynamics365/supply-chain/warehousing/application-insights-monitor-
usage-performance.
This effort will continue as we want to give the same level of services to all the applications and tools,
including integration scenarios, however we don’t have an ETA for that yet.
Q:Why licensing is not ever a factor to consider in your things to consider when choosing an
integration platform?
A:We included the licensing in the larger "cost" factor (which should include other costs as well). This is
fundamental for the upfront evaluation of the optimal design.
Please also consider that some of the components required by integration patterns may have a "pay as you
go" licensing model, so costs will also vary depending on actual transaction volumes and cannot be inferred
without analysing the actual requirements.
Q:Is it true that Microsoft will eventually replace Dual Write with other means of integration?
A:No, there is no such plan for the foreseeable future.
Q/A
Q:When are you planning to replace all these connectors for D365 with one and only one connector?
A:Microsoft is looking to simplify the integration landscape as much as possible but there is not a plan for a
unified single connector at the moment. Any suggestions from the community to simplify and optimize the
integration landscape is very welcome.
Q:Why are there no direct options for JSON (as source data format) using integration patterns like
recurring or data management API (in FO)? Does Microsoft have any plans?
A:There are currently no plans to directly support JSON format in the DMF asynchronous patterns. Those
patterns were designed to deal with the most common file formats used in the customer scenarios (XMF, CSV,
plain text etc.…). IF JSON is required it can be easily converted from/into XML. One simple approach is to
used Logic Apps which provides the functions xml() and json() to parse and covert between formats. Logic
apps also provide easy ways to contact DMF endpoints.
Q/A
Q:Is there any plan where D365FO have native support for external events consumption?
A:A simplified integration with the Azure evening platform is part of the overall considerations for the
Dynamics 365 and Dataverse integration optimization. At the moment, no plan has been created to provide
this out-of-the box capability.
Q:With one dynamics one platform vision, if I can handle an integration on F&O side and Dataverse
side, which one is more advised?
A:As in many of these cases, there is not a general answer. ODOP vision doesn’t dictate how to deal with the
current integration options, it’s rather a north star on how we are going to shape the future features. Your
current answer will depend on the requirements. It is very rare that Dataverse and FO integrations are
perfectly equivalent. At the very least, the business process which is involved tends to be focused more on
one of the applications, giving you one suggestion on how to integrate.
Q:If i have a use case that can be fulfilled by X++ and Power Platform, which one do you recommend?
A:In general, we advise prioritizing low-code/no-code approach when the solutions are completely
equivalent. Low Code/No code is generally easier to create and maintain. That said, described in the
presentation, many factors and requirements must be considered before a final decision. Even a single detail
could change the pattern of choice significatively.
Q/A
Q: Will all integrations with FnO eventually go through Dataverse? My understanding is eventually
FnO will use Dataverse as its 'database’?
A:We have no plan to remove Dynamics Finance and Operations endpoint in the short/mid term. The long-
term plan is to add more common simplified ways to create integrations with Dynamics 365 and Dataverse
without having to worry about the platform. Any sunsetting of existing endpoints would eventually be a very
long-term scenario and it would be slowly implemented.
Q: Are you guys giving "Export to Data Lake" on F&O side and "Synapse Link for Dataverse" , a new
name "Synapse Link for D365"?
A: No, the official name of the feature is “Azure Synapse Link for Dataverse”, we have only extended it to
include finance and operations apps data in it. Export to Data Lake is a similar but separate feature available
in finance and operations apps. We recommend new implementations to focus on Synapse Link and
customers already on Export to data lake to start looking into the migration patterns to Synapse Link. Please
follow the dedicated Yammer group for more information:
https://www.yammer.com/dynamicsaxfeedbackprograms/#/threads/inGroup?type=in_group&feedId=327689
09312
Q:How do i move a business event created in F&O from one environment to another environment?
A:Coping business events from one environment to another is not possible nor advisable. Even when a
database is moved between environments, events endpoint get disable to prevent any unwanted external
connections. For example, events from test environments could be intercepted by production listeners and
vice versa). Business events must be bound and activated per-environment.
Q/A
Q:Do CUD operations via Virtual Tables require license? Do I get into multiplexing issues?
A: Yes, any interaction with Dynamics 365 applications always require licenses. This is not a multiplexing
scenario, but the users are directly working with finance and operations data, so they need to be a user and
have a security role in finance and operations. The exact license depends on the role assigned. For more
information download the most recent licensing guide here: https://www.microsoft.com/en-
us/licensing/product-licensing/dynamics365
Q:What's the difference between web hook and plugin that sends a payload to an external service?
What is the difference between webhook and a plugin that posts to an http endpoint which one
should i use?
A:Plugins are a very specific concept of Dataverse, while webhooks can be seen as a more generic HTTP
callback approach. A webhook can be considered a more lightweight approach, however it will be limited at
the context of the event, while a plugin will be able to execute additional operations through the provided
SDK. Also, plugins allow to interact with pre-event stages. If you would need to execute additional logic such
as execution context transformations or validations before calling the external endpoint, plugin works better.
Also consider factors such as the security for your external endpoint, is it compatible with the plugin and
webhook authentication options?
Q:Should we anticipate enhancements around the field services area of dual write?
A:Yes, there is roadmap to integrate Field service and Finance and operations built on top of Dual-write
framework. There is no ETA yet.
Q/A
Q:When considering integrations between MS tools, I am deliberating over whether to prioritize the
control and centralization provided by Azure Functions "Durable" or to opt for the visual tracking of
Low code using multiple interconnected Logic Apps, despite the constraints in code control. Given the
necessity to frequently modify and deploy components across DEV, UAT, and PROD, could you provide
some insights or recommendations on which approach would be more beneficial in terms of
scalability, maintainability, and overall efficiency in the development lifecycle?
A:Azure Durable Functions and Logic Apps have different positive and negatives that come from their
fundamental purpose. Azure Functions is a serverless compute service, whereas Azure Logic Apps is a
serverless workflow integration platform. Simply put, Azure Durable Functions is an enhancement of Azure
Function which expands the original purpose to include orchestration capabilities. Even with that, Azure
Function remains a serverless solution to build applications and services by writing code. Logic Apps’ main
purpose instead is workflows and orchestration while providing some programmability with the low code
approach. The best route is usually a combination of the two where Logic Apps is used for overall
orchestration and simple transformations while delegating complex business logic to specialized tools,
including services developed on Azure Function. Durable Functions can be used to further optimize the
computational patterns (e.g. using Fan-out patterns). From ALM perspective, a mixed approach would
probably guarantee the best results: i.e. Logic Apps would create the more stable and easier-to-manage
framework that orchestrates the flows, while the detailed business logics can be created and handled using
standard Azure Functions ALM. Logic Apps can also be changed and managed focusing on the logical
workflow, without involving developers, while the applications contents must be carefully designed by
experienced developers. Of course there are many more considerations to keep in mind when you choose
the integration architecture including requirements collected, the current integration scenarios, the company
integration plans etc... What would be more beneficial cannot be answered looking only at technology. Still
you can refer to the documentation for some immediate comparison: https://learn.microsoft.com/en-
us/azure/azure-functions/functions-compare-logic-apps-ms-flow-webjobs#compare-azure-functions-and-
azure-logic-apps
Q/A
Q:How to restrict access to individual custom API or Odata end point? For ex: Expose some endpoint
to one third party application and restrict access to other endpoints
A:Both Dataverse and Dynamics 365 F&O apps, are built on and depend on MS Entra ID/Azure AD for
identification and authentication. For access control, secure features like single sign-on, multi-factor
authentication, conditional access can be used. Secondly, any 3rd party app that will need to access Dataverse
or Dynamics 365 F&O apps endpoints after Azure AD authentication, will need to be authorized and for this,
an app user with the appropriate security role will need to be created in Dataverse or Dynamics 365 F&O
depending on your scenario.
Custom APIs for Dataverse, besides RBAC for the app user, can also have an additional privilege required.
Learn more https://learn.microsoft.com/en-us/power-apps/developer/data-platform/custom-api#secure-
your-custom-api-with-a-privilege.
Additionally, to the MS Entra ID and platform RBAC security, for Dataverse as a platform, there are network
isolation features such as IP Firewall which is currently in preview https://learn.microsoft.com/en-us/power-
platform/admin/ip-firewall
Q: Why is network isolation only for plugin , it should be for any outbound calls for both platforms
including business events
A:The teams started with plugins as they are the most used connectors. The plan is to expand to more areas
soon.
Q/A
Q: When to use data package api vs recurring integrations? I heard there are parallel execution issues
with data package api, can you comment on those?
A:Difference between package and recurring API's are described here : https://learn.microsoft.com/en-
us/dynamics365/fin-ops-core/dev-itpro/data-entities/data-management-api#choosing-an-integration-api.
For parallel package execution, please make use of Enhanced parallel package import option from Framework
parameters Compatibility options tab > Data project and job compatibility option. Please find more details
here : https://learn.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/data-para
Dankie Faleminderit Shukran Chnorakaloutioun Hvala Blagodaria
Thank you!
감사합니다 Paldies Choukrane Ačiū Благодарам ありがとうございました
Ďakujem Tack Nandri Kop khun Teşekkür ederim Дякую Xвала Diolch
54