D365FO Interview Questions
D365FO Interview Questions
LCS:
What are the main components and tools available within LCS, and how do they
support project management?
Can you explain the process of data migration using LCS tools? What factors
influence your migration strategy?
How would you troubleshoot a failing deployment using LCS, and what best
practices would you follow for issue resolution?
Environment Metrics: Displays key metrics like CPU usage, memory consumption,
SQL query performance, and disk I/O.
Activity Monitoring: Monitors user activity, including long-running processes and peak
usage times, to identify potential performance bottlenecks.
Telemetry: Collects data on various events and processes running within the
environment. This data is invaluable for diagnosing errors and performance issues.
SQL Insights: Allows you to review the performance of SQL queries, identify slow
queries, and optimize database performance.
Raw Logs: Provides raw telemetry logs and error reports that administrators can use to
perform detailed troubleshooting.
Run History: Logic Apps provide a detailed run history for each workflow
execution. This includes information about each trigger and action, execution
time, and the data passed between steps.
Diagnostic Logs: You can enable diagnostic logs and send them to Azure
Monitor, Application Insights, or Azure Log Analytics for deeper insights and
custom alerting.
Resubmitting Failed Runs: In case of failures, you can manually resubmit a
failed run after fixing the underlying issue, avoiding the need to rerun the
entire workflow.
Batching: Logic Apps allow you to process data in batches, reducing the load
on the D365 FO instance.
Paging: For large datasets, the Logic App can retrieve data in pages,
processing a limited number of records in each batch to avoid timeouts.
Chunking Data: When posting data to D365 FO or external systems,
splitting the data into smaller chunks can improve reliability and prevent
overloading the API.
Asynchronous Processing: For long-running processes, using asynchronous
workflows can help avoid timeouts and ensure that the data processing
happens in the background.
Upgrade
Explain the end-to-end process of upgrading from AX 2012 to Dynamics 365
FO. What are the key stages in this upgrade?
How do you handle historical data during an upgrade? Do you
migrate everything, or do you apply a data archival strategy?
What tools or methodologies do you use to ensure smooth data migration between
AX 2012 and Dynamics 365 FO?
How do you handle customizations in AX 2012 during an upgrade to Dynamics
365 FO?
How would you perform data validation post-upgrade to ensure that no data is lost
or corrupted during migration?
What is the role of extensions in Dynamics 365 FO, and how do you
convert AX 2012 overlayered code into D365 extensions?
Performance Optimization
Q: What are some key performance tuning techniques in D365 FO, especially
for batch jobs and reports?
Q: How do you diagnose and resolve SQL performance issues in D365 FO?
Follow-up: How do change-based and due date alerts differ in their use cases and
configuration?
Can you walk through the process of setting up an alert for a specific business event
(e.g., an inventory level falling below a threshold)?
How can you set up alerts for changes in custom fields, and what steps are
required to handle custom objects or extensions?
How do you configure and monitor batch jobs to ensure email alerts are
processed efficiently in D365 FO?
How do you handle scenarios where email batch jobs fail or emails are not
sent? What steps do you take to diagnose and resolve the issue?
Explain how you would send multilingual alert emails based on the
recipient’s language preferences.
Can you explain how you would troubleshoot a batch job for alerts that is
running but not sending emails? What are the key areas to investigate?
Business events
Business events in Dynamics 365 Finance and Operations (D365 FO) enable
systems and external applications to communicate and respond to changes or
specific conditions within D365 FO, often integrating with external services through
Azure, Power Automate, or other enterprise systems.
What are business events in Dynamics 365 FO, and how do they differ from
workflows and alerts?
What are the core components of a business event, and how do they interact
with external systems? – Using endpoints
Change Tracking
What is change tracking in Dynamics 365 FO, and why is it important
in data synchronization scenarios?
Is it good to enable change tracking on all tables? What is the
impact?
Can you explain the underlying architecture of change tracking in D365 FO?
How is data tracked at the table level?
In a scenario where change tracking is enabled for a large number of tables,
how do you manage system performance and ensure that it doesn’t degrade?
DMF
Can you describe how DMF uses composite entities and the scenarios where
composite entities are beneficial?
How do you debug issues with data entities when data imports or
exports fail or when performance is suboptimal?
How do you configure a data project for a complex migration that involves
multiple entities, dependencies, and large data volumes?
Can you explain how you use staging tables in DMF during data imports and
exports? What are the key advantages of using staging tables?
What techniques do you use to optimize the performance of data
imports and exports in DMF, especially for high-volume transactions?
Use Batch Jobs: DMF allows data imports and exports to be processed in
batch jobs. This minimizes the impact on the front-end performance and
ensures that large datasets are processed asynchronously.
Parallel Processing: Configure the DMF to process data in parallel by
increasing the number of batch threads or parallel tasks. This allows
multiple records to be processed simultaneously, significantly reducing the
overall processing time.
Dual-write
Can you explain the architecture of dual-write in D365 FO and how it
integrates with Dynamics 365 Customer Engagement (CE) applications?
How do you configure dual-write for a new environment? Can you walk
through the process of enabling dual-write for D365 FO and CE?
How do you troubleshoot common dual-write errors like data sync
failures, misaligned field mappings, or performance slowdowns?
Batch Jobs
How do you configure batch job priorities and what impact do they
have on system performance?
How do you handle failed batch jobs? What steps do you take to investigate and resolve
issues?
What strategies do you use for managing long-running batch jobs, and how do you
identify potential bottlenecks?
What techniques do you employ to optimize the performance of batch jobs in
D365 FO, especially for large data sets?
How do you troubleshoot batch jobs that are stuck or taking longer than
expected to complete?
Upgrade AX 2009 to D365FO: Explain the
steps
ARM helps deploy, manage and monitor all the resources for an
application, a solution or a group
Microsoft 365 admin center – Microsoft 365 admin center is the subscription
management portal that Microsoft 365 provides for administrators. It's used to
provide management functions for users (Microsoft Entra ID) and subscriptions.
Power platform admin center. Under resources capacity. If you want to see
breakdown how much each table or Index is occupied, then request JIT access and
then check in SQL.
If data is more in table level then run the clean-up routines to archive.
Data Entities:
Recently I came across a scenario in which I was required to modify the behavior of
data entity so that when an empty column is present in the source file, it needed to
be ignored i.e. the existing values for that column of the record needed to be
defaulted.
Ans: first I suggested to use the ‘Ignore blank values’ check on the field to
datasource mapping.
LCS
What is LCS?
Lifecycle Services (LCS) for Microsoft Dynamics is a collaboration portal that
provides an environment and a set of regularly updated services that can
help you manage the application lifecycle of your implementations of finance
and operations apps.
If you pause updates to the production environment, all updates to other sandbox
environments are paused too.
Shared asset library – The Shared asset library is used by Microsoft and
Partners to share assets across multiple tenants, projects, and environments
in Lifecycle Services. This library can be accessed by any user who signs in
to Lifecycle Services.
Reporting:
SSRS :
A DP (Data Provider) class that will provide three data sets for the
SSRS report. The same class will also provide the information about
report parameters to the report’s RDL through an attribute that links a
DP class with the corresponding report Data Contract.
A Data Contract class that will define and carry the values of the
report parameters. We will have one hidden parameter: a sales
agreement’s RecId.
Three temporary tables to define and carry the report data. They
will be filled and exposed to the SSRS report by the DP class.
A Controller class that will handle the report dialog form, setting the
SSRS report design and the value of the hidden parameter.
A Data Contract class defines and stores the values of the report parameters.
Basic Authentication: You can include the username and password in the
request headers using the "Authorization" header.
OAuth 2.0: Postman has built-in OAuth 2.0 support, allowing you to
configure and authenticate using various OAuth flows, such as
Authorization Code or Client Credentials.
The query parameters for the GET request are saved in Postman's URL
What is the HTTP response code for a POST request with incorrect
parameters?
The correct response code for a request with incorrect parameters is 400
Bad Request
When you successfully create a resource using a POST or PUT request, the
status code 201 denotes that the resource has been created. It uses the
location header to return a link to a newly built resource.
a) Limit the data connection to the same app. Connecting the same app to
more than 30 sources can increase the time it takes to load the program.
c) Using the Concurrent function to load data sources simultaneously can cut
the time it takes for an app to load data in half.
d) Using the Set function to avoid continually retrieving data from the source
can improve the performance if the data is likely to stay the same.
e) Delegating data processing to the data source can speed up the app's
performance, as retrieving the data from a local device can demand more
processing power, memory, etc.
f) Avoid repeating the same formula. Consider setting the formula once and
referencing the outcome of the first property in future ones if many
properties run the same formula.
What are the three different views in Power BI and
briefly explain each one.
Microsoft Power BI has three views, each one of which is unique and serves a
purpose.
a) Report View: This is for adding visualisation and reports. This view also
allows for publishing.
Business events
Business events provide a mechanism that lets external systems receive
notifications from finance and operations apps.
A business action that a user performs can be either a workflow action or a non-
workflow action. Approval of a purchase requisition is an example of a workflow
action, whereas confirmation of a purchase order is an example of a non-workflow
action.
Prerequisites
Business events can be consumed using Microsoft Power Automate and
Azure messaging services.
Power Automate
Explain the different types of flow on Power Automate.
Business process flows: It helps users to get scheduled work done. Then, it
breaks the work into steps to get the best output.
Cloud flows: Here, the flows are automatically executed by triggering up the
workflow in schedule or currently.
Desktop flows: It is from a powerful platform that helps to automate web
tasks or desktop tasks
How to analyze the trace file? What we will check in the trace file?
A scheduling priority is defined for batch groups, but it can be overridden for
specific batch jobs. The scheduling priority classifications are used to declare
relative priorities, and to determine the processing order of jobs and business
processes. The available values for the scheduling priority
are Low, Normal, High, Critical, and Reserved capacity.
Batch concurrency control feature in Feature management. This feature lets you
set a limit on the number of tasks that can run concurrently in a specific batch job.
Therefore, it helps you prioritize your batch jobs and optimize the use of your
resources. For example, by limiting the number of tasks for a low-priority batch job,
you can avoid overloading the system and affecting the performance of other,
higher-priority batch jobs.
The example misses the logic to create a token etc. for authorization as the
API in use is public one and you dont need any authorization.
Base enums are a fixed set of values. Those values in database are
saved as intigers but they have also name (as referenced from X++
code) and a label (visible to users). You can have up to 255 values
for Base enums. The integers in the database will take on the values
0 through 254.
The AOT in D365FO apps contains many existing EDTs and base
enums that can be extended for use in your project, or you can
create new data types.
· Query
· dialog, with the persistence of the last values entered by the user
· Validate
What is an Entity Store? What is its significance in Dynamics 365 Finance and
Operations?
An Entity Store is a database that stores data from Dynamics 365 Finance
and Operations in a format that is optimized for reporting and analytics. The
Entity Store is updated regularly with new data from Finance and Operations,
so it is always up-to-date and can be used for reporting and analytics
purposes. The Entity Store is a key part of the Finance and Operations
platform, and it is used by many organizations to improve their reporting and
analytics capabilities.
Enable the integration for cloud-hosted
development environments
Because of the architecture differences between cloud-hosted development
environments and the sandbox or production environments, Power Platform
Integration can't be set up after the developer environment is created.
Therefore, you can set up Power Platform Integration only during the
deployment of your cloud-hosted environment. In addition, you can connect
a new cloud-hosted environment only with a new Power Platform
environment. You can't connect an existing environment of any type via
Lifecycle Services.
By default, OData returns only data that belongs to the user's default
company. To see data from outside the user's default company, specify the ?
cross-company=true query option. This option will return data from all
companies that the user has access to.
Example: http://[baseURI\]/data/FleetCustomers?cross-company=true
To filter by a particular company that isn't your default company, use the
following syntax:
http://[baseURI\]/data/FleetCustomers?$filter=dataAreaId eq 'usrt'&cross-
company=true
Expand table
URL Description
[Your organization's root URL]/data/Customers? List all the customers, but show only the
$select=FirstName,LastName first name and last name properties.
[Your organization's root URL]/data/Customers? List all the customers in a JSON format
$format=json that can be used to interact with
JavaScript clients.
The OData protocol supports many similar filtering and querying options on
entities. For the full set of query options, see Windows Communication
Foundation.
Using Enums
Enums are under namespace Microsoft.Dynamics.DataEntities. Enums
can be included in an OData query is by using the following syntax.
Microsoft.Dynamics.DataEntities.Gender'Unknown'
Microsoft.Dynamics.DataEntities.NoYes'Yes'
An example query for using the above enum values is shown below.
https://environment.cloud.onebox.dynamics.com/data/CustomersV3?\
$filter=PersonGender eq Microsoft.Dynamics.DataEntities.Gender'Unknown'
https://environment.cloud.onebox.dynamics.com/data/Currencies?\
$filter=ReferenceCurrencyForTriangulation eq
Microsoft.Dynamics.DataEntities.NoYes'No'
To ensure that old data isn't inserted, a data mart reset can be started only
after existing tasks are completed. If you try to reset the data mart before all
tasks are completed, you might receive a message such as, "The data mart
reset was unable to be processed because of an active task. Please try again
later."
The USMF demo company has a Balance sheet report that not all Financial
reporting users should have access to. You can use tree security to restrict
access to a single report so that only specific users can access it.
1. Sign in to Financial Reporter Report Designer.
2. Select File > New > Tree Definition to create a new tree
definition.
3. Double-tap (or double-click) the Summary line in the Unit
Security column.
4. Select Users and Groups.
5. Select the users or groups that require access to the report.
6. Select Save.
7. In the report definition, add your new tree definition.
8. In the tree definition, select Setting. Then, under Reporting unit
selection, select Include all units.
Current rate – This type is typically used with balance sheet accounts.
It's usually known as the spot exchange rate and can be the rate on the
last day of the month or another predetermined date.
Average rate – This type is typically used with income statement
(profit/loss) accounts. You can set up the average rate to do either a
simple average or a weighted average.
Historical rate – This type is typically used with retained earnings,
property, plant and equipment, and equity accounts. These accounts
might be required, based on FASB or GAAP guidelines.
Dimension sets
When you run the year-end close, each dimension set balance is rebuilt. This
behavior has a direct impact on performance. Some organizations create
dimension sets unnecessarily, because they were used at one point or might
be used at some point. Because these unnecessary dimension sets are
rebuilt during the year-end close, time is added to the process. Take the time
to evaluate your dimension sets and delete any that are unnecessary.
The year-end close template lets organizations select the financial dimension
level to maintain when transferring profit and loss balances to retained
earnings. The settings allow an organization to maintain the detailed
financial dimensions (Close all) when moving the balances to retained
earnings or choose to summarize the amounts to a single dimension value
(Close single). This can be defined for each financial dimension. For more
information on these settings, see the Year-end close article.
Degenerate dimensions
SQL JOIN
A JOIN clause is used to combine rows from two or more tables, based on a
related column between them.
10308 2 1996-09-18
10309 37 1996-09-19
10310 77 1996-09-20
Notice that the "CustomerID" column in the "Orders" table refers to the
"CustomerID" in the "Customers" table. The relationship between the two tables
above is the "CustomerID" column.
Then, we can create the following SQL statement (that contains an INNER JOIN),
that selects records that have matching values in both tables:
What is LCS?
DB Point In Time Restore
Package deployment (What are steps Involved? Sync Issues)
Can be merge Customizations and Standard code in to Single package?
What is PQU updates?
How to use trace parser?
Why we enable Power platform Integration in LCS?