0% found this document useful (0 votes)
2K views33 pages

D365FO Interview Questions

Uploaded by

babu thum
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2K views33 pages

D365FO Interview Questions

Uploaded by

babu thum
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 33

Can you explain your experience with

Dynamics 365 Finance and Operations


platform?

Use Debugging Tools: Dynamics 365 Finance and Operations


provides various debugging tools that can help me to identify and
resolve issues. For example, I use the “Debug” option to trace the
execution of code, and the “Call Stack” option to understand the
sequence of method calls.

Use Logging and Tracing: I make use of logging and tracing in my


code to log messages and trace the execution of code. This helps
me to identify any issues during testing and debugging.

LCS:
 What are the main components and tools available within LCS, and how do they
support project management?

 Can you explain the process of data migration using LCS tools? What factors
influence your migration strategy?

 How would you troubleshoot a failing deployment using LCS, and what best
practices would you follow for issue resolution?

What are the key components of the LCS


Environment Monitoring tool?
Answer:
The Environment Monitoring tool in LCS provides a real-time overview of system
performance and helps troubleshoot issues. Key components include:

 Environment Metrics: Displays key metrics like CPU usage, memory consumption,
SQL query performance, and disk I/O.
 Activity Monitoring: Monitors user activity, including long-running processes and peak
usage times, to identify potential performance bottlenecks.
 Telemetry: Collects data on various events and processes running within the
environment. This data is invaluable for diagnosing errors and performance issues.
 SQL Insights: Allows you to review the performance of SQL queries, identify slow
queries, and optimize database performance.
 Raw Logs: Provides raw telemetry logs and error reports that administrators can use to
perform detailed troubleshooting.

What are the different integration options


available for Power Platform with D365
Finance and Operations?
Answer: Microsoft Power Platform can integrate with D365 F&O using several methods,
including:

 Dual-write: Real-time bi-directional data synchronization between Dataverse


(Power Platform) and D365 F&O, allowing shared data across applications.
 Power Automate (Flow): Automate workflows between F&O and other
applications or services using pre-built or custom connectors.
 Custom Connectors: Build custom APIs in F&O and expose them via Azure
API Management for use in Power Platform.
 Dataverse Integration: Leverage Dataverse as a common data service to
store and share data between Power Platform apps and F&O.
 Business Events: F&O can trigger business events that Power Automate or
Power Apps can respond to, allowing for event-driven integrations.
 OData API: Use F&O’s OData API to expose data entities for integration into
Power Apps or Power BI.
 Data Management Framework (DMF): Export or import large amounts of
data between F&O and Power Platform via DMF.

What is BYOD (Bring Your Own Database) in


D365 F&O, and how does it enhance Power BI
reporting?
Answer: BYOD in D365 F&O allows organizations to export data from D365 F&O to their
own Azure SQL Database for advanced reporting and integration purposes.
Explain the concept of dual-write in Power
Platform and how it integrates with Finance
and Operations apps.
Answer: Dual-write is a key integration technology that synchronizes data between Dynamics
365 Finance & Operations (F&O) and Dataverse, which is the foundation of the Power
Platform. It ensures data consistency across finance, operations, and other business applications
by providing real-time, bidirectional data flow between D365 F&O and Dataverse.

 Real-time Integration: Changes made in F&O apps (e.g., Customer data)


are immediately reflected in Dataverse and vice versa.
 Entity Mapping: Pre-built entity maps (e.g., customers, vendors, products)
allow for quick setup. These maps can also be customized.

Setting up Data export from D365 FO using Synapse Link

Prerequisites for setting Synapse Link for D365 FO Data

 D365FO environment that’s version update 10.0.34 (PU


58) or later
 Microsoft Power Platform integration for the D365FO
environment
 Enable the Sql row version change tracking configuration
key.
 Access to Azure subscription with the following resource
provision
 Azure Gen 2 Storage account in the same region as
the D365FO environment
 Azure Synapse Analytics workspace in the same
region as the D365FO environment
 Azure Synapse Spark pool with version 3.1 or later
How can you monitor and troubleshoot a
Logic App integrated with D365 FO?
Answer:

 Run History: Logic Apps provide a detailed run history for each workflow
execution. This includes information about each trigger and action, execution
time, and the data passed between steps.
 Diagnostic Logs: You can enable diagnostic logs and send them to Azure
Monitor, Application Insights, or Azure Log Analytics for deeper insights and
custom alerting.
 Resubmitting Failed Runs: In case of failures, you can manually resubmit a
failed run after fixing the underlying issue, avoiding the need to rerun the
entire workflow.

How do you handle large data volumes when


integrating D365 FO with Logic Apps?
Answer: When dealing with large data volumes, performance optimization strategies include:

 Batching: Logic Apps allow you to process data in batches, reducing the load
on the D365 FO instance.
 Paging: For large datasets, the Logic App can retrieve data in pages,
processing a limited number of records in each batch to avoid timeouts.
 Chunking Data: When posting data to D365 FO or external systems,
splitting the data into smaller chunks can improve reliability and prevent
overloading the API.
 Asynchronous Processing: For long-running processes, using asynchronous
workflows can help avoid timeouts and ensure that the data processing
happens in the background.

Upgrade
 Explain the end-to-end process of upgrading from AX 2012 to Dynamics 365
FO. What are the key stages in this upgrade?
 How do you handle historical data during an upgrade? Do you
migrate everything, or do you apply a data archival strategy?

 What tools or methodologies do you use to ensure smooth data migration between
AX 2012 and Dynamics 365 FO?
 How do you handle customizations in AX 2012 during an upgrade to Dynamics
365 FO?

 How would you perform data validation post-upgrade to ensure that no data is lost
or corrupted during migration?

 What is the role of extensions in Dynamics 365 FO, and how do you
convert AX 2012 overlayered code into D365 extensions?

Security & Role-Based Access


 Q: How do you configure role-based security in D365 FO? Can you explain the
difference between roles, duties, and privileges?
 Q: How would you troubleshoot an issue where a user cannot access certain
data or forms in D365 FO?

Performance Optimization
 Q: What are some key performance tuning techniques in D365 FO, especially
for batch jobs and reports?
 Q: How do you diagnose and resolve SQL performance issues in D365 FO?

Alerts and emails


 How do alerts in D365 FO work at a high level? Can you explain the different types of
alerts that can be configured?

 Follow-up: How do change-based and due date alerts differ in their use cases and
configuration?

 Can you walk through the process of setting up an alert for a specific business event
(e.g., an inventory level falling below a threshold)?

 How can you set up alerts for changes in custom fields, and what steps are
required to handle custom objects or extensions?

 How do you configure and monitor batch jobs to ensure email alerts are
processed efficiently in D365 FO?

 How do you handle scenarios where email batch jobs fail or emails are not
sent? What steps do you take to diagnose and resolve the issue?
 Explain how you would send multilingual alert emails based on the
recipient’s language preferences.

 How do you integrate alerts with workflows in D365 FO to ensure that


users receive notifications at critical steps in the workflow process?

 Can you explain how you would troubleshoot a batch job for alerts that is
running but not sending emails? What are the key areas to investigate?

Business events
Business events in Dynamics 365 Finance and Operations (D365 FO) enable
systems and external applications to communicate and respond to changes or
specific conditions within D365 FO, often integrating with external services through
Azure, Power Automate, or other enterprise systems.

 What are business events in Dynamics 365 FO, and how do they differ from
workflows and alerts?
 What are the core components of a business event, and how do they interact
with external systems? – Using endpoints

Change Tracking
 What is change tracking in Dynamics 365 FO, and why is it important
in data synchronization scenarios?
 Is it good to enable change tracking on all tables? What is the
impact?
 Can you explain the underlying architecture of change tracking in D365 FO?
How is data tracked at the table level?
 In a scenario where change tracking is enabled for a large number of tables,
how do you manage system performance and ensure that it doesn’t degrade?

DMF
 Can you describe how DMF uses composite entities and the scenarios where
composite entities are beneficial?
 How do you debug issues with data entities when data imports or
exports fail or when performance is suboptimal?
 How do you configure a data project for a complex migration that involves
multiple entities, dependencies, and large data volumes?
 Can you explain how you use staging tables in DMF during data imports and
exports? What are the key advantages of using staging tables?
 What techniques do you use to optimize the performance of data
imports and exports in DMF, especially for high-volume transactions?

Batch Processing and Parallelism:

 Use Batch Jobs: DMF allows data imports and exports to be processed in
batch jobs. This minimizes the impact on the front-end performance and
ensures that large datasets are processed asynchronously.
 Parallel Processing: Configure the DMF to process data in parallel by
increasing the number of batch threads or parallel tasks. This allows
multiple records to be processed simultaneously, significantly reducing the
overall processing time.

Dual-write
 Can you explain the architecture of dual-write in D365 FO and how it
integrates with Dynamics 365 Customer Engagement (CE) applications?
 How do you configure dual-write for a new environment? Can you walk
through the process of enabling dual-write for D365 FO and CE?
 How do you troubleshoot common dual-write errors like data sync
failures, misaligned field mappings, or performance slowdowns?

Batch Jobs
 How do you configure batch job priorities and what impact do they
have on system performance?

 How do you handle failed batch jobs? What steps do you take to investigate and resolve
issues?

 What strategies do you use for managing long-running batch jobs, and how do you
identify potential bottlenecks?
 What techniques do you employ to optimize the performance of batch jobs in
D365 FO, especially for large data sets?
 How do you troubleshoot batch jobs that are stuck or taking longer than
expected to complete?

Upgrade AX 2009 to D365FO: Explain the
steps

DMF Parallel imports

What are data entities?

Walkthrough the steps to import or export data?

To speed up the import of data, parallel processing of importing a file can be


enabled if the entity supports parallel imports.

Azure cloud services

What are the advantages of the Azure


Resource Manager?

Azure Resource Manager enables users to manage their usage of application


resources. Few of the advantages of Azure Resource Manager are:

 ARM helps deploy, manage and monitor all the resources for an
application, a solution or a group

 Users can be granted access to resources they require

 It obtains comprehensive billing information for all the resources in


the group

 Provisioning resources is made much easier with the help of


templates

To assign the Contributor role to the Dynamics Deployment Services


[wsfed-enabled] application.
1. In the Azure portal, on the Subscription tab, select the Azure
subscription, and then select the Access Control (IAM) line item.

2. Select Add, select Add role assignment. In the dialog box,


set Role to Contributor and set Assign access to to Microsoft
Entra user, group, or service principal. In the Select field,
search for and select Dynamics Deployment Services [wsfed-
enabled]. Select Save.

Subscription – A subscription to finance and operations apps gives


you an online cloud environment (or multiple environments) and
experience.

Licenses – Customers must purchase subscription licenses (SLs) for


their organization, or for their affiliates' employees and on-site agents,
vendors, or contractors who directly or indirectly access finance and
operations apps. These apps are licensed through Microsoft Volume
Licensing and the Microsoft Cloud Solution Provider (CSP) program. For
more information, download the latest Microsoft Dynamics 365
Licensing Guide from Dynamics 365 pricing.

Tenant – In Microsoft Entra ID (Microsoft Entra ID), a tenant represents


an organization. It's a dedicated instance of the Microsoft Entra service
that an organization receives and owns when it creates a relationship
with Microsoft (for example, by signing up for a Microsoft cloud service,
such as Azure, Microsoft Intune, or Microsoft 365). Every Microsoft
Entra tenant is distinct and separate from other Microsoft Entra
tenants. For more information about Microsoft Entra tenants, see How
to get a Microsoft Entra Tenant.

Microsoft 365 admin center – Microsoft 365 admin center is the subscription
management portal that Microsoft 365 provides for administrators. It's used to
provide management functions for users (Microsoft Entra ID) and subscriptions.

Where we check our F&O storage consumption?

Power platform admin center. Under resources capacity. If you want to see
breakdown how much each table or Index is occupied, then request JIT access and
then check in SQL.

If data is more in table level then run the clean-up routines to archive.

Data Entities:

Recently I came across a scenario in which I was required to modify the behavior of
data entity so that when an empty column is present in the source file, it needed to
be ignored i.e. the existing values for that column of the record needed to be
defaulted.

Ans: first I suggested to use the ‘Ignore blank values’ check on the field to
datasource mapping.

LCS
What is LCS?
Lifecycle Services (LCS) for Microsoft Dynamics is a collaboration portal that
provides an environment and a set of regularly updated services that can
help you manage the application lifecycle of your implementations of finance
and operations apps.

LCS is available to customers and partners as part of their support plans.

What Tools that are provided in LCS?

Projects, Methodologies, Business process modeler, Cloud hosted env,


Customization analysis, Issue search, Asset Library, etc..

How to Pause service updates through Lifecycle Services


(LCS)?
As of February 19, 2024, the maximum number of consecutive updates that can be
paused is being reduced from three to one

Who can pause service updates?


Only users (customers or partners) who are assigned to the project
owner role in Lifecycle Services can pause updates. Updates can be paused
only for implementation projects.

If you pause updates to the production environment, all updates to other sandbox
environments are paused too.

Can I pause updates to my additional


sandbox environments only?
No, you can't pause updates to additional sandbox environments only.

What if the update to the default sandbox


environment is paused?
If the update to the default sandbox environment is paused, the updates to
the production environment and all additional sandbox environments are
also paused.

Batch jobs that are stuck in either


an Executing or Canceling state and don't
complete?
How to capture changes done on tables?
Enable Change Tracking

How to troubleshoot the form or report


slowness?
If anyone reported any batch job issue what
all inputs you collect to troubleshoot the
issue?

What is Cleanup routines in Dynamics 365 FO?


When we recommend this?
Notification clean up, Batch job history clean-up, and Clean up log

prerequisites for configuring D365 FO with Power BI?

 You must have Azure Active Directory (AAD) administrator privileges.


 You must be assigned to the System Administrator role in Dynamics 365 for
Operations.
 You must have had at least one user sign in to PowerBI.com before.

How do I deploy Power BI files in Dynamics 365?


Navigate to System administration>Setup>Deploy Power BI files. Select your report and click
Deploy Power BI reports files

Asset library in Lifecycle Services


The Asset library is a storage location for the various assets that are
associated with a tenant in Microsoft Dynamics 365 Lifecycle Services. Two
types of Asset library are available in Lifecycle Services: the Shared asset
library and the project-level Asset library.

Shared asset library – The Shared asset library is used by Microsoft and
Partners to share assets across multiple tenants, projects, and environments
in Lifecycle Services. This library can be accessed by any user who signs in
to Lifecycle Services.

Project-level Asset library – The project-level Asset library is used to


share assets across environments within a project in Lifecycle Services. This
library can be accessed by all users within a project.

Suppose you are working on any functionality/process if you see any


performance issue/Deadlocks/slow running queries etc.. from your
end where you can see this information/from where you will monitor
these?

Ans: Environment monitoring from LCS

Reporting:
SSRS :
 A DP (Data Provider) class that will provide three data sets for the
SSRS report. The same class will also provide the information about
report parameters to the report’s RDL through an attribute that links a
DP class with the corresponding report Data Contract.
 A Data Contract class that will define and carry the values of the
report parameters. We will have one hidden parameter: a sales
agreement’s RecId.
 Three temporary tables to define and carry the report data. They
will be filled and exposed to the SSRS report by the DP class.
 A Controller class that will handle the report dialog form, setting the
SSRS report design and the value of the hidden parameter.

How to create the Controller class?

A Controller class extends SrsReportRunController

How to Create the Data Contract class?

A Data Contract class defines and stores the values of the report parameters.

How to create the Data Provider class?

A Data Provider class should


extend SRSReportDataProviderBase (or SrsReportDataProviderPreProcess fo
r pre-processed reports) and implement at least the processReport() method.

What are the advantages of using DirectQuery in Power


BI?
DirectQuery allows users to create reports that query data in real time from data
sources, ensuring that reports always reflect the most up-to-date information.

What are some best practices for optimizing report


performance in Dynamics 365 Finance and
Operations?
Best practices include optimizing data models, using
appropriate visuals, minimizing the use of custom
calculations, and considering data caching strategies .
How do you handle authentication and authorization in
Postman?
Postman provides multiple ways to handle authentication and authorization
in API requests. Some commonly used methods include:

 Basic Authentication: You can include the username and password in the
request headers using the "Authorization" header.

 Token-based Authentication: Postman allows you to include tokens (such as


JWT or OAuth) in the request headers or as query parameters.
 API Key: If an API requires an API key, you can pass it as a request header
or query parameter.

 OAuth 2.0: Postman has built-in OAuth 2.0 support, allowing you to
configure and authenticate using various OAuth flows, such as
Authorization Code or Client Credentials.

 Custom Authentication: Postman's scripting capabilities enable you to


implement custom authentication mechanisms by modifying request
headers or using specialized libraries.

Where are query parameters stored in a GET


request?

The query parameters for the GET request are saved in Postman's URL

What is the HTTP response code for a POST request with incorrect
parameters?

The correct response code for a request with incorrect parameters is 400
Bad Request

Define status code 201.

When you successfully create a resource using a POST or PUT request, the
status code 201 denotes that the resource has been created. It uses the
location header to return a link to a newly built resource.

What are Postman methods?

Postman methods, also known as HTTP methods or HTTP verbs, represent


the actions that can be performed on a resource through an API. Some
common Postman methods include:
 GET: Retrieves information from a specified resource.

 POST: Submits data to be processed by a specified resource.

 PUT: Updates a specified resource with new data.

 DELETE: Removes a specified resource.

 PATCH: Partially updates a specified resource.

What are some data sources that Power Apps


supports?
SharePoint, Microsoft Excel, and Office 365 apps like Word, Excel, and
OneDrive are some of the most common data sources supported by Power
Apps. Apart from this, Power Platform applications also support SQL Server,
Dynamics 365, and other non-Microsoft CRM databases as data sources.

Suggest some ways to improve the performance and


responsiveness of Power Apps.
The performance of Microsoft Power Apps can be improved in the following
ways:

a) Limit the data connection to the same app. Connecting the same app to
more than 30 sources can increase the time it takes to load the program.

b) Reduce the number of controls added to a single app. It creates an HTML


document object model to render each control, and the more controls you
include, the longer it takes to generate.

c) Using the Concurrent function to load data sources simultaneously can cut
the time it takes for an app to load data in half.

d) Using the Set function to avoid continually retrieving data from the source
can improve the performance if the data is likely to stay the same.

e) Delegating data processing to the data source can speed up the app's
performance, as retrieving the data from a local device can demand more
processing power, memory, etc.

f) Avoid repeating the same formula. Consider setting the formula once and
referencing the outcome of the first property in future ones if many
properties run the same formula.
What are the three different views in Power BI and
briefly explain each one.
Microsoft Power BI has three views, each one of which is unique and serves a
purpose.

a) Report View: This is for adding visualisation and reports. This view also
allows for publishing.

b) Data View: Query Editor tools can be used quickly edit.

c) Model View: This view is used to manage the relationships between


complex data sets.

Business events
Business events provide a mechanism that lets external systems receive
notifications from finance and operations apps.
A business action that a user performs can be either a workflow action or a non-
workflow action. Approval of a purchase requisition is an example of a workflow
action, whereas confirmation of a purchase order is an example of a non-workflow
action.

Prerequisites
Business events can be consumed using Microsoft Power Automate and
Azure messaging services.

Power Automate
Explain the different types of flow on Power Automate.

We can use three types of flows, they are:

 Business process flows: It helps users to get scheduled work done. Then, it
breaks the work into steps to get the best output.
 Cloud flows: Here, the flows are automatically executed by triggering up the
workflow in schedule or currently.
 Desktop flows: It is from a powerful platform that helps to automate web
tasks or desktop tasks

Dev and Customizations:

Difference between edit and display method?


Ans: Display Indicates that the method’s return value is to be displayed on a
form or a report.
The value cannot be altered in the form or report
Edit Indicates that the method’s return type is to be used to provide information
for a field that is used in a form. The value in the field can be edited.

Why we use virtual companies?


Ans: Virtual company accounts contain data in certain tables that are shared by
any number of company accounts. This allows users to post information in one
company that will be available to another company.

What is the sequence of events while a report is generated?


Ans: Init, Run, Prompt, Fetch, send, Print
How many types of data validation methods are written on
the table level?
Ans: validateField(),validateWrite(),validateDelete(),aosvalidateDelete(),
aosvalidateInsert(), aosvalidateRead(),aosvalidateUpdate().

Multiple inheritances possible or not, if not how can we


overcome that?
Ans: In X++, a new class can only extend one other class; multiple inheritances
are not supported. If you extend a class, it inherits all the methods and variables
in the parent class (the superclass).
We can use Interfaces instead of multiple inheritances in Ax.

Where is the best place to write code to perform filter in a


form?
Ans: FormDataSource – executeQuery() and call this method in the design field
of the form.

How to capture a trace file?

How to analyze the trace file? What we will check in the trace file?

Batch priority-based scheduling feature.

A scheduling priority is defined for batch groups, but it can be overridden for
specific batch jobs. The scheduling priority classifications are used to declare
relative priorities, and to determine the processing order of jobs and business
processes. The available values for the scheduling priority
are Low, Normal, High, Critical, and Reserved capacity.

Batch concurrency control feature in Feature management. This feature lets you
set a limit on the number of tasks that can run concurrently in a specific batch job.
Therefore, it helps you prioritize your batch jobs and optimize the use of your
resources. For example, by limiting the number of tasks for a low-priority batch job,
you can avoid overloading the system and affecting the performance of other,
higher-priority batch jobs.

Calling a Rest API from D365 using X++


The sample method below uses the following classes.
 EMWebRequest class to create the request
 Use EMCommonWebAPI class to call the API
 EMWebResponse class to get the response
 FormJsonSerializer to convert the data contract classes to JSON

The example misses the logic to create a token etc. for authorization as the
API in use is public one and you dont need any authorization.

private static void callRestAPIPost()


{

EMCommonWebAPI webAPI = EMCommonWebAPI::construct();


// Prepare the request
EMWebRequest webRequest = EMWebRequest::newUrl('https://api.restful-
api.dev/objects');

// Set the method to use


webRequest.parmMethod('Post');
System.Text.UTF8Encoding encoder;
encoder = new System.Text.UTF8Encoding();

// Initialize the contract to prepare the body


NSHEContract contract = new NSHEContract();
contract.parmName('NShe MacBook Pro 16');

NSHEContractData contractData = new NSHEContractData();


contractData.parmCPUModel('Fast model x');
contractData.parmhardDiskSize('225 GB');
contractData.parmPrice(2000);
contractData.parmYear(2023);
contract.parmData(contractData);

//Convert the data contract to Binary


System.IO.MemoryStream stream = new System.IO.MemoryStream();
System.Byte[] encodedBytes =
encoder.GetBytes(FormJsonSerializer::serializeClass(contract));
stream.Write(encodedBytes, 0, encodedBytes.get_Length());
Binary content = binary::constructFromMemoryStream(stream);
stream.Close();

//Set the body of the request


webRequest.parmContent(content);
webRequest.parmContentType('application/json');

// Get the response by calling the API


EMWebResponse webresponse = webAPI.getResponse(webRequest);
if (webresponse && webresponse.RequestSucceeded())
{
info ('Succedded');
}

What is the concept of extension in D365?

An extension is a way to add new functionality to an existing object


in D365FO without modifying the base code of that object. Microsoft
has added the concept of extension because they don’t want to
modify the code base anymore. In this way, it will be easier to
upgrade the application code base in the future by Microsoft.

What is EDT and Base Enum?

EDT or Extended Data Type and Base Enumerations (Enums) are


data types. They are created and managed in the development
environment. Extended data types can be primitive data types like
integers, strings, real numbers, and booleans. EDT extends the
original properties of the data type which they inherit. In addition,
some extra properties are added.

Base enums are a fixed set of values. Those values in database are
saved as intigers but they have also name (as referenced from X++
code) and a label (visible to users). You can have up to 255 values
for Base enums. The integers in the database will take on the values
0 through 254.

The AOT in D365FO apps contains many existing EDTs and base
enums that can be extended for use in your project, or you can
create new data types.

Difference between RunBase and


RunBaseBatch class – AX 2012 Or Dynamics
365 F&O

RunBase: To create a job or an Action class – a program that


carries out processes, such as accepting parameters from the user
and then updating records in the database – you use the RunBase
framework.
The framework is implemented by the RunBase application class
and supplies many features, which include the following:

· Query

· dialog, with the persistence of the last values entered by the user

· Validate

The RunBase application framework runs or batches an


operation.
An operation is a unit of work, such as the posting of a sales order or
calculation of a master schedule.
The RunBase framework uses the Dialog framework to prompt
a user for data input.
It uses the SysLastValue framework to persist usage data and
the Operation Progress framework to show operation progress.
The RunBase class is a framework for classes that need a dialog for
user interaction and that need the dialog values to be saved per
user.

RunBaseBatch: You can design your own batch job by extending


the RunBaseBatch class. You can also write code to schedule the
batch to run. The batch runs on the Application Object Server (AOS)
RunBaseBatch is an extension of RunBase – it adds a support
for batch processing.
SysOperation framework is a newer framework replacing
RunBase (and its extensions such as RunBaseBatch).

Dynamics 365 Finance & Operations – How


to use chain of command in X++ (CoC)

Chain of Command (CoC) enables strongly typed extension


capabilities of public and protected methods. It is an amazing piece
of development capability that allows technical consultants to
extend the application avoiding over-layering.

Microsoft has implemented chain of command across Classes,


Tables, Forms, form data sources, and data field methods.
Before we dive into the Chain of Command methods, remember that
in order to use CoC you must declare your class as final, and your
methods should always contain the next keyword.
The next keyword behaves like a super, and it will define when
your extended logic executes. The next call after your code
behaves like a Pre-event handler, your logic executes first, and later
on, the logic residing in the original method gets executed.
The next call before your code behaves like a Post event handler,
your logic executes after the code residing in the original method
gets executed.

What is the difference between package and model in


d365?
Answer: A model is a group of elements, such as metadata and
source files, that typically constitute a distributable software
solution and includes customizations of an existing solution.
A package is a deployment and compilation unit of one or more
models. It includes model metadata, binaries, and other
associated resources.

What is an Entity Store? What is its significance in Dynamics 365 Finance and
Operations?

An Entity Store is a database that stores data from Dynamics 365 Finance
and Operations in a format that is optimized for reporting and analytics. The
Entity Store is updated regularly with new data from Finance and Operations,
so it is always up-to-date and can be used for reporting and analytics
purposes. The Entity Store is a key part of the Finance and Operations
platform, and it is used by many organizations to improve their reporting and
analytics capabilities.
Enable the integration for cloud-hosted
development environments
Because of the architecture differences between cloud-hosted development
environments and the sandbox or production environments, Power Platform
Integration can't be set up after the developer environment is created.
Therefore, you can set up Power Platform Integration only during the
deployment of your cloud-hosted environment. In addition, you can connect
a new cloud-hosted environment only with a new Power Platform
environment. You can't connect an existing environment of any type via
Lifecycle Services.

Migration from AX 2012 R3 TO D365 !


You need to
1. Create a D365FO project in LCS
2. Create a ADO (formerly known as VSTS) project
3. Link your ADO project to your LCS project
4. Run code upgrade tool in LCS (you will upload your AX2012 modelstore, and
the tool will migrate the code to D365FO format and check in to ADO)
5. Do the actual manual code migration work such as refactoring your
overlayerings to extensions. This you do with Visual Studio in a D365FO dev
machine

Can you describe your experience with


developing custom reports in Dynamics 365
Finance and Operations?
, I can tell you that developing custom reports in Dynamics 365 Finance and
Operations involves using various reporting tools such as SQL Server
Reporting Services (SSRS) and Power BI to create tailored reports that meet
specific business requirements. The reports can be created from data stored
in the Dynamics 365 Finance and Operations database, as well as from other
data sources such as Excel or SharePoint. The process typically involves
defining the report requirements, designing the report layout, and coding the
report logic using languages such as Transact-SQL, Visual Basic, or C#. Once
the report is developed, it can be deployed to the Dynamics 365 Finance and
Operations environment and made available to users through the report
viewer or integrated within the Dynamics 365 user interface.
How do you troubleshoot and resolve
performance issues in Dynamics 365 Finance
and Operations?
To troubleshoot and resolve performance issues in Dynamics 365 Finance
and Operations, the following steps can be followed:

1. Monitor System Performance: Use the built-in performance analysis tools


such as SQL Server Profiler, Perfmon, and Dynamics 365 Workspace to
monitor the system performance.
2. Identify the Root Cause: Analyze the data collected from the performance
monitoring tools to identify the root cause of the issue.
3. Optimize Database Queries: Optimize database queries by using appropriate
indexes, reducing query complexity, and avoiding redundant calculations.
4. Implement Caching Strategies: Implement caching strategies such as data
caching, query caching, and view caching to improve performance.
5. Scale Hardware Resources: Scale hardware resources such as CPU, memory,
and disk space to meet the demands of the system.
6. Tune System Parameters: Tune system parameters such as max worker
threads, tempdb size, and memory limits to optimize system performance.
7. Monitor System Health: Regularly monitor the system health by using the
Dynamics 365 Workspace and Health Diagnostics to identify and resolve
potential performance issues.
8. Seek Professional Assistance: If the performance issues persist, seek
professional assistance from Microsoft or a Dynamics 365 partner to resolve
the issue.

OData Cross-company behavior

By default, OData returns only data that belongs to the user's default
company. To see data from outside the user's default company, specify the ?
cross-company=true query option. This option will return data from all
companies that the user has access to.

Example: http://[baseURI\]/data/FleetCustomers?cross-company=true

To filter by a particular company that isn't your default company, use the
following syntax:

http://[baseURI\]/data/FleetCustomers?$filter=dataAreaId eq 'usrt'&cross-
company=true

Querying or browsing an OData endpoint


OData enables an SQL-like language that lets you create rich queries against
the database, so that the results include only the data items that you want.
To create a query, append criteria to the resource path. For example, you
can query the Customers entity collection by appending the following query
options in your browser.

Expand table

URL Description

[Your organization's root URL]/data/Customers List all the customers.

[Your organization's root URL]/data/Customers?$top=3 List the first three records.

[Your organization's root URL]/data/Customers? List all the customers, but show only the
$select=FirstName,LastName first name and last name properties.

[Your organization's root URL]/data/Customers? List all the customers in a JSON format
$format=json that can be used to interact with
JavaScript clients.

The OData protocol supports many similar filtering and querying options on
entities. For the full set of query options, see Windows Communication
Foundation.

Using Enums
Enums are under namespace Microsoft.Dynamics.DataEntities. Enums
can be included in an OData query is by using the following syntax.

Microsoft.Dynamics.DataEntities.Gender'Unknown'

Microsoft.Dynamics.DataEntities.NoYes'Yes'

An example query for using the above enum values is shown below.

https://environment.cloud.onebox.dynamics.com/data/CustomersV3?\
$filter=PersonGender eq Microsoft.Dynamics.DataEntities.Gender'Unknown'

https://environment.cloud.onebox.dynamics.com/data/Currencies?\
$filter=ReferenceCurrencyForTriangulation eq
Microsoft.Dynamics.DataEntities.NoYes'No'

The operations supported for enums are eq and ne.

What is a data mart reset?


A data mart reset will disable the integration tasks, delete all the data mart
data, and then re-enable integration.

To ensure that old data isn't inserted, a data mart reset can be started only
after existing tasks are completed. If you try to reset the data mart before all
tasks are completed, you might receive a message such as, "The data mart
reset was unable to be processed because of an active task. Please try again
later."

When do I have to do a data mart reset?


If one or more of the following statements apply to your situation, your
organization can benefit from a data mart reset:

 The application database was restored


 You opened a support ticket - A support engineer instructed
you to reset the data mart as part of a troubleshooting step.
 Large percentage of stale records - Stale records by
themselves don't necessarily justify a data mart reset. High
percentages of stale data can degrade the overall report
generation and integration performance, and incur extra database
space usage. We recommend that you complete a datamart reset
to remove the stale data when there is more than 80% stale data
in the data mart.

Do all users have to exit the system before I


can reset the data mart?
No. Users can continue to work in the system during a data mart reset.
However, until the reset is completed, users won't be able to access any
reports that were created by using Financial Reporter.

How do I restrict access to a report by using


tree security?
The following example shows how to restrict access to a report by using tree
security.

The USMF demo company has a Balance sheet report that not all Financial
reporting users should have access to. You can use tree security to restrict
access to a single report so that only specific users can access it.
1. Sign in to Financial Reporter Report Designer.
2. Select File > New > Tree Definition to create a new tree
definition.
3. Double-tap (or double-click) the Summary line in the Unit
Security column.
4. Select Users and Groups.
5. Select the users or groups that require access to the report.
6. Select Save.
7. In the report definition, add your new tree definition.
8. In the tree definition, select Setting. Then, under Reporting unit
selection, select Include all units.

When I design a report in Report Designer, or


when I generate a financial report, I received
the following message: "The operation could
not be completed due to a problem in the
data provider framework." How should I
respond?
The message indicates that an issue occurred when the system tried to
retrieve financial metadata from the data mart while you were using
Financial reporting. There are two ways to respond to this issue:

 Review the integration status of the data by going to Tools >


Integration status in Report Designer. If the integration is
incomplete, wait for it to be completed. Then retry what you were
doing when you received the message.
 Contact Support to identify and work through the issue. There
might be inconsistent data in the system. Support engineers can
help you identify that issue on the server and find the specific
data that might require an update.

How does the selection of historical rate


translation affect report performance?
The historical rate is typically used with retained earnings, property, plant
and equipment, and equity accounts. The historical rate might be required,
based on guidelines of the Financial Accounting Standards Board (FASB) or
generally accepted accounting principles (GAAP). For more information,
see Currency capabilities in financial reporting.
How many types of currency rate are there?
There are three types:

 Current rate – This type is typically used with balance sheet accounts.
It's usually known as the spot exchange rate and can be the rate on the
last day of the month or another predetermined date.
 Average rate – This type is typically used with income statement
(profit/loss) accounts. You can set up the average rate to do either a
simple average or a weighted average.
 Historical rate – This type is typically used with retained earnings,
property, plant and equipment, and equity accounts. These accounts
might be required, based on FASB or GAAP guidelines.

How does historical currency translation


work?
Rates are specific to the transaction date. Therefore, each transaction is
individually translated, based on the closest exchange rate.

For historical currency translation, the pre-calculated period balances can be


used instead of individual transaction details. This behavior differs from the
behavior for current rate translation.

General ledger: What can be changed to help


enhance the performance of year-end
processing?
You can make several changes to help improve the performance of the year-
end close. We recommend that you evaluate these suggested changes to
determine whether they're appropriate for your organization.

Optimize year-end close service

The Optimize year-end close service empowers Microsoft Dynamics 365


Finance customers to accelerate their year-end close by moving the heavy
year-end processing to a microservice. The time that's saved through an
efficient year-end close enables each Finance team to react in a timely
manner to required adjustments, ending in the generation of the financial
reports. By processing the year-end close on a microservice, valuable
resources are freed up. The processing elevation minimizes the load on the
SQL server and gives customers an opportunity to accelerate the year-end
close processing.

The Optimize year-end close service is available in version 10.0.31, so


that more customers can use the new service for the 2022 year-end season.
Additionally, the service has been backported to versions 10.0.30 and
10.0.29. For more information, see Optimize year-end close.

Dimension sets

When you run the year-end close, each dimension set balance is rebuilt. This
behavior has a direct impact on performance. Some organizations create
dimension sets unnecessarily, because they were used at one point or might
be used at some point. Because these unnecessary dimension sets are
rebuilt during the year-end close, time is added to the process. Take the time
to evaluate your dimension sets and delete any that are unnecessary.

The unnecessary dimension sets also impact the batch


job BudgetDimensionFocusInitializeBalance (General ledger > Chart
of accounts > Dimensions > Financial dimension sets).

Year-end close template configuration

The year-end close template lets organizations select the financial dimension
level to maintain when transferring profit and loss balances to retained
earnings. The settings allow an organization to maintain the detailed
financial dimensions (Close all) when moving the balances to retained
earnings or choose to summarize the amounts to a single dimension value
(Close single). This can be defined for each financial dimension. For more
information on these settings, see the Year-end close article.

We recommend that you evaluate your organization's requirements and if


possible, close as many dimensions as possible using the Close single year-
end option to improve performance. By closing to a single dimension value
(which can also be a blank value), the system calculates less detail when
determining the balances for retained earnings account entries.

Degenerate dimensions

A degenerate dimension provides little to no reuse by itself and in


combination with other dimensions. There are two types of degenerate
dimensions. The first type is a dimension that is individually degenerate.
Typically, this type of degenerate dimension will appear on only a single
transaction, or on small sets of transactions. The second type is a dimension
that becomes degenerate in combination with one or more additional
dimensions that exhibit the same potential based on the possible
permutations that can be generated. A degenerate dimension can have a
significant impact on the performance of the year-end close process. To
minimize performance issues, define all degenerate dimensions as Close
single in the year-end close setup as described in the preceding section.

SQL JOIN
A JOIN clause is used to combine rows from two or more tables, based on a
related column between them.

Let's look at a selection from the "Orders" table:

OrderID CustomerID OrderDate

10308 2 1996-09-18
10309 37 1996-09-19

10310 77 1996-09-20

Then, look at a selection from the "Customers" table:

CustomerID CustomerName ContactNam

1 Alfreds Futterkiste Maria Anders

2 Ana Trujillo Emparedados y helados Ana Trujillo

3 Antonio Moreno Taquería Antonio More

Notice that the "CustomerID" column in the "Orders" table refers to the
"CustomerID" in the "Customers" table. The relationship between the two tables
above is the "CustomerID" column.

Then, we can create the following SQL statement (that contains an INNER JOIN),
that selects records that have matching values in both tables:

ExampleGet your own SQL Server


SELECT Orders.OrderID, Customers.CustomerName, Orders.OrderDate
FROM Orders
INNER JOIN Customers ON Orders.CustomerID=Customers.CustomerID;

Write a query to display the highest salaries


in each department.
SELECT Department, MAX(Salary) FROM salaries
GROUP BY Department
How would you display the records of the top 15
students in the most recent exam?
SELECT * from students_table
LIMIT 15

Find the last record in a table of daily transactions.


SELECT * FROM transactions_table
ORDER BY transaction_time DESC
LIMIT 1

How would you write a query to find the second or


third-highest bonuses paid out the previous month?
SELECT MAX(Bonus) from employee_bonuses
WHERE Bonus<(SELECT MAX(Bonus) from employee_bonuses)

What is LCS?
DB Point In Time Restore
Package deployment (What are steps Involved? Sync Issues)
Can be merge Customizations and Standard code in to Single package?
What is PQU updates?
How to use trace parser?
Why we enable Power platform Integration in LCS?

You might also like