0% found this document useful (0 votes)
179 views5 pages

Dynamics Trucante Delete Cleanup para Export Import

The document provides instructions for cleaning up and optimizing a Dynamics 365 Finance and Operations environment before exporting the UAT database. It includes truncating tables to remove old data, deleting temporary tables, using the D drive to improve import/export performance, and cleaning up logs and workflow tracking tables. It also provides a link to Microsoft documentation on best practices for database movement and references other cleanup routines in Dynamics 365 to evaluate.

Uploaded by

Eduardo Machado
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
179 views5 pages

Dynamics Trucante Delete Cleanup para Export Import

The document provides instructions for cleaning up and optimizing a Dynamics 365 Finance and Operations environment before exporting the UAT database. It includes truncating tables to remove old data, deleting temporary tables, using the D drive to improve import/export performance, and cleaning up logs and workflow tracking tables. It also provides a link to Microsoft documentation on best practices for database movement and references other cleanup routines in Dynamics 365 to evaluate.

Uploaded by

Eduardo Machado
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

#1 truncate staging

#2 Delete EventCud

This table stores alerts waiting to be sent. If there are many records in EVENTCUD table it usually indicates the batch job that
delivers the alert messages is not running or not scheduled. If the alerts queued in the system are not critical, the EVENTCUD table
can be truncated. This will prevent all old alerts from being delivered in “one short” to the users when the batch job is scheduled
again and runs.

#3 Rever docs ref. performance.

https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/database/dbmovement-scenario-exportuat

Performance

The following guidelines can help you achieve optimal performance:


 Always import the .bacpac file locally on the computer that runs the SQL Server instance. Don't import it from
Management Studio on a remote machine.
 In a one-box environment that is hosted in Azure, put the .bacpac file on drive D when you import it. (A one-box
environment is also known as a Tier 1 environment.) For more information about the temporary drive on Azure
virtual machines (VMs), see the Understanding the temporary drive on Windows Azure Virtual Machines blog post.
 Grant the account that runs the SQL Server Windows service Instance File Initialization rights. In this way, you can
help improve the speed of the import process and the speed of a restore from a *.bak file. For a developer
environment, you can easily make sure that the account that runs the SQL Server service has these rights by setting
SQL Server to run as the axlocaladmin account.

#4 limpeza tabelas (TRUNCATE) baseado na query de Top tables relacionadas logs e transações que não irão afetar o uso do Ax >>
ANTES DO EXPORT DO UAT

USE [YourDBName] -- replace your dbname


GO
SELECT top 30
s.Name AS SchemaName,
t.Name AS TableName,
p.rows AS RowCounts,
CAST(ROUND((SUM(a.used_pages) / 128.00), 2) AS NUMERIC(36, 2)) AS Used_MB,
CAST(ROUND((SUM(a.total_pages) - SUM(a.used_pages)) / 128.00, 2) AS NUMERIC(36, 2)) AS Unused_MB,
CAST(ROUND((SUM(a.total_pages) / 128.00), 2) AS NUMERIC(36, 2)) AS Total_MB
FROM sys.tables t
INNER JOIN sys.indexes i ON t.OBJECT_ID = i.object_id
INNER JOIN sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id
INNER JOIN sys.allocation_units a ON p.partition_id = a.container_id
INNER JOIN sys.schemas s ON t.schema_id = s.schema_id
GROUP BY t.Name, s.Name, p.Rows
ORDER BY Total_MB DESC
GO
#5 Deletar temp tables que possam por ventura existir no UAT >> ANTES DO EXPORT DO UAT

-- delete temp tables for specific environment


BEGIN
    DECLARE @tableName nvarchar(256);
    DECLARE @sqlStmtTable nvarchar(512)

    DECLARE tempTableCursor CURSOR


       FOR SELECT DISTINCT Table_Name
       FROM Information_Schema.Tables
       WHERE Table_Name LIKE 't%Aos%_%';
      
       -- Open cursor and disable on all tables returned
       OPEN tempTableCursor
       FETCH NEXT FROM tempTableCursor INTO @tableName

       WHILE @@FETCH_STATUS = 0


       BEGIN
              SET @sqlStmtTable = 'DROP TABLE [dbo].[' + @tableName + '];'
              EXEC (@sqlStmtTable);
              FETCH NEXT FROM tempTableCursor INTO @tableName
       END
       CLOSE tempTableCursor
       DEALLOCATE tempTableCursor
END

#6 Utilizar o drive D para Export e Import com o parâmetro /mp:16 

The import speed is related to the CPU cores, RAM, and disk performance of your machine. 

You can try to increase the speed by putting the bacpac on the D:\ drive and passing the /mp:16  command to your import
to increase the thread count to 16.  16 is an arbitrary number, it should equal the number of cores available on your
machine.

https://docs.microsoft.com/en-us/sql/tools/sqlpackage?view=sql-server-ver15

#7 Truncate tabelas de workflow tracking


WORKFLOWTRACKINGARGUMENTTABLE,
WORKFLOWTRACKINGTABLE
WORKFLOWTRACKINGCOMMENTTABLE,
WORKFLOWTRACKINGWORKITEM

#8 rever rotinas de clean-up do Ax. Nesse  link tem rotinas sugeridas de “limpeza de tabelas” que podem ser avaliadas.

#9 processos adicionais.

 sysoutgoingemail
https://community.dynamics.com/ax/f/microsoft-dynamics-ax-forum/372592/table-size-sysoutgoingemaildata-is-huge

 The tables EVENTINBOX and EVENTINBOXDATA can be clean using the standard BatchJob  that exist in the [System
AdministrationàPeriodicàNotification Cleanup], the secret here is specify the date filter using the following command that
this can be executed correctly.
 In the “Alert created date and time field”, you need specify the following (LessThanDate(-NumOfDays)), when the batchjob
is executed, the query take the date of the execution and substract the “-NumOfDays” and clean all existent registries older
that this date. However, based on the number of registries that currently have the customer the process can take few hours
to fully execute the cleanup, its strongly recommended that they setup this Batchjob to execute on Weekly basis.

 Same as the “Notification Cleanup”, exist other one to cleanup the Batchjobhistory table System AdministratoràPeriodic
Taskà Batch job history clean-up (custom),
TABLE SYSDATABASELOG

Use the Database log cleanup tab to determine when run the log cleanup task [System AdministratoràInquiriesàDatabaseàDatabase
log]

You might also like