Tag: erp

  • Record-Level Error Logging in D365 F&O

    Learn how to capture and display detailed error logs for each record in asynchronous integrations using custom utility class DFTGetInfologMessages.

    Hello Folks,

    In many real-world D365 Finance & Operations implementations, we often deal with asynchronous integration patterns—where incoming data is first landed into a staging table, and then processed via a batch job.

    Typical business scenarios include:

    • Creation of Sales Orders or Purchase Orders
    • Posting of Invoices

    During this process, it’s common to encounter data issues that need to be handled gracefully. Instead of failing the entire batch, it’s often better to log errors at the individual record level, especially when dealing with high-volume integrations.

    This blog explores how to:

    • Capture record-specific failure details during staging table processing
    • Store those details efficiently
    • Display them in a user-friendly format, similar to how batch job logs appear in the D365 F&O UI

    This approach not only improves error transparency but also helps functional consultants and support teams quickly identify and troubleshoot failed records without digging through raw logs or the infolog.


    Step-by-Step: Implementing Record-Level Error Logging in D365 F&O

    Let’s now walk through the technical implementation to capture and display error logs for each record processed via batch jobs.

    1. Create a New EDT:

    • Define a new Extended Data Type (EDT) named DFTIntegrationStatusMessage with Memo as its base type.
    • This will hold detailed error messages.

    2. Update Your Log Table:

    • Add a string-type field named Message using the new EDT DFTIntegrationStatusMessage.
    • Add a container-type field named ErrorMessagesContainer using the EDT InfologData.
    • This container will store error messages in a structured format for later retrieval.

    3. Add Utility Class for Capturing Infologs:

    • Create a class named DFTGetInfologMessages in your project.
    • To avoid compilation errors, ensure that the TestEssentials model is referenced by navigating to: Model Management > Update Model Parameters.

    4. Sample Batch Job Service Class:

    • Refer to the sample implementation provided below
    • The batch job processes data from a custom table to create Purchase Orders and logs errors for each failed record.
    • Lines marked with important remark in the code relate to capturing and storing error logs.
    /// <summary>
    /// create and confirm purchase order
    /// </summary>
    /// <remarks>N developed for Batch Failure by RhushikeshR</remarks>
    public class DFTPurchOrderService extends SysOperationServiceBase
    {
        RefRecId currentBatchJobId;
    
        public void processOperation(DFTPurchOrderDataContract _contract)
        {
                #define.RetryNum(5)
    
                Query                   query;
                int                     recordCounter;  
                DFTPurchaseOrderHeader  purchaseOrderHeader;
                InfologData             infologData; //*** important
                int                     retryCount;
                QueryBuildDataSource    qbdsPurchOrderHeader;
                QueryBuildRange         qbrIntStatus;
    
                if (this.isExecutingInBatch())
                {
                    currentBatchJobId = this.getCurrentBatchHeader().parmBatchHeaderId();
                }
    
                this.generatePurchOrderHeaderMappingTable();
    
                DFTGetInfologMessages getInfologMessages = DFTGetInfologMessages::construct(); // Instantiate customized infolog class; Code snippet of class is attached separately.
    
                contract    = _contract;
    
                query       = contract.getQuery();
                
                if (!query)
                {
                    query = new Query(queryStr(DFTPurchaseOrderHeaderQuery));
    
                    qbdsPurchOrderHeader = query.dataSourceTable(tableNum(DFTPurchaseOrderHeader));
    
                    qbrIntStatus = qbdsPurchOrderHeader.addRange(fieldNum(DFTPurchaseOrderHeader, IntegrationStatus));
    
                    qbrIntStatus.value(strFmt('%1', SysQuery::value(DFTIntegrationStatus::InProcessing)));
                    qbrIntStatus.status(RangeStatus::Locked);
                }
                else
                {
                    qbdsPurchOrderHeader = query.dataSourceTable(tableNum(DFTPurchaseOrderHeader));
                }
    
                qbdsPurchOrderHeader.addSortField(fieldNum(DFTPurchaseOrderHeader, PurchId), SortOrder::Ascending);
                qbdsPurchOrderHeader.addSortField(fieldNum(DFTPurchaseOrderHeader, RecId), SortOrder::Ascending);
    
                QueryRun queryRun = new QueryRun(query);
    
                while (queryRun.next()) // this query iterates all purchase orders one after another
                {
                    boolean     isProcessed;
                    int         infologStartIndex   = infolog.num()+1; //*** important for each new record
    
                    purchaseOrderHeader             = queryRun.get(tableNum(DFTPurchaseOrderHeader));
                    
                    recordCounter++;
    
                    try
                    {
                        ttsbegin;
                        retryCount = xSession::currentRetryCount();
                        this.createPurchOrderHeader(purchaseOrderHeader);
                        this.confirmPurchOrder(purchaseOrderHeader.PurchId);
                        Info (strFmt("Purchase order %1 created and confirmed successfully", purchaseOrderHeader.PurchId));
                        infologData = infolog.copy(infologStartIndex, infologLine());
                        str mergedInfo = getInfologMessages.getAllMessages(infologData);
    
                        retryCount = retryCount == 0 ? 1 : retryCount;
    
                        this.updatePurchaseOrderLogFields(purchaseOrderHeader, retryCount, mergedInfo, DFTIntegrationStatus::Processed, infologData);
                        ttscommit;
    
                        isProcessed =true;
                    }
                    catch (Exception::Deadlock)
                    {
                        if (appl.ttsLevel() == 0)
                        {
                            if (xSession::currentRetryCount() >= #RetryNum)
                            {
                                error(strFmt("There is still a deadlock after retrying %1 times for purchase order %2. Try again later.", #RetryNum, purchaseOrderHeader.PurchId));
    
                            }
                            else
                            {
                                retry;
                            }
                        }
                        else
                        {
                            error(strFmt("There is a deadlock for purchase order %1. Try again later.", purchaseOrderHeader.PurchId));
                            retryCount = retryCount == 0 ? 1 : retryCount;
                        }
                    }
                    catch (Exception::CLRError)
                    {
                        System.Exception ex = ClrInterop::getLastException();
                        if (ex != null)
                        {
                            ex = ex.get_InnerException();
                            if (ex != null)
                            {
                                error(ex.ToString());
                            }
                        }
    
                        warning(strFmt("An exception occurred for purchase order %1.", purchaseOrderHeader.PurchId));
                        retryCount = retryCount == 0 ? 1 : retryCount;
                    }
                    catch
                    {
                        warning(strFmt("An exception occurred for purchase order %1.", purchaseOrderHeader.PurchId));
                        retryCount = retryCount == 0 ? 1 : retryCount;
    
                    }
                    finally
                    {
                        infologData = infolog.copy(infologStartIndex, infologLine()); //*** important
                        DFTIntegrationStatusMessage mergedInfo = getInfologMessages.getAllMessages(infologData); // get all messages in string format //*** important
    
                        if (isProcessed)
                        {
                            this.updatePurchaseOrderLogFields(purchaseOrderHeader, retryCount, mergedInfo, DFTIntegrationStatus::Processed, infologData);
                        }
                        else
                        {
                            this.updatePurchaseOrderLogFields(purchaseOrderHeader, retryCount, mergedInfo, DFTIntegrationStatus::Error, infologData);
                        }
                    }
                }
    
                info (strFmt("Batch job ended. Number of processed records: %1", recordCounter));
        }
    
    
        private void updatePurchaseOrderLogFields(DFTPurchaseOrderHeader _purchaseOrderHeader, int _retryCount, DFTIntegrationStatusMessage _mergedInfo,DFTIntegrationStatus _integrationStatus, InfologData _infologData)
        {
            ttsbegin;
            _purchaseOrderHeader.selectForUpdate(true);
    
            _purchaseOrderHeader.RetryCountReference = _purchaseOrderHeader.RetryCountReference + _retryCount;
            _purchaseOrderHeader.RetryCount = _purchaseOrderHeader.RetryCountReference;
    
            if (_purchaseOrderHeader.RetryCount > 0)
            {
                _purchaseOrderHeader.RetryDateTime = DateTimeUtil::utcNow();
            }
    
            _purchaseOrderHeader.IntegrationStatus = _integrationStatus;
            _purchaseOrderHeader.Message = _mergedInfo; //*** important
            _purchaseOrderHeader.ErrorMessagesContainer = _infologData; //*** important
            _purchaseOrderHeader.BatchJobId = currentBatchJobId;
            _purchaseOrderHeader.StatusChangeDateTime = DateTimeUtil::utcNow();
            _purchaseOrderHeader.update();
            ttscommit;
        }
    
    }

    5. Displaying Errors in the UI:

    • Add logic on your form to allow users to view error logs per record.
    • You can leverage both:
      • DFTGetInfologMessages class (to extract structured messages)
      • ErrorMessagesContainer field (to store and retrieve messages)

    6. Enhance the Form with a Log Viewer:

    • Add a new button on your form named “Show log”.
    • Implement the required logic in the clicked() method to fetch and display error details using the DFTGetInfologMessages class.
    public void clicked()
    {
        super();
        
        DFTGetInfologMessages getInfologMessages = DFTGetInfologMessages::construct();
        container errorCon = getInfologMessages.getExceptionMessagesContainer(DFTPurchaseOrderHeader.ErrorMessagesContainer);
    
        if (conLen(errorCon) > 1)
        {
            setprefix(strfmt("Infolog for purchase order %1", DFTPurchaseOrderHeader.PurchId));
            infolog.import(errorCon);
            infolog.import([[1], [0, 'End of log.']]);
        }
        else
        {
            Message::Add(MessageSeverity::Informational, "No error log for selected record.");
        }
    }

    The DFTGetInfologMessages class is a versatile utility that can be reused across various customizations—anywhere detailed error logging is required, from integrations to validations.

    Code files:

    📬 Stay tuned!
    In the next blog post, we’ll demonstrate how to extend this class to send automated email alerts for batch job failures—making your monitoring smarter and more proactive.

  • Import Database .bacpac file on D365 DEV Environment

    Hello folks, this blog is going to help you on importing database bacpac file in D365 DEV environment. Whenever we export the database from UAT environment to Asset Library, it’s always get saved in bacpac format. It can be easily imported in any DEV environment using steps below.

    Step 1: Turn off the below services –
    a. World Wide Web Publishing Service
    b. Management Reporter 2012 Process Service
    c. Microsoft Dynamics 365 Unified Operations: Batch Management Service

    Step 2: Rename Existing AxDB using below SQL query –

    USE master; 
    
    GO 
    
    ALTER DATABASE AxDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE
    
    GO
    
    ALTER DATABASE AxDB MODIFY NAME = AxDB_Old ;
    
    GO 
    
    ALTER DATABASE AxDB_Old SET MULTI_USER
    
    GO
    

    Step 3: Run command prompt as administrator and run below commands one after another –

    cd C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin
    SqlPackage.exe /a:import /sf:J:\UATDbBackup\UATBackup_27Apr\AXDB.bacpac /tsn:localhost /tdn:AxDB_New /p:CommandTimeout=1200

    In 2nd command “sf” specifies a path where bacpac file has been stored and “tdn” specifies the name of new database that is going to be imported.
    Wait until the command completes successfully. It takes approximate 2-3 hours to complete the import operation.

    Step 4: Once command mentioned in Step 3 runs successfully, you can navigate to SQL server and rename the newly imported database using below SQL query –

    USE master; 
    
    GO 
    
    ALTER DATABASE AxDB_New SET SINGLE_USER WITH ROLLBACK IMMEDIATE
    
    GO
    
    ALTER DATABASE AxDB_New MODIFY NAME = AxDB ;
    
    GO 
    
    ALTER DATABASE AxDB SET MULTI_USER
    
    GO
    

    Step 5: Once imported database is renamed as AxDB, you need to run below SQL Query against the same –

    CREATE USER axdeployuser FROM LOGIN axdeployuser
    EXEC sp_addrolemember 'db_owner', 'axdeployuser'
    
    CREATE USER axdbadmin FROM LOGIN axdbadmin
    EXEC sp_addrolemember 'db_owner', 'axdbadmin'
    
    CREATE USER axmrruntimeuser FROM LOGIN axmrruntimeuser
    EXEC sp_addrolemember 'db_datareader', 'axmrruntimeuser'
    EXEC sp_addrolemember 'db_datawriter', 'axmrruntimeuser'
    
    CREATE USER axretaildatasyncuser FROM LOGIN axretaildatasyncuser
    EXEC sp_addrolemember 'DataSyncUsersRole', 'axretaildatasyncuser'
    
    CREATE USER axretailruntimeuser FROM LOGIN axretailruntimeuser
    EXEC sp_addrolemember 'UsersRole', 'axretailruntimeuser'
    EXEC sp_addrolemember 'ReportUsersRole', 'axretailruntimeuser'
    
    CREATE USER axdeployextuser FROM LOGIN axdeployextuser
    EXEC sp_addrolemember 'DeployExtensibilityRole', 'axdeployextuser'
    
    CREATE USER [NT AUTHORITY\NETWORK SERVICE] FROM LOGIN [NT AUTHORITY\NETWORK SERVICE]
    EXEC sp_addrolemember 'db_owner', 'NT AUTHORITY\NETWORK SERVICE'
    
    UPDATE T1
    SET T1.storageproviderid = 0
        , T1.accessinformation = ''
        , T1.modifiedby = 'Admin'
        , T1.modifieddatetime = getdate()
    FROM docuvalue T1
    WHERE T1.storageproviderid = 1 --Azure storage
    
    DROP PROCEDURE IF EXISTS SP_ConfigureTablesForChangeTracking
    DROP PROCEDURE IF EXISTS SP_ConfigureTablesForChangeTracking_V2
    GO
    -- Begin Refresh Retail FullText Catalogs
    DECLARE @RFTXNAME NVARCHAR(MAX);
    DECLARE @RFTXSQL NVARCHAR(MAX);
    DECLARE retail_ftx CURSOR FOR
    SELECT OBJECT_SCHEMA_NAME(object_id) + '.' + OBJECT_NAME(object_id) fullname FROM SYS.FULLTEXT_INDEXES
        WHERE FULLTEXT_CATALOG_ID = (SELECT TOP 1 FULLTEXT_CATALOG_ID FROM SYS.FULLTEXT_CATALOGS WHERE NAME = 'COMMERCEFULLTEXTCATALOG');
    OPEN retail_ftx;
    FETCH NEXT FROM retail_ftx INTO @RFTXNAME;
    
    BEGIN TRY
        WHILE @@FETCH_STATUS = 0 
        BEGIN 
            PRINT 'Refreshing Full Text Index ' + @RFTXNAME;
            EXEC SP_FULLTEXT_TABLE @RFTXNAME, 'activate';
            SET @RFTXSQL = 'ALTER FULLTEXT INDEX ON ' + @RFTXNAME + ' START FULL POPULATION';
            EXEC SP_EXECUTESQL @RFTXSQL;
            FETCH NEXT FROM retail_ftx INTO @RFTXNAME;
        END
    END TRY
    BEGIN CATCH
        PRINT error_message()
    END CATCH
    
    CLOSE retail_ftx; 
    DEALLOCATE retail_ftx; 
    -- End Refresh Retail FullText Catalogs

    Step 6: Once you execute above Query, Follow below Query Execution one after another –

    ALTER DATABASE [AxDB] SET CHANGE_TRACKING = ON (CHANGE_RETENTION = 6 DAYS, AUTO_CLEANUP = ON);
    update DMFDataSource<br>set ENTITYSTORECONNECTIONSTRING = ''<br>where  ENTITYSTORECONNECTIONSTRING <> ''<br>
    truncate table IntegrationActivityTable

    Step 7: Start all services that are stopped in Step 1.

    Step 8: Open Visual Studio and Synchronize database using menu under Dynamics 365

    That’s it !!!

    To Export the database from UAT to Asset Library, you can refer Microsoft docs link given below,

    Thank you!