Tag: cloud

  • Enhancing Batch Job Monitoring in D365 F&O with a Custom Email Alert Solution

    Monitoring batch jobs is a critical task in any Dynamics 365 Finance & Operations environment. One effective way to stay informed about issues is by setting up email alerts specifically for failed batch jobs. This blog walks you through the setup process and highlights an optional customization to capture detailed failure information.


    Prerequisites

    Before enabling batch email alerts, ensure the following configurations are complete:

    1. Email Parameters Configuration
      First, configure the system-wide email parameters by following the steps outlined in the official Microsoft documentation:
      Configure email parameters in D365 F&O
    2. User Email Provider Setup
      Once the global email setup is complete, navigate to:
      User Options > Account > Email provider selection
      Here, configure:
      • Email Provider ID
      • Sender Email (typically your logged-in user email)

    This implementation leverages a utility class named DFTGetInfologMessages, previously detailed in an earlier blog post.


    Enabling Batch Email Alerts

    With the above configurations in place, you can now enable alerts for batch jobs:

    1. Navigate to System administration > Inquiries > Batch jobs
    2. Select the batch job you want to monitor
    3. Click on the Alerts button in the Action Pane
    4. Enable relevant conditions; for failure notifications, set the Error and Email flags to Yes

    You can enable alerts for both new and already scheduled batch jobs.

    More details on configuring alerts are available here:
    👉 Manage alerts in Dynamics 365 F&O


    Standard vs Custom Email Alert Details

    By default, standard email alerts in D365 F&O provide only high-level information about batch job failures—often not enough for deep troubleshooting. If your business requires more granular details (like error messages or failed task data), consider implementing a small customization to enhance the alert payload. This approach allows tailoring the email content based on specific business requirements.

    Step-by-Step Customization Guide

    1. Create a Temporary Table

    This table will store detailed failure messages to be included in the email body.

    • Name: DFTBatchFailureDetailsTmp
    • Fields:
      • Title
      • TaskLog

    2. Extend System Parameters Table

    • You can use any global configuration table or create a dedicated extension to manage email settings.
    • Add the following fields to your table extension:
      • EnableBatchEmailTemplate
        • Type: Enum (NoYes)
        • EDT: NoYesId
        • Label: Enable custom email template
        • Help Text: Enabling this flag will send email in custom template format for failed batch jobs.
      • BatchEmailText
        • EDT: smmCampaignBroadcastText
        • Help Text: Email body text for batch job alert

    3. Expose Custom Fields to the UI

    • Extend the relevant form to include these new fields, allowing system admins to configure the alert settings directly from the user interface.

    Add below html text in BatchEmailText and set EnableBatchEmailTemplate flag value to Yes.

    4. Customize the Email Notification Logic

    • Create a CoC (Chain of Command) extension for the EventNotificationBatch class.
    • Override the logic to replace the default email body with a custom HTML/text body using the values from your temporary table and configuration.

    Once this setup is in place and the custom email template option is enabled, the system will automatically send detailed failure information through email, offering greater insight than standard alerts.

    This solution empowers admins and support teams to act quickly and efficiently by providing actionable failure details—without having to manually investigate each batch job issue.


    Keep your operations running smoothly by proactively monitoring failed batch jobs—and don’t hesitate to customize when the out-of-box solution falls short!

    #Xpp #D365FO #ERP #BatchJob #Integration

  • Record-Level Error Logging in D365 F&O

    Learn how to capture and display detailed error logs for each record in asynchronous integrations using custom utility class DFTGetInfologMessages.

    Hello Folks,

    In many real-world D365 Finance & Operations implementations, we often deal with asynchronous integration patterns—where incoming data is first landed into a staging table, and then processed via a batch job.

    Typical business scenarios include:

    • Creation of Sales Orders or Purchase Orders
    • Posting of Invoices

    During this process, it’s common to encounter data issues that need to be handled gracefully. Instead of failing the entire batch, it’s often better to log errors at the individual record level, especially when dealing with high-volume integrations.

    This blog explores how to:

    • Capture record-specific failure details during staging table processing
    • Store those details efficiently
    • Display them in a user-friendly format, similar to how batch job logs appear in the D365 F&O UI

    This approach not only improves error transparency but also helps functional consultants and support teams quickly identify and troubleshoot failed records without digging through raw logs or the infolog.


    Step-by-Step: Implementing Record-Level Error Logging in D365 F&O

    Let’s now walk through the technical implementation to capture and display error logs for each record processed via batch jobs.

    1. Create a New EDT:

    • Define a new Extended Data Type (EDT) named DFTIntegrationStatusMessage with Memo as its base type.
    • This will hold detailed error messages.

    2. Update Your Log Table:

    • Add a string-type field named Message using the new EDT DFTIntegrationStatusMessage.
    • Add a container-type field named ErrorMessagesContainer using the EDT InfologData.
    • This container will store error messages in a structured format for later retrieval.

    3. Add Utility Class for Capturing Infologs:

    • Create a class named DFTGetInfologMessages in your project.
    • To avoid compilation errors, ensure that the TestEssentials model is referenced by navigating to: Model Management > Update Model Parameters.

    4. Sample Batch Job Service Class:

    • Refer to the sample implementation provided below
    • The batch job processes data from a custom table to create Purchase Orders and logs errors for each failed record.
    • Lines marked with important remark in the code relate to capturing and storing error logs.
    /// <summary>
    /// create and confirm purchase order
    /// </summary>
    /// <remarks>N developed for Batch Failure by RhushikeshR</remarks>
    public class DFTPurchOrderService extends SysOperationServiceBase
    {
        RefRecId currentBatchJobId;
    
        public void processOperation(DFTPurchOrderDataContract _contract)
        {
                #define.RetryNum(5)
    
                Query                   query;
                int                     recordCounter;  
                DFTPurchaseOrderHeader  purchaseOrderHeader;
                InfologData             infologData; //*** important
                int                     retryCount;
                QueryBuildDataSource    qbdsPurchOrderHeader;
                QueryBuildRange         qbrIntStatus;
    
                if (this.isExecutingInBatch())
                {
                    currentBatchJobId = this.getCurrentBatchHeader().parmBatchHeaderId();
                }
    
                this.generatePurchOrderHeaderMappingTable();
    
                DFTGetInfologMessages getInfologMessages = DFTGetInfologMessages::construct(); // Instantiate customized infolog class; Code snippet of class is attached separately.
    
                contract    = _contract;
    
                query       = contract.getQuery();
                
                if (!query)
                {
                    query = new Query(queryStr(DFTPurchaseOrderHeaderQuery));
    
                    qbdsPurchOrderHeader = query.dataSourceTable(tableNum(DFTPurchaseOrderHeader));
    
                    qbrIntStatus = qbdsPurchOrderHeader.addRange(fieldNum(DFTPurchaseOrderHeader, IntegrationStatus));
    
                    qbrIntStatus.value(strFmt('%1', SysQuery::value(DFTIntegrationStatus::InProcessing)));
                    qbrIntStatus.status(RangeStatus::Locked);
                }
                else
                {
                    qbdsPurchOrderHeader = query.dataSourceTable(tableNum(DFTPurchaseOrderHeader));
                }
    
                qbdsPurchOrderHeader.addSortField(fieldNum(DFTPurchaseOrderHeader, PurchId), SortOrder::Ascending);
                qbdsPurchOrderHeader.addSortField(fieldNum(DFTPurchaseOrderHeader, RecId), SortOrder::Ascending);
    
                QueryRun queryRun = new QueryRun(query);
    
                while (queryRun.next()) // this query iterates all purchase orders one after another
                {
                    boolean     isProcessed;
                    int         infologStartIndex   = infolog.num()+1; //*** important for each new record
    
                    purchaseOrderHeader             = queryRun.get(tableNum(DFTPurchaseOrderHeader));
                    
                    recordCounter++;
    
                    try
                    {
                        ttsbegin;
                        retryCount = xSession::currentRetryCount();
                        this.createPurchOrderHeader(purchaseOrderHeader);
                        this.confirmPurchOrder(purchaseOrderHeader.PurchId);
                        Info (strFmt("Purchase order %1 created and confirmed successfully", purchaseOrderHeader.PurchId));
                        infologData = infolog.copy(infologStartIndex, infologLine());
                        str mergedInfo = getInfologMessages.getAllMessages(infologData);
    
                        retryCount = retryCount == 0 ? 1 : retryCount;
    
                        this.updatePurchaseOrderLogFields(purchaseOrderHeader, retryCount, mergedInfo, DFTIntegrationStatus::Processed, infologData);
                        ttscommit;
    
                        isProcessed =true;
                    }
                    catch (Exception::Deadlock)
                    {
                        if (appl.ttsLevel() == 0)
                        {
                            if (xSession::currentRetryCount() >= #RetryNum)
                            {
                                error(strFmt("There is still a deadlock after retrying %1 times for purchase order %2. Try again later.", #RetryNum, purchaseOrderHeader.PurchId));
    
                            }
                            else
                            {
                                retry;
                            }
                        }
                        else
                        {
                            error(strFmt("There is a deadlock for purchase order %1. Try again later.", purchaseOrderHeader.PurchId));
                            retryCount = retryCount == 0 ? 1 : retryCount;
                        }
                    }
                    catch (Exception::CLRError)
                    {
                        System.Exception ex = ClrInterop::getLastException();
                        if (ex != null)
                        {
                            ex = ex.get_InnerException();
                            if (ex != null)
                            {
                                error(ex.ToString());
                            }
                        }
    
                        warning(strFmt("An exception occurred for purchase order %1.", purchaseOrderHeader.PurchId));
                        retryCount = retryCount == 0 ? 1 : retryCount;
                    }
                    catch
                    {
                        warning(strFmt("An exception occurred for purchase order %1.", purchaseOrderHeader.PurchId));
                        retryCount = retryCount == 0 ? 1 : retryCount;
    
                    }
                    finally
                    {
                        infologData = infolog.copy(infologStartIndex, infologLine()); //*** important
                        DFTIntegrationStatusMessage mergedInfo = getInfologMessages.getAllMessages(infologData); // get all messages in string format //*** important
    
                        if (isProcessed)
                        {
                            this.updatePurchaseOrderLogFields(purchaseOrderHeader, retryCount, mergedInfo, DFTIntegrationStatus::Processed, infologData);
                        }
                        else
                        {
                            this.updatePurchaseOrderLogFields(purchaseOrderHeader, retryCount, mergedInfo, DFTIntegrationStatus::Error, infologData);
                        }
                    }
                }
    
                info (strFmt("Batch job ended. Number of processed records: %1", recordCounter));
        }
    
    
        private void updatePurchaseOrderLogFields(DFTPurchaseOrderHeader _purchaseOrderHeader, int _retryCount, DFTIntegrationStatusMessage _mergedInfo,DFTIntegrationStatus _integrationStatus, InfologData _infologData)
        {
            ttsbegin;
            _purchaseOrderHeader.selectForUpdate(true);
    
            _purchaseOrderHeader.RetryCountReference = _purchaseOrderHeader.RetryCountReference + _retryCount;
            _purchaseOrderHeader.RetryCount = _purchaseOrderHeader.RetryCountReference;
    
            if (_purchaseOrderHeader.RetryCount > 0)
            {
                _purchaseOrderHeader.RetryDateTime = DateTimeUtil::utcNow();
            }
    
            _purchaseOrderHeader.IntegrationStatus = _integrationStatus;
            _purchaseOrderHeader.Message = _mergedInfo; //*** important
            _purchaseOrderHeader.ErrorMessagesContainer = _infologData; //*** important
            _purchaseOrderHeader.BatchJobId = currentBatchJobId;
            _purchaseOrderHeader.StatusChangeDateTime = DateTimeUtil::utcNow();
            _purchaseOrderHeader.update();
            ttscommit;
        }
    
    }

    5. Displaying Errors in the UI:

    • Add logic on your form to allow users to view error logs per record.
    • You can leverage both:
      • DFTGetInfologMessages class (to extract structured messages)
      • ErrorMessagesContainer field (to store and retrieve messages)

    6. Enhance the Form with a Log Viewer:

    • Add a new button on your form named “Show log”.
    • Implement the required logic in the clicked() method to fetch and display error details using the DFTGetInfologMessages class.
    public void clicked()
    {
        super();
        
        DFTGetInfologMessages getInfologMessages = DFTGetInfologMessages::construct();
        container errorCon = getInfologMessages.getExceptionMessagesContainer(DFTPurchaseOrderHeader.ErrorMessagesContainer);
    
        if (conLen(errorCon) > 1)
        {
            setprefix(strfmt("Infolog for purchase order %1", DFTPurchaseOrderHeader.PurchId));
            infolog.import(errorCon);
            infolog.import([[1], [0, 'End of log.']]);
        }
        else
        {
            Message::Add(MessageSeverity::Informational, "No error log for selected record.");
        }
    }

    The DFTGetInfologMessages class is a versatile utility that can be reused across various customizations—anywhere detailed error logging is required, from integrations to validations.

    Code files:

    📬 Stay tuned!
    In the next blog post, we’ll demonstrate how to extend this class to send automated email alerts for batch job failures—making your monitoring smarter and more proactive.

  • Import Database .bacpac file on D365 DEV Environment

    Hello folks, this blog is going to help you on importing database bacpac file in D365 DEV environment. Whenever we export the database from UAT environment to Asset Library, it’s always get saved in bacpac format. It can be easily imported in any DEV environment using steps below.

    Step 1: Turn off the below services –
    a. World Wide Web Publishing Service
    b. Management Reporter 2012 Process Service
    c. Microsoft Dynamics 365 Unified Operations: Batch Management Service

    Step 2: Rename Existing AxDB using below SQL query –

    USE master; 
    
    GO 
    
    ALTER DATABASE AxDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE
    
    GO
    
    ALTER DATABASE AxDB MODIFY NAME = AxDB_Old ;
    
    GO 
    
    ALTER DATABASE AxDB_Old SET MULTI_USER
    
    GO
    

    Step 3: Run command prompt as administrator and run below commands one after another –

    cd C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin
    SqlPackage.exe /a:import /sf:J:\UATDbBackup\UATBackup_27Apr\AXDB.bacpac /tsn:localhost /tdn:AxDB_New /p:CommandTimeout=1200

    In 2nd command “sf” specifies a path where bacpac file has been stored and “tdn” specifies the name of new database that is going to be imported.
    Wait until the command completes successfully. It takes approximate 2-3 hours to complete the import operation.

    Step 4: Once command mentioned in Step 3 runs successfully, you can navigate to SQL server and rename the newly imported database using below SQL query –

    USE master; 
    
    GO 
    
    ALTER DATABASE AxDB_New SET SINGLE_USER WITH ROLLBACK IMMEDIATE
    
    GO
    
    ALTER DATABASE AxDB_New MODIFY NAME = AxDB ;
    
    GO 
    
    ALTER DATABASE AxDB SET MULTI_USER
    
    GO
    

    Step 5: Once imported database is renamed as AxDB, you need to run below SQL Query against the same –

    CREATE USER axdeployuser FROM LOGIN axdeployuser
    EXEC sp_addrolemember 'db_owner', 'axdeployuser'
    
    CREATE USER axdbadmin FROM LOGIN axdbadmin
    EXEC sp_addrolemember 'db_owner', 'axdbadmin'
    
    CREATE USER axmrruntimeuser FROM LOGIN axmrruntimeuser
    EXEC sp_addrolemember 'db_datareader', 'axmrruntimeuser'
    EXEC sp_addrolemember 'db_datawriter', 'axmrruntimeuser'
    
    CREATE USER axretaildatasyncuser FROM LOGIN axretaildatasyncuser
    EXEC sp_addrolemember 'DataSyncUsersRole', 'axretaildatasyncuser'
    
    CREATE USER axretailruntimeuser FROM LOGIN axretailruntimeuser
    EXEC sp_addrolemember 'UsersRole', 'axretailruntimeuser'
    EXEC sp_addrolemember 'ReportUsersRole', 'axretailruntimeuser'
    
    CREATE USER axdeployextuser FROM LOGIN axdeployextuser
    EXEC sp_addrolemember 'DeployExtensibilityRole', 'axdeployextuser'
    
    CREATE USER [NT AUTHORITY\NETWORK SERVICE] FROM LOGIN [NT AUTHORITY\NETWORK SERVICE]
    EXEC sp_addrolemember 'db_owner', 'NT AUTHORITY\NETWORK SERVICE'
    
    UPDATE T1
    SET T1.storageproviderid = 0
        , T1.accessinformation = ''
        , T1.modifiedby = 'Admin'
        , T1.modifieddatetime = getdate()
    FROM docuvalue T1
    WHERE T1.storageproviderid = 1 --Azure storage
    
    DROP PROCEDURE IF EXISTS SP_ConfigureTablesForChangeTracking
    DROP PROCEDURE IF EXISTS SP_ConfigureTablesForChangeTracking_V2
    GO
    -- Begin Refresh Retail FullText Catalogs
    DECLARE @RFTXNAME NVARCHAR(MAX);
    DECLARE @RFTXSQL NVARCHAR(MAX);
    DECLARE retail_ftx CURSOR FOR
    SELECT OBJECT_SCHEMA_NAME(object_id) + '.' + OBJECT_NAME(object_id) fullname FROM SYS.FULLTEXT_INDEXES
        WHERE FULLTEXT_CATALOG_ID = (SELECT TOP 1 FULLTEXT_CATALOG_ID FROM SYS.FULLTEXT_CATALOGS WHERE NAME = 'COMMERCEFULLTEXTCATALOG');
    OPEN retail_ftx;
    FETCH NEXT FROM retail_ftx INTO @RFTXNAME;
    
    BEGIN TRY
        WHILE @@FETCH_STATUS = 0 
        BEGIN 
            PRINT 'Refreshing Full Text Index ' + @RFTXNAME;
            EXEC SP_FULLTEXT_TABLE @RFTXNAME, 'activate';
            SET @RFTXSQL = 'ALTER FULLTEXT INDEX ON ' + @RFTXNAME + ' START FULL POPULATION';
            EXEC SP_EXECUTESQL @RFTXSQL;
            FETCH NEXT FROM retail_ftx INTO @RFTXNAME;
        END
    END TRY
    BEGIN CATCH
        PRINT error_message()
    END CATCH
    
    CLOSE retail_ftx; 
    DEALLOCATE retail_ftx; 
    -- End Refresh Retail FullText Catalogs

    Step 6: Once you execute above Query, Follow below Query Execution one after another –

    ALTER DATABASE [AxDB] SET CHANGE_TRACKING = ON (CHANGE_RETENTION = 6 DAYS, AUTO_CLEANUP = ON);
    update DMFDataSource<br>set ENTITYSTORECONNECTIONSTRING = ''<br>where  ENTITYSTORECONNECTIONSTRING <> ''<br>
    truncate table IntegrationActivityTable

    Step 7: Start all services that are stopped in Step 1.

    Step 8: Open Visual Studio and Synchronize database using menu under Dynamics 365

    That’s it !!!

    To Export the database from UAT to Asset Library, you can refer Microsoft docs link given below,

    Thank you!