Saturday, December 3, 2022

Downloading Files from FTP and Doing Something With Them In D365FO

 Although D365FO is now thoroughly modern and web based, with all the good things that entails, we still have to interface with some less modern systems. Sometimes those systems are literally older than we are. There are some middleware solutions that can help bridge the gap but we have quite a few tools in D365FO that can help us. Let's talk about FTP for a bit. The tried and true battle horse of Odin himself, this is never going to go away, ever. So, let's see what we can do from inside D365FO with no C# in play; just some standard tools available from x++. Consider the very simple example:

using System.Net;

class AAX_FTPDownloadSimple
{

    public static void main(Args args)
    {
        System.Net.WebClient webClient = new System.Net.WebClient();
        webClient.Credentials = new NetworkCredential("userName", "password");
        webClient.DownloadFile("ftp://nadummy.sharefileftp.com/test/test.txt", @"C:\temp\test.txt");
        info("complete");
    }

}

3 line of code to download a file and place it in an arbitrary location. It's not particularly helpful as we have to know the source of the file and we have lots of hard coding. However, we now have a file that we could use TextIo on, if we wanted. I'm positive this is a dev box only example as you wouldn't have access in that way to the underlying server disk on anything that isn't a one box. Let's look at something a little more feature rich.

using System.IO;
using WebClient = System.Net.WebClient;
using NetworkCredential = System.Net.NetworkCredential;

class AAX_FileDownloadIntermediate
{
    public static void main(Args args)
    {
        TextIo textIo;

        str inputfilepath = System.IO.Path::GetTempPath() + System.IO.Path::GetRandomFileName();
        str ftphost = "nadummy.sharefileftp.com";
        str ftpfilepath = "/test/test.txt";

        str ftpfullpath = "ftp://" + ftphost + ftpfilepath;

        using (WebClient request = new WebClient())
        {
            request.Credentials = new NetworkCredential("userName", "password");

            System.Byte[] fileData = request.DownloadData(ftpfullpath);

            using (System.IO.FileStream file = System.IO.File::Create(inputfilepath))
            {
                file.Write(fileData, 0, fileData.Length);
                file.Close();
            }
            info("download Complete");
        }

        textIo = new TextIo(inputfilepath, "r");
        
        info(con2Str(textIo.read()));

        TextIo.finalize();

        System.IO.File::Delete(inputfilepath);

        info("Complete");
    }

}

Similar to that last one but we're handling data in a file, rather than a file plus a few other tweaks. First, we're using webClient.DownloadData rather than webClient.DownloadFile then writing that to a FileStream. Next, we're taking that FileStream and writing it to temporary (.NET) disk storage. Additionally, we're opening the file for the FileStream using TextIo, which we've had for years, so we've working with something familiar. Once done, we are deleting the temp file. I'm not sure if this would work in a non-onebox environment but .NET may be aware of its deployment type and sort itself out. I can't really say so definitely test in a Tier 2 environment. Finally, we're using using in a way most x++ people may be unfamiliar with. Next, let's keep on adding stuff:

using FTPWebRequest = System.Net.FtpWebRequest;
using WebRequest = System.Net.WebRequest;
using NetworkCredential = System.Net.NetworkCredential;
using WebRequestMethods = System.Net.WebRequestMethods;
using Stream = System.IO.Stream;

class AAX_FileDownloadAdvanced
{
    public static void main(Args args)
    {
        TextStreamIo textStreamIo;

        System.Byte[] buffer = new System.Byte[10240]();
 
        System.IO.Stream fileStream = new System.IO.MemoryStream();

        str targetFile = System.IO.Path::GetRandomFileName();
        FtpWebRequest request = WebRequest::Create("ftp://nadummy.sharefileftp.com/test/test.txt");
        request.Credentials = new NetworkCredential("userName", "password");    

        using (Stream ftpStream = request.GetResponse().GetResponseStream())
        {
            int read = 0;
            
            while(true)
            {
                read = ftpStream.Read(buffer, 0, buffer.Length);
                if(read > 0)
                {
                    fileStream.Write(buffer, 0, read);
                    info(strFmt("Downloaded %1 bytes", fileStream.Position));
                }
                else
                {
                    break;
                }
            }
        }

        textStreamIo = textStreamIo::constructForRead(fileStream);
        
        info(con2Str(textStreamIo.read()));
      
        textStreamIo.finalize();
        fileStream.Dispose();

        info("Complete");
    }

}

Here we have quite a few differences. First, we're using a WebRequest rather than a WebClientWebRequests support encryption so stuff like SFTP or FTPS becomes an option. Next we're grabbing chunks from our source file so we can report on it's progress. If we had a large file, this would be helpful to log what is happening. Next, we're taking the data we placed in a Stream and are interacting with it without placing it in any type of local storage, temporary or otherwise. You could use File::SendFileToTempStore() and File::UseFileFromURL() to store/retrieve in/from blob storage if you like. Finally, we are passing the stream to a stream implementation of TextIo as TextStreamIo.

D365 – Data Entity Method Call Sequence

 




1. initValue
2. validateField
3. validateWrite
4. update
4.1. doUpdate
4.1.1. persistEntity
4.1.1.1. doPersistEntity
4.1.1.1.1. initializeDataSources
4.1.1.1.1.1. initializeEntityDataSource
Note: initializeDataSource is called once for each DataSource in Entity.
4.1.1.1.2. mapEntityToDataSources
Note: initializeDataSource is called once for each DataSource in Entity.
4.1.1.1.3. saveDataSources
4.1.1.1.3.1. updateEntityDataSource
4.1.1.1.4. mapEntityToDataSource (maybe for another record)
4.1.1.1.5. saveDataSources
4.1.1.1.5.1. updateEntityDataSource for update operation and (insertEntityDataSource for insert)
4.1.1.1.5.1.1. mapDataSourceToEntity
4.1.1.1.5.1.2. doSaveDataSource
4.1.1.1.5.1.2.1. updateDataSource
4.1.1.1.5.1.2.1.1. preupInsertDataSource
4.1.1.1.5.1.2.1.1.1. validateWrite of table
Plus:
postLoad
This method is called during the export for setting the value to unmapped fields after entity is downloaded to datasource.

EXPORT:
       Entity- postLoad()
       staging - insert()
       Entity- postLoad() - depends on records

IMPORT:
       staging - postLoad()
       Entity - postLoad()
       Entity - initValue()
       Entity - validateField() - depends on no of fields
       Entity - validateWrite()
       Entity - insert() / update()
       Entity - persistEntity()
       Entity - initializeEntityDataSource()
       Entity - mapEntityToDataSource()
       Entity - insertEntityDataSource() / updateEntityDataSource()
       Entity - mapDataSourceToEntity()
       staging - postLoad()

Here are some method’s calls during Import:

defaultCTQuery
copyCustomStagingToTarget
postGetStagingData
preTargetProcessSetBased
postTargetProcess

My notes:

  1. The postLoad method is called also by import! Since postLoad is the recommended place to set values for the virtual fields, this potentially slow-down the import process unnecessarily.
  2. Be careful by using the postTargetProcess method! This method is called at the end of EACH thread/task-bundle if you are using the “Import threshold record count” option.
  3. Previously, you could add postTargetProcess only to a newly created entity, but now you can extend any entity using CoC

Extension is simple.

[ExtensionOf(tableStr(DataEntity))]
final public class DataEntity_Extension
{

    public static void postTargetProcess(DMFDefinitionGroupExecution _dmfDefinitionGroupExecution)
    {
        // Do no call next
        //logic here
    }
}

Please note that this can be done only MDM scenarios but not via ODATA because ODATA updates, inserts records row by row and there is no post event\method to use. You might wish to use OData action to perform post actions on records.

Saturday, August 27, 2022

Running x++ scripts and jobs in an environment with 0 downtime

 

This blog post is about exploring the new Microsoft feature that allows you to run X++ jobs on PROD and non-PROD environments without release and with 0 downtime.

In 10.0.25 you can run simple X++ scripts on a production environment without any downtime. This feature lets you run custom X++ scripts without having to go through Dynamics LCS or suspend your system. Therefore, you can correct minor data inconsistencies without causing any disruptive downtime.


To be able to use this feature 
you need to create a deployable package with ONLY ONE runnable class in it. To do this, 
 you can create a new model and reference the models which object you will use in the X++ script.
 
Dynamics365->Model Management -> Create Model Ex: CutomModel

 Then create the runnable class and generate a deployable package from Visual Studio only with your new model that has only one runnable X++ class.


class SLWrongVendorRunnableClass
{
    
    /// <summary>
    /// Class entry point. The system will call this method when a designated menu 
    /// is selected or when execution starts and this class is set as the startup class.
    /// </summary>
    /// <param name = "_args">The specified arguments.</param>
    public static void main(Args _args)
    {

        VENDINVOICEJOUR Vendjounr;
        PURCHTABLE purchtable;
        if (curExt() != 'XXX')
        {
            throw error("This script must run in the XXX company!");
        }

        ttsbegin;
        select forUpdate Vendjounr
            where   Vendjounr.PURCHID =='PO/000/000XXX' && Vendjounr.InvoiceAccount =='64';
        if (Vendjounr)
        {
            Vendjounr.OrderAccount ='64';
            Vendjounr.doUpdate();
        }

        select forUpdate purchtable
            where  purchtable.PURCHID =='PO/000/000XXX'&& purchtable.InvoiceAccount =='64';
        if (purchtable)
        {
            purchtable.OrderAccount ='64';
            purchtable.doUpdate();
        }
        if (Vendjounr.RowCount() != 1)
        {
            throw error("Not  ORDERACCOUNT updating the expected row in VENDINVOICEJOUR!");
        }
        
        if (PURCHTABLE.RowCount() != 1)
        {
            throw error("Not ORDERACCOUNT updating the expected row in PURCHTABLE !");
        }

        info("Success");
  
        ttscommit;
    
    }

}

After generating the deployable package you can go to System Administration – Periodic tasks – Database – Custom scripts.
Here you can upload your deployable package.

If your deployable package has not consisted of only one runnable job you won’t be able to upload it and a warning message will be displayed.
After successfully uploading the deployable package, another user (different from the one who uploaded it) must Approve the script in the process workflow.

After the approval, you should follow the workflow and Run a test.

If the test log is okay and is without any errors, you need to Accept the test log and then you will be able to Run the script.


After the execution of the script is completed, the log will be updated and you need to end the workflow by confirming that the purpose is resolved, the purpose is unresolved, or abandon.

This is a helpful feature by Microsoft if you need to do some small data corrections or delete dates from certain tables. This way you don’t have to raise a Microsoft ticket to do small data corrections on PROD DB or connect to the ACC or other T2 environments databases.
Also, runnable classes with dialogs can be imported here not just simple data correction scripts.












Friday, August 26, 2022

Restore/Import Database with RETAIL functionality

 Restoring a Dynamics 365 finops database has become much easier with the export database functionality in LCS. More details here

 

With the new feature in LCS the database is exported out of Azure SQL and a bacpac file is added to the asset library

image

image

After the database is exported the database file can be seen in the Asset library.

image

The following steps can be used to import the database to development environment for Retail testing/debugging

 

1) Restore the bacpac file

 

SqlPackage.exe /a:import /sf:D:\EXPORTED-DB.bacpac /tsn:localhost /tdn:AxDB_Copy /p:CommandTimeout=1200

 

2) Stop the world wide publishing service, Batch, DIXF and management reporter service

 

3) Rename the original database to AXDB_Orig and database imported to AxDB

 

4) Start the services stopped in step2.

 

5) In the imported database the change tracking will automatically be turned off at database level. Enable change tracking on AxDB. This can be easily done from database properties in a Tier 1 environment.

 

Note-For retail after running the initialize retail scheduler the change tracking details are updated(This is done in step 9).

 

6)  Please run build and database sync. Ensure there are no errors

 

7)  Run the below script to add users. This script is needed for running the environment retarget tool. If the users already exist in the database you can ignore the errors when running the script.

 

CREATE USER [axdbadmin] FOR LOGIN [axdbadmin]

CREATE USER [axdeployuser] FOR LOGIN [axdeployuser]

CREATE USER [axmrruntimeuser] FOR LOGIN [axmrruntimeuser]

CREATE USER [NT AUTHORITY\NETWORK SERVICE] FOR LOGIN [NT AUTHORITY\NETWORK SERVICE]

CREATE USER [axretaildatasyncuser] FOR LOGIN [axretaildatasyncuser]

CREATE USER [axretailruntimeuser] FOR LOGIN [axretailruntimeuser]

 

ALTER ROLE [db_owner] ADD MEMBER [axdbadmin]

ALTER ROLE [db_owner] ADD MEMBER [axdeployuser]

ALTER ROLE [db_datareader] ADD MEMBER [axmrruntimeuser]

ALTER ROLE [db_datawriter] ADD MEMBER [axmrruntimeuser]

ALTER ROLE [db_datareader] ADD MEMBER [NT AUTHORITY\NETWORK SERVICE]

ALTER ROLE [db_datawriter] ADD MEMBER [NT AUTHORITY\NETWORK SERVICE]

ALTER ROLE [db_ddladmin] ADD MEMBER [NT AUTHORITY\NETWORK SERVICE]

ALTER ROLE [DataSyncUsersRole] ADD MEMBER [axretaildatasyncuser]

ALTER ROLE [ReportUsersRole] ADD MEMBER [axretailruntimeuser]

ALTER ROLE [UsersRole] ADD MEMBER [axretailruntimeuser]

 

8) Run the environment retarget tool from LCS.

In your project's Asset Library, in the Software deployable packages section, click Import.

From the list of shared assets, select the Environment retarget Tool.

On the Environment details page for your target environment, select Maintain > Apply updates.

Select the Environment retarget tool that you uploaded earlier, and then select Apply to apply the package.

Monitor the progress of the package deployment.

 

9) Initialize the retail scheduler with delete existing configuration data marked as yes. Delete existing configuration “yes” will delete the existing retail change tracking configuration and create the base change tracking configuration for sync jobs again. Also, it updates any new mapping from the resource files.

 

Retail > Headquarters setup > Retail scheduler >Initialize retail scheduler

 

10) Update the identity providers to point to your tenant.

 

Retail > Headquarters setup > parameters > Retail shared parameters

 

11) Update the identity on worker. The identity on worker is required for device activation. 

https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/retail-modern-pos-device-activation

  

12)   Run the 9999 sync job to Sync all data with channel db

Export a copy of the standard user acceptance testing (UAT) database

 https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/database/dbmovement-scenario-exportuat


Import the database:

  • Keep a copy of the existing AxDB database, so that you can revert to it later if you must.
  • Import the new database under a new name, such as AxDB_fromProd.

Open a Command Prompt window, and run the following commands from the sqlpackage .NET Core folder.

Download sqlpackage .NET Core for Windows from Get sqlpackage .NET Core for Windows

SqlPackage.exe /a:import /sf:D:\Exportedbacpac\my.bacpac /tsn:localhost /tdn:<target database name> /p:CommandTimeout=1200


Update the database


Remember to edit the final ALTER DATABASE statement so that it uses the name of your database.

CREATE USER axdeployuser FROM LOGIN axdeployuser
EXEC sp_addrolemember 'db_owner', 'axdeployuser'

CREATE USER axdbadmin FROM LOGIN axdbadmin
EXEC sp_addrolemember 'db_owner', 'axdbadmin'

CREATE USER axmrruntimeuser FROM LOGIN axmrruntimeuser
EXEC sp_addrolemember 'db_datareader', 'axmrruntimeuser'
EXEC sp_addrolemember 'db_datawriter', 'axmrruntimeuser'

CREATE USER axretaildatasyncuser FROM LOGIN axretaildatasyncuser

CREATE USER axretailruntimeuser FROM LOGIN axretailruntimeuser

CREATE USER axdeployextuser FROM LOGIN axdeployextuser

CREATE USER [NT AUTHORITY\NETWORK SERVICE] FROM LOGIN [NT AUTHORITY\NETWORK SERVICE]
EXEC sp_addrolemember 'db_owner', 'NT AUTHORITY\NETWORK SERVICE'

UPDATE T1
SET T1.storageproviderid = 0
    , T1.accessinformation = ''
    , T1.modifiedby = 'Admin'
    , T1.modifieddatetime = getdate()
FROM docuvalue T1
WHERE T1.storageproviderid = 1 --Azure storage

DROP PROCEDURE IF EXISTS SP_ConfigureTablesForChangeTracking
DROP PROCEDURE IF EXISTS SP_ConfigureTablesForChangeTracking_V2
GO
-- Begin Refresh Retail FullText Catalogs
DECLARE @RFTXNAME NVARCHAR(MAX);
DECLARE @RFTXSQL NVARCHAR(MAX);
DECLARE retail_ftx CURSOR FOR
SELECT OBJECT_SCHEMA_NAME(object_id) + '.' + OBJECT_NAME(object_id) fullname FROM SYS.FULLTEXT_INDEXES
    WHERE FULLTEXT_CATALOG_ID = (SELECT TOP 1 FULLTEXT_CATALOG_ID FROM SYS.FULLTEXT_CATALOGS WHERE NAME = 'COMMERCEFULLTEXTCATALOG');
OPEN retail_ftx;
FETCH NEXT FROM retail_ftx INTO @RFTXNAME;

BEGIN TRY
    WHILE @@FETCH_STATUS = 0 
    BEGIN 
        PRINT 'Refreshing Full Text Index ' + @RFTXNAME;
        EXEC SP_FULLTEXT_TABLE @RFTXNAME, 'activate';
        SET @RFTXSQL = 'ALTER FULLTEXT INDEX ON ' + @RFTXNAME + ' START FULL POPULATION';
        EXEC SP_EXECUTESQL @RFTXSQL;
        FETCH NEXT FROM retail_ftx INTO @RFTXNAME;
    END
END TRY
BEGIN CATCH
    PRINT error_message()
END CATCH

CLOSE retail_ftx; 
DEALLOCATE retail_ftx; 
-- End Refresh Retail FullText Catalogs

--Begin create retail channel database record--
declare @ExpectedDatabaseName nvarchar(64) = 'Default';
declare @DefaultDataGroupRecId BIGINT;
declare @ExpectedDatabaseRecId BIGINT; 
IF NOT EXISTS (select 1 from RETAILCONNDATABASEPROFILE where NAME = @ExpectedDatabaseName)
BEGIN 
select @DefaultDataGroupRecId = RECID from RETAILCDXDATAGROUP where NAME = 'Default'; 
insert into RETAILCONNDATABASEPROFILE (DATAGROUP, NAME, CONNECTIONSTRING, DATASTORETYPE)
values (@DefaultDataGroupRecId, @ExpectedDatabaseName, NULL, 0); 
select @ExpectedDatabaseRecId = RECID from RETAILCONNDATABASEPROFILE where NAME = @ExpectedDatabaseName; 
insert into RETAILCDXDATASTORECHANNEL (CHANNEL, DATABASEPROFILE)
select RCT.RECID, @ExpectedDatabaseRecId from RETAILCHANNELTABLE RCT
inner join RETAILCHANNELTABLEEXT RCTEX on RCTEX.CHANNEL = RCT.RECID
        update RETAILCHANNELTABLEEXT set LIVECHANNELDATABASE = @ExpectedDatabaseRecId where LIVECHANNELDATABASE = 0
END; 
--End create retail channel database record

Turn on change tracking


To turn on change tracking, use the ALTER DATABASE command.

ALTER DATABASE [your database name] SET CHANGE_TRACKING = ON (CHANGE_RETENTION = 6 DAYS, AUTO_CLEANUP = ON);

Start to use the new database

To switch environments and use the new database, first stop the following services:

  • World Wide Web Publishing Service
  • Microsoft Dynamics 365 Unified Operations: Batch Management Service
  • Management Reporter 2012 Process Service

After these services have been stopped, rename the AxDB database AxDB_orig, rename your newly imported database AxDB, and then restart the three services.

To switch back to the original database, reverse this process. In other words, stop the services, rename the databases, and then restart the services.



Post steps for Commerce environments


If you are using Commerce channels, when importing a database to a developer environment, which was originally exported from a self-service sandbox, the following additional steps must be performed on the destination developer environment. Without completing these steps, Commerce channels will not function.

  1. To restore Commerce channels functionality, apply the latest Microsoft service update or quality update, which will create the channel database.
  2. To restore any previously deployed channel database extensions, re-apply the corresponding Retail self-service deployable package.

Reprovision the target environment


o run the Environment re-provisioning tool on the destination environment, run the following steps:

  1. In your project's Asset Library, in the Software deployable packages section, select Import.
  2. From the list of shared assets, select the Environment Reprovisioning Tool.
  3. On the Environment details page for your destination environment, select Maintain > Apply updates.
  4. Select the Environment Reprovisioning tool that you uploaded earlier, and then select Apply to apply the package.
  5. Monitor the progress of the package deployment.

The transaction log for database 'AXDB' is full due to 'LOG_BACKUP'

 USE AXDB;  

GO  

-- Truncate the log by changing the database recovery model to SIMPLE.  

ALTER DATABASE AXDB

SET RECOVERY SIMPLE;  

GO  

-- Shrink the truncated log file to 50 MB.  

DBCC SHRINKFILE (ProdAXDB_log, 50);  

GO  

-- Reset the database recovery model.  

ALTER DATABASE AXDB

SET RECOVERY FULL;  

GO