Please!… Implement Different Behavior in Production versus Sandbox Dynamics 365 F&O Environments

Write code that behaves differently in production versus sandbox D365 environments.

As a customer/user, it can be a big problem when our customizations or ISV code does certain behavior like calling external web services or sending alert emails. Frequently, there’s some kind of configuration option to change that behavior: for example, a URL is stored in the database and can be changed through the GUI.

But, I can almost guarantee that every customer will be bitten at least once by doing a database refresh and accidentally sending test data out as though it was production data. Many administrators will eventually find themselves with a fairly lengthy checklist when they do a database refresh. For customers trying to do careful testing or frequent training, this can be time consumingt and possibly error-prone, even if automated as much as possible.

As developers, I feel we have an obligation to do more than meet the bare minimum of requirements; we should write code that is as easy as possible for end users, saving them time when possible. It should be a matter of professionalism, if not pride, for us. Therefore, I urge you to write code that can be configured once and will work the same in both sandbox and production environments.

How do we do that?… I recommend using a pattern with the GetUrl() method.

str prodAXUrlPattern = 'myprod.operations.dynamics.com'
str testAXUrlPattern = 'mytest.sandbox.operations.dynamics.com'
str prodWebService = 'production.isvwebservice.com'
str testWebService = 'test.isvwebservice.com'

if (strContains(UrlUtility::GetUrl(), prodAXUrlPattern))
{
MyISVClass::CallWebservice(prodWebService);
}
else if (strContains(UrlUtility::GetUrl(), testAXUrlPattern))
{
MyISVClass::CallWebservice(testWebService);
}
else
{
Throw error('This D365 environment has not been configured.');
}

Implementing something like this will require more work up front, but every administrator will thank you for foolproofing things against against costly mistakes.

Advertisement

Automatically upload (and deploy) D365FO builds

Set up Azure DevOps to, after building your Dynamics 365 for Finance & Operations code, automatically upload it to Lifecycle Services (LCS). You can also automatically deploy it.

This snuck by me until now: you can now set up Azure DevOps to, after building your Dynamics 365 for Finance & Operations code, automatically upload it to Lifecycle Services (LCS). You can even make it automatically deploy it.

Rather than repeat information that is covered elsewhere, let me point you to the announcement (from several months ago!):

https://community.dynamics.com/365/financeandoperations/b/newdynamicsax/posts/azure-devops-task-to-deploy-code

…and a nice blog post by J.A. Estevan on how:

https://www.jaestevan.com/release-your-dynamics-365-for-finance-and-operation-packages-to-lcs-with-azure-devops-pipelines-en/

“Access to the path … is denied” error when trying to get DLLs or netmodules in Visual Studio in a D365F&O development VM onebox

If you use ISVs that ship netmodules or DLLs, a developer trying to “get latest” in Visual Studio might get errors, telling them that they cannot overwrite existing DLLs. The error might read something like “Access to the path … is denied.” There are a few ways to solve this problem.

If you use ISVs that ship netmodules or DLLs, they can be a little tricky to merge, build, and distribute to multiple developers. Besides making sure that you “include” all the right files when you first check in their models, if you later check in changes when the ISVs ship them, this can cause problems for other developers.

A developer trying to “get latest” in Visual Studio might get errors, telling them that they cannot overwrite existing DLLs. The error might read something like “Access to the path … is denied.” Here’s an example I get when I try to get a DLL from the ISV Avalara:

Example error: “Access to the path ‘K:\AosService\PackagesLocalDirectory\AVAModel\bin\Dynamics.AX.AVAModel.dll’ is denied.”

There are a few ways to solve this problem:

(1) Stop Services

I keep a batch file that I can “run as administrator” to do this, or you can manually do it in Sevices. (A second batch file changing “NET STOP to “NET START” can restart them.)

NET STOP "MR2012ProcessService"
NET STOP "DynamicsAxBatch"
NET STOP "Microsoft.Dynamics.AX.Framework.Tools.DMF.SSISHelperService.exe"
NET STOP "W3SVC"
PAUSE

(2) Stop IIS Express

If stopping services isn’t good enough, you might need to stop IIS Express, which Visual Studio automatically launches every time you start it. It isn’t a service, though; you need to close it from a little tray icon. Right-click it and choose “Exit.” You’ll probably need to restart Visual Studio before you can debug again.

Right-click the IIS Express icon and choose “Exit.”

(3) Advanced “get”

Do an “advanced get” and be sure to click the checkbox to overwrite existing files.

Right-click the branch; instead of Get Latest Version, select Advanced > Get Specific Version.
Check the box for “Overwrite writeable files that are not checked out” before clicking “Get.”

(4) Delete & Get

A final desperate move, after stopping the services and IIS Express, is to outright delete the files (via Windows Explorer) that it refuses to overwrite. Then it might let you get them since it doesn’t need to overwrite them.


Did these tips help? Do you know an easier way, or something I missed? Let me know in the comments!

HTTP 500 errors running Dynamics 365 for Finance and Operations in locally hosted onebox / development VM

If you are hosting your own onebox , you might suddenly get HTTP 500 errors trying to run the local instance of D365. This is because the certificates expire. These steps will help you fix the problem.

If you are hosting your own onebox / development VM (instead of hosting it in LCS), you might suddenly discover that you get HTTP 500 errors trying to run the local instance of D365. As of this writing, it should have just started cropping up for 7.3 VMs.

This is because the certificates used in these VMs expire. Why did Microsoft include certificates that expire so quickly?… I guess they were overly optimistic about our ability to move on to later versions. But it was a problem in previous VMs, and continues to be.

Fortunately, fixing this problem is documented. (https://blogs.msdn.microsoft.com/lcs/2018/04/24/rotate-the-expired-or-nearly-expired-certificates-on-your-downloadable-vhd/) Unfortunately, that document has some minor errors, and leaves out some useful details. (It’s also hard to find sometimes.) So, to save you some of the trouble we’ve had, here are my slightly improved steps for fixing the problem. (EDIT/NOTE: Check the comments on this post for a solution that should do all of this in a single Powershell script; it’s almost definitely easier than my original steps posted here.)

(1) Run “Windows Powershell” AS AN ADMINISTRATOR from the Start menu, and paste in this command:

Get-ChildItem-path Cert:\LocalMachine\My | Where {$_.NotAfter -lt $(get-date).AddMonths(2)}| Sort NotAfter | Format-Table Subject, Thumbprint, NotAfter

(2) You’ll get a list of several certificates. Record them in the first column below, leaving the second column blank until the next step. (Fortunately, the Powershell window allows you to select text in a box shape.) You might want to create your own copy of this in Notepad in your VM.

Old cert thumbprint (listed by step 1)     New cert thumbprint (generated in step 3)
-----------------------------------------  -----------------------------------------
2C669BD45BB8A0305F4F962A175FF4FD6B4CAD7C   266DB189E56BE7E036019D34994480071C47B92B
217695F9CE7033413ECB5F059FC4C4B0F439C102   EF84A2876504B1AE2AE86408733FBA8002BA0CF9
4C82C05E452D08A2BE1CC4F92DA24CF98E493F1D   193539F1E914DD9888D29AFD70E83F008FE7709F

(Note that the above thumbprints are examples. Don’t use them!)

(3) One at a time, generate a new certificate that is a copy of the expired certificate, but dated for far in the future. (999 months, to be precise.) Use this script (but replace the thumbprint obviously) for each old cert you found above:

cls
Set-Location -Path "cert:\LocalMachine\My"
$OldCert = (Get-ChildItem -Path 25BE4ECAA604B626327C071772AD02E1D3889520)
New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)

Record the thumbprint generated for the new certificate next to the old one. You’ll need both shortly.

(4) Launch notepad AS AN ADMINISTRATOR three times. Open the C:\AOSSERVICE\WEBROOT *.config files (web.config, wif.config,wif.services.config). Search/replace each old cert thumbprint with the new cert thumbprint.

You won’t find every thumbprint in every file. Be careful; accidentally including a trailing space, or cutting off a single character, can lead to this not working and being difficult to debug later. Sorry, this step is pretty tedious!

(5) Restart IIS by running the iisreset command in your Powershell window. If it works, you’re done, and should never need to repeat this for this virtual machine. (Unless you are still using it 999 months from now, which I’d hope you are not!)

D365FO pu20 union query problem: “There is a field mismatch in the union query. Field [Field Name] is not compatible with field .”

Cause of the run time error “There is a field mismatch in the union query. Field [Field Name] is not compatible with field .” in D365FO.

I found an exciting and obscure bug (? – “feature?”) in Dynamics 365 for Finance and Operations. It doesn’t seem to rear its head in platform update 15, but does in platform update 20. If you get the run time error message (notice the intentional space that looks like a missing word before the period at the end):

There is a field mismatch in the union query. Field [Field Name] is not compatible with field .

…then you might have made the same mistake we made.

If you are creating a union Query, and you’ve set “Derived Table,” I think it was ignored in the past. However, as of platform update 20, if you set it– and if there is a mismatch within the fields of a data source– you will get no error when you compile/build, but you WILL get the above error at run time.

Here’s an example showing the mistake we had:

 

Look closely. MORE closely. For most of the columns under ProjEmplTransactionsView in XskProjVendTransQuery, the “Derived Table” is set to ProjEmplTransactionView. (No pluralizing “s.”) It is ProjEmplTransactionsView (with the “s”) for XskVendInvoiceTransRecId.

I’m not sure that “Derived Table” is even needed (what does it do?) but when it was consistently ProjEmplTransactionsView for all the columns where it is set to something, the run time errors went away.

AXUG / D365 User Group

This will be a less focused and problem/solution oriented blog post than usual for me. I’m currently taking a short break while attending the Dynamics AX / 365 User Group Summit. Midway through, we’ve heard the keynote, I’ve hit a few sessions, and I’ve gotten an idea how this thing works and what to expect next year (in Orlando, Florida, if you want to register for next year).

Attendance seems to be roughly 1/3rd IT people, with enough developers to fill sessions pretty well.

There have been sessions focused on X++ coding, but mostly higher level (you can only do so much in an hour long session). Best session so far that I attended went over the difference between “overlayering” (what you’d call “customization” in older versions) and “extensions,” which I already knew; but with useful explanation about the new-ish Chain Of Command functionality. Also a great (but again high-level) session on automated testing, something I’ve never looked at as a newer developer; I now have a rough idea how to set up Unit Tests and Type Provider / Integration Tests, which I look forward to diving into.

Common Data Model is big. (Is this the same as Common Data Service? I’m sure Microsoft doesn’t mind any confusion. See also renaming VSTS to “Azure DevOps,” a confusing name that I really hope does not stick.) Power BI and Flow are huge topics. Although ostensibly friendly to end users, developers should not ignore these tools. You WILL be integrating Power BI into D365 as a developer, especially if there are custom reports; and you’ll find Flow helpful with some integrations.

There are no shortage of tools to learn that part, at least. Microsoft mentioned https://microsoft.com/learn during their keynote (in between basically demoing the whole family of products and how easily they talk to each other, in a perfect world at least…) which should be a good starting point for learning many things… still no real path for X++ programming, though. The expectation still seems to be that either you’ve been working in AX 2012 and earlier and need refreshers, or you’re coming from C++ and learning from a generous employer; third parties are starting to fill the gap by offering training, but that is your only real option if you can’t get on-the-job training.

Importing an Azure BACPAC to a development VM

As a Dynamics 365 for Finance and Operations developer, you will probably find it easier to develop, test, and debug problems if you have a recent database available to you in your development VM. Although there does exist general documentation on this process, as of this writing, I’ve found it less than ideal for developers; a couple of minor errors, some extra steps, and lacking a few tips for problems you might run into.

Prerequisites

For sake of this post, I’m going to assume that a system administrator has tackled providing you with an exported BACPAC file, and that you are able to get it copied to your development VM.

You should install the latest version of SQL Server Management Studio (or at least some 17.x version), which is an easy way to get the updated SQL tools you will need to work with BACPAC files from current versions of SQL Server Azure. Honestly, I recommend this as part of the initial VM setup.

If you’ve been running Dynamics 365 for Finance and Operations for a while, your VM hard drive might not be big enough. You might need to make the virtual drive larger.

Import the database

You need to run an command prompt as administrator. If you don’t run the command prompt as administrator, you will probably get an error like “Unable to connect to master or target server …” At the administrator command prompt, use the following:

cd "C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin"
SqlPackage.exe /a:import /sf:C:\TEMP\BackupFilename.bacpac /tsn:localhost /tdn:AxDB_YYYYMMDD /p:CommandTimeout=1200

You should note three parts of these commands:

  1. If the first command gives you an error, and/or you only have a “C:\Program Files (x86)\Microsoft SQL Server\130\DAC\bin” directory, and/or you get an error similar to “Error importing database:Could not read schema model header information from package. The model version ‘3.5’ is not supported.“: you should, as mentioned in the prerequisites, install the latest version of SQL Server Management Studio (or at least some 17.x version), which is an easy way to get the updated SQL tools you need.
  2. For the second command, you need to change the /sf: switch to point to the BACPAC file.
  3. For the second command you also need to change the /tdn: switch to whatever temporary name you want to use to import the new database. You will change it later, but the name you choose will determine filenames used for the database; so it’s best to choose something you haven’t used before. I recommend using a AxDB_YYYYMMDD format. If importinf gives you an error about the filename being in use, the easiest thing to do is pick a different database name. Remember, it’s just temporary!

Once you have it running, be prepared to wait a while. Like, a couple of hours is very possible.

Update the database

This is pretty much just running code in a query window of SQL Server Management Studio (SSMS). I offer some minor tweaks (and comments) to what Microsoft does.

-- Following is the SQL script you use after importing the database.
-- You might want to run it one "chunk" (up to each "go" statement) at a time.
-- You need to change the next line to use the "temporary" database name you picked.
use AxDB_YYYYMMDD
go


CREATE USER axdeployuser FROM LOGIN axdeployuser
EXEC sp_addrolemember 'db_owner', 'axdeployuser'
go

CREATE USER axdbadmin FROM LOGIN axdbadmin
EXEC sp_addrolemember 'db_owner', 'axdbadmin'
go

-- might error - deprecated or retail?
CREATE USER axmrruntimeuser FROM LOGIN axmrruntimeuser
EXEC sp_addrolemember 'db_datareader', 'axmrruntimeuser'
EXEC sp_addrolemember 'db_datawriter', 'axmrruntimeuser'
go

CREATE USER axretaildatasyncuser FROM LOGIN axretaildatasyncuser
EXEC sp_addrolemember 'DataSyncUsersRole', 'axretaildatasyncuser'
go

CREATE USER axretailruntimeuser FROM LOGIN axretailruntimeuser
EXEC sp_addrolemember 'UsersRole', 'axretailruntimeuser'
EXEC sp_addrolemember 'ReportUsersRole', 'axretailruntimeuser'
go


CREATE USER [NT AUTHORITY\NETWORK SERVICE] FROM LOGIN [NT AUTHORITY\NETWORK SERVICE]
EXEC sp_addrolemember 'db_owner', 'NT AUTHORITY\NETWORK SERVICE'
go


-- Not everybody needs this, but it doesn't hurt.
UPDATE T1
SET T1.storageproviderid = 0
 , T1.accessinformation = ''
 , T1.modifiedby = 'Admin'
 , T1.modifieddatetime = getdate()
FROM docuvalue T1
WHERE T1.storageproviderid = 1 --Azure storage
go


-- It is very unlikely you need this in a development environment. Microsoft includes it, but I do not recommend it for developers.
ALTER DATABASE CURRENT SET CHANGE_TRACKING = ON (CHANGE_RETENTION = 6 DAYS, AUTO_CLEANUP = ON)
GO


-- Begin Refresh Retail FullText Catalogs
-- If you do not use retail components, you do not need this section, but it doesn't hurt.
DECLARE @RFTXNAME NVARCHAR(MAX);
DECLARE @RFTXSQL NVARCHAR(MAX);
DECLARE retail_ftx CURSOR FOR
SELECT OBJECT_SCHEMA_NAME(object_id) + '.' + OBJECT_NAME(object_id) fullname FROM SYS.FULLTEXT_INDEXES
 WHERE FULLTEXT_CATALOG_ID = (SELECT TOP 1 FULLTEXT_CATALOG_ID FROM SYS.FULLTEXT_CATALOGS WHERE NAME = 'COMMERCEFULLTEXTCATALOG');
OPEN retail_ftx;
FETCH NEXT FROM retail_ftx INTO @RFTXNAME;

BEGIN TRY
 WHILE @@FETCH_STATUS = 0 
 BEGIN 
 PRINT 'Refreshing Full Text Index ' + @RFTXNAME;
 EXEC SP_FULLTEXT_TABLE @RFTXNAME, 'activate';
 SET @RFTXSQL = 'ALTER FULLTEXT INDEX ON ' + @RFTXNAME + ' START FULL POPULATION';
 EXEC SP_EXECUTESQL @RFTXSQL;
 FETCH NEXT FROM retail_ftx INTO @RFTXNAME;
 END
END TRY
BEGIN CATCH
 PRINT error_message()
END CATCH

CLOSE retail_ftx; 
DEALLOCATE retail_ftx; 
go
-- End Refresh Retail FullText Catalogs


-- Microsoft does not tell you to do this, but in a development environment, it is very likely that you want to change the database "recovery model."
-- Otherwise, you need to set up backups and/or clean up the transaction log; otherwise the database can grow at an unmanageable rate, eating up your whole VM drive.
ALTER DATABASE CURRENT SET RECOVERY SIMPLE;
go

-- You might want to shrink the size of the transaction log, just to keep room free on your VM's drive.
-- You can do it in the SSMS GUI, or use the following statement (with a change to reflect your DB name).
DBCC SHRINKFILE(AxDB_YYYYMMDD_Log, 50);
go

Re-provision the target environment & Reset the Financial Reporting database

I do not use Retail components or Financial Reporting, especially not in my development environment; so I have no reason not to follow exactly what is in the Microsoft documentation.

Start to use the new database

To switch to using the new database, you must first stop these three services (the Microsoft documentation is out of date on the service names):

  • Management Reporter 2012 Process Service
  • Microsoft Dynamics 365 for Operations – Batch Management Service
  • World Wide Web Publishing Service

Then rename the databases so your new one is named AxDB (the old one can be named AxDB_ORIG or AxDB_old or whatever; eventually you will want to delete it, to save space). Then you can re-start the three services above, and you are good to go!

You can rename the databases using the SSMS GUI (in Object Explorer) or by using T-SQL commands similar to:

USE master;
go
ALTER DATABASE AxDB MODIFY NAME = AxDB_OLD;
ALTER DATABASE AxDB_YYYYMMDD MODIFY NAME = AxDB;
go

Build/synchronize the new database

You should launch Visual Studio, go to Dynamics 365 > Build Models, and do a full build (with the “Synchronize Database” option) all of your customized and/or extended models.

After that, you are ready to go!

Calling an X++ method on a Data Entity through OData

Today I learned of a documented but little-known (well, to me) way to put an X++ method on a data entity, and make it available via an OData call. This is a great way to build more powerful integrations through Flow or other external tools and programs.

I won’t belabor it since it is documented, but something like this:

[SysODataActionAttribute("CalcMaintenanceDuration", true)]
public int CalculateMaintenanceDuration()
{
 //do something
 return 0;
}

…will let you make a method available through an OData call. You’d use a URL that looks something like this to call the method:

https://MYCOMPANY.operations.dynamics.com/data/MYDATAENTITY.CalculateMaintenanceDuration

This needs to be done as a POST request, so you can’t easily test it in Internet Explorer like you can just getting the data entity (which sends you a JSON file full of data). Calling and using OData from outside Dynamics 365 for Finance and Operations is outside the scope of this blog post… and, honestly, outside the scope of my expertise at this point. You’ll want to look at the OData standards (where the methods you can call are usually called “actions”) and/or documentation for your language. Tools like Flow can handle all this in the background for you, just letting you browse available “actions.”

Data Entities “versus” OData

The language and visibility of Data Entities and OData might be a little confusing for some users who are not sure about what they can see in the Dynamics 365 for Finance and Operations web GUI versus what is available via OData (in PowerBI, Excel, etc.) for them. I made this graphic to explain it, if it helps you.

Data Entities

(click here to view large – anybody who can explain how to get WordPress to show this in my desired size inline, let me know)

 

Expanding the hard drive on your development VM (“onebox”)

In some development shops, the development VMs will be cloud hosted and centrally managed. But I try to help out others who are managing their own development VMs; and if you’re restoring copies of a growing production database in order to debug a trick issue, you might find that the “as shipped” VM doesn’t have a big enough hard drive. Here are the steps to help you expand it.

  1. In Hyper-V Manager, stop the VM and delete any checkpoints. (Unfortunately, you can’t expand the drive for a VM with checkpoints.)
  2. In Hyper-V Manager, in the “Settings” for the VM, go to “Hard Drive” and find the “Edit” button under the virtual hard disk:
    Edit virtual hard drive in Hyper-V Manager
  3. Use the wizard to “Expand” to the desired size. (Unfortunately, it’s hard to know exactly how much space you’ll need. A database bacpac is significantly compressed, and you know better than me how much space you are likely to need.)
  4. Start & connect to the VM. You will not see the new space available yet.
  5. In the VM, run Disk Management. It is in Control Panel, or will come up if you type diskmgmt in the Start menu:

    diskmgmt
    Typing “diskmgmt” in the Start menu should bring up something that looks like this.

     

  6. In Disk Management, right-click the OsDisk (C:) partition and choose “Extend Volume.”
    Extend volume in Disk Management
  7. You should be able to just keep clicking “Next” through the wizard without changing any defaults; it will expand to fill the newly available space.
  8. That’s all, you’re done! When you are done restoring your bacpac and building it (or whatever else you wanted to expand the disk for), don’t forget that you’ve deleted all your checkpoints.

 

Although this isn’t a process specific to Dynamics 365, if you aren’t used to managing a VM, I hope it helps you get back to developing a little sooner.