Monday, September 25, 2017

Initial steps to troubleshoot failed environment servicing

On the topic of patching and updating an existing D365 Operations environment I will refer to the online documentation.
There are also some great community posts that aims to help you, and you may want to check out.
I expect more posts to show up. As it is of writing this, installing updates can be a bit tedious and cumbersome. 

I will use this post to share a recent update that failed on a Platform Update. A Platform Update is expected to be a fairly straight forward and safe operation. You simply import the update to your assets in LCS and apply it to your environment (assuming you're running your environment in the cloud). I will not discuss On-Premise in this post. 

I had an environment running application 1611 with Platform Update 7. I was trying to install Platform Update 10. After it failed on several attempts, I started to try investigate why it failed. 

Here are the steps I took.

1) Identify which step failed. In my case it was step 13. (Not exactly my lucky number)

2) Find the runbook output (normally under C:\RunbookOutput) and find the PowerShell Script that fails. I simply searched the log for "13"

3) Open PowerShell ISE in Admin mode and open the PowerShell Script. You will find the script in the J:\DeployablePackages folder, and you can match the GUID from the log with the Runbook folder. The Scripts will be located in a standardized folder path.

4) Run the script and let it fail. From there you can add breakpoints and try run again and step through to try see why it failed. Use whatever you find as information when you contact Microsoft Support. Some updates fails, but should not fail, and it is important that anyone with support agreements make sure to report findings back to Microsoft. 

Now, in my particular case, the script did not fail when I ran it manually. It succeeded. I can only guess to why that is the case, but after going back to LCS and letting the update "Resume" it eventually finished with all the upgrade steps successfully. 

In any case, the initial steps above can help you push through a failing update and potentially lead you to the answer why an update unexpectedly failed. 

Wednesday, August 23, 2017

Consider changing your password Pop Up

Currently the machines deployed through LCS runs with the Account Policy that passwords has a 42 days age. Interestingly you should not change the password for servers deployed according to this statement of guidelines.

So if you get annoyed by the reminder to change the password and do not plan to republish the box any time soon, why not go ahead and get rid of the pop up.

Click the start button and type in "local". You should find the Local Security Policy Console.

From there it is just a matter of changing the expiration of the password to something other than 42, or simply set it to 0 for "never expire".

Quick and easy.

Alternatively you can use a Command Prompt (Run as Admin) with the statement:

net accounts /maxpwage:unlimited

Wednesday, August 9, 2017

Excel Applet not loading when working with D365 Office Add-in

This post is somewhat related to the post by Ievgen (AX MVP) on the Office Word integration not working with Dynamics 365 for Finance and Operations (Enterprise Edition).

If you try to connect Excel to an existing environment using the Microsoft Dynamics Office Add-in and all you see is the text "Load applets" after signing in, then it might very well be because the applets needs to be initiated from within the environment.

If you click the little flag at the bottom right, you will be able to open the messages and see the error "No applet registrations found".

Solution is simple. Open the D365 environment in your favorite browser (assuming your favorite browser is on the compatible list - hehe) and either search for the form (ie type in "Office") or navigate directly through System Administration, Setup and Office app parameters

If you see an emply grid, then the settings have not been initialized and that is the problem. Most likely you are missing settings for Office apps in general, so go ahead and initialize all parameters for all the grids accordingly.

Head back to Excel and try make it reload the applets (simply try add a trailing slash to the url). Hopefully you should get the expected result.

Sunday, July 9, 2017

Error when installing Reporting Extensions for AX2012 R3 on SQL Server 2016

On my previous post I wrote on installing Reporting Extensions for AX2012 R3 on SQL Server 2014. In this post I want to emphasize that the same hotfix for SQL 2014 is needed for SQL 2016.
The error behaves slightly different on SQL 2016 if you do not have the patch. The setup experience simply crashes during install and while the components is ticked as "installed" next time you run setup, it is only "half-baked". You need to start over, this time with the hotfix ready.

Here is a screenshot of the installer crash with "AxSetup.exe has stopped working". Ignore that it is on the SSAS step, I simply chose to install both extensions at the same time. The error actually relates to Reporting Extensions.

And if you open the setup logs for further inspection, you will see it ends while trying to setup the SSRS bits. Here is an excerpt from the install log:

2017-07-05 11:30:56Z Setting the SQL Server Reporting Services service account to the Microsoft Dynamics AX .Net Business Connector account.
2017-07-05 11:30:56Z Generating database rights script.
2017-07-05 11:30:56Z Opening connection to the database using the connection string server=SERVER\INSTANCE;Integrated Security=SSPI.
2017-07-05 11:30:56Z Writing the database rights script to C:\Users\USERID\AppData\Local\Temp\3\tmpADC0.tmp.
2017-07-05 11:30:56Z Executing database rights script.

I got this error even though the installation was slipstreamed with CU12, which is a later version, compared to the hotfix.

So if you're planning on installing these bits for SQL 2016 (or SQL 2014), do yourself the favor of downloading the KB3216898 and slipstream your install by extracting it into your installation Update folder.

Here is the link, again:

Thursday, April 20, 2017

Error when installing Reporting Extensions for AX2012 R3 on SQL 2014 SSRS


I could not really find any posts on this out there, so I decided to just share this here.

You may experience installation errors when you try to install Reporting Extensions for AX2012 R3 on SQL 2014. The setup crashes internally, rolls back the installation and fails.
In the installation log you will see the error "Version string portion was too short or too long".

The solution is available on LCS as a downloadable hotfix KB3216898 (released 10th of January 2017) here:

Unpack the content of the hotfix and slipstream it as part of your AX2012 R3 installation and run the installation again. Now it will work.

Just to make this sure people find this post if they search for the errors, I'll add the full call stack below:

An error occurred during setup of Reporting Services extensions.
Reason: Version string portion was too short or too long.
System.ArgumentException: Version string portion was too short or too long.
  at System.Version..ctor(String version)
  at Microsoft.Dynamics.AX.Framework.Reporting.Shared.SrsWmi.get_ReportManagerUrl()
  at Microsoft.Dynamics.Setup.ReportsServerInstaller.GetOrCreateServerConfiguration(String instanceName, String sharePointServiceApplicationSite, Boolean& createdConfiguration)
  at Microsoft.Dynamics.Setup.Components.ReportingServicesExtensions.InstallConfigurationFiles(String instanceName, String sharePointServiceApplicationSite, Boolean& createdConfiguration)
  at Microsoft.Dynamics.Setup.Components.ReportingServicesExtensions.RunReportingSetupManagerDeploy()

Monday, December 26, 2016

Managing your Azure Subscriptions created through CSP portal

Let me start off with a disclaimer, as Microsoft may change the behavior, which would render this post obsolete. In which case I'll try to come back and make the necessary amendments. 

If you have worked with managing your Azure resources through PowerShell, you will notice that Azure Subscriptions created through the Cloud Solution Partner (CSP) portal behaves slightly different. This post from august 2016 goes into details on how to migrate from "traditional" Azure Subscriptions to "CSP" Subscriptions.

In my post, I want to just quickly show you some key points.

Azure Portal navigation

One thing you will quickly notice is that if you access the CSP portal and open the Azure Portal from there, all of the classic resource types in Azure are completely hidden. You can only create and operate on Azure Resource Manager (ARM) types of resources. So basically, this prevents you from using Azure Service Management API and any interface that assumes ASM, or "Classic Azure" as it is also named.

Another thing you'll notice is that if you try to navigate the Azure Portal directly ( you do not necessarily see the Azure Subscriptions from your tenants in the list of Directories. I say "necessarily" because if your user has been explicitly granted "owner" role on the tenant, that is a different story. One of the core features of the CSP program, is that the partner already is "owner" through the Foreign Principal role, more specifically all users who have the AdminRights permissions within the CSP portal. You can read more about that here.

So on order to navigate to the customers Azure resources you need to explicitly go the the tenant through the URL. That will open the tenants context and off you go. The URL will typically be something like this: (or the customers own domain, if it is fully setup.

Azure PowerShell management

What about PowerShell? Is that any different? YES!

If you run Login-AzureRmAccount without setting a context, you'll end up only seeing Azure Subscriptions you have access to explicitly. And even for Azure Subscriptions created through CSP will behave differently.

The solution is rather easy, even if you could argue it's a bit cumbersome.
You need to explicitly set the context.

Here are some options available:

  • You either explicit login to the tenant and subscription:
    Login-AzureRmAccount -TenantId TENANT-GUID -SubscriptionId SUBSCRIPTION-GUID
  • Or login "normally" and then run select with tenant and subscription:
    Select-AzureRmSubscription -TenantId TENANT-GUID -SubscriptionId SUBSCRIPTION-GUID
  • Or you could login and set context using the following command:
    Get-AzureRmSubscription -TenantId TENANT_GUID -SubscriptionId SUBSCRIPTION-GUID | Set-AzureRmContext

 If you do not set the context explicitly, you will not be able to operate over the Azure resources.

Now, some readers may have noticed Azure Subscriptions created through CSP is inaccessible in the old Classic Azure Portal, which in turn disconnects such the Subscription from being available on Lice Cycle Services (LCS). LCS does support ARM by now, so I believe the solution should be just around the corner. We're just missing one minor piece for all of this to work together properly.

Have a nice Christmas holiday, everyone!

Sunday, October 23, 2016

Using the On-Premise Gateway to connect to your AX2012 data to Power BI Portal

PowerBI has been around for a long time by now, so there are tons of information out there on how to connect your data sources to the powerful PowerBI Portal ( Now, getting all the moving parts to connect properly might have been difficult at times, but I'm making this post to just reassure you it is currently very easy to set up.

Before I begin, I just want to add a precaution:
Consider the implications around security and performance when setting this up.

I prefer to use a common service (or setup) account for this, and not my own consultant login. This makes it a little easier if someone else needs to step in and maintain the setup. Furthermore, it allows for the customer to lock down the credentials after I've completed the setup.
As for performance, you should pay attention to how data refresh adds load to your servers, both the one hosting the gateway itself, the server hosting the data source (SQL Server and/or Analysis Services). You don't want to cause a full system overload while pulling data from your sources.

I will use the standard Dynamics AX SSAS OLAP as an example, but the point here is less the data source, and more how easy it is to connect to the PowerBI Portal.

Before we begin, I want to list some prerequisites, or at least how I would set it up:

  • You are using a dedicated setup account and this account is a domain user
  • You are local admin on the server where you plan to setup the gateway. Basically, your setup account is listed in the Administrators Group (under Computer Management, Local Users and Groups, Groups, Administrators).
  • You have access to the SQL Server Analysis Services (SSAS) with your setup account. Check by right-click SSAS instance, choose Properties and look at the list of users under Security.
  • You have a user who is Global Admin in Azure AD. This could be the setup user, synced to Azure AD from the On-Premise domain, but it's not necessary. The point is this user will have access to setup things on PowerBI which currently requires Office 365 Global Admin rights. This may change in the near future, hopefully.
Given all of the above, you'll simply start by logging on the PowerBI portal using the Office 365 Global Admin user, and download what's called the "Data Gateway". The download link is in the top and takes you to the download page. Press Download and get the latest and finest version.

When you run this installer, it will ask you to login using the Office 365 Global Admin user (which will have access to register the gateway). Also, I am using the "Enterprise Gateway" option when installing. This allows me to schedule refresh from data sources based on SSAS.
The gateway has its own set of prerequisite software, so have a look at those before you begin.

When the gateway is installed successfully, it now can be utilized to connect to ANY of the SSAS instances on the domain, given the network traffic is allowed and you connect with a user who has access to the SSAS instance. So your LIVE, TEST, DEV, and so on. How cool is that?

Next you would use the PowerBI Admin Portal to configure the Gateway and add your data sources.
Head over to the Manage gateways and click "Add Data Source".

Fill in the form. Notice I am using the name of the server where SSAS is running and the name of the SSAS instance. I also use the domain user who has access to the SSAS Server itself. I also put in the name of the OLAP, Dynamics AX Initial.

The data source should connect and confirm everything looks good for you to connect the data source and whatever it contains. Great!
A lot of people get here fine, but the next part is something which was added just recently, well actually some months ago in the 2016 April update.

Why is this update important?

Given the scenario where you're trying to connect some on-premise SSAS with PowerBI in the cloud, who's to say you're fully synchronizing on-premise Active Directory with Azure Active Directory? What if your local domain doesn't map the users perfectly with the usernames in Azure AD? This is where the "Map User Names" comes into play. We can actually add string replace rules to the usernames, so if your users are not perfectly mapped between Azure AD and On-Premise domain, you can still get this to work.

So in this example, I will assume the On-Premise domain is using a different domain name compared to the one used by Office 365 and Azure AD. On-Premise I imagine CONTOSO is actually fully qualified as, while in the cloud users are using

Click the Data Source you need to be mapped. Right now, these settings are not shared across data sources, but hopefully they will add further administrative improvements to this.
Open the list of Users and look at the bottom for the Map User Names button.

This will slide in the setup for mapping of user names.

Notice in my example I am replacing the long username for the with So anytime I am logged in at the PowerBI portal with this powerbiadmin-user, and I try to access the data sources through the gateway, the user principal names will be "washed" through the mapping, and "magically" the credentials for that user will work On-Premise because the local domain sees a user it recognizes. Furthermore, I added another example of a user who locally is represented by, while in Azure AD is actually So if this user also tries to update or refresh data sources, the credentials will work locally.

What next?

Well, you can click "Get Data", select "Database" and choose "SQL Server Analysis Services" and simply pick your preferred cube from one of your datasources and click "Connect". With the new dataset in place, you can schedule a refresh outside regular business hours. Like this:

A couple of follow-up questions:

Q) What happens if I map two completely different users, who actually both exists both in Azure and On-Premise?
A) You're the admin and while there are no features to prevent potential illogical mappings, you can map yourself into complete chaos - at your own or someone else despair.

Q) Do I need to map all users like this? 
A) Since the mapping is a simple string replace, you can replace similar parts of the username. Like replacing "" with "". If you're lucky enough, this will be enough to fix most usernames. Also consider there may be a number of users who only will load the Reports, but who do not need access to actually reload the datasets with fresh data from the data sources. Surely, those users do not need to be mapped.

Q) How much time does it take to set this up?
A) With some practice, and if the users are setup with permissions like described in the beginning of this post, I bet you can get this up, connected and working within the hour. The rest will be waiting for data to come through so you can start fill your beautiful reports and dashboards with powerful data.

Q) What if it fails horribly and I get stuck? :'-(
A) Use the community forum and make sure to tag your question with "BI".