Saturday, March 15, 2014

Automated Deployment with Windows Azure - AX 2012 R3


Hi there!

I hope everyone had a great and productive week. I certainly did.

On this post I would like to share what I learnt about Automated Deployment with Windows Azure at the AX 2012 R3 Tech Conference. As always, I would like to extend my gratitude to TriBridge for taking me to this event.

As discussed in previous events, Microsoft said that it was moving to a cloud environment where it would provide a service to host and run AX instances in the cloud. At this stage, Microsoft offers a wide variety of services to an organization that provide a well-designed infrastructure for development, testing and small-scale production environments.
The following is the Azure Hosting Model (Blue boxes is what Azure is proving us)

Azure Setup
Only AX 2012 R3 is certified and supported for Azure at this time. When asking Microsoft about older CU (i.e. CU6), they said it could be possible to work with these versions, and there are a number of companies that are doing it, but it is not supported.
A very cost-effective concept is that Azure takes care of all the back-end processes when creating a new Azure instance. Microsoft accomplished this by using a set of automated scripts that install, and slightly configures the instance. However, customer specific configuration as well as customer specific network details are not part of the automated process for obvious reasons.
Moving right along, Azure provides a very good cost effective solution for hosting. Azure calls it, “Pay-as-you-Go”, which mean that a user will be charged a certain amount of money only when he/she is using the Azure instance.
One of the main benefits of this solution is that any device with RDP capabilities can access the Azure instances.
Azure Deployment Services

Microsoft provides a robust framework for deploying Microsoft Dynamics AX Instances to Azure. The following are most of the steps needed to make use of these services.
  1. A user/Organization must get an Azure subscription ID by signing into the Azure website.
  2. There is a 3 month free trial available.
  3. A new subscription will include 20 cores for a basic deployment of a development and test environments.
  4. The Azure subscription service will setup the instance automatically.
  5. Azure will provide different topologies (Development, Test, and Production, which needs more than 20 cores in a real business deployment scenario).
  6. By default, Azure creates 2 machines per instance to support maintenance.
  7. A typical development Dev/Test deployment takes approximately 8-10 hours.
  8. The deployment process is an “intelligent” process that retries failures. In addition a user can define the maximum number of retries, which helps on troubleshooting times.
  9. Azure provides a Demo instance with Contoso data.
  10. Azure provide Lifecycle services as a default feature in each instance.

Post Deployment Considerations

Although Azure does many of the configuration tasks automatically, there are a number of post-deployment actions we need to follow up after each setup. The following describes the steps needed after deployment.

In addition, an important point to take in consideration is setting up TFS, Outlook and Lync (if available for a customer). Microsoft can help a customer/partner to set these applications.

One important point is the existence of SQL Server Always On feature, which brings SQL Server high availability and disaster recovery to a whole new level by allowing multiple copies of the database to be highly available. Always On Availability Groups allow you to fail over a group of databases as a single entity, unlike database mirroring where you can only do so one database at a time. Further, this architecture also offers SQL Witness. Its main task is to monitor the mirroring scenario and to initiate automatic failover.

Azure also provides a REST interface instead of a SOAP one. REST is a simple stateless architecture that generally runs over HTTP, and is often used in mobile applications, social networking Web sites, and automated business processes.

Finally, Microsoft recommends starting virtual machines in Azure in a sequence, otherwise the IP addresses will not match the VM’s sub-nets. A question was asked to Microsoft to expand on this issue, and they are not sure why this happens. The good news, however, is that they are working on it.
The following is the Azure portal possible architecture that Microsoft is working on.

Check out TriBridge Cloud Services at TriBridge Concerto. We provide cloud hosting for all your needs. Also, check our TriBridge Careers page and get on our winning team.


Friday, March 7, 2014

Create Dynamics AX Builds using the X++ Server - AX 2012 R3

Hi There!

I hope everybody is doing great! I had the opportunity to attend to the MS AX 2012 R3 Tech Conference thanks to TriBridge.
Microsoft has worked hard in improving the X++ compiler for the new release. The new compiler is also available in CU7.

X++ Compiler Performance

In the past releases, the X++ compiler has been the bottleneck of build and installation scenarios across the board. For this, Microsoft has recommended the following tips to improve the compiler performance (CU7 and up only)
  • One single machine deployment
    • SQL Server
    • AOS
    • Client
  • 16GB of memory available.
  • Don’t constrain SQL Server memory consumption.
  • Installation of the KB2844240 Hotfix (AX 2012 R2) – Index Optimizations.
  • Faster CPUs is a much better option than multiple slow ones.
  • Solid state drives, which are typically more resistant to physical shock, run silently, have lower access time, and less latency.

How the Microsoft Dynamics AX compiler works

The following depicts the phases of the X++ compiler in previous versions of Microsoft Dynamics AX.


It is important to note that in earlier versions of Microsoft Dynamics AX the build performance is affected by the metadata moving from the client to the server metadata. In addition, the long compiling times are due to the deserialization of metadata and in memory cache.

The following is the architecture for the current compiler

R3 Compiler Improvements

Microsoft enhanced the compiler by allowing us to use the Build.exe command or the client. However, from an architectural point of view, they removed the client portion of the compiler in the R3 release.

The following is the new architecture improvements


A few key points to underline is that the AOS now contains the logging information, therefore there is no cache in memory. In addition, logs are generated in each AOS. In case of a multi AOS deployment scenario, the AXBuild.exe process automatically consolidate these into one log.

Finally, when using the parallel compiler, the CPU usage is extremely high. In a multi CPU scenario, the AXBuild.exe process will automatically balance the load between CPU’s. Also, is important to understand that parallel does not mean multi-threading, the new compiler is very much still a single-thread process.

The following picture depicts what a parallel compiler output looks like

Visit and learn about our Dynamics AX practice, services and focus, as well as our cloud services Concerto. Join our winning club!

Tuesday, February 18, 2014

Microsoft Dynamics AX 2012 R3 - New Data Import Export Framewrok Changes

Hi There!

I hope everybody is doing great! I had the opportunity to attend to the MS AX 2012 R# Tech Conference thanks to TriBridge.

In this post I would like to discuss the new Data Import Export Framework or DIXF. As you probably know by now, this new framework is shipped with both CU7 and R3.

So, what can new DIXF do? Well to start, one of the few new key features is that the DIXF runs on top of the SSIS service interface allowing incremental runs (UPSERT). Of course, it can import/export data, and Microsoft added the capability to compare and copy data between instances as well. In addition, the new DIXF version ships with the ability to choose different data sources such as text, MS Excel, and XML files.

Further, the new DIXF can be used to extract data directly from various ODBC*** sources such as SQL, MS Access, and MS Excel. This new additions will help us streamline our data migrations and data transfers much better.

***For ODBC types we are going to have to provide a connection string in order to simplify the data selection process. The one cool thing I saw was that we can create new rows under Mapping Details to add mandatory fields i.e. ACCOUNTNUM in case a specific legacy system does not include it.

When this scenario is true, the custom value provided can be automatically filled by a number sequence value (if we want to) by choosing the “AUTO” option in that specific row, which would take a new AccountNum from the numbering sequence system. However, we can also choose to have default values as in older versions.

In terms of the DIXF entities, the new DIXF ships with a 150 entities in comparison to the 78 (I think) it came with in earlier versions. These include master data, documents, journals, parameters, party, products, and configuration data.

Another cool addition is the addition of folder support. We are going to be able to move stuff around automatically (needs to be pre-defined) to different folders in our domain based on the operations we are executing.

The following are a few other additions:

Parallel Execution: Ability to dissect data in bundles (i.e. 1,000 rows / bundles = 100 rows per task).

This is particularly useful when large data loads need to take place. The tool provides the ability to allocate a group of records to tasks. This combination will create a bundle, and each bundle is independent of each other. See the following diagram for a visual representation of it:

Role Base Security: Provides a security framework for the different levels on an organization, this is built on top of the existing security framework (i.e. Production cannot import HR data).

Mapper Control: Allows flexible mapping between custom entities and staging objects. In addition, mapping effort is reduced when using AX friendly column names (i.e. ITEMID).

Custom Entity Wizard: We can compare data in different companies. This becomes specially interesting and useful to compare parameter data between a gold and test instances for example.  When using this tool to import data that contains discrepancies, the system inserts the data into a staging table where it is compared by another process in a specific number of companies and/or instances, and finally it gets updated.

At this point, a user can use the Comparison form to move records between different instances.

See the process in the following diagram:

NOTE: Sometimes the entity Wizard will only create a portion of the requirements and a technical consultant would have to finish the rest.

System Configuration Data: BI-Analysis and Reporting Services, Project Server Integration, EP, Batch Settings, Number Sequences, Email Parameters, AIF, System Service Accounts.

DIXF Import Process

The import Process us done by using an abstraction layer that uses SSIS behind the DIXF framework. Within this abstraction layer, we can add possible X++ customizations.

I asked the question on what would be the recommendation for migrating data from legacy systems – the following is what I could get from their recommendation (I was taking notes).There are two types of data migration architecture that consolidate both importing and cleansing data.

The first option is to have a middle tier that can process the data from a legacy system, to an external system and clean it before it goes to Microsoft Dynamics AX.

The second option is to do it directly import the data from a legacy system to Microsoft Dynamics AX. 

Microsoft recommends to keep the data cleansing business logic inside of AX. The reason is that Microsoft Dynamics provides a data migration framework that is both extensible and customizable. The framework provides entity classes that can be extended to a process specific needs. In addition to the entity classes, the framework also provides the ability to create custom staging entities for further processing prior to the final push to an entity. This can be depicted in the following picture:

The DIXF also provides a new error log preview function that allows a user to narrow down an error to the smallest unit possible to understand exactly where the error is occurring. This was not true in older versions of the DIXF. Further, the new DIXF also provides an Execution History function that allows a user to review and validate the staging data before the actual import to an entity.

DIXF Export Process

As mentioned earlier, because the DIXF also uses SSIS to export data from the framework, bulk exports can also be easily accomplished. In addition, as in older versions of DIXF, we can also generate our own source mapping and sample files. However, a cool new addition to the DIXF is that these files now can be of different types such as XLS, XML, Text, Tab delimited, Etc.

This approach sounds good and valid, however, in my mind here could be a double edge sword with the fact that XLS files might open the door for a few data consistency problems as this type of files can contain formulas. I would suggest to always understand your source files, especially our XLS ones.

DIXF Architecture

The following is the new DIXF architecture for R3.

Visit and learn about our Dynamics AX practice, services and focus, as well as our cloud services Concerto.

Sunday, February 2, 2014

Microsoft Dynamics AX 2012 R3 Tech Conference

Hi there!
I will be attending to the Dynamics AX 2012 R3 tech conference in Seattle with my team members from TriBridge. In addition, in this post I would like to share the courses I will be taking at the Tech Conference and give my readers a snapshot of what each course is going to cover.

Finally, my goal will be to write a blog entry for each of the courses in depth. So follow my blog for the next couple of weeks as I will be posting very interesting information from these amazing courses.
In a nutshell, the Microsoft Dynamics AX Technical Conference 2014 begins on Monday February 3rd in Seattle (Bellevue, Washington).  As far as I know, this is yet the biggest event for the upcoming Dynamics AX 2012 R3 release, which will introduce many new features in both the technical and functional dimensions of the application.
Below is a list of the courses I will be taking and a brief description of each. I will link each of the titles below to a new blog post in the next few days, so be sure to check this page for updated information (Each title will be in blue when ready to be explored)

Day in the life of a Microsoft Dynamics AX Solution Architect
The main focus of this course will be to learn how the roles and responsibilities of a solution architect apply to the different phases of an implementation project. I will be enjoying this course.

Data import/export framework for Microsoft Dynamics AX 2012 R3
The focus of this course is to experience the new entity-based import/export framework, which will help us manage our import/export, integration, configuration, and data migration needs.

Automated deployment of AX 2012 R3 in Windows Azure using lifecycle services
I believe this course is critical to any solution and/or technical architect as well for any developers as it will allow us to learn about the new features that Microsoft has developed for lifecycle services that will automate the deployment of Microsoft Dynamics AX 2012 R3 environments with Windows Azure.

Building Microsoft AX services integration with the Microsoft Windows Azure Service Bus
As the one above, this course will also be critical for architects and developer alike. This course will cover the new Windows Azure Service Bus Adapter for the Application Integration Framework, which will allow us to deploy Microsoft Dynamics AX Services through the Azure Service Bus, making Microsoft Dynamics AX Services accessible from the cloud.

Data synchronization to multiple instances of Microsoft Dynamics AX
This course will help us learn the Data Management (MDM) in Microsoft Dynamics AX 2012 R3, and how it simplifies master data synchronization across multiple Microsoft Dynamics AX instances by utilizing conceptual business entities, metadata, and declarative configuration.

Optimizing the performance of Microsoft Dynamics AX deployment
I'm really looking forward to this course as it will cover some techniques that could help us gain higher performance in our Microsoft Dynamics AX deployment. A few key points that will be introduced will be design-patterns, parameter configuration and implementation pit-falls.

Create Microsoft Dynamics AX builds using the new X++ server-side parallel compiler
In this course we are going to go through an overview of the new X++ server side parallel compiler as well as best practices on how to apply the new compiler in creating Microsoft Dynamics AX builds in the context of multiple development environments integrated via TFS version control.

Technical Deep Dive - warehouse management
This is a course that will be critical for me and I'm sure for the majority of Microsoft Dynamics AX professionals out there, as it will dive into the technical challenges for the new warehouse management inventory reservation system. by exploring the new data models as well as the interaction between work and reservations.

Warehouse and transportation management hands on experience
This will be an hands-on experience course to preview the functionality available in Microsoft Dynamics AX 2012 R3 for warehouse management and transportation management.

Tracking dimension at work! See the new item tracing, batch attributes-based pricing and batch merge within Microsoft Dynamics AX 2012 R3
This course will dive in to understanding how the potency of products is taken into account when calculating consumption for batch orders, leveraging potency pricing to improve the sales process and utilize the new batch merge capability to achieve a potency level requirement from a customer during sales.

This is all for now. Follow me on this adventure in both my blog and twitter.