Tuesday, December 15, 2015

The forecast has not been generated. The Database with 'Name' = Demand Forecast initial Test' doesn't exist in the collection.

Hi There!

I hope everyone is having a good week so far.
On another note, I have been working on a few issues around demand forecasting in AX2012 R3 CU8, and after our infrastructure team installed the Analysis services, users were getting the following error:

"The forecast has not been generated. The Database with 'Name' = Demand Forecast initial Test' doesn't exist in the collection."

The actual solution is pretty simple, but I thought the resolution from LCS was too general and it did not provide the details I was looking for. If you access LCS Issue Search for this specific issue (https://fix.lcs.dynamics.com/Issue/Solution/834451?bugId=1703068&qc=70e497218e943038e1e3d5bb836399fc9cfcbd2e3447897bcbd793236dcf12fd), you will see the following statement:

"Add the AOS service account to the OLAP Database Roles."
Great!, pretty straight forward, but which role and database? The following are the steps that will hopefully help someone save some time on this.


A user runs the Demand Forecast in AX and gets the following error message - The forecast has not been generated. The Database with 'Name' = Demand Forecast initial Test' doesn't exist in the collection.


To solve the problem these are the steps to be followed:

1.Go to the Analysis Server and look for the Demand Forecast Initial Database and Expand the Roles node.
2.Right click in the Production Planner and go to properties.
3.Go to Membership and click Add.
4.Add the AOS AX Service account.

That's all for now. Have a great week!


Saturday, January 24, 2015

TFS Online - Creating and Connecting to TFS Online with AX2012


Hi There,
I hope everyone is having a great weekend so far, and that you are ready for a great post about TFS Online and how to connect to it.
Before continuing with the step-by-step directions on how to connect to TFS Online, I wanted to share a few links that I think will of interest to you if you are looking to go with this option in your Microsoft Dynamics AX implementation. In addition, really soon I'll post my view on branching and what might work best for a big implementation. Also, I'll be posting information on how to use the Scrum template. This is creating features, backlog items, sprints, associating sprints with backlog items, and these with features, etc.
What is visual studio online?
Visual Studio Online Pricing
With the above out of the way, let's jump into creating a new visual studio online project, and connecting to an instance of AX2012R3. In the following steps I'm assuming you already have a Visual Studio Online account, and that you have your repository and branching implemented (although I will cover this topics very, very soon).
  • Go to https://YOURPROJECTNAME.visualstudio.com/DefaultCollection

  • Go to control panel, and click DefaultCollection. Once there go to administration tasks and choose the option create new team project as shown below.
  • Give your project a name and choose a process template. In my case I chose an Scrum template. For more information on process templates go to this link.
  • Once you create your new Visual Studio Online project, you should see the following.
  • At this step is where you would have to create your branches, and local server (or workstations) repositories. The following is a quick guide to branching, simple but very effective, click this link to open it.

  • After your branching and repositories are good to go open a development work space in AX2012R3 (or earlier), and go to Version Control / Version Control Parameters
  • Under General fill out the following information:

a.     Version Control Status = Enable
b.    Version Control System = Team Foundation Server
c.     Repository Folder = [Choose a local repository]
d.    Mark the two check boxes (as shown below)

  • Under Team Foundation Server fill out the following information

a.     Team Foundation Server URL = https://YOUR-PROJECT-NAME.visualstudio.com/DefaultCollection
b.    Team Foundation Project Name = YOUR PROJECT NAME
c.     Branch Folder = YOUR BRANCH (i.e. Development, QA, UNT, etc)

  • If you are using the Visual Studio 2010 SP1, you'll be fine. If not you'll get the following error
  • To resolve this, please follow the steps:

b.    Download and install the KB2662296

  • Once you have installed the above KB, restart your AOS and open AX2012R3 again. You'll see the following screen, which means that after signing in, you are ready to start enjoying the benefits of TFS online.
This is all for now. Remember to check my blog for updates on how to use (in my own way that is) the ALM features that TFS online offers.

Have a great week ...

Saturday, March 15, 2014

Automated Deployment with Windows Azure - AX 2012 R3


Hi there!

I hope everyone had a great and productive week. I certainly did.

On this post I would like to share what I learnt about Automated Deployment with Windows Azure at the AX 2012 R3 Tech Conference. As always, I would like to extend my gratitude to TriBridge for taking me to this event.

As discussed in previous events, Microsoft said that it was moving to a cloud environment where it would provide a service to host and run AX instances in the cloud. At this stage, Microsoft offers a wide variety of services to an organization that provide a well-designed infrastructure for development, testing and small-scale production environments.
The following is the Azure Hosting Model (Blue boxes is what Azure is proving us)

Azure Setup
Only AX 2012 R3 is certified and supported for Azure at this time. When asking Microsoft about older CU (i.e. CU6), they said it could be possible to work with these versions, and there are a number of companies that are doing it, but it is not supported.
A very cost-effective concept is that Azure takes care of all the back-end processes when creating a new Azure instance. Microsoft accomplished this by using a set of automated scripts that install, and slightly configures the instance. However, customer specific configuration as well as customer specific network details are not part of the automated process for obvious reasons.
Moving right along, Azure provides a very good cost effective solution for hosting. Azure calls it, “Pay-as-you-Go”, which mean that a user will be charged a certain amount of money only when he/she is using the Azure instance.
One of the main benefits of this solution is that any device with RDP capabilities can access the Azure instances.
Azure Deployment Services

Microsoft provides a robust framework for deploying Microsoft Dynamics AX Instances to Azure. The following are most of the steps needed to make use of these services.
  1. A user/Organization must get an Azure subscription ID by signing into the Azure website.
  2. There is a 3 month free trial available.
  3. A new subscription will include 20 cores for a basic deployment of a development and test environments.
  4. The Azure subscription service will setup the instance automatically.
  5. Azure will provide different topologies (Development, Test, and Production, which needs more than 20 cores in a real business deployment scenario).
  6. By default, Azure creates 2 machines per instance to support maintenance.
  7. A typical development Dev/Test deployment takes approximately 8-10 hours.
  8. The deployment process is an “intelligent” process that retries failures. In addition a user can define the maximum number of retries, which helps on troubleshooting times.
  9. Azure provides a Demo instance with Contoso data.
  10. Azure provide Lifecycle services as a default feature in each instance.

Post Deployment Considerations

Although Azure does many of the configuration tasks automatically, there are a number of post-deployment actions we need to follow up after each setup. The following describes the steps needed after deployment.

In addition, an important point to take in consideration is setting up TFS, Outlook and Lync (if available for a customer). Microsoft can help a customer/partner to set these applications.

One important point is the existence of SQL Server Always On feature, which brings SQL Server high availability and disaster recovery to a whole new level by allowing multiple copies of the database to be highly available. Always On Availability Groups allow you to fail over a group of databases as a single entity, unlike database mirroring where you can only do so one database at a time. Further, this architecture also offers SQL Witness. Its main task is to monitor the mirroring scenario and to initiate automatic failover.

Azure also provides a REST interface instead of a SOAP one. REST is a simple stateless architecture that generally runs over HTTP, and is often used in mobile applications, social networking Web sites, and automated business processes.

Finally, Microsoft recommends starting virtual machines in Azure in a sequence, otherwise the IP addresses will not match the VM’s sub-nets. A question was asked to Microsoft to expand on this issue, and they are not sure why this happens. The good news, however, is that they are working on it.
The following is the Azure portal possible architecture that Microsoft is working on.

Check out TriBridge Cloud Services at TriBridge Concerto. We provide cloud hosting for all your needs. Also, check our TriBridge Careers page and get on our winning team.


Friday, March 7, 2014

Create Dynamics AX Builds using the X++ Server - AX 2012 R3


Hi There!

I hope everybody is doing great! I had the opportunity to attend to the MS AX 2012 R3 Tech Conference thanks to TriBridge.
Microsoft has worked hard in improving the X++ compiler for the new release. The new compiler is also available in CU7.

X++ Compiler Performance

In the past releases, the X++ compiler has been the bottleneck of build and installation scenarios across the board. For this, Microsoft has recommended the following tips to improve the compiler performance (CU7 and up only)
  • One single machine deployment
    • SQL Server
    • AOS
    • Client
  • 16GB of memory available.
  • Don’t constrain SQL Server memory consumption.
  • Installation of the KB2844240 Hotfix (AX 2012 R2) – Index Optimizations.
  • Faster CPUs is a much better option than multiple slow ones.
  • Solid state drives, which are typically more resistant to physical shock, run silently, have lower access time, and less latency.

How the Microsoft Dynamics AX compiler works

The following depicts the phases of the X++ compiler in previous versions of Microsoft Dynamics AX.


It is important to note that in earlier versions of Microsoft Dynamics AX the build performance is affected by the metadata moving from the client to the server metadata. In addition, the long compiling times are due to the deserialization of metadata and in memory cache.

The following is the architecture for the current compiler

R3 Compiler Improvements

Microsoft enhanced the compiler by allowing us to use the Build.exe command or the client. However, from an architectural point of view, they removed the client portion of the compiler in the R3 release.

The following is the new architecture improvements


A few key points to underline is that the AOS now contains the logging information, therefore there is no cache in memory. In addition, logs are generated in each AOS. In case of a multi AOS deployment scenario, the AXBuild.exe process automatically consolidate these into one log.

Finally, when using the parallel compiler, the CPU usage is extremely high. In a multi CPU scenario, the AXBuild.exe process will automatically balance the load between CPU’s. Also, is important to understand that parallel does not mean multi-threading, the new compiler is very much still a single-thread process.

The following picture depicts what a parallel compiler output looks like

Visit http://www.tribridge.com/ and learn about our Dynamics AX practice, services and focus, as well as our cloud services Concerto. Join our winning club!

Tuesday, February 18, 2014

Microsoft Dynamics AX 2012 R3 - New Data Import Export Framewrok Changes

Hi There!

I hope everybody is doing great! I had the opportunity to attend to the MS AX 2012 R# Tech Conference thanks to TriBridge.

In this post I would like to discuss the new Data Import Export Framework or DIXF. As you probably know by now, this new framework is shipped with both CU7 and R3.

So, what can new DIXF do? Well to start, one of the few new key features is that the DIXF runs on top of the SSIS service interface allowing incremental runs (UPSERT). Of course, it can import/export data, and Microsoft added the capability to compare and copy data between instances as well. In addition, the new DIXF version ships with the ability to choose different data sources such as text, MS Excel, and XML files.

Further, the new DIXF can be used to extract data directly from various ODBC*** sources such as SQL, MS Access, and MS Excel. This new additions will help us streamline our data migrations and data transfers much better.

***For ODBC types we are going to have to provide a connection string in order to simplify the data selection process. The one cool thing I saw was that we can create new rows under Mapping Details to add mandatory fields i.e. ACCOUNTNUM in case a specific legacy system does not include it.

When this scenario is true, the custom value provided can be automatically filled by a number sequence value (if we want to) by choosing the “AUTO” option in that specific row, which would take a new AccountNum from the numbering sequence system. However, we can also choose to have default values as in older versions.

In terms of the DIXF entities, the new DIXF ships with a 150 entities in comparison to the 78 (I think) it came with in earlier versions. These include master data, documents, journals, parameters, party, products, and configuration data.

Another cool addition is the addition of folder support. We are going to be able to move stuff around automatically (needs to be pre-defined) to different folders in our domain based on the operations we are executing.

The following are a few other additions:

Parallel Execution: Ability to dissect data in bundles (i.e. 1,000 rows / bundles = 100 rows per task).

This is particularly useful when large data loads need to take place. The tool provides the ability to allocate a group of records to tasks. This combination will create a bundle, and each bundle is independent of each other. See the following diagram for a visual representation of it:

Role Base Security: Provides a security framework for the different levels on an organization, this is built on top of the existing security framework (i.e. Production cannot import HR data).

Mapper Control: Allows flexible mapping between custom entities and staging objects. In addition, mapping effort is reduced when using AX friendly column names (i.e. ITEMID).

Custom Entity Wizard: We can compare data in different companies. This becomes specially interesting and useful to compare parameter data between a gold and test instances for example.  When using this tool to import data that contains discrepancies, the system inserts the data into a staging table where it is compared by another process in a specific number of companies and/or instances, and finally it gets updated.

At this point, a user can use the Comparison form to move records between different instances.

See the process in the following diagram:

NOTE: Sometimes the entity Wizard will only create a portion of the requirements and a technical consultant would have to finish the rest.

System Configuration Data: BI-Analysis and Reporting Services, Project Server Integration, EP, Batch Settings, Number Sequences, Email Parameters, AIF, System Service Accounts.

DIXF Import Process

The import Process us done by using an abstraction layer that uses SSIS behind the DIXF framework. Within this abstraction layer, we can add possible X++ customizations.

I asked the question on what would be the recommendation for migrating data from legacy systems – the following is what I could get from their recommendation (I was taking notes).There are two types of data migration architecture that consolidate both importing and cleansing data.

The first option is to have a middle tier that can process the data from a legacy system, to an external system and clean it before it goes to Microsoft Dynamics AX.

The second option is to do it directly import the data from a legacy system to Microsoft Dynamics AX. 

Microsoft recommends to keep the data cleansing business logic inside of AX. The reason is that Microsoft Dynamics provides a data migration framework that is both extensible and customizable. The framework provides entity classes that can be extended to a process specific needs. In addition to the entity classes, the framework also provides the ability to create custom staging entities for further processing prior to the final push to an entity. This can be depicted in the following picture:

The DIXF also provides a new error log preview function that allows a user to narrow down an error to the smallest unit possible to understand exactly where the error is occurring. This was not true in older versions of the DIXF. Further, the new DIXF also provides an Execution History function that allows a user to review and validate the staging data before the actual import to an entity.

DIXF Export Process

As mentioned earlier, because the DIXF also uses SSIS to export data from the framework, bulk exports can also be easily accomplished. In addition, as in older versions of DIXF, we can also generate our own source mapping and sample files. However, a cool new addition to the DIXF is that these files now can be of different types such as XLS, XML, Text, Tab delimited, Etc.

This approach sounds good and valid, however, in my mind here could be a double edge sword with the fact that XLS files might open the door for a few data consistency problems as this type of files can contain formulas. I would suggest to always understand your source files, especially our XLS ones.

DIXF Architecture

The following is the new DIXF architecture for R3.

Visit http://www.tribridge.com/ and learn about our Dynamics AX practice, services and focus, as well as our cloud services Concerto.

Sunday, February 2, 2014

Microsoft Dynamics AX 2012 R3 Tech Conference


Hi there!
I will be attending to the Dynamics AX 2012 R3 tech conference in Seattle with my team members from TriBridge. In addition, in this post I would like to share the courses I will be taking at the Tech Conference and give my readers a snapshot of what each course is going to cover.

Finally, my goal will be to write a blog entry for each of the courses in depth. So follow my blog for the next couple of weeks as I will be posting very interesting information from these amazing courses.
In a nutshell, the Microsoft Dynamics AX Technical Conference 2014 begins on Monday February 3rd in Seattle (Bellevue, Washington).  As far as I know, this is yet the biggest event for the upcoming Dynamics AX 2012 R3 release, which will introduce many new features in both the technical and functional dimensions of the application.
Below is a list of the courses I will be taking and a brief description of each. I will link each of the titles below to a new blog post in the next few days, so be sure to check this page for updated information (Each title will be in blue when ready to be explored)

Day in the life of a Microsoft Dynamics AX Solution Architect
The main focus of this course will be to learn how the roles and responsibilities of a solution architect apply to the different phases of an implementation project. I will be enjoying this course.

Data import/export framework for Microsoft Dynamics AX 2012 R3
The focus of this course is to experience the new entity-based import/export framework, which will help us manage our import/export, integration, configuration, and data migration needs.

Automated deployment of AX 2012 R3 in Windows Azure using lifecycle services
I believe this course is critical to any solution and/or technical architect as well for any developers as it will allow us to learn about the new features that Microsoft has developed for lifecycle services that will automate the deployment of Microsoft Dynamics AX 2012 R3 environments with Windows Azure.

Building Microsoft AX services integration with the Microsoft Windows Azure Service Bus
As the one above, this course will also be critical for architects and developer alike. This course will cover the new Windows Azure Service Bus Adapter for the Application Integration Framework, which will allow us to deploy Microsoft Dynamics AX Services through the Azure Service Bus, making Microsoft Dynamics AX Services accessible from the cloud.

Data synchronization to multiple instances of Microsoft Dynamics AX
This course will help us learn the Data Management (MDM) in Microsoft Dynamics AX 2012 R3, and how it simplifies master data synchronization across multiple Microsoft Dynamics AX instances by utilizing conceptual business entities, metadata, and declarative configuration.

Optimizing the performance of Microsoft Dynamics AX deployment
I'm really looking forward to this course as it will cover some techniques that could help us gain higher performance in our Microsoft Dynamics AX deployment. A few key points that will be introduced will be design-patterns, parameter configuration and implementation pit-falls.

Create Microsoft Dynamics AX builds using the new X++ server-side parallel compiler
In this course we are going to go through an overview of the new X++ server side parallel compiler as well as best practices on how to apply the new compiler in creating Microsoft Dynamics AX builds in the context of multiple development environments integrated via TFS version control.

Technical Deep Dive - warehouse management
This is a course that will be critical for me and I'm sure for the majority of Microsoft Dynamics AX professionals out there, as it will dive into the technical challenges for the new warehouse management inventory reservation system. by exploring the new data models as well as the interaction between work and reservations.

Warehouse and transportation management hands on experience
This will be an hands-on experience course to preview the functionality available in Microsoft Dynamics AX 2012 R3 for warehouse management and transportation management.

Tracking dimension at work! See the new item tracing, batch attributes-based pricing and batch merge within Microsoft Dynamics AX 2012 R3
This course will dive in to understanding how the potency of products is taken into account when calculating consumption for batch orders, leveraging potency pricing to improve the sales process and utilize the new batch merge capability to achieve a potency level requirement from a customer during sales.

This is all for now. Follow me on this adventure in both my blog and twitter.


Tuesday, August 27, 2013

Create, Deploy, Troubleshoot and Consume external services in AX 2012 R2


Hi There!
I hope everyone is having a great week so far. Summer is almost over here in the US, and I feel like I haven't taken much advantage of it this year. The good thing, however, is that I have been able to really focus on service development lately, and a ton of other cool AX stuff.
On this post I would like to share with you how to Create, Deploy, Troubleshoot and consume an external service in AX 2012 R2. As we all know, this has changed dramatically from AX 2009 services. It used to be very easy to consume services in AX 2009 (you can see an example in my post Consume a Currency Exchange Rates Web Service from AX 2009 - WFC ).
In AX 2012 R2, however, this has become somewhat more involved. They are not necessarily harder to create and consume, but they require a few more steps to be setup. Now, the great advantage is that you can resolve the business logic either in the client itself (C# project) or in AX 2012 R2 (Server deployment). This comes handy for business that don't necessarily want to have an AX developer in house and/or large scale integration projects, among other reasons.
Let's get to it!
Open visual studio and create a new Class Library Project. Give it a name and click OK.
Right click the Project Name references folder and click the Add Service Reference button.
Paste the http://www.webservicex.net/genericbarcode.asmx?WSDL URL into the Address bar. This is a Barcode Generator service. Give it a name and click OK.
This will create a new Service Reference and a new AppConfig file where both the basic and custom bindings are automatically generated for you.
Right click the Project Name and choose Add "Service Name" to AOT. This will add the Csharp Project to the AOT under Visual Studio Projects/Csharp Projects.
Once the project has been added to the AOT, you can choose the following properties and deploy the service.
NOTE: If you choose to deploy to the server, you will need to enable Hot Swapping Assemblies on the server configuration file.  See the following for more info (http://msdn.microsoft.com/en-us/library/gg889279.aspx).  If you choose to do this, you will have to restart the AOS.
After it is deployed, you would add a code similar to the one below.
 static void TestBarcodeGenService(Args _args)
    Ed_SampleBarcodeGenerator.EdGenBarcode.BarCodeSoapClient service;
    Ed_SampleBarcodeGenerator.EdGenBarcode.BarCodeData barCodeData;
    System.Exception ex;
    System.Type type;

        service = new Ed_SampleBarcodeGenerator.EdGenBarcode.BarCodeSoapClient();
        service.GenerateBarCode(barCodeData, "0000992882");
        ex = CLRInterop::getLastException();

 Well ... that's all for now folks. Stay tuned, there is going to be a huge load of useful information in the next few weeks.


Friday, August 23, 2013

Create a Transfer Journal using AX 2012 R2 Document Services and C#


Hi there,
On this post I would like to share some C# code to create a Transfer Journal using C#. I have written a few post in the past about services and they will help you understand how to create a service, service groups, deployment, etc.
Create Counting Journals
How to choose the right service
AX 2012 Services and AIF
Services Types
Creating a service in AX 2012
Back to the creation of a Transfer Journal with C#, this is an interesting code as we need to instantiate two different instances of the InventDim Table; InventDimIssue and InventDimReceipt.
InventDimIssue can be thought as the From values and InventDimReceipt can be thought as the To values (i.e. From Warehouse ==> To Warehouse).
In addition, another interesting point is that AX uses the InventJournalTable and InventJournalTrans for all the inventory journal entries, and we specified, in C#, which entity (AXD) will be using.
The following is the code:
private void InventTransferJourTest()
            InventTransferJournal.CallContext callContext = new InventTransferJournal.CallContext();

            InventTransferJournal.TransferJournalServiceClient servClient = new InventTransferJournal.TransferJournalServiceClient();

            InventTransferJournal.AxdTransferJournal transjournal = new InventTransferJournal.AxdTransferJournal();

            InventTransferJournal.AxdEntity_InventJournalTable journalheader = new InventTransferJournal.AxdEntity_InventJournalTable();

            callContext.Company = "CEU";
            journalheader.JournalNameId = "TransferJourId";
            journalheader.Description = "Transfer Journal";
            //End header

            InventTransferJournal.AxdEntity_InventJournalTrans journalLines = new InventTransferJournal.AxdEntity_InventJournalTrans();

            journalLines.ItemId = "123456";
            journalLines.Qty = 45;
            journalLines.TransDate = DateTime.Now;

            InventTransferJournal.AxdEntity_InventDimIssue inventDimIssue = new InventTransferJournal.AxdEntity_InventDimIssue();

            inventDimIssue.InventBatchId = "RUT";
            inventDimIssue.InventLocationId = "21";
            inventDimIssue.InventSiteId = "1";

            journalLines.InventDimIssue = new InventTransferJournal.AxdEntity_InventDimIssue[1] { inventDimIssue };

            InventTransferJournal.AxdEntity_InventDimReceipt inventDimReceipt = new InventTransferJournal.AxdEntity_InventDimReceipt();

            inventDimReceipt.InventSiteId = "2";
            inventDimReceipt.InventLocationId = "11";
            inventDimReceipt.InventBatchId = "RSR";

            journalLines.InventDimReceipt = new InventTransferJournal.AxdEntity_InventDimReceipt[1] { inventDimReceipt };
            //End Lines

            journalheader.InventJournalTrans = new InventTransferJournal.AxdEntity_InventJournalTrans[1] { journalLines };

            transjournal.InventJournalTable = new InventTransferJournal.AxdEntity_InventJournalTable[1] {journalheader};

                servClient.create(callContext, transjournal);
            catch (Exception e)

 That's all for today and stay tuned as in the next few weeks I will be talking about TFS and how to work with AX 2012 in a way that we utilize the TFS server to its max capacity.

Have a great weekend!

Thursday, August 22, 2013

Create Counting Journal in AX 2012 R2 using Document Services

Hi There,

It has been a long time since I created my last post. I have been very busy learning new things about AX 2012 R2 and other related technologies such as the Data Import/Export framework, TFS and AX 2012, SharePoint Development for the Enterprise Portal, among other. Everything will come in its own time and I'm planning in sharing a lot in the weeks to come, so stay tuned!

On this post I would like to share some C# code to create a Counting Journal in AX 2012 R2 using the InventCountingJournalService that ships with AX. Let's keep in mind that the AX 2012 R2 document services are a extremely low cost option of providing this features to an external client with no AX development whatsoever.

So, I would like to start from the beginning:

1- Create a Service Group

2- Add the InventCountingJournalService to the Service Group

3- Deploy the Service Group. This will output the following.


4- Get the WSDL URI from the inbound ports form.

5- Go to Visual Studio, create a new windows form project, add a button and double click the button to create a button event.

6- Right - Click the Service References and choose Add Service Reference.

7 - Past the WSDL URI and click GO

8- Give your service a name i.e. InventCountingJournnal

9 - Write the following code and test.

 private void InventCountingJournal()
            InventCountingJournal.CallContext callContext = new InventCountingJournal.CallContext();

            InventCountingJournal.CountingJournalServiceClient servClient = new  InventCountingJournal.CountingJournalServiceClient();

            InventCountingJournal.AxdCountingJournal countJournal = new InventCountingJournal.AxdCountingJournal();

            InventCountingJournal.AxdEntity_InventJournalTable journalHeader = new InventCountingJournal.AxdEntity_InventJournalTable();

            callContext.Company = "CEU";
            journalHeader.JournalNameId = "CountJour";
            journalHeader.Description = "Counting Journal";

            InventCountingJournal.AxdEntity_InventJournalTrans journalLines = new InventCountingJournal.AxdEntity_InventJournalTrans();

            journalLines.ItemId = "12345";
            journalLines.Qty = 50;
            journalLines.TransDate = DateTime.Now;

            InventCountingJournal.AxdEntity_InventDim inventDim = new InventCountingJournal.AxdEntity_InventDim();

            inventDim.InventBatchId = "3";
            inventDim.InventLocationId = "1";
            inventDim.InventSiteId = "3";

            journalLines.InventDim = new InventCountingJournal.AxdEntity_InventDim[1] { inventDim };


            journalHeader.InventJournalTrans = new InventCountingJournal.AxdEntity_InventJournalTrans[1] { journalLines };

            countJournal.InventJournalTable = new InventCountingJournal.AxdEntity_InventJournalTable[1] { journalHeader };

            servClient.create(callContext, countJournal);

You can test this by clicking the button, and calling this method. A new counting journal would be created in AX. Then, you can either have a batch posting all the journals or simply have a user doing it manually.

That's all for now!