So what if we would like to export data to a file via CalcScript DATAEXPORT function, and then use the output file to be feed to FDMEE? How do we do that? Easy: call it from a Maxl script, wrap that into a batch file, and call it from the BefImport, so the file is there at Import step. All this was done using the knowledge Francisco has shared a while ago on the OTN.

But let’s look into the exact steps to take:
1) Create a simple Maxl script (which I encrypted because reasons, and so it’s called mxl.mxls):

2) Create a CalcScript that looks like something similar to this (and named Export, as seen above):

Why like that? ‘Cause I set up in the DLR to use the file output.txt in the inbox:

3) Then make a batch file (mxl.bat), with the following content – & just feeding Essbase back his private key, so it can decrypt the script for itself:

4) As the last step put the following code as BefImport script:

Quick and painless!

I was helping out a colleague of mine so I was investigating why a script that would export data in 11.1.2.2 was not exporting anything in 11.1.2.4. Started by only using

Essbase would respond with:
Received Command [Calculate] from user [admin@Native Directory] using [EXP.csc]
DataExport detects Dynamic Calc member [KittenMbr] in the range. Exporting Dynamic Calc data may slow down performance.
This DataExport operation will export data from existing blocks only. Any FIX on sparse dynamic calc members will be ignored. Use DATAEXPORTNONEXISTINGBLOCKS ON option to export data from all potential blocks.
Data Export Completed. Total blocks: [15]. Elapsed time: [0.023].
Total Number of Non-Missing cells exported: [60].

Now this is a cube with many Sparse members on DynamicCalc – I got no influence on this part -, which was mitigated in the old version by using:
DataExportDynamicCalc ON;
DATAEXPORTNONEXISTINGBLOCKS OFF;

Same result. Hmm interesting.

After 2 minutes on Metalink, we were able to find Document 2123909.1, where Oracle development explains changed the behaviour. Short story: you are supposed to use them as:
DataExportDynamicCalc ON;
DATAEXPORTNONEXISTINGBLOCKS ON;

After using it as above, I get the response from Essbase:
Received Command [Calculate] from user [admin@Native Directory] using [ESC.csc]
DataExport Warning: This DataExport operation will export a total of [3958416] blocks. Exporting dynamic calc members from all blocks has significant performance overhead. Use DATAEXPORTNONEXISTINGBLOCKS OFF option to export data from existing blocks only.
Data Export Completed. Total blocks: [45]. Elapsed time: [508.234].
Total Number of Non-Missing cells exported: [180].

Do I got to add be careful with this? if nothing else is an indication, just look at how much more time it took for export of still such a small amount of data.

I’ve already written partially about this, but now I’m trying to solve it by adjusting the listing.xml that LCM uses to import articafts. LCM moves most of the needed stuff before environments correctly, with few exceptions, organizing the folders by date of creation being the main one.
Now this is a dummy case, but it could happen – especially in older version than I’m testing this on – that when importing from Acceptance to Production you end up with a similar mass:

When you look at listing.xml it will look this:
FormsIncorectXML

The lines circled, and the only that need adjusting, both on Folder side and as well Forms one. This is what they will look like once re-ordered:
FormsCorrectXML

Once you have saved the changes, and after deleting the incorrectly ordered folders just simply import by using the new listing file and TADA:
TheProblemSolved