This update is just a quick note and shutout on a topic I ran into at a client. I’m working the their Planning Application integration via FDMEE. Now this is a very big environment that has one Foundations and 2 Essbase servers, which are nor clustered but standalone.

Now somehow we had the situation like this (it was fortunately discovered in TEST environment):
– Workfrc
– Expense
– OtherDB
– OtherDB2
– AppName
– Expense
– Expen14
– Expen15

You might have guessed the Application on Server 2 is in fact a copy and some kind of personal archive of an administrator for the HP application. Now the second application was deployed as an Essbase one.

But let’s rewind to what happen – I migrated the solution from my own VM, all good all the way to the “Third Fish”. When I was clicking the “Third Fish” it would keep failing as FDMEE was convinced some of the Entity members did not exist, after testing all location, one finally complained about FY16, now I knew this was REALLY weird. At this point I had no idea about the second server. After a bit of investigation, we found the rules FDMEE was deploying where not visible in EAS. So where are my rules? What is going on. At that point someone mentioned Essbase Server 2. This finally started making sense. We found a copy of the application and all the rules that FDMEE deployed for itself. Renamed the second Application, and magic…all works again.

Lesson learned, be very careful with your application naming.

A while ago I got asked to come up with a solution for Smart Merge. Now I’m more a database developer then much else, so my first though was lets do it on the DB. As I spend much time coding this and I got to say I’m not yet really happy with my jython skills I’m not sharing the code, but the idea. Hopefully it will help you get trough getting it developed.

I used 2 additional artefacts on the DB. One table and one view. The view is droped and recreated during the BefLoad. Basically what I did is create a table that is similar to TDATASEG and into which before a new load to TDATASEG is done copies the lines for the pertinent CATKEY, PERIDKEY and ENTITY, to the auxiliary table.

The overview of steps executed:
I. In BefImport: Data from TDATASEG for previous relevant load is stored in TDATASEG_SM, and view TD_TO_TDSM (which holds: TDATASEG_SM – TDATASEG) is created for current CATKEY, PERIDKEY and ENTITY
II. In AftExportToDat, based on the view we append lines with NODATA

The additional candy is the fact there a function that sorts trough the file and makes a list of all entities in there, so to query dynamically which Entities are present in the file.

Got questions about it? Just leave me a comment bellow and I will try to explain, without giving out the code 🙂

UPDATE: After few months of testing I had to move the load of TDATASEG_SM into the AftLoad Event script so to ensure the data loaded in there is the data that is relevant for the next load.

So what if we would like to export data to a file via CalcScript DATAEXPORT function, and then use the output file to be feed to FDMEE? How do we do that? Easy: call it from a Maxl script, wrap that into a batch file, and call it from the BefImport, so the file is there at Import step. All this was done using the knowledge Francisco has shared a while ago on the OTN.

But let’s look into the exact steps to take:
1) Create a simple Maxl script (which I encrypted because reasons, and so it’s called mxl.mxls):

2) Create a CalcScript that looks like something similar to this (and named Export, as seen above):

Why like that? ‘Cause I set up in the DLR to use the file output.txt in the inbox:

3) Then make a batch file (mxl.bat), with the following content – & just feeding Essbase back his private key, so it can decrypt the script for itself:

4) As the last step put the following code as BefImport script:

Quick and painless!

I was helping out a colleague of mine so I was investigating why a script that would export data in was not exporting anything in Started by only using

Essbase would respond with:
Received Command [Calculate] from user [admin@Native Directory] using []
DataExport detects Dynamic Calc member [KittenMbr] in the range. Exporting Dynamic Calc data may slow down performance.
This DataExport operation will export data from existing blocks only. Any FIX on sparse dynamic calc members will be ignored. Use DATAEXPORTNONEXISTINGBLOCKS ON option to export data from all potential blocks.
Data Export Completed. Total blocks: [15]. Elapsed time: [0.023].
Total Number of Non-Missing cells exported: [60].

Now this is a cube with many Sparse members on DynamicCalc – I got no influence on this part -, which was mitigated in the old version by using:
DataExportDynamicCalc ON;

Same result. Hmm interesting.

After 2 minutes on Metalink, we were able to find Document 2123909.1, where Oracle development explains changed the behaviour. Short story: you are supposed to use them as:
DataExportDynamicCalc ON;

After using it as above, I get the response from Essbase:
Received Command [Calculate] from user [admin@Native Directory] using []
DataExport Warning: This DataExport operation will export a total of [3958416] blocks. Exporting dynamic calc members from all blocks has significant performance overhead. Use DATAEXPORTNONEXISTINGBLOCKS OFF option to export data from existing blocks only.
Data Export Completed. Total blocks: [45]. Elapsed time: [508.234].
Total Number of Non-Missing cells exported: [180].

Do I got to add be careful with this? if nothing else is an indication, just look at how much more time it took for export of still such a small amount of data.

I’ve already written partially about this, but now I’m trying to solve it by adjusting the listing.xml that LCM uses to import articafts. LCM moves most of the needed stuff before environments correctly, with few exceptions, organizing the folders by date of creation being the main one.
Now this is a dummy case, but it could happen – especially in older version than I’m testing this on – that when importing from Acceptance to Production you end up with a similar mass:

When you look at listing.xml it will look this:

The lines circled, and the only that need adjusting, both on Folder side and as well Forms one. This is what they will look like once re-ordered:

Once you have saved the changes, and after deleting the incorrectly ordered folders just simply import by using the new listing file and TADA:

With the introduction of version, it’s more and more relevant to use Firefox instead of IE with the EPM System. Now while testing the new version, the only problem I have run into is, that it might be that while trying to use the functionality “Open in SmartView”:
You encounter this:

For that one user you will need to add a registry key, to fix it, something like this pointing to the installation location of SV:

For this to take effect, restart Firefox. After restart it will ask you if it may install the extension, that should be now visible in the add-on tab:

Hope this helps 🙂

Working with HP sometimes you get to dig in the Planning DB, here is my latest story. Might be good to mention don’t do this home alone without supervision by an adult.

I have been dealing with an applications sited in an older Hyperion (x.2.1) version. And when using LCM, it decided to re-order folders as it wished. “Move” did not do much in the sense of letting me re-order it. Here is a view of me trying to – btw you might notice it’s not the same version as my demo lab is x.2.3. – import the data forms.



In HP after import with LCM (AppX*) :

Which leaves me with a nice mess. So I decided to have a look at the background.

As I had to somewhere I first had a look at a select statement to get the most common object types counts:

On one of the apps I’m running on my x.2.3 lab, gives this result:

What about the Folder count per parent, to be able to compare two environments, you need the name and not id as that is probably different from app to app.

So let’s run the query, which just gives us you the count and the names. All the queries from here on were run on our AppX database:

But let’s go back to the problem that bought me here. You might have already know or notice while reading this the important column is Position as is how Planning knows how to sort the folders. Let’s take the “two ends” of the folders, just for a comparison, as you can see that the folders starting with 1, :

If you don’t know the “Forms” object ID, you can always do this to see the second generation:


After update has run:

If we now check our AppX we see this:

Very easily you can do something similar with TaskLists:


So on the other hand this is how it looks creating folders in Planning:
Step 1:
Step 2
Step 3
And Step 4 looks like 1, so working as intended, so if you want to re-order you just have to do it with some sql skill.

Here is a tease of a session I encourage you to join on COLLABORATE 15, as I help with the prep work a bit 😀


I will post some tricks you might need to this on your own, if you want to try it out, after the session tho 😀

As promised the lessons learned:
I) Essbase Application on which we are reporting should be rolling up possibly to top (Gen1).
II) Name Generations in Essbase – for less confusion
III) Don’t flatten account dimension – keep “Fact” – and change setting on Account dimension
IV) Tag one dimension as a time dimension
V) (“the other option”): Can be done trough Data Model (BI Publisher) created from SQL statement from reports created in Analysis (copied from under “Advanced” tab), example of SQL statement (which you can already adjust):

SELECT s_0, s_1, s_2, s_3, s_4, s_5, s_6, s_7, s_8, s_9, s_10, s_11, s_12 FROM (
0 s_0,
"PRS"."Account"."Accounts" s_1,
"PRS"."Department"."G2: AllDepts and Settings" s_2,
"PRS"."Department"."G3: Department" s_3,
"PRS"."Product"."G2: AllProd and NoProd" s_4,
"PRS"."Product"."G3: Product" s_5,
SORTKEY("PRS"."Account"."Accounts") s_6,
SORTKEY("PRS"."Department"."G2: AllDepts and Settings") s_7,
SORTKEY("PRS"."Department"."G3: Department") s_8,
SORTKEY("PRS"."Product"."G2: AllProd and NoProd") s_9,
SORTKEY("PRS"."Product"."G3: Product") s_10,
"PRS"."IncStmt"."Fact" s_11,
REPORT_AGGREGATE("PRS"."IncStmt"."Fact" BY "PRS"."Department"."G2: AllDepts and Settings","PRS"."Department"."G3: Department","PRS"."Product"."G2: AllProd and NoProd","PRS"."Account"."Accounts") s_12
(("Year"."Year - Default" = 'FY16') AND ("Scenario"."Scenario - Default" = 'Budget') AND ("Version"."Version - Default" = 'Working Version') AND ("Period"."G2: Total Year and BegBalance" = 'YearTotal') AND ("Product"."G2: AllProd and NoProd" <> 'NoProd') AND ("Department"."G2: AllDepts and Settings" <> 'Settings') AND ("Account"."Accounts" IN ('Price', 'Volume', 'Revenue', 'COGS', 'Discount', 'Bonus', 'SalesExp', '')))

A) Working with Data Model
Creating a Data Set

In the Data tab:
I) Click View
II) Click “Table View”
III)(Optional) Either in SQL or in tab Structure change the column names to make it easier to identify in Mobile designer

Ready to save and use

Normal by using Subject Areas

Using Bi Publisher Data Model

And this is how it looks like at end:

While thinking I was done with my installation, seems I was wrong, while working with HPCM, I wanted to test “design aggregation”, and found this:


Hopefully I get it before Friday, keep you posted.

So look at it again and found something I found funny – as when you read the following note from Oracle Doc ID 1608746.1. Now they say that if you want to obtain JavaFX go here. No linux edition here. So then you have a look at the asconsole.jnlp and see this:

Why is all this win reference jumping out now? 😀

So no news…in this case is not good news.

Better late than never. I got to install the new version of HP. Still not patched but that will be my next step. For now I got this:

In the installation I used: CentOS 5.11 and Oracle DB 12c.

Kernel info (you will be able to see it in the validation, later on as well):

The uninformed surprise (as I did not prep on the new things in the DB, this bit was much a surprise – both positive and negative one):

I was surprised how easy the Hyperion bit was. Install was pretty much as always Next, next, next…and the configuration, tho I did it in few iterations, was very simple and without any pain.

Validation (“Green…green everywhere”):

At this point I had to see a new application in the system…as seeing EAS empty made me a bit sad.

HCSample seemed a good idea. At this point I wanted to see LCM error, before all else. So imported CalcMng before just to see it error:

Sounds correct…I did try to trick you CalcMgr. Now importing “module” per module works. But did they somehow improve the import process? Almost…almost


…2min later. Did I configure FR even? Nop 😀

Okey. Lets try this again. Delete existing app and rules…just in case DS as well, so at least I know I started with a clean environment.

And ta-da:

After further check – still had to create a FR connections (5 in this case).
The FR book still does not work as expected.

On further note it all good looked in Planning (& FR), until I tried to look into CalcMgr.
Turns out only some parts got imported. Reimport on it’s own and much more artifacts are present now.

Last though on this point:
– As for now I would still NOT recommend importing the whole starter kit in one go…at least from this experience.
– I’m using 8Gb on my Virtual Machine, running DB and WebLogic and Hyperion and it’s not slow – so kudos*.

*For Reference