Metadata on the fly

Oracle Data Management makes your life easier, both in the cloud and on-premise. However, failures in loading data can occur, often due to missing metadata members. Solving these missing members is a time-consuming task. Automating these steps will save you time and downtime.

Missende metadata oplossen

Done manually, solving the missing metadata members in the application takes a lot of time, since it requires the following steps to complete:

  1. Check the error message from the load action, to capture the rejected members for the different dimensions
  2. Edit the dimensions by adding the rejected metadata members on the right place in the tree structure
  3. Refresh the cube to make these new members visible to the application
  4. Rerun the failed Data Management loading task
  5. Validate whether the new load has executed successfully

Automating the missing metadata process in Oracle Data Management

Consequently, one of our clients asked Swap Support whether it is possible to automate this process within the environment, and thereby saving time and downtime of the environment during working hours. Luckily, the answer to this question is yes.

In this blog I will note the functional steps for the data management data loads that have been automated. If you’re interested in the scripts and techniques that are used for this automation, please feel free to contact me directly.

The starting point for implementing this automated process is to use the standard functionality of the Hyperion environment as much as possible, and use the automation scripts only for executing already defined jobs. Therefore, we made use of the Metadata upload functionality in Data Management to create and map the new missing members. Furthermore, we used REST APIs of both Data Management and FCCS/PBCS (HFM/Planning) to execute and monitor the different loads.

How does it work?

The best way to explain our adopted steps is to run the process from beginning to end, and show the functional steps that are needed to come to a successful load of the transactional data.

  1. Remotely execute a Data Management transactional data load rule that imports an uploaded flat file into the target application
  2. In case of failure, download the corresponding error log from the environment to parse this log file for rejected members:

  3. The following rejected data row was found in the corresponding log file:

  4. To find which of these members is missing in the application, the automated script compares the rejected values in this error line with all base level members of the corresponding dimensions, and creates a new flat file that contains all rejected members of the dimension:
  5. Upload the created metadata load file into a location within Data Management that is defined to upload dimension members to the corresponding Dimension, and run the automated metadata load in Data Management.
  6. A nice feature of updating metadata via Data Management is the option to refresh the database after each successful load.

  7. If the metadata load has been executed successfully, and the database is refreshed; the script reruns the initial transactional data load rule (which failed previously):

  8. After the transactional data has been loaded successfully, an automated mail will be sent to those responsible for the loaded location, containing all details of this automated run (including the newly created members):

Saving both time and downtime

The automation of data management loads and thereby creating metadata on the fly will, besides saving time for the administrator, have the advantage that it can be scheduled during non-working hours. Thill will save downtime to the end-users, since the database needs to be refreshed after new members have been created.

More info??

Would you like to automate this process for your organisation? Please contact me. I would like to show you how it works in one of our test environments. That can also be remote.

Text: Reinder Riemersma