Wednesday 17 July 2013

Fusion Apps File Based Loader as Synchronization Tool

One of the limitations(till 11.1.4) of the FBL is that you can not update records created using other tools or application UIs.This is because FBL internally uses Key Map table to mark a record for Create if it does not find the corresponding Fusion ID for the GUID provided in the data file. If it finds the same it marks the record as Update. Basically FBL load the data file to load batch data staging tables. Now when a record is imported to the Load Batch Data stage tables, the import process compares the record’s source GUID and object-type values with values in the Key Mapping table:
  • If the values exist in the Key Mapping table, then the process replaces the source GUID in the stage tables with the Oracle Fusion ID.
  • If the values do not exist in the Key Mapping table, then an Oracle Fusion ID is generated for the record, recorded in the Key Mapping table, and used to replace the source GUID in the stage tables.
By the time the data in the stage tables is ready for loading to the Oracle Fusion application tables, all Oracle Fusion IDs have been allocated.
In release 11.1.5.0.0 oracle has introduced an extract Business Object Key Map through which you can generate the GUIDs for the records which are not created using FBL and populate the same in the Key Map table. Through the extract you can also generate the file with GUIDs and import the same in the target system. You can then take the GUIDs and create your own data file to update those records. In this post I will explain how to generate the GUIDs and populate the same in the Key Map table.

If you are on Release 5 then check whether the extract is available under Navigator --> Data Exchange-->Manage HCM Extract Definitions as shown
Search with Business. Now Oracle strongly suggests to create a copy of this extract and do all modifications based on your requirement. So let's follow this. Click on Copy and give a name to your custom Extract say MSObject Key Map as shown. Click Ok.

 










Now let us configure the parameters that will passed to this extract when it will be submitted. Navigate to Payroll Checklist using Navigator --> Checklist (under Payroll menu)














All extracts are defined as flow patterns. Search your extract, select and then click on edit icon. Navigate to Parameters tab






Select Effective Date and click on Edit icon.
Specify the following
Display : No
Parameter Basis: Post SQL Bind
Basis value : select TO_CHAR(SYSDATE,'YYYY-MM-DD') from dual
Click on Save
Select Business Objects and click on Edit icon and specify the following
Display Format: Lookup Choice List
Lookup: HRC_LOADER_BO_EXTRACT
Sequence: 220
Click on Save

Select Extract Type and click on Edit icon and specify the following
Display Format: Lookup Choice List
Lookup: HRC_LOADER_BO_EXTRACT_TYPE
Sequence: 210
 Click on Save
The sequence value of Business Object must be greater than that of Extract Type.
Select Start Date and Click on Edit icon
Specify the following:
Display: No
Click on Save and then Submit.
Now let  us configure the extract so that we can run it. For this go to Navigator-->Data Exchange
Click on Manage Extract Definitions from the regional area and search your extract. Select the one and click Edit icon





Click on Business Object block. In the Extract Records section click on Key Mapping



















Click Actions drop down and select Generate Formula








Once the formula has been generated Click on Actions drop down and select Compile Formula


















Click on Save and Close.Click Business Object Block again to verify the formula status. Ensure that Fast Formula is compiled indicated by a green tick.

















 
Click on Business Object List. In the Extract Records section select Business Objetcs











From the Actions drop down select Generate Formula and then once the formula is generated click on Compile Formula from the same Actions drop down list.






Click Save and Close. Again select Business Object List and verify the Fast Formula is compiled indicated by green check







Click on Save and Close. This completes the extract setup. Now let's run this extract.

To do this click on Submit an HCM Process as shown next

















Select your Extract and click on Next.










Give an name to your extract flow instance and select the Extract type and Business objects. Click on Next, Next and then Submit








Business Object Key Map Extract Parameters
Business Object Key Map extract process has the following parameters –

  • Extract Type – Extract type has the following options Parameter Description
  • Key Map - Select this option to generate a report of the existing values in the Key Map table.
  • Global Unique Identification Generation and Key Map - Select this option to create GUID values for those business objects that do have a Key Map entry and to generate the report.
  • Business Object
  • List of Business Objects for which the Key Map Report/GUIDs to be generated.
  • If Business Object is left blank the extract will be run for all the Business Objects.
The business objects which are supported are (as mentioned by Orcale)



























Click on Refresh icon to see whether the process has completed.

In order to see the output you need to have a BI Publisher report. I will explain the same in the next post.





In summary you now know how to generate Key Mapping information using Business Object Key Map HCM extract.

Cheers....................

Wednesday 10 July 2013

OTBI - Duty Roles Required for Subject Areas

In order to access data in OTBI you need to have access to the Subject area that OTBI is accessing. Oracle has documented the steps to get the Duty Roles assigned by default in the RPD for different subject areas under MOS document ID 1487201.1. In my previous post - How to Know On Premise RPD File Name and Location? - I have explained how to get the RPD file. Now if you open that in BI Administration Client Tool and then follow the steps mentioned in the MOS note, you will get the duty roles required for each of the Subject Areas in FSCM, CRM and HCM. Oracle has also uploaded output of the Permission Report from RPD file in that note. You can get the duty roles from there and log on to Oracle Entitlement Server/Authorization Policy Manager to get the details of the Application Roles, Role Hierarchy and Job Role Mapping.

You can also get the sample file from here. Let's track the Application Role Mapping in APM. As you can see from the next picture that Authenticated user does not have any access on Workforce Management - Person Real Time subject area. To get the data from this subject area you need to have FBI_WORKFORCE_TRANSACTION_ANALYSIS_DUTY role.
If I log on to APM and search this duty role then I will get all the required details.
You can navigate to APM from Fusion through FSM task Manage Duties or Manage Role Templates.






Once you have the duty role click on Open and the screen should look like the next picture













From here you can see what are all the other roles this duty role inherits permission from. You can also see which are the Job Roles it has been mapped to.









Role Hierarchy.













External Role mapping - As can bee seen that this duty role is mapped to Human Resource Analyst job role and Line Manager Abstract role. So if you have been provisioned with this role then you can access the data for this Subject area.




In my previous post I have also explained how to Assign BIAdministrator or BIAuthor to a user here.

So now you understand which Job role you need to be provisioned.

Cheers......................

Monday 8 July 2013

Fusion Apps - File Based Loader Part - III

In the first two blog posts (PART I & PART II) I have explained how to create HCM Business Objects Mapping file and use this file to create data file for loading into Fusion. I also explained how to create the zip file and register that using composite. Now as I mentioned in the last post that when you call and submit the composite to register the file, the composite actually moves the file to a path and then call hr2hrSqlldr.py  python script to load the data into staging tables.

Let's see what are the series of actions it does. If you open the python script you will find that it first checks what are all the input details provided by the user when he/she called the composite from a table named HRC_XFER_JOB_PARAMS. From this table it gets the list of LBOs that were loaded in the zip file. Now user may load all or few LBOs which are supported by FBL. In order to maintain the data integrity it has to load the LBOs in a a pre-defined sequence. To do this it gets the processing sequence from a table HRC_LOADER_METADATA_REGISTRY. The next screen shows the loading sequence from this table.
LBOs are processed starting with the one having the largest processing_order value. Oracle generates separate JOB IDs for each of the top level LBOs which were specified in the composite parameters. Then it creates SQL*Loader Control file for loading data into HRC_LOADER_BATCH_HEADER and HRC_LOADER_JOB tables and sets the LINE_STATUS = 'Ready To Process'. Once this is done it gets all the data files for each of the LBOs and then creates SQL*Loader Control file dynamically for them to insert into HRC_LOADER_BATCH_LINES_SRC table. Now once all data files have been imported into the table, it then calls a PL/SQL procedure HRC_SQLLOADER.PROCESS_BATCH_TRANSFORM to do the data transformation for each of the top level LBOs.

Now for some reasons it does not insert data into those tables in my instance. Don't know why. May be I might be doing something worng. So I logged SR with Oracle and initial response was that our version(11.1.4.0.0) is not the latest one where FBL works as expected.  Let's do this by running Load HCM Data
 Click on Load HCM Data under Coexistence.























Click On Schedule and select the file you loaded. I have already a loaded file Location28.zip. So let me show you the steps with this file.






Select the file, LBO and run mode. Now here I want to do - Import into staging tables and then load them to base tables in a single run. So I selected Import and Load Batch Data. You can do Import first and then load them through Load Batch Data process. Click on submit.


 It submits an ESS job which them submits child processes.









Click Refresh to see whether the process is completed.











 Click Processes tab to see whether it has any error.











 Click on the Batch Name Link as shown.











 As can be seen that it has one record and it failed. Click on the 1 under Failed Objects.






 
 Once you do that it opens a screen which looks like this. Expand Location LBO and select the data to see the error.








Here the error is 'Location code already exists'. I intentionally took this address to show the various features. Basically I created a Location from Fusion Application Navigator-->Workforce Structures->Manage Location



 This picture shows the error details.











I used that location code to create data file. Wanted to explore whether it is possible to update the data which are not created by FBL. So you can see with this process you can not update data which are not created by FBL. Recently in release 5 onwards Oracle has introduced Business Object Key Map HCM Extract to generate the GUIDs for business objects created in application manually. Will explain the process of doing that in a separate post after completing this series.

Many of ou may be wondering why did it try to insert in stead of update. 
Oracle Fusion HCM maintains a Key Mapping table HRC_LOADER_BATCH_KEY_MAP that records, for each business object, its type, source GUID, and Oracle Fusion ID (the ID by which it is identified in Oracle Fusion). For example:
OBJECT TYPE  SOURCE GUID ORACLE FUSION ID

---------------  ---------------  ----------------------
Person              PERS123          006854
Person              PERS456          059832

When a record is imported to the Load Batch Data stage tables, the import process compares the record’s source GUID and object-type values with values in the Key Mapping table:

  • If the values exist in the Key Mapping table, then the process replaces the source GUID in the stage tables with the Oracle Fusion ID.
  • If the values do not exist in the Key Mapping table, then an Oracle Fusion ID is generated for the record, recorded in the Key Mapping table, and used to replace the source GUID in the stage tables.
So for my case I did not know the GUID created by application and also there was not entry in the key mapping table. So it generated a new fusion id and marked the record as Create not Update.

Let's see this process. I am going to update the Location code from the application.
  












 The record details.












 MEHABUB_LOC is the location which is there in the system. Let's change this to MEHABUB_LOC1









 Then set the Batch status to 'Ready to Process'











 Rerun the Load Batch Data Process.











 Click on Run.












 Click on Submit.












 Note down the Process ID. This is nothing but ESS job.











 Wait till the process is completed.











Now you can see that it has created MEHABUB_LOC1 location in fusion.










 This picture shows that the location is now available in application. I am going to use FBL to update the address details of this location.
 I just changed the address details keeping location code and GUID same and created a new data file Location29.zip and called the composite


 This picture shows the composite call parameter details.










 Composite Flow.












 Submit Load HCM Data with Import and Load Batch Data for the file Mehabub29.zip.










 ESS Job id.












 Wait for the process to complete.











 All process completed successfully.











This shows that one object has been changed.
Let's check from the application.









The marked address is what I changed in my data file.










So this completes the series of FBL data loading as a conversion tool.

In summary from this post you know how
  1. FBL python script creates Control files
  2. Oracle derives the processing order of the LBOs
  3. Data transformation happens inside Fusion 
  4. To Load data into staging tables
  5. To get the errors and fix them from Application
  6. To create data file records created with FBL to update
In the next post will discuss how to use FBL as ongoing sync tool using Business Object Key Map Extract . Till then keep trying FBL for different business objects. Please go through the MOS documents mentioned in PART I and PART II.

Cheers........................