dinsdag 26 september 2017

Integration OPA Cloud and Oracle eBS (12.1.3) iProcurement - part 3 (close window)

In the current release of the Oracle Policy Automation model we were using, we could not close the interview window at the end. You can however navigate to another window using an URL.
So what we did is create a simple HTML page with java script that closes a window and navigate to that.

Add close link in your model

On the close button in the model we change the style and refer to one of the input parameters we load using the LoadRequest (see previous blog).



The parameter (start_url) refers to the htm page that closes our window.

Place the HTM file on a location you can reach

In order to do this in eBS we place this file in $OA_HTML on the applicationserver, fetch the link using the profile APPS_SERVLET_AGENT and concatenate it with our HTM file to pass the link to OPA.

The HTM file contains the following logic

<html>
<head>
<script type="text/javascript">
function closeWP() {
 var Browser = navigator.appName;
 var indexB = Browser.indexOf('Explorer');

 if (indexB > 0) {
    var indexV = navigator.userAgent.indexOf('MSIE') + 5;
    var Version = navigator.userAgent.substring(indexV, indexV + 1);

    if (Version >= 7) {
        window.open('', '_self', '');
        window.close();
    }
    else if (Version == 6) {
        window.opener = null;
        window.close();
    }
    else {
        window.opener = '';
        window.close();
    }

 }
else {
    window.close();
 }
}
</script>
</head>
<body onload="closeWP();">
</body>
</html>

zaterdag 23 september 2017

Integration OPA Cloud and Oracle eBS (12.1.3) iProcurement - part 2

We described earlier some of the initial operations we need to do to integrate OPA with Oracle eBS. Below you see an overview of the rest of the operations. So we have a model for OPA, which we create in the model designer. A model needs to be deployed to be usuable as runtime.
In the model we invoke the GetMetaData to get the data structure and the valuesets we like to use in our interview.




So each time something changes in your data structure or valuesets, you need to do a GetMetaData in the model and deploy the model to the runtime version.

During runtime you have two options
1. Start
2. Resume
These options use their own URLs to start the interview and also invoke other services. The start invokes the LoadRequest, where the resume invokes the GetCheckPoint to resume an existing interview from the point you saved.

LoadRequest
The load request loads initial data from your datamodel, which can be used as input parameters for the interview. When you called the interview we can pass an initial parameter in the URL as well. We used that to indicate the user, responsibility and a unique ID to identify the record in our table. Of course you want to encode those parameters, so we used DBMS_OBFUSCATION_TOOLKIT.Desencrypt to encrypt these parameters into one connection string.
When you then get the LoadRequest, it passes back your initial parameter so you can decrypt it and identify the user, authorization and the record we are creating/updating.

First thing we do then is a fnd_global.apps_initialize, because the user calling our webservice through the Integrated SOA Gateway is a generic user.

Our LoadRequest procedure looks something like

procedure LoadRequest
(
  root                 IN  VARCHAR2
, region               IN  VARCHAR2
, language             IN  VARCHAR2
, timezone             IN  VARCHAR2
, request_context      IN  xxgr_opa_t_RequestContext
, seedDataDescription  IN  xxgr_opa_t_Tables
, loadData             OUT xxgr_opa_t_LoadData
, error_msg            OUT VARCHAR2
, Status               OUT varchar2
)

Here the seedDataDescription contains a list of entities and fields the interview is requesting from us. So you have to check which specific fields the service wants and pass values for those back. You cannot pass more values (service will fail).

We also use a translation for booleans, because Y/N values in the database should be translated to true/false values for the service and return type boolean, where we cannot use booleans in our data structures directly.

What we actually did is draw a sequence number and pass that as parameter on the interview. As soon as we got the LoadRequest back we created our record in the database with that sequence (because we were creating entities using the interview and until we get some feedback back from the interview we do not actually need the record).
During SetCheckpoint and SaveRequest we continue processing our record.

Information we pass on the load is stuff like the name of the user that called the interview, his organization, etc.

Drawback is that we can return an error to the webservice, but OPA cannot handle that. It will not show the error message we send back (just a generic error).


SetCheckPoint
During your interview you can save the data so you can resume later. Those are called checkpoints. The hub call call the SetCheckpoint to save a base64 encoded zip file of the interview. That zip file contains an XML containing variables entered in the interview; so if you want to you could unzip and decode the information and actually store structured data in between save points.

procedure SetCheckpoint
(
  request_context   IN xxgr_opa_t_RequestContext
, checkPointData    IN xxgr_opa_r_CheckPoint_Data
, checkpointId      OUT varchar2
, error_msg         OUT varchar2
, Status                 OUT varchar2
)

So setting a checkpoint is merely saving the blob data given our context (request_context). That contains again the parameter(s) we passed to the initial URL, which is our encrypted key with user information.

In our specific case we would create purchase requisitions from our interviews, so this was the moment we actually created the requisition and related our interview (using a custom table) to it.


SaveRequest
When you are done with your interview you can submit the data using a save request operation.


procedure SaveRequest
(
  root              IN  VARCHAR2
, region            IN  VARCHAR2
, language          IN  VARCHAR2
, timezone          IN  VARCHAR2
, request_context   IN  xxgr_opa_t_RequestContext
, submitData        IN  xxgr_opa_t_submit_data
, attachments       IN  xxgr_opa_t_attachments
, auditReport       IN  xxgr_opa_t_audit_report_list
, updateData        OUT xxgr_opa_t_UpdateData
, error_msg         OUT VARCHAR2
, Status            OUT varchar2
)

This is the most complex operation, because now we get all the structured data in the submitData including attachments and auditreports.

There are some restrictions on attachments in the service. You can restrict the size of files in OPA, but the SOA Gateway also may have its own restrictions. We also had a service bus in between with memory restrictions, so we had a limit of max 40MB on files. But note that on each setcheckpoint it would send any attachments in the base64 encoded zip that you already uploaded. So it's good practice to add your attachments as late as possible in the interview to avoid a lot of data traffic.

The submitData contains two parts
- The input fields
- Request for output fields

The latter is a request after the submit (which you can use in the OPA model) to pass back some information. So we capture the fields that are requested to pass them back later after we are done (for example to pass a requisition number).

 <<OutputFields>>
        FOR i in 1 .. submitData(l_det_ind).submitRow(1).outputfield.count 
        LOOP  
          L_Outputs (submitData(l_det_ind).submitRow(1).outputfield(i).name) := submitData(l_det_ind).submitRow(1).outputfield(i).name;
        END LOOP OutputFields;

Then we loop through all the input fields, validate the input and store the data. Of course we could store the names as indexes in a table, but we still need to identify per field what we want to do with it.

 <<InputFields>>
      FOR i in 1 .. submitData(l_det_ind).submitRow(1).inputfield.count
      LOOP

               IF   submitData(l_det_ind).submitRow(1).inputfield(i).name = 'DESCRIPTION' 
               AND  submitData(l_det_ind).submitRow(1).rowAction = 'update'
               THEN
                  l_description := submitData(l_det_ind).submitRow(1).inputfield(i).data_val;
               END IF;

Note that we can have multiple entities (l_det_ind), with multiple rows and multiple fields. In our specific case we only had one main record, but multiple sub records. So we used submitRow (1) here, but otherwise we would have used  loop.

Attachments

Now we get 2 types of attachments. You get the main report IN the data and a separate attachments parameter.

submitData(l_det_ind).submitRow(1).attachments

contains the attachments on our main entity.

So finally we validate our input, update our record and return a message back. This message can be shown, if you use the LoadAfterSubmit option in OPA. It loads information you can give back, so we can show error and warning messages if necessary.

In our current release you can only submit once. After that you need to close the interview and re-open it to make changes.


GetCheckPoint

Then finally the get check point operation, which is used if you resume an interview. We simply read the base 64 encoded string (zip file) and pass that back to the webservice.


procedure GetCheckpoint
(
  request_context   IN xxgr_opa_t_RequestContext
, checkPointData    OUT xxgr_opa_r_CheckPoint_Data
, error_msg         OUT varchar2
, Status                 OUT varchar2
)

So this is pretty straightforward. The only thing we also do, on all operations, is check whether the user is allowed to do this. For example you cannot do a Load if you aren't logged in right now. And you cannot submit if you were not logged in today.

woensdag 22 februari 2017

Integration OPA Cloud and Oracle eBS (12.1.3) iProcurement - part 1

In my current project we've created a custom integration between Oracle iProcurement (12.1.3) and Oracle Policy Automation in the cloud (release 12.2.5). I like to share some lessons learned from this project and help along others who might need to build an integration between eBS and OPA Cloud.

First you need to understand the basic integration between OPA cloud and its environment. You call the cloud environment using an URL with some parameters. Then OPA cloud performs calls to your environment using the connection framework you have to develop based on predefined WSDLs OPA is prescribing.

Your connection framework needs to be able to receive SOAP calls and answer appropriately. Since we want to connect to Oracle eBS and we started out with creating APIs in the eBS environment (PL/SQL packages) which could be exposed to OPA.


To expose them we could have build BPEL processes using SOA Suite calling our API's directly of course, but since we could not use SOA Suite (domain restrictions), we've used the Integrated SOA Gateway in eBS.
Unfortunately the SOAP responses generated by Integrated SOA Gateway were not exactly what OPA Cloud expects and you are very limited in steering the respons created by the SOA Gateway because the WSDL is automatically generated based on your PL/SQL packages. So therefore we needed a translation between the SOA Gateway and OPA, the ESB.

Integrated SOA Gateway
Now there are some points which are important when you use the Integrated SOA Gateway here .. First, the input and output parameters to the procedures are complex datatypes, which all kinds of nested tables of records of tables of records. You cannot define these as types WITHIN your PL/SQL package (otherwise you can't get you package deployed as a webservice), so you have to make object types like

create
type xxx_opa_t_metatable is object
( name                     varchar2(80)
, can_be_input               varchar2(10)
, can_be_output               varchar2(10)
, description               varchar2(240)
, accepts_attachments         varchar2(10)
, table_fields               xxx_opa_t_MetaTableFields
, table_links               xxx_opa_t_MetaTableLinks
);

as separate types in the database.
For each service you can find the input and output parameters described in the OPA Documentation: http://documentation.custhelp.com/euf/assets/devdocs/august2016/PolicyAutomation/en/Default.htm#Guides/Developer_Guide/Connector_Framework/Expose_application_metadata.htm%3FTocPath%3DDeveloper%2520Guide%7CConnector%2520framework%7C_____2

Our package header is stored in a pls file with the following annotations

create or replace package                xxx_opa_wsep_pkg as
/* $header: apps.apps.apps.xxx_opa_wsep_pkg $ */
/*#
* ebs opa webservice connector
* @rep:scope public
* @rep:product XXX
* @rep:lifecycle active
* @rep:displayname eBS OPA Webservice Connector EBS Endpoint
* @rep:compatibility S
* @rep:category BUSINESS_ENTITY XXX_OPA_WSEP_PKG
*/

Including functions for each of the operations
CheckAlive
GetMetaData
LoadRequest
SaveRequest
SetCheckpoint
GetCheckpoint

which we will describe in more detail later.

CheckAlive
The checkalive function is used in the OPA Cloud environment to check if there is a valid connection. It sends back a very simple SOAP response

<S:Envelope xmlns:S="http://schemas.xmlsoap.org/soap/envelope/">
    <S:Body>
        <check-alive-response xmlns="http://xmlns.oracle.com/policyautomation/hub/12.2.2/metadata/types"/>
    </S:Body>
</S:Envelope>

To give you an idea on the required translation when using Integrated SOA Gateway, this is what our CheckAlive function returns by default

<env:Envelope
  
xmlns:env="http://schemas.xmlsoap.org/soap/envelope/">
 
<env:Header/>
 
<env:Body>
  
<OutputParameters
    
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    
xmlns="http://xmlns.oracle.com/apps/xxgr/soaprovider/plsql/xxgr_opa_wsep_pkg/checkalive/">
   
<ERROR_MSG
     
xsi:nil="true"/>
   
<STATUS>S</STATUS>
  
</OutputParameters>
 
</env:Body>
</env:Envelope>

So this needs to be translated into the response required by OPA.
If this service works, you can see the connection is green in the OPA Hub.



In our case we did not directly communicate with the service bus, but we had to pass a load balancer, an open tunnel, some firewalls and then we reach the service bus, which connects through some firewalls with the Oracle eBS environment.

GetMetaData
The GetMetaData service exposes the datamodel from eBS that can be used in OPA to map fields. OPA is aware of a lot more types than Oracle including currency, time, etc. We cannot create object types with booleans either, so if we want to indicate that some field is a boolean we define it as text (which may hold values like true/false) and we have to tell OPA this field is a boolean.

The metadata also contains any list of values you want to use in OPA. Note that OPA should not be used as a form to enter values (like select a supplier, select an employee), but you can use select lists for smaller lists to choose an answer from.

In our example we added list of values for line types, unit of measure and item categories, where we made dependent valuesets between the segments.
Important to understand in dependent valuesets is that the the main valueset contains all its children. So if you have one segment Animals including Mammal, Hoofed animal, etc and Hoofed animal includes Cow, Horse, etc, you would send list one Animals with child Hoofed animal and all its children and then a second list of Hoofed animals (and a third with mammals), etc.

For each field in your data element you specify whether it is an input or output field, whether it is required and the type. And as said, since we only have text types in Oracle for our booleans, we have to indicate this is actually a boolean to OPA so it can treat it as a boolean and pass true/false back.

Just an example of how our code was build up. We fetched all columns from a given view and passed them in our case all as input/output, but of course you could make this more complex by defining in a lookup which fields can be inputs or outputs to OPA.

<<Fields>>
  FOR F IN C_Fields (cp_table_name => 'XXX_REQUISITION_DETAILS_V')
  LOOP
       nr_of_cols := nr_of_cols + 1;
--
       lt_tablefields.extend;

       l_can_be_input := 'true';
       l_can_be_output := 'true';
       l_is_required   := F.Is_Required;

--
  l_data_type := F.Data_Type;
   IF F.Column_Name = ( my list of boolean columns ) THEN l_data_type := 'boolean'; END IF;

     l_tableFields := xxx_opa_r_MetaTableFields (F.Column_Name,l_data_type,null,l_Can_Be_Input,l_Can_Be_Output,l_Is_Required,Initcap (F.Column_Name));
     lt_tablefields (nr_of_cols) := l_tablefields;
--
  END LOOP Fields;
  Metatable := xxx_opa_t_metatables();
  MetaTable.extend;
  MetaTable (1) := xxx_opa_t_metatable ('XXX_REQUISITION_DETAILS_V','true','true','Request','true',lt_tablefields,null);
 
Next time we will describe the Load and Saverequests in more detail and the checkpoints.


zaterdag 21 januari 2017

PAAS, the magic word

Since customization in Oracle Cloud applications is not possible (next to personalization of course) the new magic word I keep hearing is 'but you can use PAAS'. Sigh of relief right?
But what does this mean?

PAAS is a nice acronym for 'a machine' that is located 'somewhere' with 'some kind of tooling on it'.  In case of PAAS it's actually a machine in the cloud, handled by an external party (Oracle in this case).

So what is the idea behind 'we can create customizations using PAAS'? Actually what we are saying is that you need a machine to build your customizations and you can connect it to your SAAS application. You can fetch data from it using available web services or BI queries (also accessible as a web service) and you can write back data using web services. But only if the service is available! So if there is no service for it, you cannot automatically get data back into your SAAS application.



Now this machine where we build our customization on could be anywhere. It could be a local machine, you can rent it at a hosting company or rent is as a cloud service. Only in the latter case we actually speak of PAAS.

And if I have this PAAS environment, what do we still need to do?
You've got to have the tooling on it to build the customization! So let's assume we want to build some kind of fancy front-end, so we need ADF. We probably need a local database and we probably also need SOA Suite if we want to develop processes (although calling web services can be done in ADF as well of course).
So you always need a Database Cloud Service (DCS) and next to it you could use a JCS (Java Cloud Service) and install your own ADF/SOA Suite on it. Of course you would have to maintain the applications (keep them up to date with the versions you need, keep them running).
But you could also rent a SOA Cloud Service which would have the latest SOA version. This may be a good or a bad thing depending on what you want.
If the SOA Suite version changes, you may get into trouble with the services you've build to connect to SAAS of course, so you would have to keep track of each time it's changed underneath you.
Both methods have their advantages and drawbacks.

Of course to build the customizations you don't specifically need a PAAS environment, you could also use your own datacenter or hosted environment as you might do now. It depends on total cost of ownership eventually.

And WHAT can I build once I have my environment?
You can build custom screens containing custom logic, fetch data from your SAAS application, manipulate it and write back data. As said, writing back is a bit restricted, because a service MUST be available to be able to do that. Of course there are more and more services coming up, but we notice that a lot of the services still are missing certain elements (for example you can create a contract, but not risks and deliveries related to the contract).

So you can't actually change an existing form in SAAS, but you could copy it and rebuild it in your own application and skip the original form in the SAAS application. It's clear that would not be the reason why you bought the SAAS standard application in the first place ;-).
I would advice to restrict whatever you build next to SAAS is an addition, and not replacing functionality ..

Anyway .. as an application custom developer I see enough opportunity to be building customizations still for SAAS applications,  but I want to emphasize that PAAS is not the magic word that solves your problem. It's a fancy way of saying that you can build custom logic, forms, etc on an external environment (either in cloud PAAS or on premise/hosted environment).

:-)

dinsdag 6 december 2016

eBS Customization and Conversion versus Cloud Customization and Conversion

In a traditional Oracle e-Business Suite environment customization logic (forms, triggers, procedures etc) are directly coded in the Oracle e-Business Suite environment. Because of this the logic is directly integrated and all objects are immediately accessible.
Conversion and integration usually takes places via open interfaces (there are more possibilities of course like API's, database links, webservices, integrated SOA Gateway, etc). But usually we use a traditional conversion file, which is first loaded in a custom staging table where we do enrichment, validation and transformation before sending the information to the open interface.



How is this changed in a cloud environment?

In a cloud environment these open interfaces are still there. They are unlocked using UCM, the Universal Content Manager. Using standard import processing, which were available in Oracle eBS as well, the data from UCM is loaded into the open interfaces and imported in the Oracle tables.

The main difference with the traditional eBS environment is that all enrichment and transformation must have taken place before loading into UCM. You deliver the enriched and validated data for example through excel spreadsheets which can be loaded using File Based Data Import.
Excel sheet templates can be loaded from Oracle Enterprise Repository.
(For example templates for ERP Cloud can be found here http://docs.oracle.com/cloud/latest/financialscs_gs/OEFBF/FBDIOverview.htm#FBDIOverview)



Next to File Based Data Import the SAAS environment also provides several webservices which can be invoked directly from the external application. Usually this occurs using a middleware environment (like SOA Suite, Mulesoft etc), where enrichment and transformation takes place (and where also error handling with an error hospital is usually taken into account).
Note that File Based Data Import can also be invoked as a webservice. Using middleware you would do your enrichment and transformation and you send the excel files to UCM, which will load the open interfaces. You could even fetch some data from SAAS first to enrich the data using a webservice that invokes a BI Report to fetch the data.

In stead of developing customizations directly into the SAAS environment, customizations can be developed in a separate environment. This can be a PAAS environment. This PAAS environment can be directly linked in the SAAS environment using Integrated Pages. If the SAAS application is based on Alta-UI the user will not notice a difference between the SAAS application and the custom application providing some additional service.

To load data from the SAAS application again you can use web services. This could be a BI Report that fetches any data from the SAAS application to expose to the PAAS environment. The BI Report can be invoked as a web service.
For example you could fetch your item information based on scanned barcodes including on hand quantities etc.
Next to this actions in the SAAS environment can be triggered by web services, like creating an inventory transaction or order  through a web service invoked from the PAAS environment.

Next to these integration options, Oracle also offers the Integration Cloud Service. This is a service which allows you to easily integrate SAAS to other SAAS application or On Premise applications through web services and maintain your services in this environment.



So .. still enough options to build customizations and do custom integration in cloud.


zaterdag 15 oktober 2016

eBS R12 Upload BLOB/URL as Attachment

I recently had a request on how to upload attachments in eBS from a file. So I dug into my old code and found this logic I once built for storing attachments (URL and files) you upload through an Apex interface as an eBS attachment. We linked it to a party id (stored in field :p300_party_id in Apex) and used a specific category.


Definition of entities
Login as Application Developer and navigate to Attachments. In our case we want to link the attachment directly to the party and we were also using Customers Online. So we used the IMC_ATTACHMENTS entity and the party_id as primary key. We used hard coded category Miscellaneous, but of course you can also choose the category.


Note that we did not have to show the attachments in a form in eBS again (just in Apex), but we wanted it to be available later in Contact Center for example. In that case we should have chosen the entity Parties instead (which also uses the PARTY_ID as unique id).




Create attachment
Now this is the code to create the attachment. For an URL you don't need the actual blob, but for a file you do. We use API's to create the attachment except for the actual BLOB in FND_LOBS. Apparently there is no public API for this (
declare
l_rowid rowid;
l_attached_document_id number;
l_document_id number;
l_media_id number;
l_category_id number := 1; -- MISC
l_pk1_value fnd_attached_documents.pk1_value%TYPE := :p300_party_id; -- Unique id of entity it is attached to
l_description fnd_documents_tl.description%TYPE := :p300_description;
l_filename fnd_documents_tl.file_name%TYPE;
l_seq_num number;
l_mime_type varchar2(240);
l_datatype_id number;

 -- This was specific for blobs we uploaded in APEX. Based on the name of the file
 -- we created an attachment in eBS. Of course you can also fetch your blobs
 -- from somewhere else like a file system or the database.
 cursor c_blob
 is
select blob_content
, filename
, mime_type
from wwv_flow_file_objects$
where name = :p300_file_name
;

l_blob c_blob%rowtype;

BEGIN

  -- In Apex we allowed for adding URLs and actual file upload. So depending
  -- on the entry (URL or not) we created different data types.

IF :p300_url IS NOT NULL
THEN
  l_datatype_id := 5; -- Web page
  l_media_id := null;
ELSE
  l_datatype_id := 6;
  OPEN c_blob;
  FETCH c_blob INTO l_blob;
  CLOSE c_blob;

  l_filename := l_blob.filename;

  select fnd_lobs_s.nextval into l_media_id from dual;

-- Determine mime type
  l_mime_type := l_blob.mime_type;
  
END IF;

If l_datatype_id IN ( 5,6)
THEN

select FND_DOCUMENTS_S.nextval
into   l_document_id
from   dual;

select FND_ATTACHED_DOCUMENTS_S.nextval
into   l_attached_document_id
from   dual;

select nvl(max(seq_num),0) + 10
into   l_seq_num
from   fnd_attached_documents
where  pk1_value = l_pk1_value  -- Your unique ID, we used PARTY_ID
and  entity_name = 'IMC_ATTACHMENTS'; -- This depends on your setup of course

fnd_documents_pkg.insert_row
( X_ROWID                        => l_rowid
, X_DOCUMENT_ID                  => l_document_id
, X_CREATION_DATE                => sysdate
, X_CREATED_BY                   => fnd_global.user_id
, X_LAST_UPDATE_DATE             => sysdate
, X_LAST_UPDATED_BY              => fnd_global.user_id
, X_LAST_UPDATE_LOGIN            => 0
, X_DATATYPE_ID                  => l_datatype_id --5 -- Web Page
, X_CATEGORY_ID                  => l_category_id
, X_SECURITY_TYPE                => 2
, X_PUBLISH_FLAG                 => 'Y'
, X_USAGE_TYPE                   => 'O'
, X_LANGUAGE                     => 'US'
, X_DESCRIPTION                  => l_description 
, X_FILE_NAME                    => l_filename
, X_MEDIA_ID                     => l_media_id
-- R12
, X_URL                          => :p300_URL
);

fnd_documents_pkg.insert_tl_row
( X_DOCUMENT_ID                  => l_document_id
, X_CREATION_DATE                => sysdate
, X_CREATED_BY                   => fnd_global.user_id
, X_LAST_UPDATE_DATE             => sysdate
, X_LAST_UPDATED_BY              => fnd_global.user_id
, X_LAST_UPDATE_LOGIN            => fnd_global.login_id
, X_LANGUAGE                     => 'US'
, X_DESCRIPTION                  => l_description
-- Removed in R12
--, X_FILE_NAME                    => l_filename
--, X_MEDIA_ID                     => l_media_id
, X_TITLE                       => l_filename
);

--dbms_output.put_line (''+Document id :  ''|| l_attached_document_id);

fnd_attached_documents_pkg.insert_row
( X_ROWID    => l_rowid
, X_ATTACHED_DOCUMENT_ID         => l_attached_document_id
, X_DOCUMENT_ID                  => l_document_id
, X_CREATION_DATE                => sysdate
, X_CREATED_BY                   => fnd_global.user_id
, X_LAST_UPDATE_DATE             => sysdate
, X_LAST_UPDATED_BY              => fnd_global.user_id
, X_LAST_UPDATE_LOGIN            => fnd_global.login_id
, X_SEQ_NUM                      => l_seq_num
, X_ENTITY_NAME                  => 'IMC_ATTACHMENTS'
, X_COLUMN1                      => null
, X_PK1_VALUE                    => l_pk1_value
, X_PK2_VALUE                    => null
, X_PK3_VALUE                    => null
, X_PK4_VALUE                    => null
, X_PK5_VALUE                    => null
, X_AUTOMATICALLY_ADDED_FLAG     => 'N'
, X_DATATYPE_ID                  => l_datatype_id
, X_CATEGORY_ID                  => l_category_id
, X_SECURITY_TYPE                => 2
, X_PUBLISH_FLAG                 => 'Y'
, X_LANGUAGE                     => 'US'
, X_DESCRIPTION                  => l_description
, X_FILE_NAME                    => l_filename
, X_MEDIA_ID                     => l_media_id
-- R12
, X_URL                        => :p300_URL
, X_TITLE                        => l_description
);


IF l_media_id IS NOT NULL
THEN
INSERT INTO FND_LOBS
(
  File_Id
, File_Name
, File_Content_Type
, File_Data
, Upload_Date
, Expiration_Date
, Program_Name
, Program_Tag
, Language
, Oracle_Charset
, File_Format
)
VALUES
(
  l_media_id
, l_filename
, l_mime_type
, l_blob.blob_content
, sysdate
, null
, null
, null
, 'US'
, 'UTF8'
, 'binary'
);

END IF; -- URL has no file'

commit;

END IF;

END;


Logic to fetch a file as BLOB from the file system
In our case the files were loaded into Apex wwv_flow_file_objects, but in some cases if you want to apply the logic from a database, you may need to fetch the files from the file system.
Note that in that case usually your code runs on the database server and your files are most likely on the application server.
What we usually do is either use a shared directory or use a host script to copy the file from the application server to $APPLPTMP (which is usually shared) and then call your SQL Package / Script from the host script.

To load the file from an accessible file location use code like this. Note that YOUR_DIRECTORY is a directory in DBA_DIRECTORIES which refers to an actual file location. You create directories using the create directory statement.

DECLARE

  l_bfile   BFILE;
  l_blob    BLOB;
  dir_name  VARCHAR2(240) := 'YOUR_DIRECTORY';
  dir_path  VARCHAR2(240) := '&1';
  file_name VARCHAR2(240) := '&2';

BEGIN

  l_bfile := BFILENAME(dir_name, file_name);

IF (dbms_lob.fileexists(l_bfile) = 1) THEN
      dbms_output.put_line('File Exists');
      
      dbms_output.put_line ('Size: ' || dbms_lob.getlength(l_bfile));
      
      dbms_lob.createtemporary (l_blob,true);
        
      dbms_lob.fileopen( l_bfile, dbms_lob.FILE_READONLY );
      dbms_output.put_line ('File is open for reading ...');
      
      dbms_lob.loadfromfile( l_blob, l_bfile, dbms_lob.getlength(l_bfile),1,1 );
      dbms_output.put_line ('Load from file ...');
       
       
      dbms_lob.fileclose( l_bfile );
      dbms_output.put_line ('Close file ...');

END IF; 

-- So now you have your file in l_blob and you can insert it into a database for example or create your attachment with the logic above.

END;

zondag 25 september 2016

Autoinvoice Import Fusion R11

In the past we've been using Autoinvoice import for importing receivables invoices. You fill the RA_INTERFACE_LINES_ALL table and the RA_INTERFACE_DISTRIBUTIONS_ALL as well if you want to create your own distributions and run Autoinvoice Import to validate and import your transactions.

In Fusion we have several options to do the same.


First we have the File Based Loader option. We can use this option manually (for conversion for example) or create a process in your middleware layer to construct the zipped CSV and upload it to UCM for processing.






Information on how to use the file based data import can be found in the Oracle Enterprise Repository (http://docs.oracle.com/cloud/latest/financialscs_gs/OEFBF/AutoInvoice_Import_4016_fbdi_1.htm#4016). You can download the excel template there as well.

In our case we however wanted to use the web services that are available. We have been investigating the services processInterfaceLine and processInterfaceDistribution next to createSimpleInvoice.
The latter can be used to directly create an invoice, but it's, as the service already says, a simple invoice with a header and some lines.

You can also use the createInterfaceLine or processInterfaceLine to create one or multiple lines in the open interface table.

Transaction Flexfield
Important is how to pass the correct context for grouping of the invoices. Usually you create a new transaction descriptive flex field context for each of your imported invoices.
You pass these in the element TransactionInterfaceLineDff.

 <inv:TransactionInterfaceLineDff xsi:type="ns3:CPQCloud" xmlns:ns3="http://xmlns.oracle.com/apps/flex/financials/receivables/transactions/autoInvoices/TransactionLineInterfaceLineDff/">
               <ns3:__FLEX_Context>CPQ Cloud</ns3:__FLEX_Context>
               <!--Optional:-->
               <ns3:_FLEX_NumOfSegments>6</ns3:_FLEX_NumOfSegments>
               <!--Optional:-->
               <ns3:orderNumber>1100</ns3:orderNumber>
               <ns3:orderLineNumber>1</ns3:orderLineNumber>
...
            </inv:TransactionInterfaceLineDff>

In order to use these elements you have to check your setup for the API Names of the context and the segments in your context. Check your setup in Manage Descriptive Flexfields. Search for flex field code RA_INTERFACE_LINES.

Open your context and note the API Name for the context.
Each element also has an API name which you must use in the flex field.


Line flex field
The line descriptive flex field can be passed in the element TransactionLineDff. Here the elements equal the API names again of the flex field attributes. For example an attribute for Service Type could be passed using <tran5:serviceType>...</tran5:serviceType>

Header flex field 
One thing we encountered is that although the createSimpleInvoice service contains a transaction header, and thus the transaction header flex field, but the processInterfaceLine service does not have this element. So currently (R11) it does not seem to be possible to pass the header flex field using this service.

Distributions
Another thing we encountered is that using the processDistributionLine does not contain elements to pass the individual segments for your accounting structure. It only accepts the code combination id. This means we either have to keep a duplicate record of the code combinations to be able to pass the internal id or we fetch the code combination first using a service.


Now there is no standard webservice to fetch the code combination id using the segments and chart of accounts id, so we need to create our own BI report/query to fetch the code combination given the segments and pass that ID to our processInterfaceDistribution service.

A detailed explanation on how to achieve this can be found in this blog http://hakanbiroglu.blogspot.nl/2014/11/leverage-power-of-bi-publisher-in-cloud.html#.V-gR6Ds5ZHQ.

Note that you have to decode the output of the XML from Base64 before you can use the XML that is retrieved.

Submit Autoinvoice Import
Now you can either schedule the Autoinvoice process or submit it from your middle layer. To find the job package name you need to pass navigate to Manage Custom Enterprise Scheduler Jobs for Receivables and Related Applications.


If you open it you will see the name AutoInvoiceMasterEss and the path /oracle/apps/ess/financials/receivables/transactions/autoInvoices/Receivables

When you call the ERP Integration service you pass parameters for example as follows

<?xml version="1.0" encoding="UTF-8" ?>
    <procesparam>
        <username>{Username}</username>
        <pswd>{Password}</pswd>
        <batchid>{batchid}</batchid>
        <businessunit>{BU Id}</businessunit>
    </procesparam>
    
    <jobPackageName>/oracle/apps/ess/financials/receivables/transactions/autoInvoices/Receivables</jobPackageName>
        <jobDefinitionName>AutoInvoiceMasterEss</jobDefinitionName>
        <paramList>1</paramList>
        <paramList>{Your transaction source}</paramList>
        <paramList>19-09-2016</paramList>
    </essparam>



References
Usage notes on the invoice service
http://www.oracle.com/webfolder/technetwork/docs/fbdi-r11/fbdi/supportingdocs/finArTrxnsInvoices-InvoiceService.pdf