HOW TO: Use a Data Access Service to retrieve t... Skip navigation

HOW TO: Use a Data Access Service to retrieve the results of a SQL execution and perform further actions?

Description:

Using Cloud Application Integration (formerly ICRT), you could interact directly with a database source incorporated a Business Process Execution Language (BPEL) process using the Data Access system service. Using this service, you create an invoke activity that runs one or more SQL statements on a specified data source and receives a result set as the response.

 

The example here demonstrates how you could execute a SQL statement as is, retrieve the list of results and perform further actions on it. For example: Insert the list of data to a CSV File, or push the list of objects to an Amazon SQS. Please note that you would have to configure a File delimited writer connection or an Amazon SQS Connection if you need to test the respective path.

 

 

Steps to Follow:

 

Process Developer "Data Access" projects:

 

1. Download and extract the attached zip file (KB520166_DataAccessForSQL .zip).

(The steps 1a and 1b are not mandatory, but nevertheless, its good to look at the source to understand what has been done in it.)

 

1a.. Import the attached BPEL projects - DataAccessForSQLProjects.zip to the Informatica Process Developer. Details on how to access the process developer is here.

1b.  Look through the BPEL projects:

DataAccessCommon: Contains the WSDL, schema used for Data Access Service. This is referenced from both the projects as below.

DataAccessForSQLAgent: Contains the BPEL process that contains the Data Access Service invoke activity to invoke the database call. This will be deployed to the agent.

DataAccessForSQL: Contains the BPEL process that will invoke the Agent process as a process call. this process is exposed as an Automated Step - so that its accessible from the Service call step in the process.

 

Deployment:

 

2. Deploy the DataAccessForSQLAgent\deploy\DataAccessForSQLAgent.bpr to the Agent, through the process console.

 

3. Deploy the DataAccessForSQL\deploy\DataAccessForSQL.bpr to the Cloud, through the process console.

 

Console Configuration:

 

4. Configure the URN mapping in the Cloud under Cloud Server Process Console > Admin > URN Mappings:

 

URN:   urn:DataAccessForSQLAgentSvc

URL:   agent:FILL_YOUR_AGENT_NAME:DataAccessForSQLAgentSvc

Example:

URN:   urn:DataAccessForSQLAgentSvc

URL:   agent:USW1PC0DWAEH:DataAccessForSQLAgentSvc

5. Configure the Data Source in the Agent Process Console (Accessible through ICS Console)> Admin > Datasource Service, with a JNDI Path as jdbc/DB and configure the Datasource Service as in https://network.informatica.com/onlinehelp/icrt/current/en/index.htm#page/ee-icrt-admin/Configure_Services.html.

 

*Prior to validating and testing the data source, you would also need to make sure that you have the right database client jar file within the secure agent directory: {agent.home}\apps\process-engine\ext

 

IPD Configuration:

6. Import the process designer project - FetchDataThroughDAS .pd.xml (located as part of the .zip extract) to the Informatica Process Designer.

 

 

7 . This example here continues with either putting the extracted data to a File or to an AmazonSQS queue for example. So, when these connections are not configured you would see an error in the process. You have the option to correct the errors by having similar connections.

 

7a. File connection: create a simple file connection with the name as "CSVDelimitedFileConnection" and have it contain an event target as a "DelimitedContentWriter" and set a File Directory path. Test and publish the connection.

 

7b. Similarly, you can create an AmazonSQS connection with a name as shown in the sample, "AmazonSQSAEI ". you would need Access Keys as you have purchased from Amazon.. etc. In case you do not have an Amazon SQS connection and/or do not intend to test this route, simply delete the activities in the SQS path and end the path.

 

At the end, make sure the process is free of errors. Publish them.

 

Database Scripts:

8. You can use the script as in the SQLScrits.txt within the attachment to create a table (Note, this example was tested with Oracle database) and insert some records to it, using the insert script provided in the same file.

 

Execution:

 

9. Send a request to the process endpoint: FetchDataThroughDAS. You can get the endpoint from your process designer, process metadata information.

 

Example: https://ps1w2.rt.informaticacloud.com/active-bpel/public/rt/000123/FetchDataThroughDAS

 

You can send an input in form of a JSON request. Example:

 

 

{

 

"sqlStmt":"select * from Employee where EMP_ID='253' or FIRST_NAME='Mike'",

"path":"CSV"

}

 

And according to the match in the database, you would get a response similar to the follows (the following is only an example according to the database scripts used for this example. The database scripts and the DML are attached within the .zip file):

 

 

{

 

    "Response": [

        {

            "last_name": "Judy",

            "created_date": "2014-06-19 20:01:41",

            "first_name": "Peter",

            "emp_id": "253"

        },

        {

            "last_name": "Anderson",

            "created_date": "2014-06-19 20:01:41",

            "first_name": "Mike",

            "emp_id": "254"

        }http://

    ]

}

 

Since the path has been mentioned as CSV in the request, the process will return the list of records to the user and continue down the path of writing the content to a CSV File at the location you have mentioned in your CSV connection (Step 7a).

 

Related Documents

[a] How to connect to a data Access Service from a developer: https://kb.informatica.com/howto/6/Pages/18/437515.aspx

[b] How to create an automated step in the ICRT developer that could be used in the ICRT IPD Process or a guide: https://kb.informatica.com/howto/6/Pages/18/431031.asp

 

Data Access for SQL
https://network.informatica.com/docs/DOC-17822

Comments