2 Replies Latest reply on Nov 25, 2021 5:16 AM by Jharana Patra

    Pass entire CSV contents as REST POST body

    Samuel Greene New Member

      I have a functioning REST V2 connection and have tested passing JSON as the body.  Now I need to do the same but with CSV as it performs better and I won't have to create a hierarchy for each file type (The API offers this option as long as content-type is set to text/csv).  I have a mapping that queries a database and creates the needed CSV.  I have another task that uses the CSV as a source and then calls the REST target.  I have modified my Swagger file according to the HOW TO here ,which defines the body parameter as payloadFromFile. 


      The HOW TO states: "This will create an additional field in Request mapping with name "payloadFromFile" and entire body payload can be passed either via FlatFile/Expressions."


      Using the CSV as source and attempting to map the 'Group Name: CSV" produces "Can map only to repeating element".  In my experience using JSON, the REST API will be called multiple times - once for each row.  This produced slow performance.  I need to 'batch' data together and send a group of rows in each REST call.  


      Is it possible to simply pass the entire content of the CSV?  Or must I use an expression to put the data into a field using an expression?  This is a first prototype and I would like to add features in the future such as breaking up large datasets into smaller files.  I'm unable to find any examples of how to create an expression to do this.  I'm also concerned about file size being too big and whatever data structure is holding the expression having issues.  I am running on a secure agent.


      Here is a doc page from the API provider: Updating Rows in Bulk with Upsert | Socrata


      Thanks for your advice!

        • 1. Re: Pass entire CSV contents as REST POST body
          Nico Heinze Guru

          If I understand you correctly, you want to pass the contents of a whole CSV file as one long string to the REST connector.

          That can be done in two steps.

          First read the CSV file as a "normal" text file (meaning one long text string per line) and concatenate all those lines into one single long string using an Aggregator.

          Second make sure that this long string does not contain any double quote because this (as you know better than myself) is the delimiter character in JSON contents.


          So, how to concatenate a whole file into one string?


          Start by setting up a flat file source with a delimiter string which never occurs in the CSV file. Honestly I don't have any experience with IICS (I'm still a PowerCenter guy), but I seem to recall that in IICS you can not only set up a single character as a field delimiter but also a string of characters. For example, a CSV file with two columns ID and NAME normally looks like this:


          123,"Donald Duck"

          124,"Scrooge Duck"

          125,"Genie Goose"


          and so on.

          Now imagine this file has a delimiter string set up to $%&/() . Then the CSV file would look like this:


          123,$%&/()Donald Duck$%&/()

          124,$%&/()Scrooge Duck$%&/()

          125,$%&/()Genie Goose$%&/()


          and so on.


          What is this good for?


          The delimiter string $%&/() probably will never occur in real life in any CSV file you have to process. So it's safe to set up the field delimiter for your flat file source as $%&/() , meaning that each line of text will be read as one long string.

          So you set up the flat file source such that it has one long string (let's say, String(1000000)) and the field delimiter set to the string $%&/() .

          Now read from the CSV file, and each single line will be read as one single string.


          Next you forward this one lone string into an Expression as input port i_String .

          The Expression is set up with one variable port of type String:

              v_new_content , String(1000000)

          Set up v_new_content to the following expression term:

              v_new_content || i_String

          Set up one output port o_long_string of type String(50000000) (yes, 50 million characters, that's - as far as I know - the technical limit for a single string in IICS) with this expression term:



          This construction will make sure that the EXP forwards the contents of lines 1, 1 plus 2, 1 plus 2 plus 3, 1 plus 2 plus 3 plus 4 and so on to the following Aggregator transformation.


          This AGG gets o_long_string as its I/O port and does not have any Group-By port. This way it will forward only the last input, meaning the last long string delivered by the EXP.


          What next?

          You have to make sure that this string does not contain any double quotes. Now I don't know whether JSON files may contain any "substitution" for double quotes. But I hope so, and you will simply have to use the ReplaceStr() function to replace the double quote everywhere in the long string with the "substitution".


          Honestly I don't know whether this works in IICS as it works in PowerCenter. But I hope so. If not, I'm confident that some people more knowledgeable than myself can help out.




          • 2. Re: Pass entire CSV contents as REST POST body
            Jharana Patra Guru

            Hi Samuel,


            As an workaround you can use cloud application integration and use service connector .

            Where you can pass the file as an attachment then pass the content as whole to the API.

            The input field will be type of attachment.


            Test the behavior outside informatica in any third application like Postman if the whole content is handled by the API or not.


            Designing Service Connectors