When you select a source or a target Amazon S3 V2 file, you can configure format options.
Select the code page that the Secure Agent must use to read or write data.
Please find the below guide for reference.
Let us know if the issue persists.
Hi Jharana - Thanks for your reply.
I verified the code page in Secure Agent properties. It is UNICODE only.
DataMovementMode - 'UNICODE'
This issue seems to be because of incompatible codepages between source and target or datatype of the field.
To make sure that codepages are set in a proper way, follow these steps:
1. Add another flatfile target to see how that data is coming from the source.
2. From the above step, we would know if the data is getting scrambled in the source or target.
3. If the character at target flatfile file is correct, ie German characters are properly displayed, then it means codepage of your database connection and/or your database client is incorrect.
-- Set code page to UTF-8 in oracle connection
-- Check the datatype of the field which has German character (It should be nvarchar , if its not nvarchar then you can change it in Informatica using edit metadata option)
Note: When you open csv file with notepad++ (generated by newly added flatfile target) then make sure the encoding is set to UTF-8 (Navigate to Encoding --> UTF-8)
Here the code page is refereed is in file format option while selecting the source.
Kindly update in source level and check.
Try the below setting:
On Windows, create the INFA_CODEPAGENAME=UTF-8 environment variable in Windows System Properties.
On Linux, set the LC_LOCALE variable to UTF-8.
HI Pavan - We tried like manually inserted 1 german record into database table..its inserted and visible correctly instead loading the data into the new target file. with this, we come know that there is no issue with Oracle DB end.
1 more option you mentioned like change the source datatype using edit metadata option, actually all the columns data type is String. As you said we will change that to nvarchar and try to load the data. we will update you once we verified the data.
Thanks for you reply.
pls change to nvarchar mainly in Target columns in the Mapping.
To change the cols type in bulk you have to follow this: 151727
If the issue persists, run the task in verbose mode with just 1 record and observe the log to see where the data is getting corrupted, at reading or writing.
varchar datatype in IICS doesn't support multibyte characters, you should change it to nvarchar to handle multi byte characters.
There is feature request IF-17770 opened to add multibyte char support to varchar datatype in iics, please cast your vote if you think it is needed