What's New > Upgrade > Post-upgrade tasks for the October 2021 release
  

Post-upgrade tasks for the October 2021 release

Perform the following tasks after your organization is upgraded to the October 2021 release.

Advanced properties in mappings

After you upgrade, existing mappings fail when the source and target advanced properties contain data type values that the fields do not support.
For example, when you run an existing Microsoft Azure Data Lake Storage Gen2 mapping that has the Block Size source and target advanced property value defined as a String value of 1GB instead of an Integer value, the mapping fails with the following error message:
Exception occurred while converting blockSize value 1GB to Integer
Previously, the mappings passed even if you specified a String or BigInt data type value as the block size.
Before you upgrade to the October 2021 release, you must modify your mappings to include a valid data type value that the advanced source and target property field supports.

Custom query override in taskflows

After you upgrade, existing taskflows that override the custom query of a mapping task might need manual updates.
If a taskflow contains a Data Task step that uses a mapping task with a custom query, and you override the custom query in the taskflow with a query that exceeds 65 characters, the mapping task fails with an error.
To run the taskflow successfully, update the overridden custom query in the Data Task input field. To do this, delete the input field, reselect the input field, and then enter the custom query again. You can use a custom query that exceeds 65 characters.
Note: Before reselecting the Data Task input field, you must clear the cache or switch to the incognito mode.
For more information about overriding a custom query in taskflows, see the following community article:
https://network.informatica.com/docs/DOC-19268

Sequence Generator transformation in mappings enabled for pushdown optimization

After you upgrade, existing tasks enabled with pushdown optimization runs without pushdown optimization. This issue occurs when the NEXTVAL() port in a Sequence Generator transformation is linked to input ports of multiple downstream transformations in the mapping.
If the NEXTVAL() port in a Sequence Generator transformation is linked directly to a single input port or multiple input ports in a Target transformation, the mapping runs with pushdown optimization.
Previously, when the NEXTVAL() port was linked to input ports of multiple downstream transformations, the mappings ran successfully with pushdown optimization, but generated incorrect data.

File Processor Connector

After the upgrade, if you decrypt the files that were encrypted in an earlier release, the mapping task runs successfully but the files are not decrypted properly. This is applicable when you use password-based decryption (PBE).
To fix this issue, you must perform one of the following tasks:

Google BigQuery V2 Connector

After the upgrade, existing mappings fail in the following scenarios:

Hive Connector

After the upgrade, existing elastic mappings that read from or write data to Hive on the Cloudera CDW 7.2 public cloud distribution fail with the following error:
java.lang.NoClassDefFoundError: org/apache/hadoop/io/Text
To access the required jars for Cloudera CDW 7.2 public cloud, run the Hadoop distribution script and specify the distribution version CDW_7.2 for the elastic job.
The script is located in the following location:
<Secure Agent installation directory>/downloads/package-Cloudera_6_1/package/Scripts
For more information on running the script, see the section "Cloudera 6.1 distribution" in the Changed behavior topic.

Microsoft Azure Synapse SQL Connector

If an existing mapping is enabled for the truncate target option and also contains an override to the target table name or schema name in the mapping properties, the data output before and after the upgrade differs.
The data output differs after the upgrade because the agent truncates the table or schema specified in the override property.
Previously, the agent truncated the table or schema that you specified at design time.
If you do not want the agent to truncate the table or schema that you specify in the override property, disable the truncate table option and run the mapping again.