Snowflake Cloud Data Warehouse V2 Connector > Snowflake pushdown optimization > Pushdown optimization using the Snowflake ODBC connection
  

Pushdown optimization using the Snowflake ODBC connection

When you use a Snowflake ODBC connection and select the ODBC subtype as Snowflake, you can configure pushdown optimization in a mapping to push transformation logic to the Snowflake Cloud Data Warehouse source or target database. The ODBC connection must use the Snowflake ODBC driver.
When you run a task configured for pushdown optimization, the task converts the transformation logic to an SQL query. The task sends the query to the database, and the database executes the query. Use pushdown optimization to improve the performance of the task.
You can configure full and source pushdown optimization in a Snowflake Cloud Data Warehouse mapping. Use the Pushdown Optimization advanced session property to configure pushdown optimization for a task.
For information about advanced session properties that you can use with pushdown optimization using a Snowflake ODBC connection, such as Create Temporary View, Create Temporary Sequence, and Allow Pushdown Across Databases, see the topic "Advanced Session Properties" under Tasks > Mapping Tasks in the Informatica Cloud Data Integration Help.
Use the Pushdown Optimization advanced session property to configure pushdown optimization in a mapping task.
Example 1. Example
You work for a rapidly growing data science organization. Your organization develops software products to analyze financials, building financial graphs connecting people profiles, companies, jobs, advertisers, and publishers. The organization stores its data in Snowflake. You want to apply some transformations in Data Integration and then write to Snowflake. To read from or write this large amount of data, you can use Full pushdown for the mapping that uses the Snowflake ODBC connection type.
Using the ODBC connection type with pushdown optimization enhances the performance.

Configuring a Snowflake ODBC connection

You can set the pushdown optimization for the ODBC connection type that uses Snowflake ODBC driver to enhance the mapping performance. To use an ODBC connection to connect to Snowflake, you must configure the Snowflake ODBC drivers connection.
Snowflake supports Snowflake ODBC drivers on Windows and Linux systems. You must install the Snowflake ODBC 64-bit driver based on your system requirement.

Configuring a Snowflake ODBC connection on Windows

Before you establish an ODBC connection to connect to Snowflake Cloud Data Warehouse on Windows, you must configure the ODBC connection.
Perform the following steps to configure an ODBC connection on Windows:
    1. Download the Snowflake ODBC driver from your Snowflake Cloud Data Warehouse account.
    You must download the Snowflake ODBC 64-bit driver.
    2. Install the Snowflake ODBC driver on the machine where the Secure Agent is installed.
    3. Open the folder in which ODBC data source file is installed.
    4. Run the odbcad32.exe file.
    The ODBC Data Source Administrator dialog box appears.
    5. Click System DSN.
    The System DSN tab appears. The following image shows the System DSN tab on the ODBC Data Source Administrator (64-bit) dialog box:
    The image shows the System DSN on the ODBC Data Source Administrator (64-bit) dialog box.
    6. Click Add.
    The Create New Data Source dialog appears. The following image shows the Create New Data Source dialog where you can select the Snowflake Cloud Data Warehouse data source:
    The image shows the Create New Data Source dialog where you can select the Snowflake data source.
    7. Select the SnowflakeDSIIDriver and click Finish.
    8. Click Configure.
    The Snowflake Configuration Dialog appears. The following image shows the Snowflake Configuration Dialog where you can configure the connection properties:
    The image shows the Snowflake Configuration Dialog where you can configure the connection properties.
    9. Specify the following connection properties:
    Property
    Description
    Data Source
    Name of the data source.
    User
    Username to access the Snowflake Cloud Data Warehouse database.
    Password
    Password to access the Snowflake Cloud Data Warehouse database.
    Server
    Domain name of your Snowflake Cloud Data Warehouse account.
    Database
    Name of the Snowflake Cloud Data Warehouse database.
    Schema
    Name of the Snowflake Cloud Data Warehouse schema.
    Warehouse
    Name of the Snowflake Cloud Data Warehouse warehouse.
    Role
    The Snowflake Cloud Data Warehouse role assigned to user.
    Tracing (0-6)
    Determines the amount of detail that appears in the log file. You can specify the following values:
    • - 0. Disable tracing.
    • - 1. Fatal error tracing.
    • - 2. Error tracing.
    • - 3. Warning tracing.
    • - 4. Info tracing.
    • - 5. Debug tracing.
    • - 6. Detailed tracing.
    Note: It is important to specify the database, schema, and warehouse properties in the ODBC data source name.
    10. Click OK.
The Snowflake ODBC connection is configured successfully on Windows.
After you configure the Snowflake ODBC connection, you must create an ODBC connection to connect to Snowflake Cloud Data Warehouse.

Configuring a Snowflake ODBC connection on Linux

Before you establish an ODBC connection to connect to Snowflake Cloud Data Warehouse on Linux, you must configure the ODBC connection.
Note: You cannot use the Snowflake ODBC driver with SUSE Linux.
Perform the following steps to configure an ODBC connection on Linux:
    1. Download the Snowflake ODBC driver from your Snowflake Cloud Data Warehouse account.
    You must download the Snowflake ODBC 64-bit driver.
    2. Install the Snowflake ODBC driver on the machine where the Secure Agent is installed.
    3. Configure the odbc.ini file properties in the following format:
    [ODBC Data Sources]
    driver_name=dsn_name

    [dsn_name]
    Driver=path/driver_file

    Description=
    Database=
    Schema=
    Warehouse=
    Server=domain_name
    role=role
    4. Specify the following properties in the odbc.ini file:
    Property
    Description
    ODBC Data Sources
    Name of the data source.
    Driver
    Location of the Snowflake ODBC driver file.
    Description
    Description of the data source.
    Database
    Name of the Snowflake database.
    Schema
    Name of the Snowflake schema.
    Warehouse
    Name of the Snowflake warehouse.
    Server
    Domain name of your Snowflake account.
    Role
    The Snowflake role assigned to user.
    Note: It is important to specify the database, schema, and warehouse properties in the ODBC data source name.
    5. Run the following command to export the odbc.ini file:
    Export ODBCINI=/<odbc.ini file path>/odbc.ini
    6. Restart the Secure Agent.
The Snowflake ODBC connection on Linux is configured successfully.
After you configure the Snowflake ODBC connection, you must create an ODBC connection to connect to Snowflake Cloud Data Warehouse.

Create an ODBC connection

You must create an ODBC connection to connect to Snowflake Cloud Data Warehouse after you configure the ODBC connection.
Perform the following steps to create a Snowflake ODBC connection on the Connections page:
    1. In Administrator, click Connections.
    The Connections page appears.
    2. Click New Connection.
    The New Connection page appears. The following image shows the New Connection page:
    3. Configure the following connection details in the Connection Details section:
    Property
    Description
    Connection Name
    Name of the ODBC connection. For example, sf_odbc.
    Description
    Description of the connection.
    Type
    Type of the connection.
    Select the type of the connection as ODBC.
    4. Configure the following connection details in the ODBC Connection Properties section:
    Property
    Description
    Runtime Environment
    Runtime environment that contains the Secure Agent you can use to access the system.
    User Name
    Username to log in to the Snowflake Cloud Data Warehouse database.
    Password
    Password to log in to the Snowflake Cloud Data Warehouse database.
    Data Source Name
    Enter the name of the ODBC data source name that you created for the Snowflake Cloud Data Warehouse database.
    Schema
    Name of the Snowflake Cloud Data Warehouse schema.
    Code Page
    The code page of the database server or flat file defined in the connection.
    ODBC Subtype
    Enter the value of the ODBC Subtype field as Snowflake.
    Driver Manager for Linux
    The driver that the Snowflake ODBC driver manager sends database calls to.
The Snowflake ODBC connection is created successfully.

Configuring optimization for a Snowflake ODBC mapping task

Perform the following steps to configure pushdown optimization in a Snowflake ODBC mapping task:
    1. Create a mapping to read from or write to Snowflake:
    1. a. Use the Snowflake ODBC connection in the Source transformation.
    2. b. Use the Snowflake ODBC connection in the Target transformation.
    2. Create a mapping task.
    1. a. Select the configured mapping.
    2. b. In the Pushdown Optimization section on the Schedule tab, set the pushdown optimization value to Full or To Source.
    3. c. Save the task and click Finish.
When you run the mapping task, the transformation logic is pushed to the Snowflake database. To verify that the pushdown optimization has taken place, you can check the session log for the job. In Monitor, view the log for jobs.

Specifying a Snowflake target for a pushdown optimization task

When you create a mapping with pushdown optimization to a Snowflake database, you can use an existing Snowflake target or create a Snowflake target to hold the results of the mapping. If you choose to create the target, the agent creates the target when you run the task. To use the Create New at Runtime option, you must set the AddQuotesAlwaysPDO custom property and set the value to Yes in the advanced session properties of the mapping task.
To specify the target properties, perform the following tasks:
    1. Select the Target transformation in the mapping.
    2. On the Incoming Fields tab, configure field rules to specify the fields to include in the target.
    3. To specify the target, click the Target tab.
    4. Select the target connection.
    5. For the target type, choose Single Object or Parameter.
    6. Specify the target object or parameter.
    7. To specify a target object, perform the following tasks:
    If you want to create a new target object, specify the table name for the target object.
    1. a. Click Select and choose a target object. You can select an existing target object or create a new target object at runtime and specify the object name.
    2. b. To create a target object at runtime, select Create New at Runtime.
    3. c. Enter the table name for the target object in the Table Name field.
    4. d. Click OK.
    8. Specify the advanced properties for the target, if required.

Configuring an unconnected Lookup transformation

To configure an unconnected Lookup transformation, select the Unconnected Lookup option, add incoming fields, configure the lookup condition, and designate a return value. Then configure a lookup expression in a different transformation.
    1. Add a Lookup transformation in a mapping.
    2. On the General tab of the Lookup transformation, enable the Unconnected Lookup option.
    3. On the Incoming Fields tab of the Lookup transformation, create an incoming field for each argument in the :LKP expression.
    For each lookup condition you create, add an incoming field to the Lookup transformation.
    4. In the Lookup Object tab, import the lookup object.
    You must select the Multiple Matches property value Report error in the unconnected lookup for pushdown optimization to work.
    5. Designate a return value.
    You can pass multiple input values into a Lookup transformation. and return one column of data. The Secure Agent returns the value from the lookup query.
    6. Specify the field that you want the Secure Agent to return.
    7. Configure a lookup expression in another transformation.
    Supply input values for an unconnected Lookup transformation from a :LKP expression in a transformation that uses an Expression transformation. The arguments are local input fields that match the Lookup transformation input fields used in the lookup condition.
    8. Map the fields with the target.

Configuring a Sequence Generator transformation

You can add a Sequence Generator transformation for a Snowflake ODBC mapping configured for pushdown optimization. Use the Sequence Generator to create a sequential range of numbers.
    1. Add a Sequence Generator transformation in a mapping.
    2. On the General tab of the Sequence Generator transformation, enter a name for the Sequence Generator transformation.
    You can view a sample mapping that uses a Sequence Generator transformation
    3. On the Sequence tab, specify the required properties.
    The following image shows a sample specifying the Initial Value, End Value, and Increment By properties to generate a sequence of numbers.
    You can view the sequence properties required to generate the sequence of numbers.
    4. In the Output fields, map the NEXTVAL field to an input field in a Target transformation or other downstream transformation to generate a sequence of numbers.
    You can view the two output fields, NEXTVAL and CURRVAL.
    5. In the mapping task, select Create Temporary Sequence in the Pushdown Optimization section.
    You can view the configured property on the Schedule tab in a mapping task.
    6. Click Finish.

Cross-schema pushdown optimization

You can use cross-schema pushdown optimization for a mapping task to read from or write data to Snowflake Cloud Data Warehouse objects associated with different schemas within the same Snowflake Cloud Data Warehouse database.
To use cross-schema pushdown optimization, create two Snowflake Cloud Data Warehouse ODBC connections and specify the schema in each connection. Ensure that the schema in the source connection is different from the schema in the target connection, but both the schemas must belong to the same database. You must use the same ODBC DSN for both the connections.
When you configure pushdown optimization for the mapping task, enable cross-schema pushdown optimization in the advanced session properties. By default, the check box is selected.

Configuring cross-schema optimization for a Snowflake ODBC mapping task

Create two Snowflake ODBC mappings.
For example, perform the following steps to configure cross-schema pushdown optimization for a Snowflake ODBC mapping task:
    1. Create the following two Snowflake ODBC connections, each defined with a different schema:
    1. a. Create an sf_odbc1 Snowflake ODBC connection and specify CQA_SCHEMA1 schema in the connection properties.
    2. b. Create sf_odbc2 Snowflake ODBC connection and specify CQA_SCHEMA2 schema in the connection properties.
    2. Create a Snowflake ODBC mapping, m_sf_pdo_acrossSchema. Perform the following tasks:
    1. a. Add a Source transformation and include a Snowflake ODBC source object and connection sf_odbc1 to read data using CQA_SCHEMA1.
    2. b. Add a Target transformation and include a Snowflake ODBC target object and connection sf_odbc2 to write data using CQA_SCHEMA2.
    3. Create a Snowflake ODBC mapping task, and perform the following tasks:
    1. a. Select the configured Snowflake ODBC mapping, m_sf_pdo_acrossSchema.
    2. b. In the Pushdown Optimization section on the Schedule tab, select Full as the value for pushdown optimization.
    3. c. In the Advanced Session Properties section, select Enable cross-schema pushdown optimization.
    4. The following image shows the configured Enable cross-schema pushdown optimization property:
      The Schedule tab shows the configured Enable cross-schema pushdown optimization property in the Advanced Session Properties section.
    5. d. Save the task and click Finish.
    When you run the mapping task, the Secure Agent reads data from the Snowflake ODBC source object associated with the CQA_SCHEMA1 schema and writes data to the Snowflake ODBC target object associated with the CQA_SCHEMA2 schema.

Cross-database pushdown optimization

You can use an ODBC connection in a Snowflake mapping to enable cross-database pushdown optimization to run queries on data spread across multiple Snowflake databases.
You can configure cross-database pushdown optimization in the mapping task. You must ensure that the Snowflake source and target transformations in the mapping must use two different ODBC connections that point to different DSN entries.

Configuring cross-database pushdown optimization

    1. Create a Snowflake mapping task.
    2. Go to the Schedule tab.
    3. In the Advanced Options on the Schedule tab, select Allow Pushdown across Databases as the Session Property Value and set the Session Property Value to Yes.
    The following image shows the cross-database pushdown optimization configuration:
    The Schedule tab shows the configured properties in the Advanced Session Properties.
    4. Save the task and click Finish.

Supported functions and operators for Snowflake ODBC mappings

The following table summarizes the availability of pushdown functions in a Snowflake Cloud Data Warehouse database. Columns marked with an X indicate that the function can be pushed to the Snowflake Cloud Data Warehouse database by using source-side or full pushdown optimization. Columns marked with a dash (-) symbol indicate that the function cannot be pushed to the database.
Function
Pushdown
Function
Pushdown
Function
Pushdown
ABS()
X
LN()
X
SOUNDEX()
-
ADD_TO_DATE()
X
LOG()
X
SQRT()
X
ASCII()
X
LOWER()
X
STDDEV()
X
AVG()
X
LPAD()
X
SUBSTR()
X
CEIL()
X
LTRIM()
X
SUM()
X
CHR()
X
MAX()
X
SYSDATE()
-
CONCAT()
X
MAX(NUMBER,DATE,STRING)
-
SYSTIMESTAMP()
-
COS()
X
MIN(NUMBER,DATE,STRING)
-
TAN()
X
COSH()
X
MEDIAN()
X
TANH()
X
COUNT()
X
MIN()
X
TO_BIGINT
X
DATE_COMPARE()
X
MEDIAN()
X
TO_CHAR(DATE)
-
DATE_DIFF()
-
MIN()
X
TO_CHAR(NUMBER)
X
DECODE()
X
MD5()
X
TO_DATE()
X
EXP()
X
MOD()
X
TO_DECIMAL()
X
FLOOR()
X
POWER()
X
TO_FLOAT()
X
GET_DATE_PART()
X
REPLACECHR()
-
TO_INTEGER()
X
IIF()
X
REPLACESTR()
X
TO_NUMBER()
X
IN()
-
ROUND(NUMBER)
X
TRUNC(DATE)
X
INITCAP()
X
RPAD()
X
TRUNC(NUMBER)
X
INSTR()
X
RTRIM()
X
UPPER()
X
ISNULL()
X
SIGN()
X
VARIANCE()
X
LAST_DAY()
X
SIN()
X
LENGTH()
X
SINH()
X
The following table summarizes the supported operators in a Snowflake Cloud Data Warehouse database:
+ - * / % || > = >= <= != AND OR NOT ^=

Supported transformations for Snowflake ODBC mappings

When you configure pushdown optimization, the Secure Agent tries to push the configured transformation to the database.
The following table shows the supported pushdown types for Snowflake database to which you can push the transformation:
Transformations
Pushdown
Aggregator
Source, Full
Expression
Source, Full
Filter
Source, Full
Joiner
Source, Full
Lookup (connected and unconnected)1
Source, Full
Sorter
Source, Full
Union
Source, Full
Router
Full
Update Strategy
-
Sequence Generator2
Source, Full
1 See the limitations for configuring a Lookup transformation in the "Rules and guidelines for pushdown optimization" topic.

2 See the limitations for configuring the Sequence Generator transformation in the "Rules and guidelines for pushdown optimization" topic.

Rules and guidelines for pushdown optimization

Use the following rules and guidelines when you configure pushdown optimization to a Snowflake database:
Update override property
The update override property is applicable for all ODBC subtypes in the ODBC connection, except Snowflake.
Common fields in multiple sources
When you use a Snowflake ODBC connection in a mapping enabled with pushdown optimization to read data from two Snowflake sources that have fields with the same name and you define a filter condition for one of the common fields, the mapping fails.
Create Temporary View session property
Enable the Create Temporary View property in the session properties of the mapping task before you configure the following properties:
Not Null constraint
When you run a mapping to write data to a Snowflake target, and you define the primary key for the target table but do not explicitly define the Not Null constraint, the upsert, delete, or update operation fails. You must add the Not Null constraint for the target table and then run the upsert, delete, or update operation.
Sequence Generator transformation
When you configure a Sequence Generator transformation in a mapping, adhere to the following guidelines:
Lookup
When you configure a lookup, adhere to the following guidelines:
Create New at Runtime option
You can configure the Create New at Runtime option for a Target transformation in a mapping configured for pushdown optimization. When you use the Create New at Runtime option, adhere to the following guidelines:
Unicode characters
To run a SQL query to read from tables where the names contain unicode characters, you must add a custom flag OdbcUseUnicodeAPI and set the value to 1 for the service of type Data Integration Server in the Secure Agent properties.
Functions
When you push functions to Snowflake, adhere to the following guidelines:

Troubleshooting

When you select the truncate table option for a Snowflake target that contains special characters and enable pushdown optimization, the mapping fails.
To rectify this issue, see the Knowledge Base article: https://kb.informatica.com/howto/6/Pages/23/579219.aspx
How do you configure pushdown optimization using an ODBC connection in a mapping task for an upsert, update, or delete operation to Snowflake?
In the advanced session properties on the Schedule tab, select Create Temporary View as the session property name and select Yes as the session property value.
For more information, see the KB article: https://informatica.com/solution/23/Pages/71/579225.aspx