Mappings > Parameters > Parameter files
  

Parameter files

A parameter file is a list of user-defined parameters and their associated values.
Use a parameter file to define values that you want to update without having to edit the task. You update the values in the parameter file instead of updating values in a task. The parameter values are applied when the task runs.
You can use a parameter file to define parameter values in the following tasks:
Mapping tasks
Define parameter values for connections in the following transformations:
Define parameter values for objects in the following transformations:
Also, define values for parameters in data filters, expressions, and lookup expressions.
Note: Not all connectors support parameter files. To see if a connector supports runtime override of connections and data objects, see the help for the appropriate connector.
Synchronization tasks
Define values for parameters in data filters, expressions, and lookup expressions.
PowerCenter tasks
Define values for parameters and variables in data filters, expressions, and lookup expressions.
You cannot use a parameter file if the mapping task is based on an elastic mapping.
You enter the parameter file name and location when you configure the task.

Parameter file requirements

You can reuse parameter files across assets such as mapping tasks, taskflows, and linear taskflows. To reuse a parameter file, define local and global parameters within a parameter file.
You group parameters in different sections of the parameter file. Each section is preceded by a heading that identifies the project, folder, and asset to which you want to apply the parameter values. You define parameters directly below the heading, entering each parameter on a new line.
The following table describes the headings that define each section in the parameter file and the scope of the parameters that you define in each section:
Heading
Description
#USE_SECTIONS
Tells Data Integration that the parameter file contains asset-specific parameters. Use this heading as the first line of a parameter file that contains sections. Otherwise Data Integration reads only the first global section and ignores all other sections.
[Global]
Defines parameters for all projects, folders, tasks, taskflows, and linear taskflows.
[project name].[folder name].[taskflow name]
-or-
[project name].[taskflow name]
Defines parameters for tasks in the named taskflow only.
If a parameter is defined in a taskflow section and in a global section, the value in the taskflow section overrides the global value.
[project name].[folder name].[linear taskflow name]
-or-
[project name].[linear taskflow name]
Defines parameters for tasks in the named linear taskflow only.
If a parameter is defined in a linear taskflow section and in a global section, the value in the linear taskflow section overrides the global value.
[project name].[folder name].[task name]
-or-
[project name].[task name]
Defines parameters for the named task only.
If a parameter is defined in a task section and in a global section, the value in the task section overrides the global value.
If a parameter is defined in a task section and in a taskflow or linear taskflow section and the taskflow uses the task, the value in the task section overrides the value in the taskflow section.
If the parameter file does not contain sections, Data Integration reads all parameters as global.
Precede the parameter name with two dollar signs, as follows: $$<parameter>. Define parameter values as follows:
$$<parameter>=value
$$<parameter2>=value2
For example, you have the parameters SalesQuota and Region. In the parameter file, you define each parameter in the following format:
$$SalesQuota=1000
$$Region=NW
The parameter value includes any characters after the equals sign (=), including leading or trailing spaces. Parameter values are treated as String values.

Parameter scope

When you define values for the same parameter in multiple sections in a parameter file, the parameter with the smallest scope takes precedence over parameters with larger scope.
In this case, Data Integration gives precedence to parameter values in the following order:
  1. 1. Values defined in a task section.
  2. 2. Values defined in a taskflow or linear taskflow section.
  3. 3. Values defined in the #USE_SECTIONS section.
  4. 4. Values defined in a global section.
For example, a parameter file contains the following parameter values:
[GLOBAL]
$$connection=ff5
[Project1].[Folder1].[monthly_sales]
$$connection=ff_jd
For the task "monthly_sales" in Folder1 inside Project1, the value for parameter $$connection is "ff_jd." In all other tasks, the value for $$connection is "ff5."
If you define a parameter in a task section and in a taskflow or linear taskflow section and the taskflow uses the task, Data Integration uses the parameter value defined in the task section.
For example, you define the following parameter values in a parameter file:
#USE_SECTIONS
$$source=customer_table
[GLOBAL]
$$location=USA
$$sourceConnection=Oracle
[Default].[Sales].[Task1]
$$source=Leads_table
[Default].[Sales].[Taskflow2]
$$source=Revenue
$$sourceconnection=ODBC_1
[Default].[Taskflow3]
$$source=Revenue
$$sourceconnection=Oracle_DB
Task1 contains the $$location, $$source, and $$sourceconnection parameters. Taskflow2 and Taskflow3 contain Task1.
When you run Taskflow2, Data Integration uses the following parameter values:
Parameter
Section
Value
$$source
[Default].[Sales].[Task1]
Leads_table
$$sourceconnection
[Default].[Sales].[Taskflow2]
ODBC_1
$$location
[GLOBAL]
USA
When you run Taskflow3, Data Integration uses the following parameter values:
Parameter
Section
Value
$$source
[Default].[Sales].[Task1]
Leads_table
$$sourceconnection
[Default].[Taskflow3]
Oracle_DB
$$location
[GLOBAL]
USA
When you run Task1, Data Integration uses the following parameter values:
Parameter
Section
Value
$$source
[Default].[Sales].[Task1]
Leads_table
$$sourceconnection
[GLOBAL]
Oracle
$$location
[GLOBAL]
USA
For all other tasks that contain the $$source parameter, Data Integration uses the value customer_table.

Sample parameter file

The following example shows a sample parameter file entry:
#USE_SECTIONS
$$oracleConn=Oracle_SK
$$city=SF
[Global]
$$ff_conn=FF_ja_con
$st=CA
[Default].[Accounts].[April]
$$QParam=SELECT * from con.ACCOUNT where city=LAX
$$city=LAX
$$tarOb=accounts.csv
$$oracleConn=Oracle_Src
$$state=$st

Parameter file location

When you use a parameter file, save the parameter file on a local machine or in a cloud-hosted directory based on the task type. You enter details about the parameter file on the Schedule tab when you create the task.
By default, Data Integration uses the following parameter file directory on the Secure Agent machine:
<Secure Agent installation directory>/apps/Data_Integration_Server/data/userparameters
When you use a parameter file in a synchronization task, save the parameter file in the default directory.
For mapping tasks, you can also save the parameter file in one of the following locations:
A local machine
Save the file in a location that is accessible by the Secure Agent.
You enter the file name and directory on the Schedule tab when you create the task. Enter the absolute file path. Alternatively, enter a path relative to a $PM system variable, for example, $PMSessionLogDir/ParameterFiles.
You can use the following system variables:
To find the configured path of a system variable, see the pmrdtm.cfg file located at the following directory:
<Secure Agent installation directory>\apps\Data_Integration_Server\55.0.<version>\ICS\main\bin\rdtm
You can also find the configured path of any variable except $PMRootDir in the Data Integration Server system configuration details in Administrator.
If you do not enter a location, Data Integration uses the default parameter file directory.
A cloud platform
You can use a connection stored with Informatica Intelligent Cloud Services. The following table shows the connection types that you can use and the configuration requirements for each connection type:
Connection type
Requirements
Amazon S3 V2
You can use a connection that was created with the following credentials:
  • - Access Key
  • - Secret Key
  • - Region
The S3 bucket must be public.
Azure Data Lake Store Gen2
You can use a connection that was created with the following credentials:
  • - Account Name
  • - ClientID
  • - Client Secret
  • - Tenant ID
  • - File System Name
  • - Directory Path
The storage point must be public.
Google Storage V2
You can use a connection that was created with the following credentials:
  • - Service Account ID
  • - Service Account Key
  • - Project ID
The storage bucket must be public.
Create the connection before you configure the task. You select the connection and file object to use on the Schedule tab when you create the task.
Data Integration displays the location of the parameter file and the value of each parameter in the job details after you run the task.

Rules and guidelines for parameter files

Data Integration uses the following rules to process parameter files:

Parameter file templates

You can generate and download a parameter file template that contains mapping parameters and their default values. The parameter file template includes input and in-out parameters that can be overridden at runtime. Save the parameter file template and use it to apply parameter values when you run the task, or copy the mapping parameters to another parameter file.
When you generate a parameter file template, the file contains the default parameter values from the mapping on which the task is based. If you do not specify a default value when you create the parameter, the value for the parameter in the template is blank.
The parameter file template does not contain the following elements:
If you add, edit, or delete parameters in the mapping, download a new parameter file template.

Downloading a parameter file template

    1. On the Schedule tab in the mapping task, click Download Parameter File Template.
    The file name is <mapping task name>.param.
    2. If you want to use the file in subsequent task runs, save the parameter file in a location that is accessible by the Secure Agent.
    Enter the file name and directory on the Schedule tab when you configure the task.

Overriding connections with parameter files

If you used a connection parameter in a mapping, you can override the connection defined in the mapping task at runtime with values specified in a parameter file.
When you define a connection value in a parameter file, the connection type must be the same as the default connection type in the mapping task. For example, you create a Flat File connection parameter and use it as the source connection in a mapping. In the mapping task, you provide a flat file default connection. In the parameter file, you can only override the connection with another flat file connection.
When you override an FTP connection using a parameter, the file local directory must the same.
You cannot use a parameter file to override a lookup with an FTP/SFTP connection.
Note: Some connectors support only cached lookups. To see which type of lookup a connector supports, see the help for the appropriate connector.
    1. In the mapping, create an input parameter:
    1. a. Select connection for the parameter type .
    2. b. Select Allow parameter to be overridden at runtime.
    2. In the mapping, use the parameter as the connection that you want to override.
    3. In the mapping task, define the parameter details:
    1. a. Select a default connection.
    2. b. On the Schedule tab, enter the parameter file directory and parameter file name.
    4. In the parameter file, define the connection parameter with the value that you want to use at runtime.
    Precede the parameter name with two dollar signs ($$). For example, you have a parameter with the name ConParam and you want to override it with the connection OracleCon1. You define the runtime value with the following format:
    $$ConParam=OracleCon1
    5. If you want to change the connection, update the parameter value in the parameter file.

Overriding data objects with parameter files

If you used a data object parameter in a mapping, you can override the object defined in the mapping task at runtime with values specified in a parameter file.
Note: You cannot override source objects when you read from multiple relational objects or from a file list. You cannot override target objects if you create a target at run time.
When you define an object parameter in the parameter file, the parameter in the file must have the same metadata as the default parameter in the mapping task. For example, if you override the source object ACCOUNT with EMEA_ACCOUNT, both objects must contain the same fields and the same data types for each field.
    1. In the mapping, create an input parameter:
    1. a. Select data object for the parameter type.
    2. b. Select Allow parameter to be overridden at runtime.
    2. In the mapping, use the object parameter at the object that you want to override.
    3. In the mapping task, define the parameter details:
    1. a. Set the type to Single.
    2. b. Select a default data object.
    3. c. On the Schedule tab, enter the parameter file directory and file name.
    4. In the parameter file, specify the object to use at runtime.
    Precede the parameter name with two dollar signs ($$). For example, you have a parameter with the name ObjParam1 and you want to override it with the data object SourceTable. You define the runtime value with the following format:
    $$ObjParam1=SourceTable
    5. If you want to change the object, update the parameter value in the parameter file.

Overriding source queries

If you used a source query or filter condition in a mapping, you can override the value specified in the mapping task with values specified in a parameter file. You can override source queries for relational and ODBC database connections.
When you define an SQL query, the fields in the overridden query must be the same as the fields in the default query. The task fails if the query in the parameter file contains fewer fields or is invalid.
If a filter condition parameter is not resolved in the parameter file, Data Integration will use the parameter as the filter value and the task returns zero rows.
    1. In the mapping, create a data object parameter.
    2. Select Allow parameter to be overridden at runtime.
    3. Use the parameter as the source object.
    4. In the mapping task, on the Sources tab, select Query as the source type.
    5. Enter a default custom query.
    6. On the Schedule tab, provide the parameter file name and location.
    7. In the parameter file, enter the values to use when the task runs.
    8. If you want to change the query, update the parameter value in the parameter file.