Databricks Delta Connector > Mappings and mapping tasks with Databricks Delta connector > Databricks Delta target in mappings
  

Databricks Delta target in mappings

In a mapping, you can configure a Target transformation to represent a Databricks Delta object.
The following table describes the Databricks Delta properties that you can configure in a Target transformation:
Property
Description
Connection
Name of the target connection. Select a target connection or click New Parameter to define a new parameter for the target connection.
Target Type
Target type. Select one of the following types:
  • - Single Object.
  • - Parameter. Select Parameter to define the target type when you configure the task.
Object
Name of the target object.
Create Target
Creates a target. Enter a name for the target object and select the source fields that you want to use. By default, all source fields are used.
You cannot parameterize the target at runtime.
Operation
Defines the type of operation to be performed on the target table.
Select from the following list of operations:
  • - Insert (Default)
  • - Update
  • - Upsert
  • - Delete
  • - Data Driven
When you use an upsert operation, you must configure the Update Mode in target details as Update else Insert.
If the key column gets null value from the source, the following actions take place for different operations:
  • - Update. Skips the operation and does not update the row.
  • - Delete. Skips the operation and does not delete the row.
  • - Upsert. Inserts a new row instead of updating the existing row.
Update Columns
The fields to use as temporary primary key columns when you update, upsert, or delete data on the Databricks Delta target tables. When you select more than one update column, the mapping task uses the AND operator with the update columns to identify matching rows.
Applies to update, upsert, delete and data driven operations.
Data Driven Condition
Flags rows for an insert, update, delete, or reject operation based on the expressions that you define.
For example, the following IIF statement flags a row for reject if the ID field is null. Otherwise, it flags the row for update:
IIF (ISNULL(ID), DD_REJECT, DD_UPDATE )
Required if you select the data driven operation.
The following table describes the Databricks Delta advanced properties that you can configure in a Target transformation:
Advanced Property
Description
Target Database Name
Overrides the database name provided in the connection and the database selected in the metadata browser for existing targets.
Not applicable to elastic mappings.
Target Table Name
Overrides the table name at runtime for existing targets.
Not applicable to elastic mappings.
Write Disposition
Overwrites or adds data to the existing data in a table. You can select from the following options:
  • - Append. Appends data to the existing data in the table even if the table is empty.
  • - Truncate. Overwrites the existing data in the table.
Staging Location
Relative directory path to store the staging files.
  • - If the Databricks cluster is deployed on AWS, use the path relative to the Amazon S3 staging bucket.
  • - If the Databricks cluster is deployed on Azure, use the path relative to the Azure Data Lake Store Gen2 staging filesystem name.
Not applicable to elastic mappings.
Job Timeout
Maximum time in seconds that is taken by the Spark job to complete processing. If the job is not completed within the time specified, the Databricks cluster terminates the job and the mapping fails.
If the job timeout is not specified, the mapping shows success or failure based on the job completion.
Not applicable to elastic mappings.
Job Status Poll Interval
Poll interval in seconds at which the Secure Agent checks the status of the job completion.
Default is 30 seconds.
Not applicable to elastic mappings.
DB REST API Timeout
The Maximum time in seconds for which the Secure Agent retries the REST API calls to Databricks when there is an error due to network connection or if the REST endpoint returns 5xx HTTP error code.
Default is 10 minutes.
Not applicable to elastic mappings.
DB REST API Retry Interval
The time Interval in seconds at which the Secure Agent must retry the REST API call, when there is an error due to network connection or when the REST endpoint returns 5xx HTTP error code.
This value does not apply to the Job status REST API. Use job status poll interval value for the Job status REST API.
Default is 30 seconds.
Not applicable to elastic mappings.
Update Mode
Defines how rows are updated in the target tables. Select from the following options:
  • - Update As Update: Rows matching the selected update columns are updated in the target.
  • - Update Else Insert: Rows matching the selected update columns are updated in the target. Rows that don't match are appended to the target.
Not applicable to elastic mappings.

Create a target table at runtime

You can use an existing target or create a target to hold the results of a mapping. If you choose to create the target, the agent creates the target, if it does not exist already, when you run the task.
To specify the target properties, perform the following tasks:
    1. Select the Target transformation in the mapping.
    2. To specify the target, click the Target tab.
    3. Select the target connection.
    4. For the target type, choose Single Object or Parameter.
    5. Specify the target object or parameter.
    6. To specify a target object, perform the following tasks:
    1. a. Click Select and choose a target object. You can select an existing target object or create a new target object at runtime and specify the object name. The window for creating a new target at runtime
    2. You must specify the target object name.
    3. b. To create a target object at runtime, select Create New at Runtime.
    4. c. Enter the name of the target table that you want to create in the Object Name field.
    5. d. Enter the location of the target table data in the Table Location field. The table location is relative to the data bucket or data filesystem name specified in the connection.
    6. e. In the Path field, specify the Databricks database name.
    7. f. Click OK.

Rules and guidelines for create target at runtime

When you configure a mapping with the Create New at Runtime option, consider the following rules: