Databricks Delta Connector > Mappings and mapping tasks with Databricks Delta connector > Databricks Delta sources in mappings
  

Databricks Delta sources in mappings

In a mapping, you can configure a Source transformation to represent a Databricks Delta object.
The following table describes the Databricks Delta source properties that you can configure in a Source transformation:
Property
Description
Connection
Name of the source connection. Select a source connection or click New Parameter to define a new parameter for the source connection.
Parameterization is not applicable to elastic mappings.
Source Type
Type of the source object. Select any of the following source objects:
  • - Single Object
  • - Parameter. Select Parameter to define the source type when you configure the task. Not applicable to elastic mappings.
Object
Name of the source object.
The following table describes the Databricks Delta query options that you can configure in a Source transformation:
Property
Description
Query Options
Filters the source data based on the conditions you specify. Click Configure to configure a filter option:
Filter
Filters records and reduces the number of rows that the Secure Agent reads from the source. Add conditions in a read operation to filter records from the source. You can specify the following filter conditions:
  • - Not parameterized. Use a basic filter to specify the object, field, operator, and value to select specific records.
  • - Completely parameterized. Use a parameter to specify the filter query. Not applicable to elastic mappings.
  • - Advanced. Use an advanced filter to define a complex filter condition.
The following table describes the Databricks Delta source advanced properties that you can configure in a Source transformation:
Note: Advanced source properties are not applicable to elastic mappings.
Property
Description
Database Name
Overrides the database name provided in connection and the database name provided during metadata import.
Table Name
Overrides the table name used in the metadata import with the table name that you specify.
Staging Location
Relative directory path to store the staging files.
  • - If the Databricks cluster is deployed on AWS, use the path relative to the Amazon S3 staging bucket.
  • - If the Databricks cluster is deployed on Azure, use the path relative to the Azure Data Lake Store Gen2 staging filesystem name.
Job Timeout
Maximum time in seconds that is taken by the Spark job to complete processing. If the job is not completed within the time specified, the Databricks cluster terminates the job and the mapping fails.
If the job timeout is not specified, the mapping shows success or failure based on the job completion.
Job Status Poll Interval
Poll interval in seconds at which the Secure Agent checks the status of the job completion.
Default is 30 seconds.
DB REST API Timeout
The Maximum time in seconds for which the Secure Agent retries the REST API calls to Databricks when there is an error due to network connection or if the REST endpoint returns 5xx HTTP error code.
Default is 10 minutes.
DB REST API Retry Interval
The time Interval in seconds at which the Secure Agent must retry the REST API call, when there is an error due to network connection or when the REST endpoint returns 5xx HTTP error code.
This value does not apply to the Job status REST API. Use job status poll interval value for the Job status REST API.
Default is 30 seconds.
Tracing Level
Sets the amount of detail that appears in the log file. You can choose terse, normal, verbose initialization, or verbose data.
Default is normal.