Using Parameters in Pipelines
In Data Integration, parameters in pipelines originate from parameters that are assigned in data flows and tasks.
For data flow operators that use expressions in the configuration, the values of any parameters that are used in the expressions are also available for reconfiguring in a pipeline.
See Using Data Flow Parameters and About Expression Parameters to learn how to use parameters and expression parameters (also known as scalar parameters) in data flows.
Parameters and expression parameters that are used in data flows and tasks have types and default values. If you don't change the incoming parameter values for tasks and associated data flows in a pipeline, the original default values are used at pipeline design time, or at runtime when you run the pipeline.
The supported incoming parameter data types are:
NUMERIC
VARCHAR
DATE
DATETIME
DECIMAL
DOUBLE
FLOAT
INTEGER
TIMESTAMP
TINYINT
BOOLEAN
LONG
STRING
Parameters for the source and target resources in a data loader task can also be reconfigured when the task is used in a pipeline. To learn how to use parameters in data loader tasks, see Using Parameters for Source and Target.
Ways to Reconfigure Parameters in a Pipeline
When you add a parameterized task to a pipeline, you can use the Configuration tab in the Properties panel to reconfigure any of the parameter values specified in that task, or in the associated data flow that an integration task wraps.
To reconfigure an incoming parameter value for a task in a pipeline, you can assign to the input:
- A specific value directly
- The value of an output from a previous operator. For example, an output parameter.
- A user-defined parameter, which lets you set the value at pipeline runtime, or at design time when you create a pipeline task that wraps the pipeline.
- A system parameter
When you create a pipeline task for a parameterized pipeline, you can use the Configure Parameters page to change the default values of user-defined parameters, if any.
Types of Pipeline Parameters
In Data Integration, the types of parameters you can use in pipelines are user-defined or system-generated.
System-generated parameters can be classified into the following types:
User-defined parameters
User-defined parameters are custom parameters that you create in pipelines to assign to the inputs of incoming parameters for task operators. A user-defined parameter is local to the pipeline where you created it, so you can't reuse a user-defined parameter across pipelines.
A user-defined parameter has a name, a type, and a value. You provide a name and a value when you create a user-defined parameter. The type depends on the type of the incoming parameter that you're assigning the user-defined parameter to. You can't change the type of a user-defined parameter.
The user-defined parameter types that you can create in a pipeline include:
- Data asset
- Connection
- Schema
- Data entity
- Condition (from a parameterized join or filter condition)
- Expression
You can also add parameters of data types such as NUMERIC, VARCHAR, DATE, DECIMAL, and STRING.
While designing a pipeline, you can use the Parameters panel or the Parameters tab on the Properties panel of the pipeline to see the user-defined parameters that have been added.
The user-defined parameters are also available in the Expression Builder when you define an expression in the pipeline using an expression operator.
System parameters
Data Integration generates system parameters that you can use to assign to the inputs of incoming parameters for task operators at runtime. A system parameter has a name and type. You can bind the value of a system parameter to an incoming parameter of the same type but you can't change the value.
The following table shows the system parameters that you can use in a pipeline.
Name | Type | Description |
---|---|---|
SYS.TASK_START_TIME | TIMESTAMP | Task start time in UTC. |
SYS.TASK_RUN_KEY | STRING | Task run key of the task run. |
SYS.TASK_RUN_NAME |
STRING |
Task run name of the task run. |
SYS.TASK_KEY |
STRING |
Key of the task. |
SYS.TASK_NAME |
STRING |
Name of the task. |
SYS.PARENT_TASK_RUN_KEY | STRING | Parent task run key. |
SYS.TASK_SCHEDULE_TRIGGER_TIME | TIMESTAMP | The trigger time for the schedule (for a task run that's initiated from a schedule). The scheduled time might be different from the task start time. |
SYS.TASK_SCHEDULE_TIMEZONE | STRING | The time zone corresponding to the scheduled time, which represents the time zone time as defined in the task schedule. |
SYS.RETRY_ATTEMPT | INTEGER | The number of run attempts of a task that's configured for retries when a run fails. |
SYS.PIPELINE_NAME | STRING | Name of the pipeline. |
SYS.PIPELINE_IDENTIFIER | STRING | Identifier of the pipeline. |
SYS.TIMEOUT | INTEGER | The timeout value (in minutes) of the pipeline. The pipeline run fails when the timeout value is reached. |
SYS.WORKSPACE_ID | STRING | Identifier of the workspace. |
SYS.APPLICATION_KEY | STRING | Key of the application. |
SYS.REGION | STRING | Region name. |
While designing a pipeline, you can use the Parameters tab on the Properties panel of the pipeline to see the system pipeline parameters that are available. The Used in column indicates where a parameter is used in the pipeline.
While adding an expression to an expression operator in a pipeline, you can use the Parameters tab in the Expression Builder to see the list of system parameters. You can use system parameters in expressions to log system information.
Output parameters
Output parameters are system-generated parameters for operators that you can use to pass output values from one operator in a pipeline to another operator downstream.
The following table shows some output parameters that you can use.
Name | Type | Description |
---|---|---|
SYS.STATUS | STRING | Status of operation (for example, task run) |
SYS.ERROR_MESSAGE | STRING | Error message for operation |
SYS.NO_OF_INSERTS | INTEGER | Number of rows inserted and updated |
SYS.DATA_READ | INTEGER | Amount of data read |
SYS.DATA_WRITTEN | INTEGER | Amount of data written |
SYS.DURATION | INTEGER | Duration of operation |
SYS.START_TIME_TIMESTAMP | TIMESTAMP | Start time of the task execution in Timestamp format. |
SYS.END_TIME_TIMESTAMP | TIMESTAMP | End time of the task execution in Timestamp format. |
SYS.TASK_RUN_KEY |
STRING |
Task run key of the task run. |
SYS.TASK_RUN_NAME |
STRING |
Task run name of the task run. |
SYS.TASK_NAME |
STRING |
Name of the task. |
SYS.TASK_KEY |
STRING |
Key of the task. |
While designing a pipeline, you can use the Output tab on the Properties panel of a task operator to view the outputs that can be used as inputs for the next operator.
Note the following:
-
For a SQL task operator associated with a stored procedure, the outputs include any output parameters from the SQL stored procedure.
-
For a merge operator, the outputs available are a combination of output parameters from the system, and the outputs for each operator that's connected to the merge operator.
- Only the following outputs from the system are available: SYS.STATUS, SYS.START_TIME_TIMESTAMP, and SYS.END_TIME_TIMESTAMP
- Task outputs are qualified by a task operator name. For example: TASK_OPERATOR_1.SYS.STATUS, TASK_OPERATOR_2.SYS.STATUS
- Outputs from an expression operator are qualified by the expression operator name and expression identifier. For example: EXPRESSION_1.EXPRESSION_ERROR_MESSAGE
- Outputs from a merge operator are qualified by the merge operator name. For example: MERGE_3.SYS.START_TIME_TIMESTAMP
-
For an OCI Data Flow task operator associated with an OCI Data Flow application, the outputs include any output parameters from the application, and the following output data types:
- com.oracle.dos.task.model.customparameter.Arguments: args
- Float:
- bytesProcessed
- bytesWritten
- String:
- applicationId
- displayName
-
For a REST task operator, the following data types of REST response outputs are available:
- JSON or JSON_TEXT: SYS.RESPONSE_PAYLOAD_JSON and SYS.RESPONSE_HEADERS_JSON
- String: SYS.RESPONSE_STATUS
Note
- To use a REST response output in downstream operations, pass the output through an expression operator where you can apply string and regex operations. Then use the result in subsequent downstream task operators.
- The String outputs SYS.RESPONSE_PAYLOAD and SYS.RESPONSE_HEADERS are deprecated. We recommend converting any existing usage to the JSON data type equivalents of SYS.RESPONSE_PAYLOAD_JSON and SYS.RESPONSE_HEADERS_JSON, respectively.
About the Parameters Panel
The Parameters panel shows all the user-defined pipeline parameters that have been created in a pipeline.
To access the Parameters panel when the pipeline is opened in the designer canvas, select Parameters from the View menu on the canvas toolbar.
Use the View tab to see the user-defined parameters in the pipeline. Used and unused parameters are shown. When you click the Used in link for a parameter that's used in the pipeline, the task operator referencing that parameter is brought into focus on the canvas.
Use the Config tab in the Parameters panel to delete a user-defined parameter or edit the default value.
When you delete a user-defined pipeline parameter, any configured values in operators associated with the parameter are removed. To reconfigure an input value, see Configuring Incoming Parameters for Tasks.
Adding a Parameter in a Pipeline
You can add and define user-defined parameters on incoming parameters of task operators in a pipeline.
How you define a user-defined parameter depends on the type of the incoming parameter. For all parameter types, you provide a name and an optional description, and set a default value for the user-defined parameter. You can't change the type, and you can't change the name and description after the parameter is created.
A user-defined parameter can be reused within the same pipeline to represent the configured value for another incoming parameter of the same type.
Viewing the Parameters in a Pipeline
You can view the system parameters that are available in a pipeline, and the user-defined parameters that you have created in a pipeline.
You can also use this procedure to see where a parameter is used in a pipeline.
Assigning a Parameter in a Pipeline
You can assign existing user-defined parameters and system parameters to incoming parameters of task operators in a pipeline.
Editing the Default Value of a User-Defined Parameter
At pipeline design time, you can quickly change the default value of an assigned user-defined parameter.
Unassigning a Parameter in a Pipeline
Unassigning a parameter removes only the reconfigured value of the incoming parameter. It doesn't delete the incoming parameter from the task or delete the user-defined parameter.
Deleting a Parameter in a Pipeline
You can delete user-defined pipeline parameters that you no longer need in a pipeline.
System parameters in a pipeline can't be deleted.