Work with Connections
Connections help you to connect Data Transforms to various technologies reachable from your OCI network.
This section describes the generic steps to create a connection. The displayed connection detail options may vary depending on the selected connection type.
Apart from the connection types listed in Supported Connection Types you can create custom connectors, which you can use to connect Data Transforms to any JDBC supported data sources. See Create Custom Connectors.
To create a new connection:
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
- Select Edit to edit the provided connection details.
- Select Test Connection to test the created connection.
- Click Export to export the connection. See Export Objects.
- Select Delete Schema to delete schemas.
- Select Delete Connection to delete the created connection.
You can also search for the required connection to know its details based on the following filters:
- Name of the connection.
- Technology associated with the created connection.
- Supported Connection Types
This topic lists the connection types that are supported for connecting to Data Transforms. - Create Custom Connectors
The Custom Connections page of the Administration tab of Oracle Data Transforms helps you to create custom connectors that point to any JDBC supported data sources. - Create a Data Transforms Connection for Remote Data Load
You can connect to an existing Data Transforms instance and run a data load remotely. - Create a Delta Share Connection
Databricks Delta Share is an open protocol for secure data sharing. Oracle Data Transforms integrates with Delta Share to load data to Oracle Autonomous Database. You can use the Delta Share connection to load data from Databricks or Oracle Data Share. - Create an Oracle Business Intelligence Cloud Connector Connection
Oracle Business Intelligence Cloud Connector (BICC) allows you to extract business data from a data source and load it into Autonomous Database. - Create an Oracle Financials Cloud Connection
You can fetch real time transactional data from Oracle Financials Cloud REST endpoints, import the data entities into Data Transforms, and use them as a source in a data flow. - Create and Use an Oracle NetSuite Connection
You can use the Oracle NetSuite JDBC Driver or OAuth 2.0 authentication to connect to the Oracle NetSuite application. For Oracle NetSuite connections, Data Transforms allows you to load pre-built dataflows and workflows that you can run to transfer data from NetSuite to your target schema. - Create an Oracle Object Storage Connection
You can use Data Transforms to upload data from Oracle Object Storage to Autonomous Database. - Create a REST Server Connection
You can connect to any REST service endpoint, import the data entities into Data Transforms, and use them as source in a data flow.
Parent topic: The Data Transforms Page
Supported Connection Types
This topic lists the connection types that are supported for connecting to Data Transforms.
APPLIES TO: Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.
- For the connectors that require driver installation, you need to copy the jar files to the
/u01/oracle/transforms_home/userlibs
directory before you add the connection. - Apart from the connection types listed here you can create custom connectors, which you can use to connect Data Transforms to any JDBC supported data sources. See Create Custom Connectors.
Name | Type | Supported in Data Integrator: Web Edition | Supported in Data Transforms built into Autonomous Database | Supported in Data Transforms built into OCI GoldenGate | Supports Write Operation | Notes |
---|---|---|---|---|---|---|
Aha! | Application | Yes | Yes | Yes | No | |
Ahrefs | Application | Yes | Yes | Yes | No | |
Amazon Aurora | Database | Yes | Yes | Yes | Yes | |
Amazon EMR Hive | Database | Yes | Yes | Yes | No | |
Amazon Redshift | Database | Yes | Yes | Yes | Yes | |
Apache Hive | Database | Yes | Yes | Yes | No | |
Apache Spark SQL | Database | Yes | Yes | Yes | No | |
AWS S3 | Database | Yes | Yes | Yes | No | |
Azure Billing | Application | Yes | Yes | Yes | No | |
Azure Compute | Database | Yes | Yes | Yes | No | |
Azure Data Lake Storage | Database | Yes | Yes | Yes | No | |
Azure Reserved VM Instances | Database | Yes | Yes | Yes | No | |
Azure Resource Health | Application | Yes | Yes | Yes | No | |
Azure SQL Database | Database | Yes | Yes | Yes | Yes | |
Azure Synapse Analytics | Database | Yes | Yes | Yes | Yes | |
BigCommerce | Application | Yes | Yes | Yes | No | Requires driver installation |
Cassandra | Database | Yes | Yes | Yes | Yes | |
Cloudera CDH Hive | Database | Yes | Yes | Yes | No | |
Cloudera Impala | Database | Yes | Yes | Yes | No | |
Confluence Cloud | Database | Yes | Yes | Yes | No | |
Data Transforms | Service | Yes | Yes | Yes | No | For instructions on connecting to an existing Data Transforms instance, see Create a Data Transforms Connection for Remote Data Load. |
DataStax | Application | Yes | Yes | Yes | Yes | |
Delta Share | Application | Yes | Yes | Yes | No | For instructions on creating a connection using Delta Share, see Create a Delta Share Connection |
DocuSign | Database | Yes | Yes | Yes | No | |
eBay | Application | Yes | Yes | Yes | No | Requires driver installation |
EnterpriseDB | Database | Yes | Yes | Yes | Yes | |
FinancialForce | Application | Yes | Yes | Yes | Yes | |
FourSquare | Application | Yes | Yes | Yes | No | |
Generic Rest | Application | Yes | Yes | Yes | No | For information about connecting to any REST service endpoint to create a connection, see Create a REST Server Connection. |
Generic Rest Config | Application | Yes | No | No | No | For information about connecting to any REST service endpoint to create a connection, see Create a REST Server Connection. |
GitHub | Application | Yes | Yes | Yes | No | |
Google Ads | Application | Yes | No | No | Depends on the driver | Requires driver installation |
Google AdSense | Application | Yes | Yes | Yes | No | |
Google Analytics | Application | Yes | Yes | Yes | No | |
Google BigQuery | Database | Yes | Yes | Yes | No | |
Google Calendar | Application | Yes | Yes | Yes | No | |
Google Campaign Manager | Application | Yes | Yes | Yes | No | |
Google Contacts | Application | Yes | Yes | Yes | No | |
Google Drive | Database | Yes | Yes | Yes | No | |
Google Search Ads 360 | Application | Yes | Yes | Yes | No | |
Greenplum | Database | Yes | Yes | Yes | No | |
Hortonworks Hive | Database | Yes | Yes | Yes | No | |
HubSpot | Application | Yes | Yes | Yes | No | |
Hypersonic SQL | Database | Yes | Yes | Yes | Yes | |
IBM BigInsights | Database | Yes | Yes | Yes | No | |
IBM DB2 Hosted | Database | Yes | Yes | Yes | Yes | |
IBM DB2 UDB | Database | Yes | Yes | Yes | Yes | |
IBM DB2 Warehouse | Database | Yes | Yes | Yes | Yes | |
IBM DB2/400 | Database | Yes | Yes | Yes | Yes | |
Informix | Database | Yes | Yes | Yes | No | |
Jira | Application | Yes | Yes | Yes | No | |
Klaviyo | Application | Yes | Yes | Yes | No | |
Magento | Application | Yes | No | No | Depends on the driver | Requires driver installation |
Mailchimp | Application | Yes | Yes | Yes | No | |
MapR Hive | Database | Yes | Yes | Yes | No | |
Marketo | Application | Yes | Yes | Yes | No | |
Microsoft Dynamics 365 | Application | Yes | Yes | Yes | Yes | |
Microsoft SharePoint | Application | Yes | Yes | Yes | Yes | |
Microsoft SQL Server | Database | Yes | Yes | Yes | Yes | |
Mongo DB | Database | Yes | Yes | Yes | Yes | |
MySQL | Database | Yes | Yes | Yes | Yes | Make sure that the system variable property sql_require_primary_key is set to OFF. Otherwise, an ADW to MySQL mapping could fail with a “Table does not exist” error.
|
MySQL Heatwave | Database | Yes | Yes | Yes | Yes |
If MySQL Heatwave database is created with high availability, then write operation is not supported. Make sure that the system variable property |
Netezza | Database | Yes | No | No | Depends on the driver | Oracle Data Transforms uses the Netezza JDBC to connect to a NCR Netezza database. This driver must be installed in your Data Transforms userlibs directory. See Download the Netezza JDBC driver for more information.
|
Oracle | Database | Yes | Yes | Yes | Yes | For Data Integrator Web Edition, write operation is supported only on Oracle cloud database targets. For details refer to the Oracle terms of use before deploying the image from OCI marketplace. |
Oracle Analytics Cloud | Application | Yes | Yes | Yes | No | |
Oracle Business Intelligence Cloud (BICC) Connector | Application | Yes | Yes | Yes | No | For information about creating a connection using Oracle Business Intelligence Cloud (BICC) Connector, see Create an Oracle Business Intelligence Cloud Connector Connection. |
Oracle EBS | Application | Yes | Yes | Yes | Yes | |
Oracle Financials Cloud | Application | Yes | Yes | Yes | No | For information about creating a connection using Oracle Financials Cloud, see Create an Oracle Financials Cloud Connection. |
Oracle Fusion ERP | Application | Yes | Yes | Yes | No | |
Oracle Fusion Sales | Application | Yes | Yes | Yes | No | |
Oracle Fusion Service | Application | Yes | Yes | Yes | No | |
Oracle GoldenGate – OCI | Service | Yes | Yes | Yes | Yes | |
Oracle Marketing Cloud | Application | Yes | Yes | Yes | Yes | |
Oracle NetSuite | Application | Yes | Yes | Yes | No | For information about creating a connection using Oracle Netsuite, see Create and Use an Oracle NetSuite Connection. |
Oracle Object Storage | Database | Yes | Yes | Yes | Yes | For information about creating a connection using Oracle Object Storage, see Create an Oracle Object Storage Connection. |
Oracle People Soft | Application | Yes | Yes | Yes | No | |
Oracle Sales Cloud | Application | Yes | Yes | Yes | No | |
Oracle Service Cloud | Application | Yes | Yes | Yes | No | |
Oracle SIEBEL | Application | Yes | Yes | Yes | No | |
PayPal | Application | Yes | Yes | Yes | No | |
Pivotal HD | Database | Yes | Yes | Yes | No | |
Pivotal HDB | Database | Yes | Yes | Yes | No | |
PostgreSQL | Database | Yes | Yes | Yes | Yes | |
Qmetry | Application | Yes | Yes | Yes | No | |
QuickBooks Online | Application | Yes | Yes | Yes | No | |
QuickBooks Payments | Application | Yes | Yes | Yes | No | |
Quora Ads | Application | Yes | Yes | Yes | No | |
Sage | Application | Yes | Yes | Yes | No | |
Salesforce Chatter | Application | Yes | Yes | Yes | No | |
Salesforce.com | Application | Yes | Yes | Yes | Yes | |
SAP BW/4HANA | Database | Yes | Yes | Yes | No | |
SAP HANA | Application | Yes | Yes | Yes | No | |
SAP NetWeaver | Database | Yes | Yes | Yes | No | |
SAP S/4HANA Cloud | Application | Yes | Yes | Yes | No | |
Semrush | Application | Yes | Yes | Yes | No | |
ServiceNow | Service | Yes | Yes | Yes | No | |
Shopify | Application | Yes | Yes | Yes | No | Requires driver installation |
Snowflake | Database | Yes | Yes | Yes | Yes | |
Square | Application | Yes | Yes | Yes | No | |
Stripe | Application | Yes | Yes | Yes | No | |
Sybase As Anywhere | Database | Yes | Yes | Yes | Yes | |
Sybase as Enterprise | Database | Yes | Yes | Yes | Yes | |
Sybase AS IQ | Database | Yes | Yes | Yes | Yes | |
TeamCity | Application | Yes | Yes | Yes | No | |
Teradata | Database | Yes | No | No | Depends on the driver | Data Transforms uses the Teradata JDBC Driver to connect to a Teradata Database. To use Teradata as a data source the Teradata Gateway for JDBC must be running, and this driver must be installed in your Data Transforms userlibs directory. You will find the driver here: https://downloads.teradata.com/download/connectivity/jdbc-driver.
|
Teradata 17+ | Database | Yes | No | No | Depends on the driver | Data Transforms uses the Teradata JDBC Driver to connect to a Teradata Database. To use Teradata as a data source the Teradata Gateway for JDBC must be running, and this driver must be installed in your Data Transforms userlibs directory. You will find the driver here: https://downloads.teradata.com/download/connectivity/jdbc-driver.
|
Tumblr | Application | Yes | Yes | Yes | No | |
Application | Yes | Yes | Yes | No | ||
Veeva CRM | Application | Yes | Yes | Yes | Yes | |
Volusion | Application | Yes | Yes | Yes | No | |
Wistia | Application | Yes | Yes | Yes | No | |
WooCommerce | Application | Yes | No | No | Depends on the driver | Requires driver installation |
WordPress | Application | Yes | Yes | Yes | No | |
Workday | Application | Yes | No | No | Depends on the driver | Requires driver installation |
Xero | Application | Yes | Yes | Yes | No | Requires driver installation |
Yelp | Application | Yes | Yes | Yes | No | |
Zendesk | Application | Yes | Yes | Yes | No | Requires driver installation |
Zoho CRM | Application | Yes | Yes | Yes | No | |
Zoom | Application | Yes | Yes | Yes | No |
Parent topic: Work with Connections
Create Custom Connectors
APPLIES TO: Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.
To create a new connector:
- In the left pane, click Administration.
A warning message appears.
- Click Continue.
- In the left pane, click Custom Connections.
Custom Connections screen appears.
- Click Create Connection Type.
The Create Connection Type page appears.
- From the Category drop-down select the type of connection that you wish to create whether database, application, or service.
- Enter a name for the connection.
- Enter the name of the JDBC Driver.
- Click OK.
The newly created custom connection appears in the list and are available in the Create Connection page.
Click the Actions icon () next to the selected connection to perform the following operations:
- Select Edit, to edit the provided connection details.
- Click Export to export the connection. See Export Objects.
- Select Delete, to delete the created connection.
Note
You cannot delete custom connectors that have existing connections.
Parent topic: Work with Connections
Create a Data Transforms Connection for Remote Data Load
You can connect to an existing Data Transforms instance and run a data load remotely.
To define a Data Transforms connection:
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
- Select Edit to edit the provided connection details.
- Select Test Connection to test the created connection.
- Click Export to export the connection. See Export Objects.
- Select Delete Schema to delete schemas.
- Select Delete Connection to delete the created connection.
You can also search for the required connection to know its details based on the following filters:
- Name of the Connection.
- Technology associated with the created Connection.
Parent topic: Work with Connections
Create a Delta Share Connection
Databricks Delta Share is an open protocol for secure data sharing. Oracle Data Transforms integrates with Delta Share to load data to Oracle Autonomous Database. You can use the Delta Share connection to load data from Databricks or Oracle Data Share.
To use Databricks as a source, you need to specify the URL of the Delta Sharing server along with the bearer token that lets you access the Delta Lake share server. To use Oracle Data Share as a source, you need to specify the URL for the token end point along with a client ID and the secret key.
Creating the Delta Share Connection
To define a Delta Share connection:
- From the left pane of the Home page, click the Connections tab.
Connections page appears.
- Click Create Connection.
Create Connection page slides in.
- Do one of the following:
- In the Select Type field, enter the name or part of the name of the connection type.
- Select the Databases tab.
- Select Delta Share as the connection type.
- Click Next.
- The Connection Name field is pre-populated with a default name. You can edit this value.
- In the Share Endpoint URL textbox, enter the URL of the Delta Sharing server. Enter the value in the
<host>:<port>/<shareEndpoint>/
format. - In the Connection section, do one of the following:
- Select Oracle Data Share and provide the Token Endpoint URL, Client ID, and Client Secret for accessing the share.
You can get this information from the Delta Share Profile JSON document that you will need to download from supplied to you by the Share Provider. (This is also where they get the Share Endpoint URL from)
You can get this information from the Delta Share Profile JSON document that you can download from the activation link that is provided by the Data Share provider to access their share.
- Select Databricks and in the Bearer Token text box enter the token for connecting to the Delta Sharing server.
- Select Oracle Data Share and provide the Token Endpoint URL, Client ID, and Client Secret for accessing the share.
- If you need to use a proxy to access the Delta Share Server or Delta Share Storage configure the following settings:
- In the Proxy Host textbox, enter the host name of the proxy server to be used for the connection.
- In the Proxy Port textbox, enter the port number of the proxy server.
- Select the following checkboxes depending on where the proxy is required:
- Use Proxy to access Delta Share Server
- Use Proxy to access Delta Share Storage
- Click Test Connection, to test the established connection.
- After providing all the required connection details, click Create.
The new connection is created.
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
- Select Edit, to edit the provided connection details.
- Select Test Connection, to test the created connection.
- Click Export to export the connection. See Export Objects.
- Select Delete Schema, to delete schemas.
- Select Delete Connection, to delete the created connection.
You can also search for the required Connection to know its details based on the following filters:
- Name of the Connection.
- Technology associated with the created Connection.
Creating and Running a Delta Share Data Load
To load data from Delta Share into Oracle Autonomous Database, the Oracle connection user must be an Admin user. Admin privileges are required so that the Oracle user can create and insert data into tables in another schema.
When you run the data load, Data Transforms loads the data onto a corresponding table in the target schema. The data load runs incrementally. The very first time you run a data load, Data Transforms copies all the data into new tables. For every subsequent data load run, it only uploads the changes. Any additions or deletions in the records will reflect in the target tables. Note that if there is any metadata change in the table, for example a column is added, Data Transforms creates a new table to load the data on to the target server. You could create a workflow, add the data load as a step, create a schedule to run the workflows at a predefined time interval. See
To create and run a Delta Share data load:
- Do one of the following:
- On the Home page, click Load Data. The Create Data Load wizard appears.
In the Create Data Load tab, enter a name if you want to replace the default value, add a description, and select a project from the drop-down.
- On the Home page, click Projects, and then the required project tile. In the left pane, click Data Loads, and then click Create Data Load. The Create Data Load wizard appears.
In the Create Data Load tab, enter a name if you want to replace the default value and add a description.
- On the Home page, click Load Data. The Create Data Load wizard appears.
- Click Next.
- In the Source Connection tab,
- From the Connection Type drop-down, select Delta Share.
- from the Connection drop-down, select the required connection from which you wish to add the data entities.
- Select the share that you want to load tables from the Share drop-down. The drop-down lists all the shares for the selected connection.
- Click Next.
- In the Target Connection tab,
- From the Connection Type drop-down, select Oracle as the connection type.
Note
This drop-down lists only JDBC type connections. - From the Connection drop-down, select the required connection from to you wish to load the data entities.
- Enter a unique name in the Schema textbox.
- Click Save.
The Data Load Detail page appears listing all the tables in the selected share with their schema names.Note
For Delta Share data loads the Data Load Detail page only includes the option. You cannot apply different actions - incremental merge, incremental append, recreate, truncate, append - on the data entities before loading it to the target schema. This is to make sure that the data is consistent between the Delta Sharing server and the target schema. - From the Connection Type drop-down, select Oracle as the connection type.
- Click to run the data load.
A confirmation prompt appears when the data load starts successfully.
To check the status of the data load, see the Status panel on the right below the Target Schema details. For details about the Status panel, see Monitor Status of Data Loads, Data Flows, and Workflows. This panel shows links to the jobs that execute to run this data load. Click the link to monitor the progress on the Job Details page. For more information about jobs, see Create and Manage Jobs.
All the loaded data entities along with their details are listed in the Data Entities page. To view the statistics of the data entities, click the Actions icon () next to the data entity, click Preview, and then select the Statistics tab. See View Statistics of Data Entities for information.
Parent topic: Work with Connections
Create an Oracle Business Intelligence Cloud Connector Connection
Oracle Business Intelligence Cloud Connector (BICC) allows you to extract business data from a data source and load it into Autonomous Database.
To create an Oracle BICC connection you need to first configure external storage using the OCI Object Storage Connection tab in the BICC Console. You need to specify these connection details when you define the connection in Oracle Data Transforms.
You can use the BICC connection to choose the offerings whose data stores you want to extract. Data Transforms uses an Oracle Object Storage Data Server used by Oracle BICC to stage the extracted files, which you can then use as a source for mapping. Note that you cannot use an Oracle BICC connection as a target for mapping.
To define an Oracle BICC connection,
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
- Select Edit, to edit the provided connection details.
- Select Test Connection, to test the created connection.
- Click Export to export the connection. See Export Objects.
- Select Delete Schema, to delete schemas.
- Select Delete Connection, to delete the created connection.
You can also search for the required Connection to know its details based on the following filters:
- Name of the Connection.
- Technology associated with the created Connection.
Parent topic: Work with Connections
Create an Oracle Financials Cloud Connection
You can fetch real time transactional data from Oracle Financials Cloud REST endpoints, import the data entities into Data Transforms, and use them as a source in a data flow.
To define an Oracle Financials Cloud connection,
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
- Select Edit, to edit the provided connection details.
- Select Test Connection, to test the created connection.
- Click Export to export the connection. See Export Objects.
- Select Delete Schema, to delete schemas.
- Select Delete Connection, to delete the created connection.
You can also search for the required Connection to know its details based on the following filters:
- Name of the Connection.
- Technology associated with the created Connection.
Parent topic: Work with Connections
Create and Use an Oracle NetSuite Connection
You can use the Oracle NetSuite JDBC Driver or OAuth 2.0 authentication to connect to the Oracle NetSuite application. For Oracle NetSuite connections, Data Transforms allows you to load pre-built dataflows and workflows that you can run to transfer data from NetSuite to your target schema.
Creating the Oracle NetSuite Connection
You can create an Oracle Netsuite connection using JDBC connectivity or OAuth 2.0 authentication.
To define an Oracle NetSuite connection:
- From the left pane of the Home page, click the Connections tab.
Connections page appears.
- Click Create Connection.
Create Connection page slides in.
- Do one of the following:
- In the Select Type field, enter the name or part of the name of the connection type.
- Select the Applications tab.
- Select Oracle NetSuite as the connection type.
- Click Next.
- The Connection Name field is pre-populated with a default name. You can edit this value.
- To specify the connection details, do one of the following:
- To use JDBC connectivity, specify the following details:
- JDBC URL - Enter the URL of the SuiteAnalytics Connect server to be used for the connection.
- User - Enter the user name for connecting to the data server.
- In the Password text box enter the password for connecting to the data server.
- In the Account ID textbox, enter the account ID for connecting to the data server.
- In the Role ID textbox, enter the role ID for connecting to the data server.
- To use OAuth 2.0 authentication, click the OAuth 2.0 switch and then specify the following details:
- Username - Enter the name of the user who has role access to login to NetSuite using OAuth 2.0 connection.
- Account ID - Enter the account ID for connecting to the data server. You can get this information by logging into the NetSuite account and viewing the SuiteAnalytics connect information.
- Role ID - Enter the role ID for connecting to the data server. You can get this information by logging into the NetSuite account and viewing the SuiteAnalytics connect information.
- Client ID - Enter the client ID for connecting to the data server.
To obtain the client ID, create an Integration record in NetSuite by enabling OAuth 2.0 Client Credentials Flow. Copy and save the Client ID that is displayed when the Integration Record is successfully created.
- Public Certificate and Private Key - Use the OpenSSL commands to generate the key pair in the required PEM format. For example,
openssl req -x509 -newkey rsa:4096 -sha256 -keyout auth-key.pem -out auth-cert.pem -nodes -days 730
Paste the contents of
auth-cert.pem
in the Public Certificate field. Paste the contents ofauth-key.pem
in the Private Key field. - Certificate ID - Enter the Certificate ID for connecting to the data server.
To get the certificate ID, use the Netsuite OAuth 2.0 Client Credentials (M2M) Setup to add the public certificate file (
auth-cert.pem
) to the certificate key list and copy the generated Certificate ID.
- To use JDBC connectivity, specify the following details:
- If the source that you want to use for mapping is a saved search, you need to also specify the following details in Saved Search Extraction:
- Application ID: Enter the NetSuite Application ID for Data Transforms.
- Version: Enter the NetSuite version number.
- Select the checkbox in Build Data Model to install pre-built dataflows and workflows that you can run to extract data from NetSuite and move it to your Oracle target schema using the Build Data Warehouse wizard.
- Click Test Connection, to test the established connection.
- After providing all the required connection details, click Create.
The new connection is created.
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
- Select Edit, to edit the provided connection details.
- Select Test Connection, to test the created connection.
- Select Build Data Warehouse, to select the functional areas and create the NetSuite Data Warehouse in the target schema. See Using the Build Data Warehouse Wizard for more information.
- Click Export to export the connection. See Export Objects.
- Select Delete Schema, to delete schemas.
- Select Delete Connection, to delete the created connection.
You can also search for the required Connection to know its details based on the following filters:
- Name of the Connection.
- Technology associated with the created Connection.
Using the Build Data Warehouse Wizard
Data in your NetSuite account is grouped into business or subject areas in the Analytics Warehouse. The Build Data Warehouse wizard allows you to select the areas that you want to include in the newly created Data Warehouse.
To use the Build Data Warehouse Wizard:
- On the Home page, click the Connections tab. The Connections page appears.
- Click the Actions icon () next to the Oracle NetSuite connection that you want to use to build the data warehouse and click Build Data Warehouse.
The Build Data Warehouse wizard opens.
- From the Connection drop-down list, choose the Autonomous Database connection where your target schema resides.
- From the Staging Schema drop-down, all schema corresponding to the selected connection are listed in two groups:
- Existing Schema (ones that you've imported into Oracle Data Transforms) and
- New Database Schema (ones that you've not yet imported).
- Similarly select the Target Schema.
- Click Next.
- Select the NetSuite Business Areas that you want to use to transfer data from the NetSuite Data Warehouse to the target schema.
- Click Save.
Data Transforms starts the process to build the data warehouse. Click Jobs on the left pane of the Home page to monitor the progress of the process. When the job completes successfully, Data Transforms creates a Project folder that includes all the pre-built workflows and dataflows, which you can run to transfer data from the NetSuite connection to your target schema. See Running the Pre-Built Workflows to Load Data into the Target Schema for more information.
Running the Pre-Built Workflows to Load Data into the Target Schema
When the Build Data Warehouse wizard completes successfully, Data Transforms creates a project that includes all the pre-built data flows and workflows that you can run to extract data from a Netsuite connection and load it into your target schema.
To view and run the pre-built workflows:
- Click Projects on the left pane of the Home page and select the newly created NetSuite project.
- Click Workflows in the left pane. The following pre-built workflows are listed in the Project Details page:
- Stage NetSuite Source to SDS
- Extract Transaction Primary Keys
- Load SDS to Warehouse
- Apply Deletes
- All Workflows
- Click the Actions icon () next to the workflow you want to run and click Start.
Oracle recommends that you run All Workflows to execute all the pre-built workflows.
To see the status of the workflow, click Jobs from the left pane in the current project. When the job completes successfully, all the data from the NetSuite connection is loaded into the target schema.
Parent topic: Work with Connections
Create an Oracle Object Storage Connection
You can use Data Transforms to upload data from Oracle Object Storage to Autonomous Database.
The OCI Object Storage dedicated endpoints feature allows OCI customers to securely access the storage buckets. See Object Storage Dedicated Endpoints for more information. You need to use the new URL format when you create Object Storage connections in Data Transforms. For users that already have an Object Storage connection, the existing URL is automatically updated to the new URL format.
To create an Oracle Object Storage connection you need to have an Oracle Cloud Infrastructure username and an auth token. See Getting an Auth Token for information about how to generate the auth token. You need to specify these details when you define the connection in Oracle Data Transforms.
Note the following:
- To use an Oracle Object Storage connection to import data into Data Transforms, you must use a public IP address to access the compute node. If you want to use a private IP address to access the Object Storage service, make sure that you have access to the Internet.
- The supported file format for loading data from Oracle Object Storage to Autonomous Database and vice versa is CSV.
- The supported data types are Numeric, Double, String, and Date.
- Data load is not supported for Oracle Object Storage connections.
To define an Oracle Object Storage connection,
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
- Select Edit, to edit the provided connection details.
- Select Test Connection, to test the created connection.
- Click Export to export the connection. See Export Objects.
- Select Delete Schema, to delete schemas.
- Select Delete Connection, to delete the created connection.
You can also search for the required Connection to know its details based on the following filters:
- Name of the Connection.
- Technology associated with the created Connection.
Parent topic: Work with Connections
Create a REST Server Connection
To create a generic REST connector, you need to provide the JDBC URL, username, and password to connect to the endpoint. You can also create and upload a config file that contains information such as the authentication methods, endpoints, and tables that you want to import data entities from.
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
- Select Edit to edit the provided connection details.
- Select Test Connection to test the created connection.
- Click Export to export the connection. See Export Objects.
- Select Delete Schema to delete schemas.
- Select Delete Connection to delete the created connection.
You can also search for the required connection to know its details based on the following filters:
- Name of the connection.
- Technology associated with the created connection.
Creating a Generic REST Connection
To create this connection you need to specify the REST service URL and choose a temporary schema where Data Transforms can create data entities after the reverse-engineering operation.
To define a REST server connection:
- From the left pane of the Home page, click the Connections tab.
Connections page appears.
- Click Create Connection.
Create Connection page slides in.
- Do one of the following:
- In the Select Type field, enter the name or part of the name of the connection type.
- Select the Applications tab.
- Select Generic Rest as the connection type.
- Click Next.
- The Connection Name field is pre-populated with a default name. You can edit this value.
- In the REST Service URL textbox, enter the URL of the endpoint that services the REST resources.
- In the Proxy Host textbox, enter the host name of the proxy server to be used for the connection.
- In the Proxy Port textbox, enter the port number of the proxy server.
- In the User text box enter the user name for connecting to the REST endpoint.
- In the Password text box enter the password for connecting to the REST endpoint.
- Choose a connection from the Staging Connection drop-down list. The list displays only existing Autonomous Database connections. To use a different connection, create the connection before you reach this page.
- After providing all the required connection details, click Test Connection to test the connection.
- Click Create.
The new connection is created.
Creating a Generic Rest Connection Using a Config File
APPLIES TO: Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.
To create a generic REST connector, you need the JDBC URL, username, password, and a config file. The config file is a model file with the file_name.rest
naming convention that you need to upload when you create a REST Server connection. You need to specify the endpoints, table mappings, and the authentication methods to create the config file. You can create the config file using any text editor.
- From the left pane of the Home page, click the Connections tab.
Connections page appears.
- Click Create Connection.
Create Connection page slides in.
- Do one of the following:
- In the Select Type field, enter the name or part of the name of the connection type.
- Select the Applications tab.
- Select Generic Rest Config as the connection type.
- Click Next.
- The Connection Name field is pre-populated with a default name. You can edit this value.
- Use the Config File text box to upload the config file that you want to use.
- In the JDBC URL textbox, enter the URL to connect to the server.
- In the User and Password text boxes enter the user name and password for connecting to the REST endpoint. You may leave these fields blank if these values are not applicable or are already mentioned in the JDBC URL.
- After providing all the required connection details, click Test Connection to test the connection.
- Click Create.
The new connection is created.
Parent topic: Work with Connections