About Managing Data Connections

You can connect to a variety of data sources and remote applications to provide the background information for reports. You can blend the additional data from the various data sources with the prebuilt datasets to enhance business analysis.

Oracle Fusion Data Intelligence can connect to other pre-validated data sources such as Oracle Object Storage, cloud applications such as Google Analytics, and on-premises applications such as Oracle E-Business Suite.

You can view the usage of capacity for custom data that's loaded into Oracle Fusion Data Intelligence through the connectors in the Custom Data Usage dashboard available in the Common folder. The dashboard shows data loaded daily and monthly from each of the activated external data sources.
Custom Data Usage dashboard in the Common folder

Create a Data Connection Type

Connection Type specifies the source to which you're connecting. A connection type can have multiple connections.

You can create a custom data source type for any remote data connection.
  1. Sign in to your service.
  2. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.

    Description of fawag_manage_connections.png follows

  4. On the Manage Connections page, click Create and then click Connection Type.

    Description of fawag_manage_connections_create.png follows

  5. In the Create Connection Type dialog, enter the Name, Identifier, and Prefix for warehouse for the connection type.

    Description of fawag_create_connection_type.png follows

  6. Click Add Property and enter the parameters for each property that defines the connection.
  7. When you've finished adding the connection properties, you can reorder them as needed.
  8. Click Save the Connection Type.
The new connection is available on the Connections page.

Edit a Data Connection Type

If the properties or parameters for a data connection type change, you can edit them.

  1. Sign in to your service.
  2. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Connection Types and then click or search for the connection type you want to edit.
    You can't edit or delete Oracle-managed connections.
  5. Click the Action button next to the connection type you want to change.
  6. In the dialog box for the connection type, edit the details for your connection type, and then click Save.

Delete a Data Connection Type

You can delete a data connection type if you don't need it anymore.

Note

After you delete a connection type, you can't create new data connections to it.
  1. Sign in to your service.
  2. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Connections and then select or search for the connection you want to test.
  5. Click the Action menu for the connection and select Delete.
  6. In the Delete Connection dialog box, click Delete.

Create a Data Connection

You create a data connection to a data source to load data from that source into Oracle Fusion Data Intelligence. You can create a connection for any available connection type.

While creating the connection, the system populates the connection name based on the connection source and you can’t change it while creating the connection or edit it later.

  1. Sign in to your service.
  2. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Create and then click Connection.
  5. In Create Connection, select the applicable option in Usage Type, and then click the connection type you want to create.

    Description of fawag_create_connection.png follows

  6. In the dialog box for the connection, enter the details for your connection in the fields.
  7. Click Save.
The new connection is available on the Connections page.

Test a Data Connection

After you create a data connection, you should test it to ensure it works properly.

  1. Sign in to your service.
  2. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Connections, then select or search for the connection you want to test.
  5. Click the Action menu for the connection and select Test Connection.
  6. On the Request History page, check the status of the request to test the connection.

Update a Data Connection

When you first make a data connection, or when you make changes, you need to initialize and refresh it.

  1. Sign in to the Oracle Cloud Infrastructure Console.
  2. In Oracle Cloud Infrastructure Console, click the Navigation menu icon in the top left corner.
  3. Click Analytics & AI. Under Analytics, click Data Intelligence.
  4. Navigate to your service instances page.
  5. On the Instances page, click the instance for which you want to update the service.
  6. Click Connections, then select or search for the connection you want to test.
  7. Click the Action menu for the connection and select Initialize/Refresh Connection.

Delete a Data Connection

You can delete a custom data connection if you don't need it anymore.

Ensure that you delete the functional areas, data augmentations, and custom data configurations related to the data connection before deleting it. You can't update or load data from deleted data connections to the warehouse.

Note

  1. Sign in to your service.
  2. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click Manage Connections under Configurations.
  4. On the Manage Connections page, click Connections, then select or search for the connection you want to test.
  5. Click the Action menu for the connection and select Delete.
  6. In the Delete Connection dialog box, click Delete.

Connect With On-premises Sources

Connect with your on-premises applications to provide the background information for reports.

You can blend the additional data from these data sources with the prebuilt datasets to enhance business analysis.

Set up the Remote Agent to Load Data into Fusion Data Intelligence (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises systems such as E-Business Suite, Peoplesoft, and JD Edwards, load data from these on-premises systems into Oracle Fusion Data Intelligence, and then use the on-premises data to create data augmentations.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Data Intelligence instance. You can extract and load the on-premises data into Oracle Fusion Data Intelligence only once in 24 hours.
Note

After configuring the remote agent on the Data Configuration page, wait for few minutes, refresh the remote agent page, and when you see the Agent Initialised message, you can proceed with other operations such as testing the connection to the remote agent, testng the connection to the remote source like EBusiness Suite, and refreshing the metadata. This enables you to run these jobs without timeout failures because data pipeline has a default timeout of 15 minutes.

Ensure that Remote Agent is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Download the remote agent Docker image from here.
  2. Identify a host to deploy the remote agent.
    The host that you identify must meet these minimum system requirements for the basic configuration of a single source agent:
    • CPU: 4 (CORE/CPU)
    • Memory: 8 GB
    • Storage: 8 GB
    Note

    Ensure that the host is able to make a JDBC connection to the applicable database.
  3. Depending on your host, complete these steps:
    • For a Linux machine, do the following:
    1. Copy the Docker image to the host and load it using this script:
      docker load -i <docker image zip>
      //List the images docker images
    2. Create and run the Docker container using this script:
      docker run -d -p 9091:9091 --name remoteagent -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>

      If the remote agent user interface isn't accessible, then run this script:

      sudo docker run -d -p 9091:9091 --name remoteagent --network host -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/ -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <docker image Id>
      Note

      Ensure that the logs directory in /faw/logs/RemoteAgent/ has write permissions and the config folder in /faw/software/remoteagent/config/ is present in case you need to add custom properties.
    3. Verify that the container has started successfully using this script:
      run '$ docker ps'
    4. Configure the extract service URL to connect using this information:
      1. Sign in to the remote agent user interface using https://<host>:9091/extractservice-remoteagent/index.html.
      2. Configure the extract service URL that the remote agent connects to and configure any outgoing proxies if required using the applicable extract service end points. You can form the extract service url based on your Oracle Fusion Data Intelligence URL by replacing ui/oax/ with the extract service context path. For example, if your product URL is https://myinstance.example.com/ui/oax/ then the extract service URL would be https://myinstance.example.com/extractservice.
    5. In the remote agent user interface, click Configure to configure the agent.
    6. Copy the configuration details from the text box or download the configuration details.
      You use it to set up the connection on the Data Configuration page in Oracle Fusion Data Intelligence.
    7. Optional: If you need to upgrade the remote agent in the Linux host, then use the following script:
      
      Stop Remoteagent docker
      sudo docker stop remoteagent
      
      Remove Remoteagent docker
      sudo docker rm remoteagent
      
      load the image from tar ball
      docker load -i <docker image zip>   
       
      Get Image ID from below 
      sudo docker images
      
      Run the image:       
      sudo docker run -d -p 9091:9091 --name remoteagent -v /faw/software/remoteagent/config/:/faw/software/remoteagent/config/:Z -v /faw/logs/RemoteAgent/:/faw/logs/RemoteAgent:Z <imageid>
    • For a Windows machine, do the following:
    1. Load the docker using this script:
      docker load -i .\RemoteAgentFrameworkDocker_19.5.tar.gz

      Ensure to replace "19.5" with the latest docker version number in the script.

    2. Provide permission to these directories
      icacls "C:/faw/software/remoteagent/" /grant Everyone:F /t
               icacls "C:/faw/logs/RemoteAgent/" /grant Everyone:F /t
      Note

      • icacls: This is a command-line tool used to manage file and directory access control lists (ACLs).
      • C:/faw/software/remoteagent/": This specifies the target directory where ACL changes will be applied.
      • /grant Everyone:F: This grants the "Everyone" group Full Control (F) permissions to the specified directory and all subdirectories and files recursively due to the "/t" switch. The user needs write permissions, hence "F" (full control was given). To use user-specific permission, replace "Everyone" with username.
    3. Run the docker using this script:
      docker run -d -p 9091:9091 --name remoteagent -v C:/faw/software/remoteagent/config/:/faw/software/remoteagent/config/
            -v C:/faw/logs/RemoteAgent/:/faw/logs/RemoteAgent <imageid>
  4. You can either use the default TLS certificate provided in the keystore with the remote agent or provide your own keystore and TLS certificate.
    To provide your own keystore and TLS certificate, complete these steps:
    1. If you don’t have a keystore, then generate one using the keytool CLI command such as:
      keytool -genkeypair -alias springboot -keyalg RSA -keysize 4096 -storetype PKCS12 -keystore springboot.p12 -validity 3650 -storepass password -ext SAN=dns:test.example.com
      Instructions on how to use the keytool CLI command can be found here.
    2. After generating the keystore, place it in the /faw/software/remoteagent/config directory of your local instance. Once in this directory, you must create a startup-config.properties file. This properties file contains the keystore information and credentials needed by the remote agent to connect to it and fetch the TLS certificate.
    3. In the startup-config.properties file, add the following key-value properties:
      server.ssl.enabled=true
      server.ssl.key-store=</PATH/TO/KEYSTORE_FILE>
      server.ssl.key-store-password=<KEYSTORE_PASSWORD>
      server.ssl.key-store-type=<KEYSTORE_TYPE>
      server.ssl.key-alias=<KEYSTORE_ALIAS>
      server.ssl.key-password=<KEY_PASSWORD>
      The values for each key pair are as follows:
      • </PATH/TO/KEYSTORE_FILE> - File location of the keystore file
      • <KEYSTORE_PASSWORD> - Password specified for the keystore
      • <KEYSTORE_TYPE> - Type specified for the keystore, should be either JKS or PKCS12
      • <KEYSTORE_ALIAS> - Alias specified for the keystore
      • <KEY_PASSWORD> - Certificate password, NOT the password for the keystore
    4. After specifying all the required properties in the startup-config.properties file, restart the remote agent docker. The remote agent uses your TLS certificate instead of the default.
  5. Configure the remote agent on the Data Configuration page in Oracle Fusion Data Intelligence using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Configuration in Usage Type, and then select Remote Agent as the connection type.
      Remote Agent connection option

    5. In the Create Connection Remote Agent dialog, in Connection Name, you can modify the default name and verify that Remote is displayed in Connectivity Type.
      Create Connection Remote Agent dialog

    6. Enter an email address to receive notifications in Notification Email, provide the Identifier and Host, in Public Key, click Upload File or Drop Above to fill in the details of the remote agent, and then click Save. You can add the configuration details file that you had downloaded or use the configuration details that you had copied after configuring the remote agent.

Load Data from On-premises E-Business Suite into Fusion Data Intelligence (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises Oracle E-Business Suite system.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Data Intelligence instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle Fusion Data Intelligence only once a day. Ensure that the user credentials you provide have access to the specific tables they need to extract data from within the EBS schema, whose URL you provide while creating the connection.

Ensure that Oracle E-Business Suite On-Prem is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your on-premises Oracle E-Business Suite system into Oracle Fusion Data Intelligence.
  2. Configure the remote agent and E-Business Suite data source on the Data Configuration page in Oracle Fusion Data Intelligence using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and select Oracle E-Business Suite On-Prem as the connection type.
      E-Business Suite connection option

    5. In Create Connection for Oracle E-Business Suite On-Prem, select Remote as connectivity type.
      Create Connection for Oracle E-Business Suite On-Prem dialog

    6. In the Remote Agent field, select the remote agent connection that you created, for example, EBS-Remote Agent. Enter an email address to receive notifications in Notification Email, provide the credentials in User Name and Password, the E-Business Suite connection using the JDBC format such as jdbc:oracle:thin:@<HOST>:<PORT>/<DB_NAME/SID_NAME> in URL, and select Application Object Library and General Ledger offerings mandatorily, and any other E-Business Suite offerings that you want to load data from in Offerings.
    7. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection. You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for E-Business Suite On-Prem unless you perform a metadata extract.
    8. Confirm that you see the Remote Agent and E-Business Suite connections on the Manage Connections page.
    9. Test both the connections by selecting the Test Connection option in Actions. You can check the statuses of all these requests on the Data Configuration Request History page.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the E-Business Suite data. Select the applicable E-Business Suite source tables. See Augment Your Data.

Load Data from On-premises MySQL Database into Fusion Data Intelligence (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises MySQL database.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Data Intelligence instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle Fusion Data Intelligence only once a day.

Ensure that MySQL On-Prem is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your on-premises MySQL database into Oracle Fusion Data Intelligence.
  2. Check the port number for your on-premises MySQL database and create a service request with server host and port details to enable network connectivity to the on-premises MySQL server.
  3. Specify the remote agent and configure the on-premises MySQL database on the Data Configuration page in Oracle Fusion Data Intelligence by following these steps:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select MySQL On-Prem as the connection type.
      MySQL On-Prem connection option

    5. In Connectivity Type, select Remote.
    6. In the Remote Agent field, select the remote agent connection that you created, for example, MySQL-Remote Agent. Enter an email address to receive notifications in Notification Email, and provide these details:
      • Host Name: Enter the host name of MySQL server such as 100.111.252.64
      • Port Number: Enter the port number where the server is listening such as 3306
      • Database: Enter the database name you need to connect to such as airportdb
      • Credentials to access the database in User Name and Password
      • Last Update Date Column Pattern: Enter "%r%a%o%%"
        Note

        If column name matches with pattern provided, then Fusion Data Intelligence marks the column name with isLastUpdateDate = true and uses it for the incremental extract. For example, if pattern provided is “%mo%fie%te%“, then the column name modifiedDate is marked as isLastUpdateDate = true.
      • Initial Extract Date Column Pattern: Enter "%e%art%new"
        Note

        If column name matches with pattern provided, then Fusion Data Intelligence marks the column name with isCreationDate = true and uses it for the initial extract date extraction. For example, if pattern provided is: “%cr%ted%te%“, then the column name createdDate is marked as isCreationDate = true.

      Create Connection for MySQL On-Prem dialog

    7. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection. You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for on-premises MySQL database unless you perform a metadata extract.
    8. Confirm that you see the Remote Agent and on-premises MySQL database connections on the Manage Connections page.
    9. Test both the connections by selecting the Test Connection option in Actions. You can check the statuses of all these requests on the Data Configuration Request History page.
  4. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the on-premises MySQL database data. Select the applicable on-premises MySQL database source tables. See Augment Your Data.

Load Data from On-premises PeopleSoft into Fusion Data Intelligence (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises Oracle PeopleSoft system.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Data Intelligence instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle Fusion Data Intelligence only once in 24 hours.
Ensure that Remote Agent and depending on the functional module you want to connect to, the applicable feature is enabled on the Enable Features page prior to creating this connection:
  • Oracle PeopleSoft On-Prem - Campus Solutions
  • Oracle PeopleSoft On-Prem - Financials
  • Oracle PeopleSoft On-Prem - Human Resources
  • Oracle PeopleSoft On-Prem - Learning Management
See Make Preview Features Available.
  1. Set up the remote agent to load data from your on-premises Oracle E-Business Suite system into Oracle Fusion Data Intelligence.
  2. Configure the remote agent and PeopleSoft data source on the Data Configuration page in Oracle Fusion Data Intelligence using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select the connection type based on the functional module that you want to connect to. For example, to connect to the "Financials" module, select Oracle PeopleSoft On-Prem - Financials as the connection type.
      Oracle PeopleSoft On-Prem - Financials connection option

    5. In Create Connection for Oracle PeopleSoft On-Prem - Financials dialog, in Connectivity Type, verify that Remote is selected automatically.
      Create Connection for Oracle PeopleSoft On-Prem - Financials dialog

    6. In Remote Agent, select the remote agent connection that you created earlier, for example, Remote Agent.
    7. Enter an email address to receive notifications in Notification Email, provide credentials for your PeopleSoft source in User Name and Password, and the URL of your PeopleSoft source in URL.
    8. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection. You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for PeopleSoft unless you perform a metadata extract.
    9. Click Save.
    10. On the Manage Connections page, select Actions for the PeopleSoft connection and then select Test Connection. You can check the statuses of all these requests on the Data Configuration Request History page.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the PeopleSoft data. Select the applicable PeopleSoft source tables. See Augment Your Data.

Load Data from On-premises JD Edwards into Fusion Data Intelligence (Preview)

As a service administrator, you can use an extract service remote agent to connect to your on-premises JD Edwards system and use the JD Edwards data to create data augmentations.

After connecting to your on-premises system, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Data Intelligence instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the on-premises data into Oracle Fusion Data Intelligence only once in 24 hours.

Ensure that Remote Agent and Oracle JD Edwards On-Prem are enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your on-premises Oracle JD Edwards system into Oracle Fusion Data Intelligence.
  2. Configure the remote agent and JD Edwards data source on the Data Configuration page in Oracle Fusion Data Intelligence using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select Oracle JD Edwards On-Prem as the connection type.
      Oracle JD Edwards On-Prem connection option

    5. In Create Connection for Oracle JD Edwards On-Prem, in Connectivity Type, verify that Remote is selected automatically.
      Create Connection for Oracle JD Edwards On-Prem

    6. In Remote Agent, select the remote agent connection that you created earlier, for example, Remote Agent.
    7. Enter an email address to receive notifications in Notification Email, provide credentials for your JD Edwards source in User Name and Password, and the URL of your JD Edwards source in URL.
    8. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection. You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for JD Edwards unless you perform a metadata extract.
    9. Confirm that you see the Remote Agent and JD Edwards connections on the Manage Connections page.
    10. Test both the connections by selecting the Test Connection option in Actions. You can check the statuses of all these requests on the Data Configuration Request History page.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the JD Edwards data. Select the applicable JD Edwards source tables. See Augment Your Data.

Load Data from SQL Server into Oracle Fusion Data Intelligence (Preview)

As a service administrator, you can use the Oracle Fusion Data Intelligence extract service to acquire data from SQL Server and use it to create data augmentations.

Ensure that SQL Server is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your SQL Server into Oracle Fusion Data Intelligence.
  2. In Oracle Fusion Data Intelligence, create the SQL Server data connection using these instructions:
    1. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select SQL Server as the connection type.

      SQL Server connection option

    5. In the dialog for the SQL Server connection, enter these details and click Save:
      • Connectivity Type: Select Remote.
      • Remote Agent: Select the remote agent that you had set up to load data from your SQL Server.
      • Notification Email: Enter an email address to receive notifications.
      • SQL Server Name: Enter the SQL server name.
      • Port Number: Enter the port number on which your SQL server is available.
      • Database Name: Enter the database name in your SQLServer instance.
      • Schema Name: Enter the name of the schema for the dataset you want to load to run analytics on.
      • User Name and Password: Enter the credentials for your SQL Server instance.
      • Initial Extract Date Column Pattern: MM/dd/yyyy is the date format in your initial extract column; sample date is 1/23/1998.
      • Last Update Date Column Pattern: Last update date shows when the last update was done in your SQL Server database.

      Create Connection for SQL Server dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for SQL Server unless you perform a metadata extract.
    7. Click Save.
  3. On the Manage Connections page, select Actions for the SQL Server connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the SQL Server data. Select the applicable SQL Server source tables. See Augment Your Data.

Connect with Cloud File Storage Sources

Connect with your file storage-based cloud sources to provide the background information for reports.

You can blend the additional data from these data sources with the prebuilt datasets to enhance business analysis. The file-based connectors support only UTF-8 encoding for data files that you upload.

About OpenCSV Standards

The CSV parser in the extract service for file extractors uses Opencsv. The csv files that are processed by extractservice must be compliant with the Opencsv standards.

See Opencsv File Standards. In addition to the CSV parser, the extract service supports files that are compliant with RFC4180 specification. The RFC 4180 CSV parser enables you to ingest single-line and multi-line data within your .csv files. The RFC 4180 parser supports ingesting data records with up to 99 line breaks. For more information on the RFC 4180 specification, see Common Format and MIME Type for Comma-Separated Values (CSV) Files.

Keep in mind the following:
  • While using special characters:
    • For strings without special characters, quotes are optional.
    • For strings with special characters, quotes are mandatory. For example, if a string has a comma, then you must use quotes for the string such as "Abc, 123".
    • Escapes (backslash character) are optional.
    • Backslash characters must always be escaped. For example, if there is a backslash in your data, use the following format: "Double backslash ( \\ ) abc".
    • To manage quotes inside a quoted string, use a backslash inside the quotes: "Asd \" asd".
  • The Opencsv parser allows you to select one of these available characters as a delimiter:
  • Comma (,)
  • Semi-colon ( ; )
  • Pipe (|)
  • Tab ( )

About Date and Timestamp Formatting for CSV File-based Extractors

Extractors such as Secure FTP (SFTP), Amazon Simple Storage Service (AWS S3), and Oracle Object Storage Service use CSV data files that have date and timestamp fields.

For the CSV file-based extractors, use the format examples to provide the values in the CSV Date Format, and CSV Timestamp Format fields while entering the source connection details.
Note

Ensure that the date and timestamp formats for the data files match the date and timestamp formats in your source; for example, if you've used MM/dd/yyyy and MM/dd/yyyy hh:mm:ss in your source, then you must specify the same formats while creating the applicable data connections.
Examples
Example Pattern
1/23/1998 MM/dd/yyyy
1/23/1998 12:00:20 MM/dd/yyyy hh:mm:ss
12:08 PM h:mm a
01-Jan-1998 dd-MMM-yyyy
2001-07-04T12:08:56.235-0700 yyyy-MM-dd'T'HH:mm:ss.SSSZ
The guidelines to define the format are:
Letter Meaning
M Month
d Day
y Year
h Hour (0-12)
H Hour (0-23)
m Minute
s Second
S Milli Second
a AM/PM
Z Timezone

Load Data from Amazon Simple Storage Service into Oracle Fusion Data Intelligence (Preview)

As a service administrator, you can use the Oracle Fusion Data Intelligence extract service to acquire data from Amazon Simple Storage Service (AWS S3) and use it to create data augmentations.

Ensure that AWS S3 is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle Fusion Data Intelligence, create the AWS S3 data connection using these instructions:
    1. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select AWS S3 as the connection type.

      AWS S3 connection option

    5. In the dialog for the AWS S3 connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide applicable details of your AWS S3.

      Create Connection for AWS S3 dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for AWS S3 unless you perform a metadata extract.
    7. Click Save.
  2. On the Manage Connections page, select Actions for the AWS S3 connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the AWS S3 data. Select the applicable AWS S3 source tables. See Augment Your Data.

Load Data from Oracle Object Storage into Fusion Data Intelligence

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from Oracle Object Storage Service and use it to create data augmentations.

The recommended approach is to create one augmentation from one source table after acquiring data from Oracle Object Storage Service. After completion of augmentation, Fusion Data Intelligence renames the source table in this case and if you create more than one augmentation from the same source, all other augmentations may fail with a message that the source file wasn't found.

  1. Store the following details in a text file to use while creating the connection to Oracle Object Storage Service in Fusion Data Intelligence:
    1. In Oracle Object Storage Service, create the Remote Host Extract Files directory as the base folder in which you must place all your data files. Note down the name of this directory. See the "To create a folder or subfolder" section in Using the Console.
    2. Obtain the URL of the Oracle Object Storage Service by signing into the Oracle Cloud Infrastructure Console and navigating to the bucket to get the details of the region, namespace, and bucket name. For example, the URL must be in the https://objectstorage.<region>.oraclecloud.com/n/<namespace>/b/<name of the bucket> format. See the "To view bucket details" section in Using the Console.
    3. Obtain a user’s OCID by navigating in the Oracle Cloud Infrastructure Console to Identity & Security, and then Users. On the Users page, search for a user who has access to the bucket used in the connector and copy the OCID. Obtain the tenancy ID by clicking your profile icon and then Tenancy in the Oracle Cloud Infrastructure Console. Under Tenancy information, copy the OCID. See Where to Get the Tenancy's OCID and User's OCID.
    4. Obtain the fingerprint for a user from the Oracle Cloud Infrastructure Console. Navigate to API Keys under Resources on the user page, and then click Add API Keys. In the Add API Keys dialog, ensure that Generate API Key Pair is selected. Download the private and public keys using the Download Private Key and Download Public Key options. You must copy the entire text of the private key along with the comments before and after the actual key. These comments could be as simple as: “---------------Begin RSA Private Key --------” and “-----------End of RSA Private Key----------“. Don’t copy only the alphanumeric key without the header and footer comments. In the Add API Keys dialog, select Choose Public Key File to upload your file, or Paste Public Key, if you prefer to paste it into a text box and then click Add. Copy the fingerprint that you see after you upload the public key in the Console. It looks something like this:12:34:56:78:90:ab:cd:ef:12:34:56:78:90:ab:cd:ef.
  2. In Fusion Data Intelligence, create the Oracle Object Storage connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select Oracle Object Storage Service as the connection type.
      Oracle Object Storage Service connection option
    5. In the dialog for the Oracle Object Storage Service connection, select Standard in Connectivity Type and enter these details:
      • Connection Name: Object Storage
      • Connection Type: Standard
      • Notification Email: An email address to receive notifications
      • Remote Host Extract Files Directory: Name of the base folder in which you must place all your data files in Oracle Object Storage Service
      • URL: URL of the Oracle Object Storage Service that you noted down in a text file
      • User ID: OCID of a user that has access to the applicable bucket in Oracle Object Storage Service
      • Finger Print: The fingerprint that you saw and copied after you uploaded the public key in the Console. It looks something like this: 12:34:56:78:90:ab:cd:ef:12:34:56:78:90:ab:cd:ef
      • Tenant ID: Tenancy in the Oracle Infrastructure Cloud Console that you noted down in the text file
      • Private Key: Paste the private key contents that you previously downloaded
      • File Type: csv
      • CSV Delimiter: Delimiter for the data files
      • Date format for the data files must match the date format in your Oracle Object Storage Service source; for example, if you've used MM/dd/yyyy (01/23/1998) in your source, then you must specify the same format in CSV Date Format. See About Date and Timestamp Formatting for CSV File-based Extractors.
      • Timestamp format for the data files must match the timestamp format in your Oracle Object Storage Service source; for example, if you've used MM/dd/yyyy hh:mm:ss (01/23/1998 12:00:20) in your source, then you must specify the same format in CSV Timestamp Format
      Create Connection for Oracle Object Storage Service dialog
    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for the Oracle Object Storage Service unless you perform a metadata extract.
    7. Click Save.
  3. In Oracle Object Storage Service:
    1. Create the folder structure in the Bucket using these guidelines:
      Base folder
      • The base folder in the bucket must match with the details provided in the connection.
      • Inside the base folder, ensure to place each file in its own folder.
      • Ensure that the Prefix of Data_Store_Name (same as Folder name) and Files in the target folder match exactly.

      See the "To create a folder or subfolder" section in Using the Console.

    2. Inside the base folder, create the metadata file for the Data Store List. This file lists the supported data stores. Each data store is a folder that has the actual file used in data augmentation, for example, ASSETS. Ensure that the file name and folder name match and there aren’t any special characters (including space) in the datastore, folder or file names.
      Base folder structure
    3. Create the metadata file for each data file under the data store folder using these guidelines:

      The META_DATASTORES.csv must have these columns:

      • DATA_STORE_NAME - A mandatory column to identify the data store name.
      • DATA_STORE_LABEL - A non-mandatory column that identifies the description of the data store.

      Each folder must have:

      • A data file that has the actual data that gets loaded into Fusion Data Intelligence. This file must have a prefix with the DATA STORE NAME.
      • A metadata file for the list of columns contains all the column information on the data. This file must have a Prefix with META_DATASTORES_<DATA_STORE_NAME>_COL.
        • For the columns in this metadata, ensure the following:
        • If column name is ABC, then metadata can be ABC or “ABC” - the double quotes are ignored.
        • If column name is “ABC”, then metadata must be “”ABC”” – the first double quotes are ignored.

      Example

      In the image, the folder name is ACTIVITY_TYPES. Hence, the data store name is ACTIVITY_TYPES. You can confirm this from the META_DATASTORES.csv file. In this example, the file is named ACTIVITY_TYPES.xlsx or ACTIVITY_TYPES.csv. The metadata file must be META_DATASTORES_ACTIVITY_TYPES_COL.csv.Sample folder and metadata file

      The META_DATASTORES_ACTIVITY_TYPES_COL.csv has these columns:
      • DATA_STORE_NAME - This is a mandatory column.
      • COLUMN_NAME - This is a mandatory column.
      • COLUMN_LABEL - This is a non-mandatory column.
      • DATA_TYPE – This is a mandatory column.
      • WIDTH – This column identifies the string length.
      • PRECISION - This column value must be Numeric data type.
      • SCALE - This column value must be Numeric data type.
      • KEY_SEQUENCE - This is a mandatory column that identifies the Primary Key definition. If you’re using the composite primary key, then use column order numbers as values.
  4. In Fusion Data Intelligence, on the Manage Connections page, select Actions for the Oracle Object Storage Service connection and then select Test Connection.
  5. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the data from the Oracle Object Storage Service. Select the applicable source tables from the Oracle Object Storage Service data. See Augment Your Data.

Load Data from a Secure FTP Source into Fusion Data Intelligence (Preview)

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from a secure FTP source (SFTP) and use it to create data augmentations.

Ensure that SFTP is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Check the port number for your secure FTP database and create a service request to get the port opened.
    Note

    You must provide the IP address of the SFTP server, which should be a public IP and can’t be hostname and a fully qualified domain name (FQDN) or a class A private IP.
  2. In Fusion Data Intelligence, create the SFTP data connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select SFTP as the connection type.

      SFTP connection option

    5. In the dialog for the SFTP connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide applicable values in Remote Host, User Name, Private Key, Remote Host Extract Files Directory, File Type, CSV Delimiter, CSV Date Format, and CSV Timestamp Format. In Lock Metadata, specify whether you want to turn off the metadata extracts after first refresh if metadata isn't going to change. This option is useful if the flag to derive metadata from data files using the metadata utility is turned on in your source. In Remote Host, ensure that you specify an SFTP Server that supports FIPS Compliant key exchange algorithms.
      Ensure the following:
      • The table name and file name in your SFTP source needs to be the same.
      • The private key you provide is in the valid OpenSSH format and the minimum number of bits in the key should be 2048.
      • Date format for the data files must match the date format in your SFTP source; for example, if you've used MM/dd/yyyy (01/23/1998) in your source, then you must specify the same format in CSV Date Format. See About Date and Timestamp Formatting for CSV File-based Extractors.
      • Timestamp format for the data files must match the timestamp format in your SFTP source; for example, if you've used MM/dd/yyyy hh:mm:ss (01/23/1998 12:00:20) in your source, then you must specify the same format in CSV Timestamp Format.

      Create Connection for SFTP dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for SFTP unless you perform a metadata extract.
    7. Click Save.
  3. In your SFTP source:
    1. Create the folder structure in the Bucket using these guidelines:
      Base folder
      • The base folder in the bucket must match with the details provided in the connection.
      • Inside the base folder, ensure to place each file in its own folder.
      • Ensure that the Prefix of Data_Store_Name (same as Folder name) and Files in the target folder match exactly.

      See the "To create a folder or subfolder" section in Using the Console.

    2. Inside the base folder, create the metadata file for the Data Store List. This file lists the supported data stores. Each data store is a folder that has the actual file used in data augmentation, for example, ASSETS. Ensure that the file name and folder name match and there aren’t any special characters (including space) in the datastore, folder or file names.
      Base folder structure
    3. Create the metadata file for each data file under the data store folder using these guidelines:

      The META_DATASTORES.csv must have these columns:

      • DATA_STORE_NAME - A mandatory column to identify the data store name.
      • DATA_STORE_LABEL - A non-mandatory column that identifies the description of the data store.

      Each folder must have:

      • A data file that has the actual data that gets loaded into Fusion Data Intelligence. This file must have a prefix with the DATA STORE NAME.
      • A metadata file for the list of columns contains all the column information on the data. This file must have a Prefix with META_DATASTORES_<DATA_STORE_NAME>_COL.
        • For the columns in this metadata, ensure the following:
        • If column name is ABC, then metadata can be ABC or “ABC” - the double quotes are ignored.
        • If column name is “ABC”, then metadata must be “”ABC”” – the first double quotes are ignored.

      Example

      In the image, the folder name is ACTIVITY_TYPES. Hence, the data store name is ACTIVITY_TYPES. You can confirm this from the META_DATASTORES.csv file. In this example, the file is named ACTIVITY_TYPES.xlsx or ACTIVITY_TYPES.csv. The metadata file must be META_DATASTORES_ACTIVITY_TYPES_COL.csv.Sample folder and metadata file

      The META_DATASTORES_ACTIVITY_TYPES_COL.csv has these columns:
      • DATA_STORE_NAME - This is a mandatory column.
      • COLUMN_NAME - This is a mandatory column.
      • COLUMN_LABEL - This is a non-mandatory column.
      • DATA_TYPE – This is a mandatory column.
      • WIDTH – This column identifies the string length.
      • PRECISION - This column value must be Numeric data type.
      • SCALE - This column value must be Numeric data type.
      • KEY_SEQUENCE - This is a mandatory column that identifies the Primary Key definition. If you’re using the composite primary key, then use column order numbers as values.
  4. On the Manage Connections page, select Actions for the SFTP connection and then select Test Connection.
  5. After the connections are successfully established, navigate to the Data Augmentation tile on the Data Configuration page, select the connection in Data Source, and create a data augmentation using the SFTP data. Select the applicable secure FTP source tables. See Augment Your Data.

Load Data from Azure Storage into Oracle Fusion Data Intelligence (Preview)

As a service administrator, you can use the Oracle Fusion Data Intelligence extract service to acquire data from Azure Storage and use it to create data augmentations.

Ensure that Azure Storage is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle Fusion Data Intelligence, create the Azure Storage data connection using these instructions:
    1. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select Azure Storage as the connection type.
    5. In the dialog for the Azure Storage connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, enter these Azure Storage instance details, and click Save:
      • Connectivity Type: Select Standard.
      • Notification Email: Enter an email address to receive notifications.
      • File Type: Select CSV.
      • Azure Blob Connection String: Enter your storage account's connection string.
      • Container: Specify the Azure container.
      • Remote Host Extract Files Directory: Name of the base folder in which you must place all your data files in Azure Storage.
      • CSV Delimiter: Delimiter for the data files.
      • CSV Date Format: Date format for the data files must match the date format in your Azure Storage source; for example, if you've used MM/dd/yyyy (01/23/1998) in your source, then you must specify the same format. See About Date and Timestamp Formatting for CSV File-based Extractors

      Azure Storage connection dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Azure Storage unless you perform a metadata extract.
    7. Click Save.
  2. On the Manage Connections page, select Actions for the Azure Storage connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Azure Storage data. Select the applicable Azure Storage source tables. See Augment Your Data.

Connect With Cloud Sources

Connect with your cloud applications to provide the background information for reports.

You can blend the additional data from these data sources with the prebuilt datasets to enhance business analysis. To know about the date and timestamp formatting for the CSV file-based extractors, see About Date and Timestamp Formatting for CSV File-based Extractors.

Load Data from Azure SQL into Oracle Fusion Data Intelligence (Preview)

As a service administrator, you can use the Oracle Fusion Data Intelligence extract service to acquire data from Azure SQL and use it to create data augmentations.

Ensure that Azure SQL is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle Fusion Data Intelligence, create the Azure SQL data connection using these instructions:
    1. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select Azure SQL as the connection type.

      Azure SQL connection option

    5. In the dialog for the Azure SQL connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, enter your Azure SQL instance details, and click Save.

      Create Connection for Azure SQL dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Azure SQL unless you perform a metadata extract.
  2. On the Manage Connections page, select Actions for the Azure SQL connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Azure SQL data. Select the applicable Azure SQL source tables. See Augment Your Data.

Connect with Your Oracle Eloqua Data Source (Preview)

If you’ve subscribed for Oracle Fusion CX Analytics and want to load data from your Oracle Eloqua source into Fusion Data Intelligence, then create a connection using the Eloqua connection type.

The Oracle Eloqua data that you load into Fusion Data Intelligence enables you to augment the data in your warehouse and create varied customer experience-related analytics. Ensure that Oracle Eloqua is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
  2. On the Data Configuration page, click Manage Connections under Global Configurations.
  3. On the Manage Connections page, click Create and then click Connection.
  4. In Create Connection, , select Data Extraction in Usage Type, and then select Oracle Eloqua as the connection type.

    Oracle Eloqua connection type

  5. In the dialog for the Eloqua connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and the credentials to connect with the Eloqua source in User Name and Password.
  6. In URL, enter the URL for your Eloqua server in this sample format: https://<your eloqua server>/api/odata.Description of fawag_eloqua_create_connection.png follows
  7. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
    You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
    Note

    You can’t create augmentations for Eloqua unless you perform a metadata extract.
  8. Click Save.
  9. On the Manage Connections page, select Actions for the Eloqua connection and then select Test Connection.
  10. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Eloqua data. Select the applicable Eloqua source tables. See Augment Your Data.
Load Data from Your Oracle Eloqua Data Source (Preview)

Create a data pipeline for the Marketing Campaign Analytics functional area to load data from your Oracle Eloqua source into Oracle Fusion Data Intelligence.

  1. Sign in to your service.
  2. In Oracle Fusion Data Intelligence, Console, click Data Configuration under Application Administration.
  3. On the Data Configuration page, click your service. For example, under Applications, click Customer Experience.
  4. On the Customer Experience page, click Create.
  5. In the wizard, select Customer Experience Marketing Analytics in Offering and Marketing Campaign Analytics in Functional Area to transfer the data to the warehouse, and then click Next.

    Description of fawag_load_eloqua_data.png follows

  6. Review the parameters and click one of the options:
    • Cancel: To cancel the data pipeline for the functional area.
    • Save: To save the data pipeline for the functional area but not activate it.
    • Activate: To schedule when to run the data pipeline for the functional area. See Activate a Data Pipeline for a Functional Area.

Load Data from Enterprise Data Management Cloud into Fusion Data Intelligence (Preview)

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from the Enterprise Data Management Cloud instance and use it to create data augmentations.

The extracts created in the Enterprise Data Management Cloud service need to be public, hence you must promote your private extracts to the public. Review the documentation and error messages for the metadata refresh failures for the private extract. This connector supports only the CSV data format.

Ensure that Oracle Enterprise Data Management Cloud is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Fusion Data Intelligence, create the Enterprise Data Management Cloud data connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select Oracle Enterprise Data Management Cloud as the connection type.

      Oracle Enterprise Data Management Cloud connection option

    5. In the dialog for the Enterprise Data Management Cloud connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, host name of the Oracle Enterprise Data Management server in Host Name, credentials to access the Enterprise Data Management Cloud instance in User Name and Password. In Extract Configuration, enter the list of extracts using only “comma” as the delimiter. The configuration extract must be a single line JSON without formatting for the quotation marks (" instead of \"), for example:
      [{"applicationName": "Account Reconciliation", "dimensionName": "Profiles","extractName": "Profiles"}]

      Create Connection for Oracle Enterprise Data Management Cloud dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Enterprise Data Management Cloud unless you perform a metadata extract.
    7. Click Save.
  2. On the Manage Connections page, select Actions for the Enterprise Data Management Cloud connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Enterprise Data Management Cloud data. Select the applicable Enterprise Data Management Cloud source tables. See Augment Your Data.

Load Data from Enterprise Performance Management into Fusion Data Intelligence (Preview)

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from the Enterprise Performance Management (EPM) SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

You can connect to these functional modules of EPM:
  • Financial Close and Consolidation (FCCS)
  • Planning and Budgeting (PBCS)
  • Profitability and Cost Management (PCMCS)
If you've pivot table in your source, then the metadata extract supports pivot table metadata extraction for the EPM connectors. You can only manually extract the incremental data because, for incremental extraction, you must update the results file in EPM before starting the next extraction for the updated data. Update the results file by running the integration using Data Exchange and then access the new results file from the EPM connector in Fusion Data Intelligence. Ensure that you enter all the fields in accordance with your EPM nomenclature and specifics. For example, if you have defined Period in your EPM job as {June-23}, then include exactly the same in the Create Connection for the EPM source dialog.
Note

The EPM connectors display the default datatype and size; you must edit these values as applicable while creating data augmentations.
Depending on the functional module you want to connect to, ensure that the applicable feature is enabled on the Enable Features page prior to creating this connection:
  • Oracle EPM - Financial Close and Consolidation
  • Oracle EPM - Planning and Budgeting
  • Oracle EPM - Profitability and Cost Management
See Make Preview Features Available.
  1. In EPM, create an integration, write out the results into a file whose name you provide in Download File Name, and then specify that same file name in List of Data Files while creating the connection to EPM in Fusion Data Intelligence to extract the data.
    Create an integration in EPM
  2. In EPM, when exporting data, use one of the following modes:
    • Standard mode: This built-in workflow process helps manage the data flow process in and out of EPM. For Standard mode, you specify the period when you run the integration.
    • Quick mode: This process by-passes most of the steps in the workflow and provides a significant improvement in the extract process. When using quick mode, you specify the period in the integration definition filter, and this may include a substitution variable.When using the API, the same rules apply, except that when you need to specify a period, you can’t use a substitution variable to pull from EssBase, but will need to include the period or year on extract.

    See Exporting Data.

  3. In Fusion Data Intelligence, create the EPM data connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select the connection type based on the functional module that you want to connect to. For example, to connect to the "Financial Close and Consolidation (FCCS)" module, select Oracle EPM - Financial Close and Consolidation as the connection type.

      Oracle EPM - Financial Close and Consolidation connection option

    5. In Create Connection for the EPM source, enter these details and click Save:
      • Connectivity Type: Select Standard.
      • Notification Email: Enter an email address to receive notifications.
      • User Name and Password: Enter the credentials for your EPM source. Prefix the user name with the domain of your EPM source, such as domain.username.
      • URL: Enter the specific URL of your EPM source using the https://<DOMAIN_NAME>region.ocs.oraclecloud.com format. For example, https://epm7-test-a123456.epm.us6.oraclecloud.com. Ensure to use the https:// protocol to avoid a timeout error.
      • List of Data Files: Specify the file name that you had entered in Download File Name while creating an integration in EPM.
      • List of Job Names: Provide the EPM integration job names. If you’ve multiple job names, then ensure that they are comma separated. For example, FAWEPMTestingV2, FAWEXPORT,FAW Job Testing.
      • List of Period Names: Provide the period names for the corresponding job names. Ensure that multiple period names are comma separated. You may leave this blank, in which case the connector uses the global point of view. Few examples of period names are:
        • {Jan-22},{Oct-22}
        • {Nov-22}{Dec-22},{Jan-23}{Feb-23}
        • {Jan#FY20}{Mar#FY20},{Apr#FY20}{May#FY20}

        For elaborate examples, refer to Running Integrations.

      • Extract Configuration: Provide the EPM job name such as "Job 1" and period name such as "Quarter 1" corresponding to the given job name to extract data for specific periods. This enables you to run multiple EPM jobs.

      Create EPM Connection

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for EPM unless you perform a metadata extract.
    7. Click Save.
  4. On the Manage Connections page, select Actions for the EPM connection and then select Test Connection.
  5. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the EPM data. Select the applicable EPM source tables. See Augment Your Data.

Load Data from EPM Export Data Instance into Fusion Data Intelligence (Preview)

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from EPM Export Data instance and use it to create data augmentations.

You can connect to these functional modules of EPM:
  • Financial Close and Consolidation (FCCS)
  • Planning and Budgeting (PBCS)
  • Profitability and Cost Management (PCMCS)
If you've pivot table in your source, then the metadata extract supports pivot table metadata extraction for the EPM connectors. You can only manually extract the incremental data because, for incremental extraction, you must update the results file in EPM before starting the next extraction for the updated data. Update the results file by running the integration using Data Exchange and then access the new results file from the EPM connector in Fusion Data Intelligence. Ensure that you enter all the fields in accordance with your EPM nomenclature and specifics. For example, if you have defined Period in your EPM job as {June-23}, then include exactly the same in the Create Connection for the EPM source dialog.
Note

The EPM connectors display the default datatype and size; you must edit these values as applicable while creating data augmentations.
Depending on the functional module you want to connect to, ensure that the applicable feature is enabled on the Enable Features page prior to creating this connection:
  • EPM Financial Close and Consolidation Data Export
  • EPM Planning and Budgeting Data Export
  • EPM Profitability and Cost Management Data Export
See Make Preview Features Available.
  1. In EPM, create a data exchange job, write out the results into a file whose name you provide in Download File Name, and then specify that same file name in List of Data Files while creating the connection to EPM in Fusion Data Intelligence to extract the data.
  2. In EPM, when exporting data, use one of the following modes:
    • Standard mode: This built-in workflow process helps manage the data flow process in and out of EPM. For Standard mode, you specify the period when you run the integration.
    • Quick mode: This process by-passes most of the steps in the workflow and provides a significant improvement in the extract process. When using quick mode, you specify the period in the integration definition filter, and this may include a substitution variable.When using the API, the same rules apply, except that when you need to specify a period, you can’t use a substitution variable to pull from EssBase, but will need to include the period or year on extract.

    See Exporting Data.

  3. In Fusion Data Intelligence, create the data connection to the EPM Export Data instance using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select the connection type based on the functional module that you want to connect to. For example, to connect to the "Financial Close and Consolidation (FCCS)" module, select EPM Financial Close and Consolidation Data Export as the connection type.

      EPM Financial Close and Consolidation Data Export connection option

    5. In Create Connection for the EPM source, enter these details and click Save:
      • Connectivity Type: Select Standard.
      • Notification Email: Enter an email address to receive notifications.
      • URL: Enter the specific URL of your EPM source ensuring to use https. For example, https://epm7-test-a123456.epm.us6.oraclecloud.com. If you provide http://epm7-test-a123456.epm.us6.oraclecloud.com or epm7-test-a123456.epm.us6.oraclecloud.com, then the system rejects it.
      • User Name and Password: Enter the credentials for your EPM source. Prefix the user name with the domain of your EPM source, such as domain.username.
      • EPM Application Name: Specify the file name that you had entered in Download File Name while creating the EPM data exchange job in EPM.
      • Comma Separated Job Names: Provide the EPM data exchange job names. If you’ve multiple job names, then ensure that they are comma separated. For example, FAWEPMTestingV2, FAWEXPORT,FAW Job Testing.
      • Extract Configuration: Provide the EPM data exchange job name in Job Name and select Dimension Export if you want to extract the hierarchy (dimensional data) from your EPM Export Data instance.

      EPM Financial Close and Consolidation Data Export Create Connection dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for EPM unless you perform a metadata extract.
    7. Click Save.
  4. On the Manage Connections page, select Actions for the EPM connection and then select Test Connection.
  5. AAfter the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the EPM data. Select the applicable EPM source tables. See Augment Your Data.

Load Data from Google Analytics into Fusion Data Intelligence

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from the Google Analytics SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Before connecting with the Google Analytics source, note these:
  • Fusion Data Intelligence supports Google Analytics extractor for GA4 properties and doesn’t support the previous version – Google Universal Analytics (UA) properties.
  • DataStores are the list of GA4 properties.
  • DataStore columns are the list of Dimensions and Metrics for a GA4 property.
  • DataExtract runs the report based on user selection for a GA4 property as DataStore and Dimensions and Metrics as DataStore columns.
  • MetaExtract fetches metadata for all the available GA4 properties (DataStores) and its Dimensions and Metrics (DataStoreColumns).
  • This connector supports limited number of Google Analytics metrics. See Dimensions Metrics Explorer to know what is available.
  1. In Google Cloud (Analytics) Project, create a service account and download the credentials.json file.
  2. Add the service account to the Google Analytics 4 property.
  3. Enable Google Analytics APIs using these instructions:
    1. Using a text editor, open the credentials.json file that you had downloaded and search for the client_email field to obtain the service account email address.
    2. Use this email address to add a user to the Google Analytics 4 property you want to access through the Google Analytics Data API v1.
    Enable Google Analytics APIs
  4. Ensure that the Google Analytics Admin API, Google Analytics Data API are available for your Google Analytics instance.
    View Google Analytics APIs
  5. In Fusion Data Intelligence, create the Google Analytics data connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select Google Analytics as the connection type.

      Google Analytics connection option

    5. In the dialog for the Google Analytics connection, select Standard as the connectivity type and enter these details:
      • Notification Email: An email address to receive notifications regarding this connection.
      • Service Account Credentials Json File: The Google Cloud Service Account credentials.json file that you had downloaded.
      • Account ID: Google Analytics account ID.
      • GA4 List of Property ID: The GA4 List of Property ID with commas to separate each ID.
      • Lookback Mode: Select either Full or Committed.
      • List of Lookback N days Ago: Comma separated list of days (integer) values such as 7,21.
      Create Connection dialog
      Note these:
      • For the Lookback mode, if you don't provide a value, then the Lookback mode isn't supported. The Full option requires one day value, if you provide multiple values, then the process uses the first value. You can provide multiple values for the Committed option.
      • For List Data Stores, the REST API returns a list of GA4 Property IDs either using the Account ID (if provided) or just the source configured or provided list of property.
      • For List columns, the REST API returns a list of column metadata for the given GA4 Property ID.
    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Google Analytics unless you perform a metadata extract.
      Metadata extract:
      • Retrieves metadata columns for each GA4 Property ID provided in the source configuration.
      • Prefixes the GA property columns with Dimension_ orMetric_ that Fusion Data Intelligence later uses while extracting data to differentiate Dimension and Metric column type.
      • Leaves the payload dataStores array empty.
    7. Click Save.
  6. On the Manage Connections page, select Actions for the Google Analytics connection and then select Test Connection.
    Note

    REST API signature is same across sources. Test connection invokes GA Common Metadata API. This returns the default version values and no calls are made to the source.
  7. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Google Analytics data. Select the applicable Google Analytics source tables. Ensure that you specify “Dimension_transactionId” in the source tables as the primary key and use it to join each of the data augmentation tables. You can select a maximum number of nine dimensions for each data augmentation. See Augment Your Data.
    When you enable data extraction, you can schedule to run when you choose to do so. For data extraction, note these:
    1. Provide date ranges to run the report and fetch data.
    2. Regular data extract uses the initial or last ExtractDate as StartDate and job RunDate as EndDate.
    3. Lookback mode includes additional date ranges along with the regular extract date range which fetches additional data set but in a single runReport call.
      • The Full option has a single date range; StartDate=ExtractDate - NdaysAgo, EndDate=RunDate.
      • The Commited option can have multiple date ranges. For each configured GA_LIST_OF_N_DAYS_AGO, StartDate=ExtractDate - NdaysAgo, EndDate=RunDate - NdaysAgo.

Load Data from Mongo Database into Fusion Data Intelligence (Preview)

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from the Mongo database and use it to create data augmentations.

Ensure that MongoDB is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Check the port number for your Mongo database and create a service request with server host and port details to enable network connectivity to the Mongo database server.
  2. In Fusion Data Intelligence, create the Mongo database connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select MongoDB as the connection type.

      MongoDB connection option

    5. In the dialog for the Mongo database connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide these details:
      • Connection Protocol: Enter the connection protocol such as mongodb+srv or mongodb
      • Credentials to access the database in User Name and Password
      • Host Name: Enter the host name of the Mongo database such as cluster0.example4.mongodb.net
      • Host Port: Enter the port number where Mobgo database is listening such as 27017
      • Database Name: Enter a name such as Analytics
      • Last Update Date Column Pattern: Enter pattern such as "%mo%fie%te%"
        Note

        If column name matches with pattern provided, then Fusion Data Intelligence marks the column name with isLastUpdateDate = true and uses it for the incremental extract. For example, if pattern provided is “%mo%fie%te%“, then the column name modifiedDate is marked as isLastUpdateDate = true.
      • Initial Extract Date Column Pattern: Enter pattern such as "%cr%ted%te%"
        Note

        If column name matches with pattern provided, then Fusion Data Intelligence marks the column name with isCreationDate = true and uses it for the initial extract date extraction. For example, if pattern provided is: “%cr%ted%te%“, then the column name createdDate is marked as isCreationDate = true.

      Create Connection for Mongo database dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Mongo database unless you perform a metadata extract.
    7. Click Save.
  3. On the Manage Connections page, select Actions for the Mongo database connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Mongo database data. Select the applicable Mongo database source tables. See Augment Your Data.

Load Data from MySQL Cloud Database into Fusion Data Intelligence (Preview)

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from the MySQL Cloud database and use it to create data augmentations.

Ensure that MySQL Cloud is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Check the port number for your MySQL Cloud database and create a service request with server host and port details to enable network connectivity to the MySQL server.
  2. In Fusion Data Intelligence, create the MySQL Cloud database connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select MySQL Cloud as the connection type.

      MySQL Cloud database connection option

    5. In the dialog for the MySQL Cloud database connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, and provide these details:
      • Host Name: Enter the host name of MySQL server such as 100.111.252.64
      • Port Number: Enter the port number where the server is listening such as 3306
      • Database: Enter the database name you need to connect to such as airportdb
      • Credentials to access the database in User Name and Password
      • Last Update Date Column Pattern: Enter format such as "%mo%fie%te%"
        Note

        If column name matches with pattern provided, then Fusion Data Intelligence marks the column name with isLastUpdateDate = true and uses it for the incremental extract. For example, if pattern provided is “%mo%fie%te%“, then the column name modifiedDate is marked as isLastUpdateDate = true.
      • Initial Extract Date Column Pattern: Enter format such as "%cr%ted%te%"
        Note

        If column name matches with pattern provided, then Fusion Data Intelligence marks the column name with isCreationDate = true and uses it for the initial extract date extraction. For example, if pattern provided is: “%cr%ted%te%“, then the column name createdDate is marked as isCreationDate = true.

      Create Connection for MySQL Cloud database dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for MySQL Cloud database unless you perform a metadata extract.
    7. Click Save.
  3. On the Manage Connections page, select Actions for the MySQL Cloud database connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the MySQL Cloud database data. Select the applicable MySQL Cloud database source tables. See Augment Your Data.

Load Data from Salesforce into Fusion Data Intelligence

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from the Salesforce SaaS instance and use it to create data augmentations.

  1. Run the following command in your terminal to generate a keystore file:
    openssl pkcs12 -export -inkey server.key -in server.crt -name YOUR_KEY_NAME -passout pass:YOUR_PASSWORD -out keystorefile.p12

    Replace YOUR_KEY_NAME and YOUR_PASSWORD with your desired values, and save the generated keystorefile.p12 file to later upload it while creating the connection to your Salesforce source.

  2. In Fusion Data Intelligence, create the Salesforce data connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select Salesforce as the connection type.

      Salesforce connection option

    5. In the dialog box for the Salesforce connection, select Standard in Connectivity Type and enter an email address to receive notifications in Notification Email.

      Create Connection for Salesforce dialog

    6. In Authorization Type, you can select Basic Authentication or OAuth as the type of authorization.
      If you select Basic Authentication, then:
      • Enter the credentials for your Salesforce source in User Name and Password. The user stated here must have access to all the data in the Salesforce system to extract it to the warehouse.
      • Copy and paste the security token from your Salesforce account in Security Token. This is an alpha-numeric code and may contain special characters, however, it isn't visible. It's encrypted and shown as ….
      • In Is Sandbox Environment, select Yes if your Salesforce source is a test or sandbox environment; else select No.
      If you select OAuth, then enter these additional values:
      • In Client ID, enter the unique identifier that represents the Salesforce application within the Salesforce instance and is visible when you sign into your Salesforce account.
      • In KeyStore File, provide the file that is in a PKCS#12 file format (denoted by .p12 extension), which can contain both private keys and certificates (such as the public key of the server or the signing certificate).
      • In KeyStore Password, enter the password for accessing the specified keystore file.
      • In Keystore Key name, provide the unique name (identifier or alias) of the specified keystore file.
    7. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Salesforce unless you perform a metadata extract.
    8. Click Save.
  3. On the Manage Connections page, select Actions for the Salesforce connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Salesforce data. Select the applicable Salesforce source tables. See Augment Your Data.

Load Data from Shopify into Fusion Data Intelligence (Preview)

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from the Shopify SaaS instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Ensure that Shopify is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Fusion Data Intelligence, create the Shopify data connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select Shopify as the connection type.

      Shopify connection option

    5. In the dialog for the Shopify connection, select Standard in Connectivity Type, enter an email address to receive notifications in Notification Email, applicable token value in Access Token, Store Name such as myfawteststore.myshopify.com, and True in Bulk Extract.

      Create Connection for Shopify dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Shopify unless you perform a metadata extract.
    7. Click Save.
  2. On the Manage Connections page, select Actions for the Shopify connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Shopify data. Select the applicable Shopify source tables. See Augment Your Data.

Load Data from Oracle Autonomous Database into Oracle Fusion Data Intelligence (Preview)

As a service administrator, you can use the Oracle Fusion Data Intelligence extract service to acquire data from Oracle Autonomous Database and use it to create data augmentations.

You can create connections to five autonomous databases. Depending on the number of connections, ensure that options such as Oracle Autonomous Database 1, Oracle Autonomous Database2 are enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.
Note

Currently, you can't connect to a private autonomous transaction processing database (ATP database).
  1. In Oracle Fusion Data Intelligence, create the autonomous database connection using these instructions:
    1. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and depending on the number of connections, select options such as Oracle Autonomous Database 1, or Oracle Autonomous Database2 as the connection type.

      Oracle Autonomous Database connection option

    5. In the dialog for the Oracle Autonomous Database connection, provide these details in Source Connection and then click Save and Next:
      • Select Standard in Connectivity Type.
      • Enter an email address to receive notifications in Notification Email.
      • Enter the credentials to access the database in User Name and Password.
      • Enter the database service details in Service.
      • In Wallet, drag and drop the database wallet details.

      Create Connection for Oracle Autonomous Database dialog displaying the Source Connection tab

    6. In the dialog for the Oracle Autonomous Database connection, provide these details in Extract Configuration and click Save:
      • In Incremental Strategy, select the incremental strategy (Flashback, Golden Gate, or RowSCN) that is compatible with your database configuration.
      • In Initial Extract Date Column Pattern, provide the initial extract date pattern that matches the pattern in your source.
      • In Last Update Date Column Pattern, provide the last update date pattern that matches the pattern in your source.
      • In Inclusion List, select Yes or No to include the mentioned list of datastores in the incremental strategy or not.
      • In List of Incremental datastores to include/exclude, enter a comma separated list of datastores names.

        If you don't provide, then the connector uses the incremental strategy for all the datastores. If you provide and IS_INCLUSION_LIST=true, only the provided list use the specified incremental strategy. If provided and IS_INCLUSION_LIST=false, the provided list won’t use the incremental strategy. If INCREMENTAL_STRATEGY property is available, then the connector uses IS_INCLUSION_LIST and INCR_DATASTORES_LIST for all strategies. If not available, then for FLASHBACK the connector checks if ENABLE_FLASHBACK_INCREMENTAL and FBA_INCR_DATASTORES_EXCEPTION_LIST is provided and for ROWSCN, it checks if ENABLE_ORA_ROWSCN_INCREMENTAL and ROWSCN_INCR_DATASTORES_EXCEPTION_LIST is provided.

      • In Case sensitive Data Stores, select Yes or No.
      • In Schema Name, enter the schema name to extract data from.
      • In Data Store Name Pattern, specify the name pattern of the data stores that you want extracted. If you provide this value, then the connector extracts only data stores matching the pattern.
      • Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection. You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
        Note

        You can’t create augmentations for autonomous database unless you perform a metadata extract.

      Create Connection for Oracle Autonomous Database dialog displaying the Extract Configuration tab

  2. On the Manage Connections page, select Actions for the autonomous database connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the autonomous database data. Select the applicable autonomous database source tables. See Augment Your Data.

Load Data from Snowflake into Oracle Fusion Data Intelligence (Preview)

As a service administrator, you can use the Oracle Fusion Data Intelligence extract service to acquire data from a Snowflake instance.

You can later use this data to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases. Establish the connection from Fusion Data Intelligence to your Snowflake instance to start data acquisition followed by augmentation.
Note

Snowflake some times requires API calls to originate from a known IP address. If you're experiencing connection issues due to an unauthorized IP, then submit an Oracle Support ticket to obtain the necessary Oracle IP address for your Snowflake allowlist.

Ensure that Snowflake is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Fusion Data Intelligence, create the Snowflake data connection:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select Snowflake as the connection type.
      Snowflake connection option
    5. In Create Connection, enter these details and then click Save:
      • Connectivity Type: Standard.
      • Notification Email: An email address to receive notifications.
      • Auth Schema: Enter “BASICAUTH” if you’re using username and password to establish the connection. Enter “PRIVATE_KEY” if you’re using token-based authentication.
      • User Name: Enter username only if you're using the basic authentication.
      • Password: Enter password for the username only if you're using the basic authentication.
      • Host Name: Complete host name of your Snowflake instance.
      • Table Schema: Your Snowflake table schema such as TPCH_SF1.
      • Database: Mentioned in your Snowflake account under Data.
      • Warehouse: The compute resources in your Snowflake instance that you can find by running SHOW WAREHOUSES [ LIKE '<pattern>' ]. See SHOW WAREHOUSES.
      • Private Key: Generate the Private Key in Snowflake, if you don’t have one already, and paste it here. See Generate the Private Key.

      Create Snowflake connection

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Snowflake unless you perform a metadata extract.
    7. Click Save.
  2. On the Manage Connections page, select Actions for the Snowflake connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Snowflake data. Select the applicable Snowflake source tables. See Augment Your Data.

Load Data from Taleo into Fusion Data Intelligence (Preview)

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from the Taleo instance and use it to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases.

Ensure that Taleo is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Fusion Data Intelligence, create the Taleo data connection using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select Taleo as the connection type.

      Taleo connection option

    5. In Connectivity Type, select Standard, enter an email address to receive notifications in Notification Email, host name of your Taleo instance in Host Name, and credentials for your Taleo source in User Name and Password.

      Create Taleo Connection dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Taleo unless you perform a metadata extract.
    7. Click Save.
  2. On the Manage Connections page, select Actions for the Taleo connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Taleo data. Select the applicable Taleo source tables. See Augment Your Data.

Load Data from Oracle Analytics Publisher into Fusion Data Intelligence (Preview)

As a service administrator, you can use the Fusion Data Intelligence extract service to acquire data from the Analytics Publisher reports and use it to create data augmentations for various use cases.

Currently, the BI Publisher Reports connector only supports:
  • The Analytics Publisher in Oracle Fusion Cloud Applications for data augmentation.
  • Only those reports that complete within the Analytics Publisher report execution timeout limit that's typically 300 seconds.

The BI Publisher Reports connector workflow must observe the security rules of Oracle Fusion Cloud Applications. You must ensure that the password rotation and update are done on time before executing the BI Publisher Reports connector pipeline. Otherwise, those pipeline jobs will hang and eventually those jobs will get deleted, and the data source will be disabled until you update the password and resubmit the job.

Ensure that BI Publisher Reports is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In BI Publisher, build a comma separated values (CSV) report for the targeted tables. See Create Reports and Generate CSV Output.
  2. In Fusion Data Intelligence, create the data connection to the BI Publisher reports using these instructions:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select BI Publisher Reports as the connection type.

      BI Publisher connection option

    5. In the dialog for the BI Publisher reports connection, provide these details:
      • Select Standard in Connectivity Type.
      • Enter an email address to receive notifications in Notification Email.
      • Host name of the BI Publisher in Host Name.
      • Credentials to access your BI Publisher in User Name and Password
      • Enter comma separated reports in List of Reports in the <reports directory>/<report names> format.
        Note

        Ensure that the reports directory and report names don't have spaces.
      • In Reports Configuration, provide the path of the report in Oracle Analytics Publisher, select Run Asynchronously if the report isn't huge, else select Run Synchronously In Chunk if the report has a large volume of data (more than 7 MB in size) because this enables you to download the report in chunks of data.
        Note

        • If the report execution is completed before the timeout limit and generates large volume of data then set runInChunkMode: true. The recommendation is to set runInChunkMode to true if the report output file is over 7MB.
        • If the report has parameters defined, then provide the values in the "params" section of the report configuration array. If the value in the params array is set with placeholder __lastUpdateDate__, then the connector applies the initial extract date or last successful extract date to the param.
      • For CSV Date Format and CSV Timestamp Format, see About Date and Timestamp Formatting for CSV File-based Extractors.
        Note

        Ensure that the date format used in Oracle Analytics Publisher and Fusion Data Intelligence match.

      Create Connection for BI Publisher dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for the BI Publisher reports unless you perform a metadata extract.
    7. Click Save.
  3. On the Manage Connections page, select Actions for the BI Publisher reports connection and then select Test Connection.
  4. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the BI Publisher reports data. Select the applicable BI Publisher source tables. See Augment Your Data.

Load Data from Oracle Database Using JDBC into Fusion Data Intelligence (Preview)

As a service administrator, you can use an extract service remote agent to connect to an Oracle database using JDBC and use the data to create data augmentations.

After connecting to an Oracle database using JDBC, the remote agent extracts the data and loads it into the autonomous data warehouse associated with your Oracle Fusion Data Intelligence instance. The remote agent pulls the metadata through the public extract service REST API and pushes data into object storage using the object storage REST API. You can extract and load the data from an Oracle database into Oracle Fusion Data Intelligence only once every 24 hours.

Ensure that Remote Agent and Oracle JDBC are enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. Set up the remote agent to load data from your SQL Server into Oracle Fusion Data Intelligence.
  2. Configure the remote agent and Oracle database data source on the Data Configuration page in Oracle Fusion Data Intelligence using these instructions:
    1. On the Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select Oracle JDBC as the connection type.
      Oracle JDBC connection option

    5. In Create Connection for Oracle JDBC, in Connectivity Type, verify that Remote is selected automatically.
      Create Connection for Oracle JDBC

    6. In Remote Agent, select the remote agent connection that you created earlier, for example, Remote Agent.
    7. Enter an email address to receive notifications in Notification Email, provide credentials for the Oracle database source in User Name and Password, and the URL of the Oracle database source in URL.
    8. In Initial Extract Date Column Pattern, provide the initial extract date pattern that matches the pattern in your source.
    9. In Last Update Date Column Pattern, provide the last update date pattern that matches the pattern in your source.
    10. If your source has flashback support, then select Yes in Enable flashback incremental.
    11. In List of Flashback Incremental Exempt datastores, provide a comma separated list of datastores that you want to exempt from the flashback incremental queries.
    12. If your source has ROWSCN support, then select Yes in Enable ROWSCN incremental.
    13. In List of ROWSCN Incremental Exempt datastores, specify a comma-separated list of datastores that you want to exclude from the automatic tracking of row changes based on system change numbers.
    14. In Case Sensitive Data Stores, select Yes or No to specify whether the datastores have case sensitive data.
    15. In Schema Name, enter the schema name to extract data from.
    16. In Data Store Name Pattern, specify the name pattern of the datastores that you want extracted. If you provide this value, then the connector extracts only datastores matching the pattern.
    17. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection. You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Oracle database unless you perform a metadata extract.
    18. Click Save.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Oracle database data. Select the applicable Oracle database source tables. See Augment Your Data.

Load Data from Oracle Transportation Management Cloud Service into Oracle Fusion Data Intelligence (Preview)

As a service administrator, you can use the Oracle Fusion Data Intelligence extract service to acquire data from an Oracle Transportation Management Cloud Service SaaS instance.

You can later use this data to create data augmentations for various Enterprise Resource Planning and Supply Chain Management use cases. Establish the connection from Fusion Data Intelligence to your Oracle Transportation Management Cloud Service instance to start data acquisition followed by augmentation.

Note

Oracle Fusion SCM Analytics is a prerequisite to use the "Oracle Transportation Management" connector.
  1. Create a confidential application in Oracle Identity Cloud Service. See IDCS Configuration.
  2. In Oracle Transportation Management Cloud Service, create a user and assign the "Integration" role using these instructions:
    1. Navigate to the User Manager with DBA.Admin privileges and click New in User Finder.
    2. In User Name, enter a name of your choice.
    3. Enter the client ID from the Oracle Identity Cloud Service confidential application in Nickname.
    4. In Domain Name, enter the name of the target domain.
    5. Enter a compliant Password and the password confirmation.
    6. In User Role ID, select INTEGRATION.
    7. In Access Control List add the list named Data Replication Service - Rest with the Granted option selected.
    8. Select Finished.
  3. Obtain the OAuth 2 client credentials from your Oracle Transportation Management Cloud Service instance to provide while creating the data connection. See OAuth 2.
  4. In Fusion Data Intelligence, enable Oracle Transportation Management on the Enable Features page. See Make Preview Features Available.
  5. In Fusion Data Intelligence, create the Oracle Transportation Management Cloud Service data connection:
    1. In Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select Oracle Transportation Management as the connection type.

      Oracle Transportation Management Cloud Service connection option

    5. In the dialog for the Oracle Transportation Management Cloud Service connection, enter these details and then click Save:
      • Connectivity Type: Standard.
      • Notification Email: An email address to receive notifications.
      • Scope: The scope name defined in your Oracle Transportation Management Cloud Service's Oracle Identity Cloud Service application’s configuration.
      • Token URL: The URL from which to obtain the access token. See Runtime Configuration.
      • URL: Source application URL (hostname and protocol of the Oracle Transportation Management Cloud Service instance).
      • Client ID: The unique client identifier generated during OAuth registration process.
      • Client Secret: The client secret generated during the OAuth registration process (a private key similar to a password that is generated when registering your Oracle Transportation Management Cloud Service's Oracle Identity Cloud Service application).
      • Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection. You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
        Note

        You can’t create augmentations for Oracle Transportation Management Cloud Service unless you perform a metadata extract.

      Create Oracle Transportation Management Cloud Service connection

  6. On the Manage Connections page, select Actions for the Oracle Transportation Management Cloud Service connection and then select Test Connection.
  7. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Oracle Transportation Management Cloud Service data. Select the applicable Oracle Transportation Management Cloud Service source tables. See Augment Your Data.

Load Data from QuickBooks Online into Oracle Fusion Data Intelligence (Preview)

As a service administrator, you can use the Oracle Fusion Data Intelligence extract service to acquire data from QuickBooks Online and use it to create data augmentations.

Ensure that QuickBooks Online is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle Fusion Data Intelligence, create the QuickBooks Online data connection using these instructions:
    1. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type and then select QuickBooks Online as the connection type.

      QuickBooks Online connection option

    5. In the dialog for the QuickBooks Online connection, enter these details and click Save:
      • Connectivity Type: Select Standard.
      • Notification Email: Enter an email address to receive notifications.
      • URL: Enter the complete URL of your Quickbooks Online instance.
      • Client ID: This is the unique ID on your Quickbooks Online instance.
      • Client Secret: Copy and paste the entire "client secret" from your authentication.
      • Refresh Token: This token is changed by Quickbooks everyday; enter your refresh token for the day you want to execute this pipeline.
      • Company ID: This is your company ID for the Quickbooks Online instance.

      Create Connection for QuickBooks Online dialog

    6. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for QuickBooks Online unless you perform a metadata extract.
    7. Click Save.
  2. On the Manage Connections page, select Actions for the QuickBooks Online connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the QuickBooks Online data. Select the applicable QuickBooks Online source tables. See Augment Your Data.

Load Data from Oracle Warehouse Management Cloud into Oracle Fusion Data Intelligence (Preview)

As a service administrator, you can use the Oracle Fusion Data Intelligence extract service to acquire data from Oracle Warehouse Management Cloud and use it to create data augmentations.

Ensure that Oracle Warehouse Management Cloud is enabled on the Enable Features page prior to creating this connection. See Make Preview Features Available.

  1. In Oracle Fusion Data Intelligence, create the Oracle Warehouse Management Cloud data connection using these instructions:
    1. In Oracle Fusion Data Intelligence Console, click Data Configuration under Application Administration.
    2. On the Data Configuration page, click Manage Connections.
    3. On the Manage Connections page, click Create and then click Connection.
    4. In Create Connection, select Data Extraction in Usage Type, and then select Warehouse Management Cloud as the connection type.

      Oracle Warehouse Management Cloud connection option

    5. In the dialog for the Oracle Warehouse Management Cloud connection, provide these details in Source Connection and then click Save and Next:
      • Connectivity Type: Select Standard.
      • Notification Email: Enter an email address to receive notifications.
      • URL: Enter the URL of your Oracle Warehouse Management Cloud instance in this format: *protocol://domain/environment/app/lgfapi version/lgfapi module/resourcepath*. For example, https://xxxxx.wms.ocs.oraclecloud.com/myenv/wms/lgfapi/v10/entity.
      • User Name and Password: Enter the credentials for your Oracle Warehouse Management Cloud instance.

      Create Connection for Oracle Warehouse Management Cloud dialog displaying the Source Connection tab.

    6. In the dialog for the Oracle Warehouse Management Cloud connection, in Extract Configuration, provide a value in Filter for Inventory History using at least one of the following combinations (in addition to any other field) and click Save:
      * company_id_{_}code, facility_id{_}_code, group_nbr
      * company_id_{_}code, facility_id{_}_code, history_activity_id, status_id
      * company_id_{_}code, facility_id{_}_code, history_activity_id, item_code
      * company_id_{_}code, facility_id{_}_code, history_activity_id, item_alternate_code
      * company_id_{_}code, facility_id{_}_code, history_activity_id, container_nbr 

      If you don’t enter the filter information, then data extraction for the inventory history datastore won’t work. See Oracle Warehouse Management documentation for information about the filter for inventory history.


      Create Connection for Oracle Warehouse Management Cloud dialog displaying the Extract Configuration tab.

    7. Verify that the Refresh Metadata toggle is enabled to ensure that the metadata is refreshed when you save the connection.
      You can later refresh the metadata from the Actions menu on the Manage Connections page, if required.
      Note

      You can’t create augmentations for Oracle Warehouse Management Cloud unless you perform a metadata extract.
  2. On the Manage Connections page, select Actions (Actions menu ellipsis icon) for the Oracle Warehouse Management Cloud connection and then select Test Connection.
  3. After the connections are successfully established, navigate to the Data Configuration page, select the connection in Data Source, then click the Data Augmentation tile, and create a data augmentation using the Oracle Warehouse Management Cloud data. Select the applicable Oracle Warehouse Management Cloud source tables. See Augment Your Data.