Export Data to Autonomous Database

This section details the process to export data to an Autonomous Database and manage the load pipelines.
Note

Setup and maintenance of Autonomous database is outside the purview of JMS. See Always Free Autonomous Database for more information.

Setup Prerequisites:

Before setting up and running load pipelines for data exported to your Autonomous Database, ensure the following prerequisites are met:

  1. Set Database User Privilege

    As an admin user, grant execution privileges to the user who will run the load pipeline scripts inside Oracle Database Actions - SQL Console.

    GRANT EXECUTE ON DBMS_CLOUD_PIPELINE TO <user-name>;
  2. Enable Resource Principal

    Inside Oracle Database Actions - SQL Console, as an administrator, execute the following command to enable resource principal:

    EXEC DBMS_CLOUD_ADMIN.ENABLE_RESOURCE_PRINCIPAL();

    If a user other than the administrator executes the pipeline script, as an administrator, enable the resource principal for that user by executing the following command:

    EXEC DBMS_CLOUD_ADMIN.ENABLE_RESOURCE_PRINCIPAL(username => '<user-name>');
  3. Create Dynamic Group for Target Autonomous Database

    Create a dynamic group with the following matching rule:

    ALL {resource.type = 'autonomousdatabase', resource.id = '<adb-ocid>'} 

    where <adb-ocid> is the OCID of the Autonomous Database to which data will be loaded.

  4. Give Resource Principal Policy Access to Object Storage Bucket

    Create a policy to give the Autonomous Database's resource principal access to files in the target object storage bucket. Create the following policy in the Policies field:

    ALLOW DYNAMIC-GROUP <dynamic-group-name> TO READ objects IN compartment <compartment-name> where target.bucket.name='<export-bucket-name>'
    
    where:
    • <export-bucket-name> is the name of the object storage to which the JMS Fleet data is being exported
    • <compartment-name> is the name of the compartment to which the bucket belongs to

Get Load Pipeline Script

The pipeline script can be obtained from the Java Management Service console:
  1. In the Fleet dashboard, select the fleet whose exported data needs to be loaded into the Autonomous Database.
  2. Go to the Data Export Settings Tab and download the Autonomous Database load pipeline script.

Execute Pipeline Script

Copy the contents of the script and execute them in Oracle Database Actions - SQL Console.

The load pipeline script executes four main tasks:

  1. Creates three tables corresponding to three export resources. If the tables already exist, no new tables will be created:
    • Managed Instance
    • Managed Instance and Java Runtime
    • Managed Instance, Java Runtime, and Application
  2. Creates and configures three pipelines to continuously poll the target object storage bucket at regular intervals for new CSV files. When new files are detected, data from those files is loaded into the correct table. If the pipelines already exist, no new pipelines are created.
  3. Starts the newly created pipelines.
  4. Creates and enables a scheduled job which runs weekly to check for redundant export data and remove them.

Monitor and Manage Existing Pipelines

To monitor all the active pipeline jobs, go to Oracle Database Actions and Scheduling.

You can also run the query inside Oracle Database Actions - SQL Console:

SELECT * FROM USER_CLOUD_PIPELINES WHERE PIPELINE_NAME LIKE 'JMS%'

For details on each executed job, see $STATUS and $LOG tables that are identified by your user cloud name.

You can control a pipeline by starting, stopping, resetting, or dropping the pipeline. For detailed steps, see Control Pipelines.