Monitor and Troubleshoot COPY_COLLECTION Loads
All data
load operations you perform using the PL/SQL package
are logged in the tables
DBMS_CLOUD
dba_load_operations
and user_load_operations
. Use
these tables to monitor loading with DBMS_CLOUD.COPY_COLLECTION
.
-
dba_load_operations
shows all load operations -
user_load_operations
shows the load operations in your schema
You can query these tables to see information about ongoing and completed data
loads. For example, the following SELECT
statement with a
WHERE
clause predicate on the TYPE
column shows load
operations of the type COPY
:
SELECT table_name, owner_name, type, status, start_time, update_time, logfile_table, badfile_table
FROM user_load_operations WHERE type = 'COPY';
TABLE_NAME OWNER_NAME TYPE STATUS START_TIME UPDATE_TIME LOGFILE_TABLE BADFILE_TABLE
FRUIT ADMIN COPY COMPLETED 2020-04-23 22:27:37 2020-04-23 22:27:38 "" ""
FRUIT ADMIN COPY FAILED 2020-04-23 22:28:36 2020-04-23 22:28:37 COPY$2_LOG COPY$2_BAD
The LOGFILE_TABLE
column shows the name of the table you can
query to look at the log of a load operation. For example, the following query shows the
log of the load operation with status FAILED
and timestamp
2020-04-23 22:28:36
:
SELECT * FROM COPY$2_LOG;
The column BADFILE_TABLE
shows the name of the table you can
query to review information for the rows reporting errors during loading. For example, the
following query shows the rejected records for the load operation:
SELECT * FROM COPY$2_BAD;
Depending on the errors shown in the log and the rows shown in the
BADFILE_TABLE
table, you might be able to correct errors by specifying
different format options with DBMS_CLOUD.COPY_COLLECTION
.
The
LOGFILE_TABLE
and
BADFILE_TABLE
tables are stored for two days for each load operation
and then removed automatically.
See DELETE_ALL_OPERATIONS Procedure for information on clearing the user_load_operations
table.
Parent topic: Load JSON on Autonomous Database