Upload Data Files to Your Object Store

Upload to your cloud-based object store the data files that you want to load to your Autonomous Transaction Processing database. This tutorial uses an object store in the Oracle Cloud Infrastructure Object Storage service.

    1. Log in to your Oracle Cloud Infrastructure Console with the following credentials provided by your Oracle Cloud administrator: cloud tenant, user name, password.

Description of the illustration log_in_to_oci_object_storage_with_username_password

    1. Select Object Storage from the menu at the top left of the Oracle Cloud Infrastructure console. Select Object Storage from the sub-menu.

Description of the illustration select_object_storage

    1. Select a compartment in which to create a bucket to upload your database table data.

Description of the illustration select_a_compartment

    1. Click Create Bucket to create the storage bucket in which to upload your source files. You will later copy this staged data into tables in your Autonomous Transaction Processing database. 

Description of the illustration create_a_bucket

    1. Enter a bucket name, select the standard storage tier, and click Create Bucket.

Description of the illustration create_a_bucket_dialog

    1. Click Upload Object to begin selecting the data files to upload to the bucket.

Description of the illustration click_upload_object

    1. Navigate to the location of the data files on your local computer. Drag and drop each file individually or click Upload Object to upload each file individually.
      This example uploads the data files of the SH tables (sales history tables from an Oracle sample schema). Click here to download a zip file of the 10 SH data files for you to upload to the object store. Unzip the data files from the zip file, because zip files cannot be uploaded. Upload each unzipped data file individually.
      Note: Alternatively, you can use curl commands to upload large numbers of files.

Description of the illustration upload_data_files_to_bucket

    1. The data files are uploaded to the bucket. These files staged in the cloud are ready to be copied into the tables of your Autonomous Transaction Processing database. Remain logged in to Oracle Cloud Infrastructure Object Storage.

Description of the illustration uploaded_data_files_in_bucket

Create an Object Store Auth Token

To load data from an Oracle Cloud Infrastructure Object Storage object store, you need to create an Auth Token for your object store account. The communication between your Autonomous Transaction Processing database and the object store relies on the Auth Token and username/password authentication.

    1. If you have logged out of Oracle Cloud Infrastructure Object Storage, log back in with the following credentials provided by your Oracle Cloud administrator: cloud tenant, user name, password.
    2. Hover your mouse cursor over the human figure icon at the top right of the console and click User Settings from the drop-down menu.

Description of the illustration click_user_settings

    1. Click Auth Tokens under Resources on the left of the console.

Description of the illustration click_auth_tokens

    1. Click Generate Token.

Description of the illustration click_generate_token

    1. A pop-up dialog appears. Set the Auth Token by performing the following steps:
      1. In the pop-up dialog, enter a description.
      2. Click the Generate Token button.
      3. Copy the generated token to a text file. The token does not appear again.
      4. Click Close.

Description of the illustration enter_descriptions_and_click_generate_token

Description of the illustration copy_the_generated_token

Create Object Store Credentials in your Autonomous Transaction Processing Schema

Now that you have created an object store Auth Token, store in your Autonomous Transaction Processing atpc_user schema the credentials of the object store in which your data is staged.

    1. Open SQL Developer and connect to your Autonomous Transaction Processing database as user atpc_user. See the previous tutorial in this series, Connecting to SQL Developer and Creating Tables, for steps to connect SQL Developer to your Autonomous Transaction Processing database as atpc_user.
    2. In a SQL Developer worksheet, use the create_credential procedure of the DBMS_CLOUD package to store the object store credentials in your atpc_user schema.
      1. Create a credential name. You reference this credential name in the copy_data procedure in the next step.
      2. Specify the credentials for your Oracle Cloud Infrastructure Object Storage service: The username and the object store Auth Token you generated in the previous step.

        begin
        DBMS_CLOUD.create_credential (
        credential_name => 'OBJ_STORE_CRED',
        username => '<your username>',
        password => '<your Auth Token>'
        ) ;
        end;
        /

Description of the illustration create_credential_in_adwc_schema

After you run this script, your object store's credentials are stored in your Autonomous Transaction Processing atpc_user schema.

 

Copy Data from Object Store to Autonomous Transaction Processing Database Tables

The copy_data procedure of the DBMS_CLOUD package requires that target tables must already exist in in your Autonomous Transaction Processing database. In the previous tutorial in this series, Connecting SQL Developer and Creating Tables, you created in your Autonomous Transaction Processing atpc_user schema all of the target tables. 

Now run the copy_data procedure to copy the data staged in your object store to your Autonomous Transaction Processing atpc_user tables. 

  1. In a SQL Developer worksheet, use the copy_data procedure of the DBMS_CLOUD package to copy the data staged in your object store.
    • For credential_name, specify the name of the credential you defined in section 3, Create Object Store Credentials in your Autonomous Transaction Processing Schema. 
    • For file_uri_list, specify the URL that points to the location of the file staged in your object store. The URL is structured as follows. The values you specify are in bold:
      https://swiftobjectstorage.<region name>.oraclecloud.com/v1/<tenant name>/<bucket name>/<file name>
    • Click here for an example script. In the script, use your own table names, region name, tenant name, bucket name, and file names.
      Note: The region name, tenant name, and bucket name can all be found in one place by clicking the ellipsis option menu and going to file details.
      Note: If you receive an error message that your atpc_user does not have read/write privileges into the Object Store, you may need to properly set up your user privileges or contact your administrator to do so.
      Description of the illustration data_loading_script
ORA-20404: Object not found - https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/lgcnscorp/tutorial_load_atpc/ 
ORA-06512: "C##CLOUD$SERVICE.DBMS_CLOUD",  757행 
ORA-06512: "C##CLOUD$SERVICE.DBMS_CLOUD",  1879행 
ORA-06512:  1행 
  1. After you run the procedure, observe that the data has been copied from the object store to the tables in your Autonomous Transaction Processing database.
    Description of the illustration result_of_loading_table

 

All data load operations done using the PL/SQL package DBMS_CLOUD are logged in the tables dba_load_operations and user_load_operations. These tables contain the following:

  • dba_load_operations: shows all load operations.
  • user_load_operations: shows the load operations in your schema.
  1. Query these tables to see information about ongoing and completed data loads. For example:
    SELECT table_name, owner_name, type, status, start_time, update_time, logfile_table, badfile_table FROM user_load_operations WHERE type = 'COPY';
  2. Examine the results. The log and bad files are accessible as tables:
    TABLE_NAME STATUS ROWS_LOADED LOGFILE_TABLE   BADFILE_TABLE
    ---------- ------------ ----------- -------------   -------------
    CHANNELS FAILED COPY$1_LOG      COPY$1_BAD
    CHANNELS COMPLETED 5   COPY$2_LOG COPY$2_BAD

Next Tutorial 

Using Oracle Machine Learning with Autonomous Data Warehouse Cloud ServiceUsing Oracle Machine Learning with Autonomous Data Warehouse Cloud Service

 

+ Recent posts