If you’ve worked with Autonomous Database previously, you know that every now and then you need to get files from the Autonomous Database or put files to Autonomous Database so they can be processed.

For example, when you migrate data using Oracle Data Pump, the log files will be placed on the Autonomous Database. Typically, I’ve used OCI Object Storage for it which is also available with ADB@GCP. However, with many implementations with Azure and GCP, I’ve seen that people don’t want to use OCI as we are now running within Azure or GCP and OCI is a no-go to use.

Can we use GCP Cloud Storage for this with ADB? Short answer – ABSOLUTELY, and it’s really simple!

I’ll include example on setup below.

Setup High Level Summary

To setup access from ADB to Cloud Storage we need the following:

  • Create Service Account in ADB (yes you create GCP service account in ADB!)
  • Setup GCP Role (or use pre-existing ones) which will be assigned to our service account
  • Assign this Service Account to your GCP Cloud account and grant necessary role to it
  • Create Bucket and add necessary principal authentication for this Service Account

Settings things up

We’ll start with creating the service account on my Autonomous Database which is running on GCP. I won’t go into specifics on provisioning the ADB, I’ve had some posts around that in the past.

Provisioning the account is straightforward:

BEGIN    
    DBMS_CLOUD_ADMIN.ENABLE_PRINCIPAL_AUTH(provider => 'GCP');                        
END;                                                                 
/

And that will create the service account, we will need details for it so we can create it on the GCP side.

select * from cloud_integrations where param_name like 'gcp%';

PARAM_NAME          PARAM_VALUE                                                
------------------- ---------------------------------------------------------- 
gcp_service_account GCP-SA-43501-58393@gcp-pa-project1.iam.gserviceaccount.com 


Elapsed: 00:00:00.007
1 rows selected.

PARAM_VALUE is what we will need to use when we create the service account.

Now we will need to create the Role which will be assigned to our service account/principal. On the GCP Console navigate to IAM & Admin => Roles. I’ll create role ADB-GCP-Role:

Giving necessary permissions for my IAM Role in GCP

I’ve given the Cloud Storage User permissions for this role. Next I’ll go to my bucket and assign authentication for the bucket so service account with that role can do what it needs to!

Assigning principal (service account) and role to the principal for the bucket access

Now that the principal (service account) has a role on that bucket, we’re ready to try it out. I’ll login to SQL Developer on my ADB and try to upload a file from the Cloud Storage to Autonomous Database DATA_PUMP_DIR.

BEGIN  
DBMS_CLOUD.GET_OBJECT(
     credential_name => 'GCP$PA',
     object_uri => 'https://gcpadb-bucket.storage.googleapis.com/file.txt',
     directory_name => 'DATA_PUMP_DIR'); 
END;
/

PL/SQL procedure successfully completed.

Elapsed: 00:00:00.542

SELECT * FROM DBMS_CLOUD.LIST_FILES ( 'DATA_PUMP_DIR' );

OBJECT_NAME BYTES CHECKSUM CREATED                     LAST_MODIFIED               
----------- ----- -------- --------------------------- --------------------------- 
file.txt       32 null     2025-04-30T14:10:26.947559Z 2025-04-30T14:10:27.040355Z 


Elapsed: 00:00:00.024
1 rows selected.

Few notes on above, credential name you can get from DBA_CREDENTIALS and the Object URL is the virtual-hosted style name of the URL which you can get from the file details.

Then use DBMS_CLOUD.LIST_FILES to see the files you’ve uploaded.

Getting files from ADB to Cloud Storage works the same but just with different command, DBMS_CLOUD.PUT_OBJECT is the one I use more frequently during migrations.

BEGIN  
DBMS_CLOUD.PUT_OBJECT(
     credential_name => 'GCP$PA',
     object_uri => 'https://gcpadb-bucket.storage.googleapis.com/file.txt',
     directory_name => 'DATA_PUMP_DIR',
     file_name => 'file.txt'); 
END;
/

PL/SQL procedure successfully completed.

Elapsed: 00:00:00.125

After running that command, I can see the file uploaded to Cloud Storage.

Summary

Even though this was just for getting files from and to ADB@GCP using GCP Cloud Storage, I think it’s a good example on how we can start leveraging GCP services together with Oracle Database.

You could connect even some of the GCP AI services to ADB! (I just had to mention AI here :))

The database will be placed within your GCP region and having your data available for GCP services is just awesome! Low latency and secure connectivity without needing data to traverse over different cloud providers.

Leave a Reply

Your email address will not be published.