Since I have previously written a PL/SQL API for Amazon S3, uploading your backup files to S3 is not very difficult. The high-level steps are:
- Create an export (dump file) of your database schema using DBMS_DATAPUMP (or the DATAPUMP_UTIL_PKG wrapper described in my last blog post).
- Save the export file to disk on the server.
- Upload the export file to S3.
- (Optionally) Delete the export file from the server.
- (Optionally) Send an email notification indicating if the upload was successful or not.
In the "demos" folder of the Alexandria Utility Library for PL/SQL you can find a package called DATAPUMP_CLOUD_PKG which does the work described above in a procedure called BACKUP_SCHEMA_TO_S3.
Here is how you can set up a call to DBMS_JOB to schedule regular backups of your database schema to Amazon S3 (note the double quoting of the string values):
declare
l_job number;
l_what varchar2(4000);
begin
l_what := 'datapump_cloud_pkg.backup_schema_to_s3(''your_amazon_id'', ''your_amazon_password'', ''your_amazon_bucket'', ''your_amazon_folder'', ''YOUR_ORACLE_BACKUP_DIR'', ''your_backup_file_name_'' || to_char(sysdate, ''yyyymmddhh24miss''), p_email_failure => ''your_email_address@domain'', p_email_success => ''your_email_address@domain'', p_version => ''10.2'', p_gmt_offset => 0);';
-- run job at the end of each day
dbms_job.submit (job => l_job, what => l_what, next_date => sysdate, interval => 'trunc(sysdate + 1) + 23/24');
commit;
end;
/
Security Tip: You may not like to expose your Amazon password in the job definition. What you can do to reduce the risk is to use the AWS Identity and Access Management (IAM) tools to define a separate backup user which only has upload ("put") permissions on your S3 folder, and no privileges to list file contents, delete or download files. This will limit the damage that someone with a stolen password can do.