Wednesday, January 16, 2013

PL/SQL and Microsoft technology

Many companies tend to have mixed environments, with Oracle databases (and PL/SQL) running business-critical back-office systems, as well as Microsoft products for front-office applications such as email, word processing and collaboration/file sharing (ie Outlook, Office and Sharepoint), and of course the desktop operating system itself (Windows, with Internet Explorer, Internet Information Server, all integrated via Active Directory).



Working in these mixed environments means that you, as an Oracle PL/SQL developer, frequently need to work/integrate with various Microsoft technologies from PL/SQL.

Over the last couple of years, I've written a number of blog posts on this topic. This post is just a convenient collection of links to these previous posts:


PL/SQL and NTLM





Oracle Data Provider for .NET (ODP.NET)





Oracle Application Express (Apex) and IIS




Microsoft Office and Office Open XML (OOXML)




Microsoft Exchange (Email, Calendar and Contacts)



Microsoft Sharepoint

  • I have not yet had the need (or the desire...) to work with Sharepoint from PL/SQL, but it would be quite easy to create a PL/SQL library package to interface with the Sharepoint web services, similar to the Exchange package listed above. Perhaps I will revisit this topic sometime in the future... ? 


Tuesday, November 6, 2012

Apex Plugin: Execute PL/SQL code and return content to page

Here is a Dynamic Action plugin for Oracle Application Express (Apex) that lets you execute PL/SQL code in the database, and then send output generated on the server (via calls to the "HTP.P" procedure) back to any element on the web page.

A more specific use case would be to respond to the clicking of a button by generating a fragment of HTML using PL/SQL, and then refresh a DIV on the page with that dynamically generated content. See the screenshot below for an example; notice that part of the first region has been updated with content generated on the server (using input from the user, in this case the name from a regular text input item). Every time the user clicks on the button, the line above the button is updated using an Ajax call to the server.

The second region illustrates another use case: Automatically refreshing part of the page at a fixed interval.



This is how the page has been set up:



The first region has a standard text input item, and a button. The button has a Dynamic Action that is triggered when the button is clicked. The click executes the PL/SQL code which generates the greeting.

The dynamic action is set up as follows:



We can see that the content generated by the PL/SQL code is returned into a page element called "my_div". This div is simply included as the static content of the HTML region:



The second dynamic action has been set up to trigger on "Page Load". The plugin is set up to call the PL/SQL code every 5 seconds (nice way to hammer your database with requests...) and return the content into another div, called "my_other_div". The mode has been set to "Append after", which means the content will be added to the end of the existing content, instead of replacing it.



Here is the setup for the second region:



Conclusion: This plugin combines the flexibility of Dynamic Actions with the power of PL/SQL for dynamic content generation. You can download the plugin here. Update 22.05.2016: This plugin is now in my plugin repository at GitHub.




Monday, August 13, 2012

Database (schema) backup to the cloud with PL/SQL

In my last blog post I described using the DBMS_DATAPUMP and DBMS_JOB packages to automate database schema backups from the database itself, rather than using batch files to do so. I also mentioned that "since the files are written to the database server itself, you need some kind of process to move the files to another server or storage location, so the backups don't disappear along with the database if the database server itself is somehow lost or corrupted. [One] option is to upload the backup file from the database to some online ("cloud") storage service, such as Amazon S3."

Since I have previously written a PL/SQL API for Amazon S3, uploading your backup files to S3 is not very difficult. The high-level steps are:


  1. Create an export (dump file) of your database schema using DBMS_DATAPUMP (or the DATAPUMP_UTIL_PKG wrapper described in my last blog post).
  2. Save the export file to disk on the server.
  3. Upload the export file to S3.
  4. (Optionally) Delete the export file from the server.
  5. (Optionally) Send an email notification indicating if the upload was successful or not.
In the "demos" folder of the Alexandria Utility Library for PL/SQL you can find a package called DATAPUMP_CLOUD_PKG which does the work described above in a procedure called BACKUP_SCHEMA_TO_S3.

Here is how you can set up a call to DBMS_JOB to schedule regular backups of your database schema to Amazon S3 (note the double quoting of the string values):

declare
  l_job number;
  l_what varchar2(4000);
begin

  l_what := 'datapump_cloud_pkg.backup_schema_to_s3(''your_amazon_id'', ''your_amazon_password'', ''your_amazon_bucket'', ''your_amazon_folder'', ''YOUR_ORACLE_BACKUP_DIR'', ''your_backup_file_name_'' || to_char(sysdate, ''yyyymmddhh24miss''), p_email_failure => ''your_email_address@domain'', p_email_success => ''your_email_address@domain'', p_version => ''10.2'', p_gmt_offset => 0);';

  -- run job at the end of each day

  dbms_job.submit (job => l_job, what => l_what, next_date => sysdate, interval => 'trunc(sysdate + 1) + 23/24');

  commit;

end;
/

Security Tip: You may not like to expose your Amazon password in the job definition. What you can do to reduce the risk is to use the AWS Identity and Access Management (IAM) tools to define a separate backup user which only has upload ("put") permissions on your S3 folder, and no privileges to list file contents, delete or download files. This will limit the damage that someone with a stolen password can do.

Monday, July 23, 2012

Simple database (schema) backup using PL/SQL

Usually, database backups are the responsibility of DBAs, and not something that developers really care or think too much about. However, if you are a lone developer, or part of a small team that doesn't have the luxury of having a dedicated DBA, then you might have to deal with backups yourself.

There are several ways to make Oracle backups; what I will be concentrating on here is the "Data Pump Export" method. You may already be familiar with the command-line "expdp" command which allows you to create a dump (.dmp) file containing your database objects (schemas, tables, procedures, etc.).

To perform a regular backup, one could create a simple batch file to run "expdp" and schedule it using the Windows Task Scheduler or a Unix Cron job. While this certainly works, it means that you have one more "moving part" to think about (the OS scheduler), and you may have to hardcode database passwords in the batch file unless you use OS authentication.

However, the "expdp" command-line utility is actually just a "front-end" to a PL/SQL package called DBMS_DATAPUMP, as can be seen in this diagram:



This means that we can call the DBMS_DATAPUMP package directly from our own PL/SQL code to create dump files, and use DBMS_JOB to schedule this backup at regular intervals. We avoid the dependence on batch files, and we don't have to expose any passwords.

The API for DBMS_DATAPUMP is flexible and, as mentioned, supports everything that can be done from the command line (both exports and imports), but it can sometimes be a bit tricky to get the (filtering) syntax correct.

I've written a wrapper package called DATAPUMP_UTIL_PKG, which you can download as part of the Alexandria PL/SQL Utility Library. This package makes it really easy to perform the most common use case: Exporting a single schema to a file on disk. This can be accomplished with the following code:



-- export current schema to file, use default file name, and make the export compatible with XE 10g
-- include a custom message


begin
  debug_pkg.debug_on;
  datapump_util_pkg.export_schema_to_file ('DEVTEST_TEMP_DIR', p_version => '10.2', p_log_message => 'it is possible to include custom messages in the log');
end;
/

To schedule this as a regular job, use the following code:

declare
  l_job number;
  l_what varchar2(4000);
begin

  l_what := 'datapump_util_pkg.export_schema_to_file(''DEVTEST_TEMP_DIR'', p_version => ''10.2'');';

  -- run job at the end of each day

  dbms_job.submit (job => l_job, what => l_what, next_date => sysdate, interval => 'trunc(sysdate + 1) + 23/24');

  commit;

end;
/

For this to work, you also need to set up a directory on the database server:

-- run as SYS
create directory devtest_temp_dir as 'c:\some_folder\backup\';
grant read, write on directory devtest_temp_dir to your_database_schema;


Obviously, since the files are written to the database server itself, you need some kind of process to move the files to another server or storage location, so the backups don't disappear along with the database if the database server itself is somehow lost or corrupted. Unlike the "hot" database files, these dump files can be safely copied by normal backup software.

Another option is to upload the backup file from the database to some online ("cloud") storage service, such as Amazon S3. I will talk about that in my next blog post...

PS. Don't forget to regularly test a restore (import) of your backup files. The only thing worse than not having a backup is having a backup that can't be restored... !

Saturday, May 26, 2012

MS Exchange API for PL/SQL

As mentioned in my earlier post, I've been working on a PL/SQL wrapper for the Microsoft Exchange Web Services (EWS) API. The code is now ready for an initial release!



Features


Using this pure PL/SQL package, you will be able not just to search for and retrieve emails and download attachments, but you will also be able to create emails and upload attachments to existing emails. You can move emails between folders, and delete emails. You can read and create calendar items. You can get the email addresses of the people in a distribution (mailing) list, and more.

Prerequisites



  • You need access to an Exchange server, obviously! I've done my testing against an Exchange 2010 server, but should also work against Exchange 2007.
  • If the Exchange server uses HTTPS, then you need to add the server's SSL certificate to an Oracle Wallet. See this page for step-by-step instructions.


Usage



The API should be fairly self-explanatory. The only thing you need to do is to call the INIT procedure at least once per session (but remember that in Apex, each page view is a new session, so place the initialization code in a before-header process or a page 0 process).


begin
  ms_ews_util_pkg.init('https://thycompany.com/ews/Exchange.asmx', 'domain\user.name', 'your_password', 'file:c:\path\to\Oracle\wallet\folder\on\db\server', 'wallet_password');
end;


Note also that there are two varieties of most functions: One pipelined function intended for use from plain SQL, and a twin function suffixed with "AS_LIST" that returns a list intended for use with PL/SQL code.

Code Examples



-- resolve names


declare
  l_names ms_ews_util_pkg.t_resolution_list;
begin
  debug_pkg.debug_on;
  l_names := ms_ews_util_pkg.resolve_names_as_list('john');
  for i in 1 .. l_names.count loop
    debug_pkg.printf('name %1, name = %2, email = %3', i, l_names(i).mailbox.name, l_names(i).mailbox.email_address);
  end loop;
end;




-- resolve names (via SQL)


select *
from table(ms_ews_util_pkg.resolve_names ('john'))




-- expand distribution list


declare
  l_names ms_ews_util_pkg.t_dl_expansion_list;
begin
  debug_pkg.debug_on;
  l_names := ms_ews_util_pkg.expand_public_dl_as_list('some_mailing_list@your.company');
  for i in 1 .. l_names.count loop
    debug_pkg.printf('name %1, name = %2, email = %3', i, l_names(i).name, l_names(i).email_address);
  end loop;
end;




-- get folder


declare
  l_folder ms_ews_util_pkg.t_folder;
begin
  debug_pkg.debug_on;
  l_folder := ms_ews_util_pkg.get_folder (ms_ews_util_pkg.g_folder_id_inbox);
  debug_pkg.printf('folder id = %1, display name = %2', l_folder.folder_id, l_folder.display_name);
  debug_pkg.printf('total count = %1', l_folder.total_count);
  debug_pkg.printf('child folder count = %1', l_folder.child_folder_count);
  debug_pkg.printf('unread count = %1', l_folder.unread_count);
end;




-- find up to 3 items in specified folder


declare
  l_items ms_ews_util_pkg.t_item_list;
begin
  debug_pkg.debug_on;
  l_items := ms_ews_util_pkg.find_items_as_list('inbox', p_max_rows => 3);
  for i in 1 .. l_items.count loop
    debug_pkg.printf('item %1, subject = %2', i, l_items(i).subject);
  end loop;
end;




-- get items in predefined folder


select *
from table(ms_ews_util_pkg.find_folders('inbox'))




-- get items in predefined folder, and search subject


select *
from table(ms_ews_util_pkg.find_items('inbox', 'the search term'))




-- get items in user-defined folder


select *
from table(ms_ews_util_pkg.find_items('the_folder_id'))




-- get items in user-defined folder, by name


select *
from table(ms_ews_util_pkg.find_items(
             ms_ews_util_pkg.get_folder_id_by_name('Some Folder Name', 'inbox')
            )
          )




-- get item (email message)


declare
  l_item ms_ews_util_pkg.t_item;
begin
  debug_pkg.debug_on;
  l_item := ms_ews_util_pkg.get_item ('the_item_id', p_include_mime_content => true);
  debug_pkg.printf('item %1, subject = %2', l_item.item_id, l_item.subject);
  debug_pkg.printf('body = %1', substr(l_item.body,1,2000));
  debug_pkg.printf('length of MIME content = %1', length(l_item.mime_content));
end;




-- get item (calendar item)


declare
  l_item ms_ews_util_pkg.t_item;
begin
  debug_pkg.debug_on;
  l_item := ms_ews_util_pkg.get_item ('the_item_id', p_body_type => 'Text', p_include_mime_content => true);
  debug_pkg.printf('item %1, class = %2, subject = %3', l_item.item_id, l_item.item_class, l_item.subject);
  debug_pkg.printf('body = %1', substr(l_item.body,1,2000));
  debug_pkg.printf('length of MIME content = %1', length(l_item.mime_content));
  debug_pkg.printf('start date = %1, location = %2, organizer = %3', l_item.start_date, l_item.location, l_item.organizer_mailbox_name);
end;




-- create calendar item


declare
  l_item ms_ews_util_pkg.t_item;
begin
  debug_pkg.debug_on;
  l_item.subject := 'Appointment added via PL/SQL';
  l_item.body := 'Some text here...';
  l_item.start_date := sysdate + 1;
  l_item.end_date := sysdate + 2;
  l_item.item_id := ms_ews_util_pkg.create_calendar_item (l_item);
  debug_pkg.printf('created item with id = %1', l_item.item_id);
end;




-- create task item


declare
  l_item ms_ews_util_pkg.t_item;
begin
  debug_pkg.debug_on;
  l_item.subject := 'Task added via PL/SQL';
  l_item.body := 'Some text here...';
  l_item.due_date := sysdate + 1;
  l_item.status := ms_ews_util_pkg.g_task_status_in_progress;
  l_item.item_id := ms_ews_util_pkg.create_task_item (l_item);
  debug_pkg.printf('created item with id = %1', l_item.item_id);
end;




-- create message item


declare
  l_item ms_ews_util_pkg.t_item;
begin
  debug_pkg.debug_on;
  l_item.subject := 'Message added via PL/SQL';
  l_item.body := 'Some text here...';
  l_item.item_id := ms_ews_util_pkg.create_message_item (l_item, p_to_recipients => t_str_array('recipient1@some.company', 'recipient2@another.company'));
  debug_pkg.printf('created item with id = %1', l_item.item_id);
end;




-- update item
-- item id and change key can be retrieved with following query:
-- select item_id, change_key, subject, is_read from table(ms_ews_util_pkg.find_items('inbox'))


declare
  l_item_id varchar2(2000) := 'the_item_id';
  l_change_key varchar2(2000) := 'the_change_key';
begin
  ms_ews_util_pkg.update_item_is_read (l_item_id, l_change_key, p_is_read => true);
end;




-- get list of attachments


select *
from table(ms_ews_util_pkg.get_file_attachments('the_item_id'))




-- download and save 1 attachment


declare
  l_attachment ms_ews_util_pkg.t_file_attachment;
begin
  debug_pkg.debug_on;
  l_attachment := ms_ews_util_pkg.get_file_attachment ('the_attachment_id');
  file_util_pkg.save_blob_to_file('DEVTEST_TEMP_DIR', l_attachment.name, l_attachment.content);
end;




-- create attachment (attach file to existing item/email)


declare
  l_attachment ms_ews_util_pkg.t_file_attachment;
begin
  debug_pkg.debug_on;
  l_attachment.item_id := 'the_item_id';
  l_attachment.name := 'Attachment added via PL/SQL';
  l_attachment.content := file_util_pkg.get_blob_from_file('DEVTEST_TEMP_DIR', 'some_file_such_as_a_nice_picture.jpg');
  l_attachment.attachment_id := ms_ews_util_pkg.create_file_attachment (l_attachment);
  debug_pkg.printf('created attachment with id = %1', l_attachment.attachment_id);
end;



Download


The MS_EWS_UTIL_PKG package is included in the Alexandria Utility Library for PL/SQL.

Known Issues


The CREATE_ITEM functions don't seem to return the ID of the created item (which do get created in Exchange). I think the issue is with the XML parsing of the returned results; I will look into this later.

Please report any bugs found via the issue list.

Friday, May 25, 2012

A friend in need: Flashback Query


"It is sayd, that at the nede the frende is knowen."
- William Caxton: "Sonnes of Aymon" (1489)
"I say, that when you screw up, Flashback Query is a friend indeed!"
- Morten Braten: "SQLs of Oracle" (2012)

The Flashback Query feature of Oracle (10g) can be really helpful when you run that UPDATE or DELETE statement and then, after the COMMIT, you discover that you made a mistake!

Wouldn't it be nice to be able to look back and see what the data looked like before you messed things up? With a Flashback Query you can do just that:


-- see the data as it was 10 minutes ago
select *
from emp
as of timestamp systimestamp - interval '10' minute
where empno = 7369


Or you can see a history of changes:


-- history of all changes last 20 minutes
select empno, ename, sal, job,
  versions_operation,
  versions_starttime,
  versions_endtime,
  versions_xid
from emp
versions between timestamp systimestamp - interval '20' minute and systimestamp
where empno = 7369
order by versions_endtime nulls last


I've found this useful several times lately, so I'm adding it to the blog as a quick reminder, as I always struggle to remember that funky interval syntax thingy.

Monday, May 21, 2012

jQGrid Integration Kit for PL/SQL and Apex version 1.2 available

It's been over two years since I released the initial version of the jQGrid Integration Kit for PL/SQL and Apex.



Since then, I've added some additional features and fixed a few bugs:


Version 1.1


  • Procedure include_static_files_once will make sure required JS and CSS files are correctly included
  • Support for "popup_lov" column edit type
  • Procedure render_grid supports p_read_only parameter that can be used to set read-only dynamically
  • Support for some jQGrid built-in formatters based on Oracle column data type
  • Lists of values (select list variety, not Popup LOV) now support bind variables and Static LOVs
  • Date parser handles dates and times
  • Bug Fix: JSON generator now handles NULL parameter values (Thanks to Matt Nolan for spotting this one)


Version 1.2


  • Popup LOV now handles Apex bind variables and arbitrary LOV column names
  • Support for jQGrid grouping (experimental)


The latest version can be downloaded here.

New installation: Run the install.sql script. Then run all scripts in the upg folder. Copy the files in the js folder to your web server.

Upgrade from previous version: Run all scripts in the upg folder. Copy the files in the js folder to your web server (overwriting any previous version of the same files; or copy to a different folder and change references accordingly).

Note on jQuery and jQGrid versions: The Integration Kit (via the include_static_files_once procedure) references a specific jQuery version (1.3.2). If your web page includes a more recent version of jQuery, you should modify/remove the reference in the Integration Kit to avoid conflicts.

The version of jQGrid bundled with the Integration Kit is version 3.6. At the time of writing, the latest jQGrid version available is 4.3.2; this has not been tested with the Integration Kit.

Have fun, and please put any bug reports into the project issue list.