tag:blogger.com,1999:blog-5215551487816981140.post957846203473525269..comments2024-01-23T06:10:31.184-08:00Comments on ORA-00001: Unique constraint violated: Amazon S3 API for PL/SQLMorten Bratenhttp://www.blogger.com/profile/12300886042835631690noreply@blogger.comBlogger48125tag:blogger.com,1999:blog-5215551487816981140.post-58947451014640317872023-10-11T03:08:20.844-07:002023-10-11T03:08:20.844-07:00@PC: The "duplicate headers" error is du...@PC: The "duplicate headers" error is due to some change at Amazon. You can find the issue discussed (and the fix you need to do) here: https://github.com/mortenbra/alexandria-plsql-utils/issues/83Morten Bratenhttps://www.blogger.com/profile/12300886042835631690noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-67666721779557058182023-09-20T04:43:16.274-07:002023-09-20T04:43:16.274-07:00Getting this message! What I've missed?
Error ...Getting this message! What I've missed?<br />Error at line 1<br />ORA-20000: Your request contains duplicate headers.<br />ORA-06512: at "AMAZON_AWS_S3_PKG", line 41<br />ORA-06512: at "AMAZON_AWS_S3_PKG", line 95<br />ORA-06512: at "AMAZON_AWS_S3_PKG", line 608<br />ORA-06512: at "AMAZON_AWS_S3_PKG", line 656<br />Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-89379177416813712552023-09-20T04:41:45.659-07:002023-09-20T04:41:45.659-07:00I'm getting this error in each. what I've ...I'm getting this error in each. what I've missed?<br /><br />Error at line 1<br />ORA-20000: Your request contains duplicate headers.<br />ORA-06512: at "PROD1.AMAZON_AWS_S3_PKG", line 41<br />ORA-06512: at "PROD1.AMAZON_AWS_S3_PKG", line 95<br />ORA-06512: at "PROD1.AMAZON_AWS_S3_PKG", line 608<br />ORA-06512: at "PROD1.AMAZON_AWS_S3_PKG", line 656<br />ORA-06512: at line 5PChttps://www.blogger.com/profile/17010870257103351831noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-48919054281841174662023-09-20T04:40:12.624-07:002023-09-20T04:40:12.624-07:00I'm Getting this error in each function! What ...I'm Getting this error in each function! What I've missed!?<br /><br />Error at line 1<br />ORA-20000: Your request contains duplicate headers.<br />ORA-06512: at "PROD1.AMAZON_AWS_S3_PKG", line 41<br />ORA-06512: at "PROD1.AMAZON_AWS_S3_PKG", line 95<br />ORA-06512: at "PROD1.AMAZON_AWS_S3_PKG", line 608<br />ORA-06512: at "PROD1.AMAZON_AWS_S3_PKG", line 656<br />ORA-06512: at line 5<br />PChttps://www.blogger.com/profile/17010870257103351831noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-37011321081739344122021-10-04T18:47:30.322-07:002021-10-04T18:47:30.322-07:00Hi Morten, how are you? I know the post is a littl...Hi Morten, how are you? I know the post is a little old, but today I need to perform the aws s3 integration using the packages you provided. I do everything as described in your tutorial but I get the error:<br /><br />now-28239 no key provided<br />ora-06512 dbms_crypto_ffi<br /><br />ora-06512 dbms_crypto<br /><br />If you still read the message and manage to help me, I would appreciate it.<br /><br />Daniel<br />daniel@asinc.com.brAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-12111720206088881732021-09-06T22:38:01.632-07:002021-09-06T22:38:01.632-07:00Hi Morten,
Will you be able to help regarding co...Hi Morten, <br /><br />Will you be able to help regarding connecting to Amazon Seller Partner - API using PL/SQL.<br /><br />It will be a great help.<br /><br />Thank you.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-85259362036008023952020-08-17T02:55:22.440-07:002020-08-17T02:55:22.440-07:00Hi Mortan,
This is really helpful.Could you please...Hi Mortan,<br />This is really helpful.Could you please advise if we will be able to use this package to transfer files from RDS directory to S3 bucket which is in a different account.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-36264376360713455512020-03-23T13:43:24.937-07:002020-03-23T13:43:24.937-07:00@Nisha K: There is no concept of folders in S3, bu...@Nisha K: There is no concept of folders in S3, but you can use slash in key names, which some clients present as folders, see http://stackoverflow.com/questions/1939743/amazon-s3-boto-how-to-create-folder<br /><br />So using the PL/SQL API you would simply do it like this:<br /><br /> amazon_aws_s3_pkg.new_object ('my-bucket-name', 'my-new-folder/some-subfolder/the-file-name.gif', l_blob, 'image/gif');<br /><br />- MortenMorten Bratenhttps://www.blogger.com/profile/12300886042835631690noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-85471159600935038822020-03-17T23:26:55.189-07:002020-03-17T23:26:55.189-07:00Hey this blog is really awesome and was really use...Hey this blog is really awesome and was really useful in our project. But we are stuck with creating sub folders within S3 using pl/sql... so Please if you could tell us that how do we create a subfolder inside the S3 bucket ... it would be really helpful..<br /><br />Thanks ans Regards,<br />Nisha KNisha Knoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-443600099751404352018-10-01T12:00:35.082-07:002018-10-01T12:00:35.082-07:00How do you install without apex?
AMAZON_AWS_S3_PK...How do you install without apex?<br /><br />AMAZON_AWS_S3_PKG gives this error<br />Error: PLS-00201: identifier 'WWV_FLOW_UTILITIES.URL_ENCODE2' must be declared<br />Michael Jerniganhttps://www.blogger.com/profile/02263818810659602183noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-1131076297790336492018-10-01T11:59:20.577-07:002018-10-01T11:59:20.577-07:00How do you install when you aren't using apex?...How do you install when you aren't using apex?<br /><br />I am seeing this message on the AMAZON_AWS_S3_PKG package<br />Error: PLS-00201: identifier 'WWV_FLOW_UTILITIES.URL_ENCODE2' must be declared<br />Michael Jerniganhttps://www.blogger.com/profile/02263818810659602183noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-53470861137349547442018-08-29T06:41:02.335-07:002018-08-29T06:41:02.335-07:00I was a having a problem writing a file to a sub d...I was a having a problem writing a file to a sub directory of a bucket.<br />The following was giving errors:<br /><br />amazon_aws_s3_pkg.new_object ('bucket_name/directory_name',<br /> 'test.csv',<br /> l_blob,<br /> 'text/csv');<br /><br />Once I realized that directories were in actuality not buckets, the solution was straightforward:<br /><br />amazon_aws_s3_pkg.new_object ('bucket_name',<br /> 'directory_name/test.csv',<br /> l_blob,<br /> 'text/csv');<br />This is the url that it builds:<br />https://bucket_name.s3.amazonaws.com/directory_name/test.csv<br />Ed Sieglehttps://www.blogger.com/profile/09523458160685134765noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-51150018272164521232018-08-28T13:31:28.298-07:002018-08-28T13:31:28.298-07:00I need to write to bucket with a sub directory, bu...I need to write to bucket with a sub directory, but when I attempt to do so I get the error below: <br /><br />15:22:02 ORA-12545: Connect failed because target host or object does not exist<br /><br />What would be the syntax to do this?<br />amazon_aws_s3_pkg.new_object ('bucket/directory',<br /> 'test.csv',<br /> l_blob,<br /> 'text/csv');<br />Ed Sieglehttps://www.blogger.com/profile/09523458160685134765noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-91784957451728536622018-08-28T13:27:10.547-07:002018-08-28T13:27:10.547-07:00I need to write to a bucket with a directory. What...I need to write to a bucket with a directory. What would be the syntax to accomplish this as bucket_name/directly_name gives me 15:22:02 ORA-12545: Connect failed because target host or object does not exist?Ed Sieglehttps://www.blogger.com/profile/09523458160685134765noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-1994309141746021192018-08-18T03:41:19.913-07:002018-08-18T03:41:19.913-07:00Also there is a bug when in the get_object_list pr...Also there is a bug when in the get_object_list procedure. <br /><br />When fetching multiple rows and if the p_next_continuation_token contains certain characters, it errors with: "ORA-20000: The continuation token provided is incorrect: <br /><br />This is because utl_url.escape() oes not work correctly (really this is an Oracle bug which has never been fixed. <br /><br />To fix this change this:<br />l_clob := make_request (get_url(p_bucket_name) || '?list-type=2&continuation-token=' || utl_url.escape(p_next_continuation_token) || '&max-keys=' || p_max_keys || '&prefix=' || utl_url.escape(p_prefix), 'GET', l_header_names, l_header_values, null);<br /><br />to:<br />l_clob := make_request (get_url(p_bucket_name) || '?list-type=2&continuation-token=' || eUrl(p_next_continuation_token) || '&max-keys=' || p_max_keys || '&prefix=' || eUrl(p_prefix), 'GET', l_header_names, l_header_values, null);<br /><br />and use this function:<br /><br />CREATE OR REPLACE function eUrl(p_str varchar2) return varchar2 is <br /> l_str varchar2(32000) := regexp_replace(p_str, '[^[:print:]]');<br /> l_tmp varchar2(32000);<br /> l_len number default length(l_str);<br /> l_bad varchar2(100) default ' >%}\~];?@&<#{|^[`/:=$+''"' || chr(10);<br /> l_char char(1);<br />begin<br /><br /> if ( l_str is NULL ) then<br /> return NULL;<br /> end if;<br /><br /> for i in 1 .. l_len loop<br /> l_char := substr(l_str,i,1);<br /> if ( instr( l_bad, l_char ) > 0 ) then<br /> l_tmp := l_tmp || '%' || to_char(ascii(l_char),'fm0X');<br /> else<br /> l_tmp := l_tmp || l_char;<br /> end if;<br /> end loop;<br /><br /> return l_tmp;<br /><br />end;<br />/<br /><br /><br />Brian McGinitynoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-48102781259487994402018-08-18T03:02:11.943-07:002018-08-18T03:02:11.943-07:00Great package. Very useful. To make this work ov...Great package. Very useful. To make this work over https, install stunnel https://www.stunnel.org/ and then add to the config this:<br /><br />[AWS-S3]<br />client = yes<br />accept = 127.0.0.1:1921<br />connect = s3.amazonaws.com:443<br />;sslVersion = TLSv1<br />;sslVersion = TLSv1.2<br /><br />and then change:<br />g_aws_url_s3 constant varchar2(255) := 'http://s3.amazonaws.com/';<br /><br />to:<br />g_aws_url_s3 constant varchar2(255) := 'http://localhost:1921/';Brian McGinitynoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-49711377226235728392018-05-09T06:40:10.331-07:002018-05-09T06:40:10.331-07:00Hi Morten,
I did see issue 24. Sounds like every...Hi Morten,<br /><br />I did see issue 24. Sounds like everyone is having the same problem, namely, the answer to your question of, "Have you implemented the full signing method as described in the Amazon docs for AWS signature version 4?" being "tried but did not succeed".... So I think everyone was kind of hoping you might have done it successfully. I certainly tried building the signature, but getting its complexity in correctly all the way way till the end was a problem.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-4227531409275441582018-04-26T09:21:17.167-07:002018-04-26T09:21:17.167-07:00@Anonymous: Please see this issue regarding the AW...@Anonymous: Please see this issue regarding the AWS signature v4:<br /><br />https://github.com/mortenbra/alexandria-plsql-utils/issues/24<br /><br />- MortenMorten Bratenhttps://www.blogger.com/profile/12300886042835631690noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-20575253895204947542018-04-23T11:32:39.546-07:002018-04-23T11:32:39.546-07:00Hi Morten,
Writing files to S3 now requires sigv4....Hi Morten,<br />Writing files to S3 now requires sigv4. Replacing dbms_crypto.hmac_sh1 with hmac_sh256 doesn't seem to do the trick, as even a GET fails with "The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.". Looking at sigv4 docs, looks like what is encrypted with it is also different, not just how it gets encrypted. Tried to tinker with the code, but not getting far. Any chance you might be working on fixing up the package, so it can be operational again?<br />Thank you!Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-33306430923679587082018-04-12T06:54:18.810-07:002018-04-12T06:54:18.810-07:00@anonymous: Downloading an object from S3 is expla...@anonymous: Downloading an object from S3 is explained in the "Downloading objects" section. You call amazon_aws_s3_pkg.get_object () and it returns a BLOB. You can do whatever you want with this BLOB, including saving it to a directory on your database server using utl_file.<br /><br />- MortenMorten Bratenhttps://www.blogger.com/profile/12300886042835631690noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-65661106722047008242018-03-26T19:10:00.668-07:002018-03-26T19:10:00.668-07:00Hi Morten,
RDS now supports utl_file and utl_tcp a...Hi Morten,<br />RDS now supports utl_file and utl_tcp and all that.<br />If you have an object in S3 and want to push it into an Oracle directory in RDS (which is now also allowed)... I'm struggling to adapt what you have to do that.<br />Have you thought about giving this utility a once-over, now that what Oracle can do in RDS has been extended a fair bit? Pulling files into db from S3 that way would be super helpful.<br />Thank you!<br />Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-83696955680119133682017-02-19T13:31:37.823-08:002017-02-19T13:31:37.823-08:00Very useful library.
Thank you for your job! :)Very useful library.<br />Thank you for your job! :)Karavanhttps://www.blogger.com/profile/08727337942281620690noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-68979524184077462642016-07-23T03:43:21.531-07:002016-07-23T03:43:21.531-07:00@ILSabbe: Thanks, glad the package is useful to yo...@ILSabbe: Thanks, glad the package is useful to you.<br /><br />To do a minimal install, first install the "core" packages and types:<br /><br />https://github.com/mortenbra/alexandria-plsql-utils/blob/master/setup/install_core.sql<br /><br />Then install the Amazon (S3) packages:<br /><br />https://github.com/mortenbra/alexandria-plsql-utils/blob/master/setup/install_amazon.sql<br /><br />- MortenMorten Bratenhttps://www.blogger.com/profile/12300886042835631690noreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-48739150147675363442016-07-15T07:01:36.118-07:002016-07-15T07:01:36.118-07:00Hi Morten, wow what a gr8 job!!
It is like superH...Hi Morten, wow what a gr8 job!!<br /><br />It is like superHero now that i need to migrate around 500000 blobs from Oracle cmsdk to S3 :)<br /><br /><br />I just want to know if it is possible to install ONLY the amazon_s3 related packages, and which of the others are needed....<br />i suppose the types sql script is mandatory ... and which other?<br /><br />Thanks a lot, i will let you know how it goes :)ILSabbenoreply@blogger.comtag:blogger.com,1999:blog-5215551487816981140.post-1225299634116441592016-03-30T06:34:39.928-07:002016-03-30T06:34:39.928-07:00@Anonymous: See this issue for more info about buc...@Anonymous: See this issue for more info about buckets in the Frankfurt AWS region: https://github.com/mortenbra/alexandria-plsql-utils/issues/24<br /><br />- MortenMorten Bratenhttps://www.blogger.com/profile/12300886042835631690noreply@blogger.com