I’ve upated my Delphi sample for using Amazon S3; this time I’ve put the exe up for download so anyone can try it (if you have an S3 account).
If there is interest I might work this up into a more user-friendly utility; or perhaps someone would like to help with this.
It does many of the essentials: file upload and download, progress report with cancel option, create, delete and list buckets and items.
http://www.itwriting.com/s3.php
Tim
Your main website http://www.itwriting.com and also the page of the sample http://www.itwriting.com/s3.php do not seem to work (I get an error in Explorer).
I’ve just tried them and they are working OK here – can you retry and let me know if you still have difficulty?
Tim
I can add a little more data to my previous post, in support of the previous note from Pep: it seems that both the MAIN PAGE for the web site, and the http://www.itwriting.com/s3.php page are having problems with the COMPRESSION.
Attempting to access these pages from any user agent that allows retrieving compressed content (which is just about all of them, including Internet Explorer) will fail with a bad decompression code. Access WITHOUT decompression is fine (testing through Synapse component)…
Hope this helps…
I still can’t see any content on that page (http://www.itwriting.com/s3.php).
I’m puzzled by this; the site comes up fine here in both IE and FireFox. Still, I appreciate the feedback and I’ll see if I can work out what’s happening.
In the meantime, if you want to grab the code, it’s here:
S3 Delphi sample
Note that you need the Synapse libraries.
Tim
Thanks all. I’ve figured it out.
I wrote a small script that gets recent blog posts from WordPress in order to insert links to them on other pages. The script works OK; but I’d also configured WordPress to do gzip compression. A characteristic of PHP gzip compression is that it must be set at the start of the script (if not done in php.ini). Since I was including the WordPress script within other PHP pages, it was being set after some content had already been sent. Result: garbage.
I didn’t see it because (I presume) my proxy server disables the request for gzip compression. So in my case the compressed content was not sent.
I’ve now disabled gzip compression in WordPress which should fix the problem.
Tim
Thanks is what I was looking for.
I add a contribution to obtain the ACL.
It can be useful to check the existence of an object
function TS3Storage.GetS3ObjectACL(BucketName: string; .ObjectName: string; DestStream: TStream): boolean;
var
sRequest: string;
theResponse: TMemoryStream;
sDate: string;
sFinalAuth: string;
begin
InitHttp;
try
sRequest := ‘/’ + BucketName + ‘/’ + ObjectName + ‘?acl’;
sDate := RFC822DateTime(now);
mhttp.Headers.Add(‘Date: ‘ + sDate);
sFinalAuth := GetAuthString(‘GET’, ”, ”, sDate, sRequest);
mhttp.Headers.Add(sFinalAuth);
mhttp.HTTPMethod(‘GET’, FHttpPrefix + ‘s3-eu-west-1.amazonaws.com’ + sRequest);
theResponse := mhttp.Document;
theResponse.Position := 0;
result := uppercase(mhttp.resultstring) = ‘OK’;
if result then begin
theResponse.SaveToStream(DestStream);
FError.clear;
end
else
begin
self.FError.LoadFromStream(theResponse);
self.FError.Insert(0,’Http Result: ‘ + mhttp.ResultString);
end;
finally
Freeandnil(mhttp);
end;
end;
Would be useful to define a parameter to indicate the s3 zone
For example in my case: s3-eu-west-1 (s3-eu-west-1.amazonaws.com)