| Comments

After posting my sample implementation of accessing Amazon Simple Storage Solution (S3) via Silverlight, I reflected quickly and also chatted with some AWS engineers.

Cross-domain Policy

One thing that you should never do is just deploy a global clientaccesspolicy.xml file blindly.  Often times in samples, we (I) do this.  I need to be better about this guidance to be honest, so I’ll start here.  As an example, for the S3 cross domain policy file, we really should add some additional attributes to it to make it more secure.  Since we know it is a SOAP service, we can ratchet down the requests a little bit by adding the http-request-headers restrictions like this:

   1: <?xml version="1.0" encoding="utf-8" ?>
   2: <access-policy>
   3:   <cross-domain-access>
   4:     <policy>
   5:       <allow-from http-request-headers="SOAPAction,Content-Type">
   6:         <domain uri="*"/>
   7:       </allow-from>
   8:       <grant-to>
   9:         <resource include-subpaths="true" path="/"/>
  10:       </grant-to>
  11:     </policy>
  12:   </cross-domain-access>
  13: </access-policy>

Additionally (and ideally) we’d be hosting our application from a known domain.  In this instance let’s say I was going to host my application on timheuer.com in the root domain.  I would add the allow from attribute and complete my security like this:

   1: <?xml version="1.0" encoding="utf-8" ?>
   2: <access-policy>
   3:   <cross-domain-access>
   4:     <policy>
   5:       <allow-from http-request-headers="SOAPAction,Content-Type">
   6:         <domain uri="http://timheuer.com"/>
   7:       </allow-from>
   8:       <grant-to>
   9:         <resource include-subpaths="true" path="/"/>
  10:       </grant-to>
  11:     </policy>
  12:   </cross-domain-access>
  13: </access-policy>

Of course if I had a cool application and others wanted to embed it, I could add more domains to that allow list as well and just list them in there.  But restricting it makes sense if you want to provide some secure access to your APIs (as a service provider) and to you (in methods of doing things like this sample).

More security with SSL

As I mentioned in the initial sample I changed the binding configuration, modifying the binding to use a security mode of “None” instead of “Transport.”  I actually did this because I use the built-in web server from Visual Studio for most of my development and it doesn’t support HTTPS connectivity.  To demonstrate my sample with S3 I had to ensure the schemes matched because in Silverlight 2 right now to access a secure service, the XAP itself would have to be served from a secure location.  The contexts must match.

I’ve come to learn that even with a bucket alias (except ones with “.” characters) you can use the SSL cert from Amazon S3 as it is a wildcard certificate.  So your endpoint (assuming a bucket name of timheuer-aws) could be https://timheuer-aws.s3.amazonaws.com/soap and it would work.

Using SSL of course means that currently you will have to serve your application from an SSL endpoint as well to avoid cross-scheme violations.

I hope this helps clear some things up and provide you with a more secure and recommended way of accessing Amazon S3 services with Silverlight!

| Comments

For about a year now I've been using Amazon S3 services.  Mostly I'm using it for image storage for my blog and web site.  I decided to stop using Flickr for screenshot stuff and keep it to 'photographs' when I can.  I signed up for an S3 account and have been using it for screenshot type stuff since then.  If you don't know, S3 is a service that basically enables 'object' storage in the cloud.  An object can be anything really, but I'm treating it like a remote host for images.

The one thing Amazon doesn't provide themselves is a tool to manage your account...it is really an API only.  There are plenty out there that have implemented user interfaces around S3 services.  My two favorites are the S3 plugin for Firefox and BucketExplorerI use the Firefox one more than anything for uploading just because it was faster for what I needed.

UPDATE: While I still use the Firefox extension and always have it installed, I find myself using CloudBerry Explorer a LOT more.  It is the most full-featured (free) Amazon S3 tool I've seen and I love it.  They keep adding little subtle things to that make my process even easier!  Check it out today!

But the problem is that neither of the tools really incorporated *why* I was using S3 for me, which was primarily with my blog.  So a year ago I grabbed some of the sample code from the S3 developer site and whipped up a quick-and-dirty plugin for Windows Live Writer that I've been using.  I already had my Flickr4Writer plugin that I used for Flickr, but like I mentioned, I was using S3 for other image hosting now.  I was lazy though and only did a read version that inserted an image.  I was still relying on the other tools to upload, change permissions, etc. -- my workflow sucked.

Well as a part of The Code Trip, we set goals to release projects on CodePlex.  I decided to put this project out on CodePlex as far as I had it.  I immediately had a partner in Aaron Lerch.  He jumped in and within a day basically put in the remaining features that were lacking.

The result is the S3 Browser for Windows Live Writer project:

The initial 0.91 beta release is available on the project now.  Please give it a spin if you are a Live Writer and Amazon S3 user!

| Comments

so at tech lunch wednesday here in phoenix, after lunch i sat with hamid and scott for a bit and we were brainstorming about a few things.  one of which was storage, then we got on the topic of amazon's s3 solution.  i had started to look at it before, but then never got the time to go back.  essentially amazon provides storage via a web service (there are no tools provided by them, just an api).  i said that i mainly use flickr for the storage and that i'd only use it if i could get a direct url to things, and that i'd have to have a plugin for live writer :-).

well, when i got back i started to mess around a bit and got a working progress for my "s3 browser" live writer plugin.  you can browse your buckets and link to an item as a link or an image.  it is very rough right now, has some issues with UI threads, etc., but it works for me so far.

you can take a look at the screenshot below -- and the link is directly to my s3 stored image!