| Comments

I should have known better honestly.  I’ve had one strike with cloud billing catching me by surprise and I’m not sure why I’m shocked it happened again.  This time, however, I thought I really did plan it out, pay attention to things and asked what I thought were the right questions.  Unfortunately I didn’t get the full answers.  This time I was stung by my shiny new SQL Azure service choice.

UPDATE 12-APR-2012: Based on comments I've received I feel the need to clarify that I'm not bashing Azure or cloud services in general here.  I don't think anywhere I indicated Azure was a crap product or that I hated it at all.  In fact, I indicated I was completely happy with the service offering.  My frustration was *only* with the fact that the pricing was unclear to me based on how I researched it...that is all and nothing more.  As many have pointed out, cloud services like Azure are extremely important in the marketplace and the ability to scale real-time with minimal effort is an exceptional feature.  *FOR ME* I currently don't have those needs so I couldn't justify the charges beyond what I had planned...that is all, nothing more.  My experience with SQL Azure was a positive one as a product.  Quick setup, familiar tools to manage, worry-free database management, great admin interface and a reliable data storage solution.  My architecture, however, just didn't prove ideal currently with my site not being in Azure as well.  When VM roles come out of beta I will be sure to evaluate moving sites there and plan better.

A while back I heard about the change in price for some Windows Azure services and the one that piqued my interest was the SQL Azure.  At the time it hit me right as I needed to move around some of my hosting aspects of my site.  The lure of the $5/month SQL Azure database (as long as it was < 100MB) was appealing to me.  The SQL server aspect of my site has always been a management headache for me as I don’t want to have to worry about growing logs, etc.

Stung by marketing

I followed the announcements to the http://www.windowsazure.com site and read the descriptions of the services.  I was immediately convinced of the value and heck, it was a service from my company so why shouldn’t I give it a try and support it?  When I began to set it up, however, there were questions being asked during setup and I started to get concerned.  I asked around about if this $5 fee was really the only fee.  I didn’t want to get surprises by things like compute time.  Perhaps I wasn’t asking specific enough questions, but all answers I got was that signs pointed to yes, that would be my only fee.

NOTE: As of this writing yes I am a Microsoft employee, but this is my own opinion and I realize that peoples’ expectations and results vary.  This is only my experience.  I’m not only an employee but also a customer of Microsoft services and in this instance a full paying customer.  No internal benefits are used in my personal Azure hosting accounts.

Yesterday I learned that wasn’t the case.  I received my first Azure billing statement and it was way more than I expected.  Yes my $5 database was there as expected, but also was suddenly “Data Transfer” charges of $55.

Trying to make sense of billing

I immediately tried to make sense of this billing.  I immediately remembered that I had created a storage account as well for a quick test and perhaps I forgot to disable/delete that service.  I logged into the management portal and saw that my storage account was properly deleted and nowhere to be seen.  But how to make sense of these charges from the past week then?  Luckily Azure provides detail usage download data so I grabbed that.  The CSV file I download did indeed provide some detail…perhaps too much as some of it I couldn’t discern, namely the one piece that I had hoped would help me: Resource ID.  This ID was a GUID that I thought pointed to a service that I used.  It did not, or at least that GUID was nowhere to be seen on my Azure management portal.

I contacted the billing support immediately to help.  I was able to talk with a human fairly quickly which was a plus.  The gentleman explained to me that I had a lot of outgoing data leaving the Azure data centers and that was the source of the costs.  He asked if I knew if anything was connecting to my SQL Azure instance externally.  Well, duh, yes it was my site!  He went on to explain that this constitutes “Data Transfer” and I’m billed at a per GB rate for any data that leaves the Azure data center. 

I took a deep breath and asked where this was documented in my SQL Azure sign-up process.  We walked through the site together and he agreed that it wasn’t clear.  After being put on hold for a while, I was assured I would receive a credit for the misunderstanding.  Unfortunately for Azure, the damage was done and they lost a customer.

Where the failure occurred

For me the failure was twofold: me for not fully understanding terms and Azure for not fully explaining them in context.  I say “in context” because that was the key piece that was missing in my registration of my account.  Let me explain the flow I took (as I sent this same piece of internal feedback today as well) as a customer once I heard the announcement about the SQL Azure pricing changes:

  • I received notice of updated SQL Azure pricing
  • I visited the site http://www.windowsazure.com for more information
  • I clicked the top-level “PRICING” link provided as that was my fear
  • I was presented with a fancy graphical calculator.  I moved the slider up to 100MB and confirmed the pricing on the side (no asterisks or anything)
  • I notice a “Learn more about pricing, billing and metering” link underneath the calculator and click it to learn more
  • I’m presented with a section of 10 different options all presented at the same level giving the appearance as unique services.
  • I choose the Database one and again read through and confirm the charge for the 100MB database option.
  • I click the “More about databases” link to double-verify and am presented with another detailed description of the billing

Not once during that process was context provided.  Not at any of the steps above (3 different pricing screens) was there context that additional fees could also apply to any given service.  Data transfer, in fact, doesn’t even describe itself very well.  As I was assured in asking folks involved in Azure about my concern on pricing, this “Data Transfer” wasn’t brought up at all.  I’m not sure why at all it is listed along side services and almost presented as a separate service as it appears all Azure services are subject to data transfer fees.  This is not made clear during sign up nor marketing of the pricing for each service.  SQL Azure should clearly state that the fees are database *plus* any additional fees resulting from data transfer.  Heck Amazon does this with S3 which also makes it so confusing to anticipate the cost of billing there as well…but at least it is presented that I need to factor that into my calculation.

I’m to blame, so why am I whining

I said I’m to blame as well for not understanding better what I’m getting into.  It is unfortunate because I really did like the service and felt an assurance of more reliability with my database then I had before.  The management portal was great and the uptime and log management was something I didn’t have to think about anymore. 

So why, you might ask, am I complaining about a service fee for something that was providing me value? 

NOTE: You may ask why I didn’t just move my site within Azure as well so that no data would be leaving the data centers.  This is a fair question, but unfortunately my site won’t run on any Azure hosting services and additionally I manage a few sites on a single server so it is cost prohibitive to have multiple Azure hosting instances for me right now.

Well it is simple.  I’m not made of money.  This blog has no accounting department or annual budget and such, I have to be smart about even the smallest cost.  I already have sunk costs into the server that hosts this site as well as a few others.  A $5/month database fee was nothing and justifiable easily with the value I was getting and the minor additional cost.  $50 (and growing) just wasn’t justifiable to me.  It was already at the same cost as my dedicated server and just no longer made sense for my scenario here.  In this instance I’m the “little guy” and need to think like one.  Perhaps cloud services are not for me.

Summary

So what did I learn?  Well, I really need to understand bandwidth and transfer data better for the sites I have.  Unfortunately this isn’t totally predictable for me and as such if I can’t predict the cost then it isn’t something that I should be using.  If you are considering these types of services regardless of if they are from Azure or Amazon (or whomever) you need to really plan out not only the service but how it will be used.  Don’t be lured by those shiny cost calculators that let you use sliders and show you awesome pricing but don’t help you estimate (or alert you) to that some of those sliders should be linked together.

I think Azure (and other similar services) have real customer value…there is no doubt in that.  For me, however, it just isn’t the time right now.  The services, based on my configuration needs, just don’t make sense.  Had I had a clearer picture of this when signing up, I wouldn’t have been in this situation of frustration.  Choose your services wisely and understand your total usage of them.  For me it currently doesn’t make sense and I’m moving back to a SQL Express account on my server.  Yes I’ll have to manage it a bit more, but my costs will be known and predictable.

Hope this helps.

| Comments

Here’s how it started…

Lisa (my wife) [shouting from office into the kitchen]: Tim, what’s this Amazon charge for $193?
Me [thinking what I may have purchased and not remembered]: Um, don’t know…let me look.

I then logged into my Amazon account to see what order I may have forgotten.  Surely I didn’t order $200 worth of MP3…that’s ridiculous.  Sure enough nothing was there.  Immediately I’m thinking fraud.  I start freaking out, getting mad, figuring out my revenge scheme on the scammer, etc.

Then it hit me: Amazon Web Services account.

The Culprit

Sure enough I logged in and my January 2010 billing account was $193 and change.  Yikes.  Well, I could let the (what has been averaging) $30 or so charge slide under the family CFO radar for a while…but this $193 charge…the chief auditor herself caught that one.

So I panicked.  I needed to figure out where/what the spike was.  I logged into the Amazon Web Services management console (I only use the S3/CloudFront storage in their services right no) to see what was going on.  I see ‘Usage Reports’ and click.  I’m met with essentially a bunch of useless data really.  No offense to Amazon, but really the usage reports weren’t really helpful at all.  First, they gave me a Resource ID which I thought would represent the URI I was looking for.  Nope, Resource ID == Bucket.  And they didn’t even put the bucket name in the report!

For some perspective, here’s essentially what I’m used to – here’s my December 2009 billing statement details:

December 2009 S3 CloudFront Billing

Anyhow, after some hunting it was obvious that I wasn’t going to figure out what bucket objects/unique URIs were causing my spike.  This was primarily because I didn’t have logging turned on at all on my buckets.  I had in the past but really didn’t think I needed it so I turned it off.

I was wrong – go now and enable logging.

While I was searching for a solution to understand my traffic, I was curious for where my traffic was.  Like I said, I’d been averaging (actually *peaking*) at about a $30 charge for the S3 hosting.

NOTE: I use S3 for all my image/screenshot/sample code file hosting.  I’ve invested in S3 for a long time and built my blogging workflow around it with building tools like S3 Browser for Windows Live Writer.

What was interesting was my most usage of my CloudFront data was coming from Hong Kong.  Compare to above the December 2009 billing to this January 2010 billing:

January 2010 Blling Statement

Yeah, that was my reaction too.  I went from roughly 40GB of transfer bandwidth to over 960GB in one month.  I suspected I knew what happened, but needed to confirm before I changed things. 

Implementing Logging for Statistics

The problem was that I didn’t have logging enabled and I was pretty much stuck.  I needed to get some data from the logs before being for sure.  I quickly found S3Stat and it appears to be the de-facto reporting for Amazon S3 log files.  I signed up for the free trial and generated a new access key to give them.

NOTE: They have a ‘manual’ option which means a lot more work.  I simply generated a NEW S3 access key for this specific purpose.  That way I didn’t have to give them my golden key I’ve been using in other places and can shut this off at any time without issue to my other workflows.

24 hours later, I had some reports.  Wicked cool reports.  Here’s a list of what I’m currently looking at:

  • Total hits, total files, total kbytes
  • Hits/files per hour/day
  • Hourly stats
  • Top 30 URIs
  • Top URIs by kbytes used
  • Top referrers (find out who’s using your bits without you knowing)
  • User agents
    Here’s a quick snapshot of one:
    S3Stat sample report image

Wow…honestly…THIS is what I was expecting when I see “usage” data reports.  S3Stat is awesome and you should use that now.  Yes, I’m buttering up to them…but they have a great tool here for $5/month if you are a heavy Amazon S3/CloudFront user.  Amazon frankly should just buy them and integrate this into their management console.  You can see other examples of their report outputs on their site at http://www.s3stat.com

What I also found out is that the tool I use for my desktop usage of S3/CloudFront (outside of my blogger workfow and S3Browser) has S3Stat integration built in!  I use CloudBerry’s S3 Explorer Pro for managing my S3 content.  It’s awesome and you should look at it.  When I look at the logging features in CloudBerry I see this:

CloudBerry S3Stat dialog

And after enabling the logging, within CloudBerry I can view the log data within the tool:

CloudBerry view logging

Summary

Wow, this is incredibly helpful and insightful data.  I now know who/how/when my cloud storage data is being used in various ways I can see the data.  S3Stat immediately showed me incredible value within less than 24 hours of enabling it.  I know can confirm the culprit of the burst of usage and plan accordingly.

Now, to be clear I’m not complaining about the cost of cloud storage.  That has been clear to me from the beginning.  Nothing is hidden and I’m not an idiot for not understanding it.  What I did not account for was the popularity of some files…and then the ones that just happened to be the largest.  I could not have personally thought I’d see a 920GB spike in one month of usage…but now I know…and have to alter some plans. 

Hopefully this is helpful for some who are just exploring cloud storage solutions/services.  Make sure you have instrumentation and logging capabilities turned on so you can identify and tune your situations.  For me, S3Stat and CloudBerry are winners for my personal usages.  If you are an Amazon S3 customer, I recommend looking at S3Stat and turning on logging immediately!


| Comments

Many have inquired if Silverlight Live Streaming had a replacement since the announcement of it being deprecated.  The SLS team blog pointed to Azure as a possible solution.  Since it doesn’t seem like anyone except James has really tried this, I decided to dust off my Azure account information and give it a try.

First, a note about SLS.  As I’ve said before I don’t think the offering was named very well from the start.  “Streaming” implies a specific technical connotation to most folks.  In fact, the media (or other files) hosted on SLS were never streamed as you might be thinking they were.  It was always a media file hosted on a content delivery network (CDN) and distributed delivered via basic progressive download…not streamed.  Now that we got that out of the way, let’s move on.

Setting up Azure

I’ll be honest.  I’m not sure how you go about getting an Azure token/invite at this point.  But you’ll need one.  Those who attended PDC08 already have access to one.  I believe MSDN subscribers will have access at some point but I don’t see it enabled on my account just yet either.  Bottom line is you’ll need an invitation token.  I cannot help you in this regard, so don’t even ask…I wish I could, but I can’t.

Once you have an Azure account, you want to create a Windows Azure Storage instance and give it a name.  I recommend not using special characters in the name as some of the tools I’m explaining below had problems with that.  I gave mine a simple name “timheuerblob1” and let it provision.

Azure Storage Account creation

Once you have that you’ll see a bunch of configuration screen details:

Azure configuration details

I’d recommend doing the following tasks first before you proceed.  In the bottom you’ll see an option to enable CDN.  For this purpose of hosting files/storage I recommend putting it on.  Additionally, this provisioning of CDN can take some time so do that first.  The next step is to configure your custom domain names (if you’d like) to map to the CDN and/or blob storage endpoints.  This is a simple/fast process but requires you to understand DNS entries and have the ability to modify your domain’s DNS information.  You can find step-by-step instructions at Steve Marx’s blog post here.  Without enabling this CDN feature, you really aren’t going to get the same level of scale as you did with SLS.

The key things you will need for the next part are the Primary Access Key and the Endpoints.  Make note of them.

Tools to upload content

Windows Azure blob storage is exactly what it says: storage.  Right now there aren’t any provided tools from Windows Azure to really manage the contents of the storage blob containers (unless I’m mistaken, and if so, please correct me).  There is an API for you to get access to and publish content, but if you’re wanting to use it like an SLS replacement you’re likely hoping for an easy tool to upload/manage the contents.

Enter two tools: Azure Storage Explorer and Cloud Storage Studio.  Both of these are Windows clients (WPF incidentally) and are installed on your machine.  Cloud Storage Studio actually has an online version if you wanted to use that offering.  I did not try that tool as I’m okay using a client tool.

Azure Storage Explorer is a CDDL licensed CodePlex project (which actually has no source code checked in at this point) provided by Neudesic – a Microsoft partner.  It is a simple UI that requires you to input your blob storage name, access key and endpoints in the Storage Settings dialog.  I initially had problems with this as my name had a special character in it.  I deleted my initial (and old) instance and recreated a new one in Azure and didn’t have problems after that.

Azure Storage Explorer screenshot

It was pretty simple to use and understand.  Create a container, upload stuff into it.  Done. 

Cloud Storage Studio is a commercial offering and unclear if it is going to be free always or just in CTP mode right now.  I couldn’t find information that jumped out at me on their site to let me know. 

Cloud Storage Studio screenshot

Cloud Storage Studio wouldn’t let me resize the window and that was frustrating.  Other than that it operated as expected, same thing – blob container select and upload.

While both tools were providing me basic blob container and upload capabilities, I felt the upload was slow (comparing to my Amazon S3 experience in creating a tool for content management).  I’m not sure if that was the app, Azure, or the nature of the API.  Also, neither tool seemed to enable me to set the content-type of the contents of the blob I was uploading.  I see this to be a problem for some.  The default of application/octet-stream isn’t going to work for all scenarios (for example I cannot use it to host XAPs and embed them because I would need to be able to set the content-type header).

Side note: Be sure to check out James Clarke’s CodePlex work on a Windows PowerShell script that will do encoding via Expression Encoder and then trigger a publishing plugin – such as one that will automatically upload to an Azure blob storage account.

Other than that, they served the need of filling the gap for a UI tool that already worked out the API stuff.

UPDATE: CloudBerry for Azure

I cannot believe I didn’t know about this one, since CloudBerry is what I use for my Amazon S3 storage and is AWESOME.  I was directed to CloudBerry for Azure Blob Storage (oddly by their competitor above) and I downloaded it immediately. 

CloudBerry Azure Blob Storage explorer

It met every expectation I had and has been the fastest client.  To me, the other clients must meet this bar of functionality *for blob storage interaction* to be considered for me to use.  The one thing it doesn’t have is custom CDN URI generation in its “WebURL” functionality – but I suspect this might be an Azure API limitation (I’m not familiar with the Azure API).

UPDATE 2: Cloud Storage Studio

Be sure to read the comments below from the team at Cloud Storage Studio.  They were kind enough to reach out to me to clarify some things above and listen to my feedback.  They are ACTIVELY looking to improve their product and provide value-add to Azure users/developers.  I sincerely appreciate when people reach out to their customers for feedback.  It’s been a great dialog with them helping them understand how I would use Azure and what my tools needs might be.

My Silverlight media test

Using my new found tools and some sample media files I uploaded a 720p high-def 30 second clip (30MB) to my blob storage account.  I then used put together a page with a simple player and pointed it to the Azure-hosted media.  I first pointed the media player to the basic blob endpoint.  The result of which is here: Non-CDN Hosted Media Playback.

Users who tested it with me said that the video buffered quite a bit and in remote areas (Australia) it took about 1.5 minutes to download the video.  This is again because this is still not streaming but just progressive download.

I then changed the media URL to the CDN version (after waiting 24 hours for propagation, etc. just to be sure) and have the CDN Hosted Media Playback.

I have to be honest and say that the CDN versions seemed to be better at consistent playback after the initial buffer of video (i.e., I received no more future buffering).  I’d be curious as to what other international people see on the CDN version as it would give a better view of how well the CDN delivers content globally.

Summary

I’m not sure if this will be a solution for everyone, but if you are looking for a Microsoft offering for cloud/CDN storage, give Azure a try.  NOTE: It is not a free service once it goes into production!  In fact, most CDN/cloud storage solutions you will find are not.  I’m currently using Amazon S3 for mine and have been for a long while.  It, too, is not free.  (Here’s the Amazon S3 CloudFront version of same media file.)  Again, here are the links to the three different video hosted locations/tests I did:

Either way, now that SLS is gone as a CDN/free hosted service where many were putting their media, you’ll have to look for another option.  Azure is one of those options.  Does it provide an automatic skin for a media player?  No.  You’ll have to do that yourself.  Is it streaming?  No, but neither was SLS.  Is there streaming/smooth streaming on the horizon?  I don’t know…but it would be cool if there was!

Hope this helps.

| Comments

From time to time I’ve gotten a few inquiries as to what platform my blog is, what tools do I use, etc.  After a recent trip to Redmond and visiting with the Live Writer team, I got another inquiry while talking with a customer.  I thought I’d just spit out my thoughts.

First, my platform.  Yes there are many platforms out there for blogging.  Probably the most popular are Wordpress and Blogspot.  I think those are popular because you can get up and running for free and have it hosted.  My wife and her friends mostly use Blogspot for that reason.  Only a little customization is allowed, but skilled people can get creative.  For us propeller heads though, we don’t like hosted solutions :-).  I started a long while back on the .TEXT platform.  When Scott Watermasysk had started working with Telligent and Community Server, the .TEXT project was at a bit of a stand-still for growth.  A few picked up the project source code, forked it and created Subtext, which is the platform I now use.  Subtext has a good developer ecosystem around it and led by Phil Haack, there are constant improvements being discussed on the developer list.  It has served me well since I made the move and I’ve contributed features/fixes myself to make it better for me to use!  I’ve tinkered with Graffiti CMS, but for now, Subtext is my comfort zone and has given me no reason to leave.  There are a few things in the overall engine that are a bit dated, but heck the team is all volunteers and open source, so I’m not holding that against anyone.

As for tools, I’ve come to love Windows Live Writer.  Honestly, if you are a blogger and don’t use Writer, I have to ask why.  Seriously…even if you are the casual blogger.  I’ve heard my wife’s friends complain about formatting pictures in Blogspot…then they see Writer and love it!  When I first started blogging I would use the web site and my blog engine.  But quite honestly that is limiting to very basic posting information.  It doesn’t make authoring easy.  I initially started using BlogJet in the early days.  Honestly, it’s a good tool and I happily paid for it.  Probably my only reason for switching to Live Writer was because of the programming model.  As a developer I wanted to be able to customize it and take advantage of other customizations from others.  I remember getting word that Live Writer was available internally.  I took a look at it. 

The moment I downloaded it and installed it I knew we were going to be on to something.  But there were glaring holes.  That being said, I was a responsible beta user and gave feedback…very blunt feedback.  I remember within a week being invited on a phone call with the team to help them understand my feedback.  It was GREAT!  I told them that as-is they shouldn’t release…there were too many holes and releasing without a few key features would be detrimental to future releases.  Quickly I was introduced to some APIs and we talked through certain scenarios.  A few we agreed at the time couldn’t be core, and I volunteered some time to look at building some plugins with their help.  That was the birth of a few of my tools (Tag4Writer and Flickr4Writer).  Tag4Writer was a stop-gap until the Tags feature could get into the next release (which the functionality has and much better).  Flickr4Writer was a great learning experience in client development as well as working with a great team.  I’ve constantly stayed close to the feedback loop with the team to make it a better product.  To date, there is no better authoring product for me than Live Writer.

That being said, here’s my complete tools for blogging:

  • Subtext – my blog engine
  • Windows Live Writer (WLW) – authoring tool
  • Flickr4Writer – a plugin for WLW that enables browsing and insertion of pictures from Flickr and also has BlogThis support
  • S3Browser - a plugin that enables insertion/upload of bits to Amazon S3 storage.  I wrote this along with tremendous help from Aaron Lerch.  This enables me to keep my images/files stored on a reliable network and reduce overall bandwidth usage.
  • Leo Vildosa’s Code Snippet plugin – another plugin for WLW which enables me to insert formatted code snippets.  I know people have their favorites (and there are a lot of them) – this one is mine.
  • Dynamic Templates – a WLW plugin from one of the developers of Writer that enables you to provide your own “macros” within the plugin, helpful in a lot of cases
  • Creative Commons footer – a WLW plugin to append a Creative Commons note to every post without having to think about it.
  • Twitter Notify plugin: a WLW plugin to automatically update Twitter after posting
  • Templates: every blog should be as unique as you can make it.  Some of us are more skilled in design than others.  I’m not one of them.  I get my inspiration from others.  Wordpress has the best ecosystem of templates…learn from them.  There are a few sites that advertise templates like wpSnap.com and others.  Also look at SmashingMagazine.com always, they have some great stuff they find/provide with liberal licensing.

As you can see, my tools completely revolve around Live Writer.  There is only one instance where I can’t use it to do what I need: Enclosures – and I’ve been providing the team feedback around this feature to hopefully get it into their next release. 

For Mac Users: No, there isn’t a version of Writer for Mac :-(.  I’ve heard a lot of people on Mac say they keep a Windows virtual image around just for using Live Writer…wow.  There are some other tools (ecto and Mars Edit) but I haven’t used them extensively to know if they are good or not.  I know they don’t provide the suite of tools that I use so I don’t even bother exploring for now.

I’ve made a lot of investment in making Writer+Subtext an easy authoring setup for me and it has paid off in productivity savings.  Hopefully you have a set of tools on your own as well that keep you productive!

| Comments

One of the great things I like about some of our platform products is that they are building in extensibility more and more.  Take Windows Live Writer as an example.  It’s no secret on this blog that I’ve got a geek affair with that tool.  I use it daily and have customized it (via plugins) and my blogging platform (Subtext) to make it even more of a best experience for me for web authoring.

Writing plugins for Writer has been a lot of fun and a great way to get the functionality I want/need into a workflow without having a different utility to work in.  Another one of these tools has been Expression Encoder 2 which I’ve been using a bit more lately.  Expression Encoder is a tool that enables the encoding of audio/video assets into VC-1 formats and H.264/AAC formats.  It’s a really simple tool to use and also comes with several Silverlight player templates that you can choose as a part of your output.  In one click you can have your HD home movie encoded and a rich playback experience developed for you as well.  I’ve wrote several times about Encoder, templates, etc. before and you can see some of them here:

With no shortage of information on how to do it, I got home last night and began cranking one out.  I’ve been using Amazon’s S3 web services for a while and have really grown to like it a lot.  One of the Live Writer extensions I spoke of earlier is a plugin for S3 for Live Writer that Aaron Lerch helped out with as well!  I though I should extend Encoder so that I’d have a one click publishing point to my S3 account instead of having to use S3Fox all the time (which is an awesome tool btw).

So after getting home from a user group I started cranking one out, figuring out the nuances and just coding something together.  A few hours later I came up with what I’m calling 1.0 beta of my plugin.

It’s not a fancy UI, but it doesn’t need to be, it serves a purpose: enable publishing of Encoder output directly to an Amazon S3 bucket in one click.  That’s it.  Encoding just media?  No problem.  Adding a template?  Not a problem either.  You simply need to enter your Amazon S3 account information and enter a bucket.  If the bucket isn’t there, it will attempt to create it.  You can also list your current buckets if you forgot them.

There are likely indeed a few problems and some fit-n-finish needed.  I am positive the error handling needs to be refined as well as it could probably benefit from some more efficient threading handling.  The cool think about the Encoder publishing plugins is that they are WPF user controls, so it gave me a chance to work with more XAML.

At any rate, even in the current form (which isn’t perfect but seems to be working for the specific need I built it for -- “works on my machine” warranty applies here) I wanted to share it out for any others to use and hopefully give feedback and contribute.  It’s available as a Ms-PL licensed project with source code and you can get it on CodePlex: Amazon S3 Encoder Publishing Plugin.  I hope you like it and can give feedback.  Please if you find issues log them in the Issue Tracker for the project so they are trackable.