| Comments

At Build the Azure team launched a new service called Azure Static Web Apps in preview. This service is tailored for scenarios that really work well when you have a static web site front-end and using things like serverless APIs for your communication to services/data/etc. You should read more about it here: Azure Static Web Apps .

Awesome, so Blazor WebAssembly (Wasm) is a static site right? Can you use this new service to host your Blazor Wasm app? Let’s find out!

NOTE: This is just an experiment for me.  This isn’t any official stance of what may come with the service, but only what we can do now with Blazor apps.  As you see with Azure Static Web Apps there is a big end-to-end there with functions, debug experience, etc.  I just wanted to see if Blazor Wasm (as a static web app) could be pushed to the service.

As of this post the service is tailored toward JavaScript app development and works seamlessly in that setup. However, with a few tweaks (for now) we can get our Blazor app working. First we’ll need to get things setup!

Setting up your repo

The fundamental aspects of the service are deployment from your source in GitHub using GitHub Actions. So first you’ll need to make sure you have a repository on GitHub.com for your repository. I’m going to continue to use my Blazor Wasm Hosting Sample repo (which has different options as well to host Wasm apps) for this example. My app is the basic Blazor Wasm template, nothing fancy at all. Okay, we’ve got the repo set up, now let’s get the service setup.

Create the Azure Static Web App resource

You’ll need an Azure account of course and if you don’t have one, you can create an Azure account for free. Go ahead and do that and then come back here to make it easier to follow along. Once you have the account you’ll log in to the Azure portal and create a new resource using the Static Web App (Preview) resource type. You’ll see a simple form to fill out a few things like your resource group and a name for your app and the region.

Screenshot of Azure Portal configuration

The last thing there is where you’ll connect to your GitHub repo and make selections for what repo to use. It will launch you to authorize Azure Static Web Apps to make changes to your repo (for workflow and adding secrets):

Picture of GitHub permission prompt

Once authorized then more options show for the resource creation and just choose your org/repo/branch:

Picture of GitHub repo choices

Once you complete these selections, click Review+Create and the resource will create! The process will take a few minutes, but when complete you’ll have a resource with a few key bits of information:

Picture of finished Azure resource config

The URL of your app is auto-generated with probably a name that will make you chuckle a bit. Hey, it’s random, don’t try to make sense of it, just let the names like “icy cliff” inspire you. Additionally you’ll see the “Workflow file” YAML file and link. If you click it (go ahead and do that) it will take us over to your repo and the GitHub Actions workflow file that was created. We’ll take a look at the details next, but for now if you navigate to the Actions tab of your repo, you’ll see a fail. This is expected for us right now in our steps…more on that later.

Picture of workflows in Actions

In addition to the Actions workflow navigate to the settings tab of your repo and choose Secrets. You’ll see a new secret (with that random name) was added to your repo.

Picture of GitHub secrets

This is the API token needed to communicate with the service.

Why can’t you see the token itself and give the secret a different name? Great question. For now just know that you can’t. Maybe this will change, but this is the secret name you’ll have to use. It’s cool though, the only place it is used is in your workflow file. Speaking of that file, let’s take a look more in detail now!

Understanding and modifying the Action

So the initial workflow file was created and added to your workflow has all the defaults. Namely we’re going to focus on the “jobs” node of the workflow, which should start about line 12. The previous portions in the workflow define the triggers which you can modify if you’d like but they are intended to be a part of your overall CI/CD flow with the static site (automatic PR closure, etc.). Let’s look at the jobs as-is:

jobs:
  build_and_deploy_job:
    if: github.event_name == 'push' || (github.event_name == 'pull_request' && github.event.action != 'closed')
    runs-on: ubuntu-latest
    name: Build and Deploy Job
    steps:
    - uses: actions/[email protected]
    - name: Build And Deploy
      id: builddeploy
      uses: Azure/[email protected]
      with:
        azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_API_TOKEN_ICY_CLIFF_XXXXXXXXX }}
        repo_token: ${{ secrets.GITHUB_TOKEN }} # Used for Github integrations (i.e. PR comments)
        action: 'upload'
        ###### Repository/Build Configurations - These values can be configured to match you app requirements. ######
        app_location: '/' # App source code path
        api_location: 'api' # Api source code path - optional
        app_artifact_location: '' # Built app content directory - optional
        ###### End of Repository/Build Configurations ######

Before we make changes, let’s just look. Oh see that parameter for api token? It’s using that secret that was added to your repo. GitHub Actions has built in a ‘secrets’ object that can reference those secrets and this is where that gets used. That is required for proper deployment. So there, that is where you can see the relationship to it being used!

This is great, but also was failing for our Blazor Wasm app. Why? Well because it’s trying to build it and doesn’t quite know how yet. That’s fine, we can help nudge it along! I’m going to make some changes here. First, change the checkout version to @v2 on Line 18. This is faster.

NOTE: I suspect this will change to be the default soon, but you can change it now to use v2

Now we need to get .NET SDK set up to build our Blazor app. So after the checkout step, let’s add another to first set up the .NET SDK we want to use. It will look like this, using the setup-dotnet action:

    - uses: actions/[email protected]
    
    - name: Setup .NET SDK
      uses: actions/[email protected]
      with:
        dotnet-version: 3.1.201

Now that we are setup, we need to build the Blazor app. So let’s add another step that explicitly builds the app and publish to a specific output location for easy reference in a later step!

    - uses: actions/[email protected]
    
    - name: Setup .NET SDK
      uses: actions/[email protected]
      with:
        dotnet-version: 3.1.201

    - name: Build App
      run: dotnet publish -c Release -o published

There, now we’ve got it building!

NOTE: I’m taking a bit of a shortcut in this tutorial and I’d recommend the actual best practice of Restore, Build, Test, Publish as separate steps. This allows you to more precisely see what is going on in your CI and clearly see what steps may fail, etc.

Our Blazor app is now build and prepared for static deployment in the location ‘published’ referenced in our ‘-o’ parameter during build. All the files we need start now at the root of that folder. A typical Blazor Wasm app published will have a web.config and a wwwroot at the published location.

Picture of Windows explorer folders

Let’s get back to the action defaults. Head back to the YAML file and look for the ‘app_location’ parameter in the action. We now want to change that to our published folder location, but specifically the wwwroot location as the root (as for now the web.config won’t be helpful). So you’d change it to look like this (a snippet of the YAML file)

    - uses: actions/[email protected]
    
    - name: Setup .NET SDK
      uses: actions/[email protected]
      with:
        dotnet-version: 3.1.201

    - name: Build App
      run: dotnet publish -c Release -o published

    - name: Build And Deploy
      id: builddeploy
      uses: Azure/[email protected]
      with:
        azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_API_TOKEN_ICY_CLIFF_XXXXXXXXX }}
        repo_token: ${{ secrets.GITHUB_TOKEN }} # Used for Github integrations (i.e. PR comments)
        action: 'upload'
        ###### Repository/Build Configurations - These values can be configured to match you app requirements. ######
        app_location: 'published/wwwroot' # App source code path
        api_location: '' # Api source code path - optional
        app_artifact_location: 'published/wwwroot' # Built app content directory - optional
        ###### End of Repository/Build Configurations ######

This tells the Static Web App deployment steps to push our files from here. Go ahead and commit the workflow file back to your repository and the Action will trigger and you will see it complete:

Screenshot of completed workflow steps

We have now successfully deployed our Blazor Wasm app to the Static Web App Preview service! Now you’ll note that there is a lot of output in the Deploy step, including warnings about build warnings. For now this is okay as we are not relying on the service to build our app (yet). You’ll also see the note about Functions not being found (reminder we changed our parameter to not have that value). Let’s talk about that.

What about the Functions?

For now the service will automatically build a JavaScript app including serverless functions built using JavaScript in this one step. If you are a .NET developer you’ll most likely be building your functions in C# along with your Blazor front-end. Right now the service doesn’t automatically allow you to specify an API location in your project for C# function classes and automatically build them. Hopefully in the future we will see that be enabled. Until then you’ll have to deploy your functions app separately. You can do it in the same workflow though if it is a part of your same repo. You’ll just leverage the other Azure Functions GitHub Action to accomplish that. Maybe I should update my sample repo to also include that?

But wait, it is broken!

Well maybe you find out that the routing of URLs may not work all the time.  You’re right!  You need to supply a routes.json file located in your app’s wwwroot directory to provide the global rewrite rule so that URLs will always work.  The routes.json file should look like

{
  "routes": [
    {
      "route": "/*",
      "serve": "/index.html",
      "statusCode": 200
    }
  ]
}

and put in your source project’s wwwroot folder.  This will be picked up by the service and interpreted so routes work!

Considerations and Summary

So you’ve now seen it’s possible, but you should also know the constraints. I’ve already noted that you’ll need to deploy your Functions app separately and you have to build your Blazor app in a pre-step (which I think is a good thing personally), so you may be wondering why might you use this service. I’ll leave that answer to you as I think there are scenarios will it will be helpful and I do believe this is just a point in time for the preview and more frameworks hopefully will be supported. I know those of us on the .NET team are working with the service to better support Blazor Wasm, for example.

Another thing that Blazor build does for you is produce pre-compressed files for Brotli and Gzip compression delivered from the server. When you host Blazor Wasm using ASP.NET Core, we deliver these files to the client automatically (via middleware). When you host using Windows App Service you can supply a web.config to have rewrite rules that will solve this for you as well (you can in Linux as well). For the preview of the Static Web App service and Blazor Wasm, you won’t automatically get this, so your app size will be the default uncompressed sizes of the assemblies and static assets.

I hope that you can give the service a try with your apps regardless of if they are Blazor or not. I just wanted to demonstrate how you would get started using the preview making it work with your Blazor Wasm app. I’ve added this specific workflow to my Blazor Wasm Deployment Samples repository where you can see other forms as well on Azure to deploy the client app.

I hope this helps see what’s possible in preview today!

| Comments

Reflecting back on this blog I realized it’s been 16 years of life here.  It hasn’t always been consistent content focus on tech over the early years versus a more random outlet of my thoughts (and apparent lack of concern over punctuation and capitalization).  Some months had more volume and as my career (and perhaps passions) changed some had lower volume.

Month comparison of blog post quantity

I’ve enjoyed getting back in to posting more recently and finding more time (and again, perhaps the passion) to do so.  I think also with newer various outlets of social media, my personal passions are posted elsewhere now like Instagram (if you want to follow my escapades on the bike mostly).  Recently this summer in 2019 I switched job roles at Microsoft back into program management with the .NET team.  I’m focusing on a few different things but having spent so much time in UI frameworks on the client side for so long, I missed some waves of changes in ASP.NET and needed to re-learn.  I spent the first month of my new role doing this and exploring the end-to-end experiences.  Instead of building a To-do app, I wanted to have some real scenario for me to work with so I set off to migrate my blog…it was time anyway as just that week I had received warnings on my server about some errors.  I could avoid this no longer.  It wasn’t an easy path, but here was my journey.

Existing website frameworks

My blog started in 20-Aug-2003 and was built using Community Server (from Telligent) initially.  That quickly forked into a product called .TEXT from Scott Watermasysk and I moved to using that as it was solely content management and not forums or other things I didn’t need.  I stayed on .TEXT for as long as it lived until again another fork happened.  This time the .NET ecosystem around Open Source was improving and this fork was one of those projects.  Phil Haack, along with others, created SubText which was initially a pretty direct fork, but quickly evolved.  I wasn’t much interested in the code at this point, but wanted to follow ‘current’ frameworks so I moved to SubText.  All along this path it was easy because these migrations were similar, using the same frameworks and similar (if not same) data structures.  The SubText site was an ASP.NET 2.0 site using SQL Server as the data store.  Over time this moved to .NET 3.5 but not much more after that (for me at least). 

Over time I made a few adjustments to the SubText environment for me, never really concerning myself about the source, but just patching crap in random binaries that I’d inject into the web.config.  My last one was in 2013 to support Twitter cards and it was a painful reminder at the time that this site was fragile.   By this time as well the SubText project itself was fragile and not really being maintained as ASP.NET had moved to newer things like MVC and such.  The writing was on the wall for me but I ignored it.

Hosting environment

In addition to the platform/framework used, I was using an interesting hosting setup.  Well, not abnormal really considering at the time in 2003 there was no ‘cloud’ as we know it today.  I had a dedicated box (1U server) hosted at my own data center (I was managing, among other things, a data center rack at the time).  This was running Windows Server 2000 and whatever goop that came with.  Additionally this was SQL Server Express [insert some old version here].  I had moved on to another job and after a period of time I needed to move that server.  I was using the server for more than just my site, running about 10 other WordPress blogs for my community, my wife’s business, and various other things.  WordPress sites were constantly being attacked/hacked due to vulnerabilities in WordPress and leading to my server being filled with massive video porn files and me not knowing until my site was down then I had to login remotely and clean crap up.  Loads of fun.  I eventually moved that server to a co-located environment at GoDaddy still maintaining a dedicated 1U server for me.  It was nice having direct access to the server to do whatever I wanted, but I was quickly not needing that type of hands-on configuration anymore…but still dealing with the management.

During each of these moves I was just moving folders around.  I had no builds, no original source code reliably, etc.  “Fixing” things was me writing new code and finding interesting ways to redirect some SubText functionality as I didn’t have the source nor was interested in digging up tooling to get the source to work.  I never upgraded from Windows 2000 server and was well beyond support for things I was doing.  When I wanted to upgrade the OS at GoDaddy, I was faced with “Sure we’ll set up a new server for you and you migrate your apps” approach.  So again I was going to have to re-configure everything.  Another nightmare waiting and I just put it off.

To the Cloud!

My first step was moving data to the cloud.  I wrote about this when I did this task back in 2012.  I quickly learned that if my site wasn’t on Azure as well and with the traffic I was still getting, the egress costs were not going to be attractive for me.  A few years ago I went about moving just my blog app to Azure App Service as well.  Not having anything to build, this was going to be a fun ‘deployment’ where I needed to copy a lot of things manually to my App Service environment.  I felt dirty just FTP-ing in to the environment and continually trying things until it worked.  But it eventually did.  I had my .NET Framework SubText app running on App Service and using Azure SQL.  The cool thing about Azure SQL is the monitoring and diagnostics it provides.  I immediately was met with a few recommendations and configuration changes I should make and/or it automatically made on my behalf.  That was awesome.  I did have one stored procedure from SubText that was causing all kinds of performance havoc and contributing to me hitting capacity with my chosen SQL plan requiring me to bump up to the next plan and more costs.  Neither of which I wanted to do.  And due to what I mentioned prior about not having SubText buildable I couldn’t reliably make a change to the stored proc without really changing the code that called it.  Just another dent in the plan.  I needed a real migration plan.

Migrate to ASP.NET Core

I mentioned that in my new role I needed to spend time re-learning ASP.NET and this was a perfect opportunity.  I decided to dedicate the time and ‘migrate’ to ASP.NET Core.  Why the air quotes?  Because realistically I couldn’t migrate anything but data.  I did not have reliably building source for SubText and despite that it was WebForms and I didn’t know what I might be getting myself in to.  I needed a new plan, which meant a new framework and I went looking.  Immediately I was met with recommendations that I should go static sites, that Jekyll and GitHub pages are the new hotness and why would I want anything else.  I don’t know, for me, I still wanted some flexibility in the way I worked and I wasn’t seeing that I’d be able to get what I want out of a static site approach.  I wanted to move to ASP.NET Core solutions and found a few frameworks that looked attractive.  Most were in varying states and others felt just too verbose for my needs.  I landed on a recommendation to look at miniblogcore.  This was the smallest, simplest, most understandable solution to my needs that I found.  No frills, just render posts with some dynamicism.

I did not even attempt to migrate any of my existing ASP.NET WebForms code or styling as modern platforms were using Bootstrap and other things to do the site, so that was where I needed to start.  I spent a good amount of time working on the simple styling structure for a few things and learning MVC in the process to componentize some of the areas.  I added a few pieces of customization on the miniblog source, adding search routing (using Google site search), a timeline/calendar view thanks to Telerik controls, category browsing (although mine is horrendous due to waaay too many categories used over the years), using Disquss for commenting, adding SyntaxHighlighter for code formatting support, an image provider for my embedded images during authoring, and a few other random things.  I wrote a little MVC controller to do the data migration once from SQL to the XML file-based storage that MiniBlog uses.  That was a lot simpler than I thought it would take to migrate the data to the new structure.  Luckily my old blog had ‘slug’ support and this new one had it as well, so the URI mapping worked fine, but now I had to ensure the old routing would work.  I had to play around with some RegEx skills to accomplish this but in the end I found a pattern that would match and implemented that in my routing, using proper redirect response codes:

// This is for redirecting potential existing URLs from the old Miniblog URL format
// old subtext non-slugged & slugged
// https://timheuer.com/blog/archive/2003/08/19/145.aspx
// https://timheuer.com/blog/archive/2015/04/21/join-windows-engineers-at-free-build-events-around-the-world-xaml.aspx
[Route("/post/{slug}")]
[Route("/blog/archive/{year:regex(\\d{{4}})}/{month:regex(\\d{{2}})}/{day:regex(\\d{{2}})}/{slug}.aspx")]
[HttpGet]
public IActionResult Redirects(string slug)
{
    // if the post was a non-named one we need to append some text to it otherwise it will think it is a page
    // if (slug doesn't contain letters) { redirect to $"post-{slug}" }
    var newSlug = slug;
    var isMatch = Regex.IsMatch(slug, "^[0-9]*$");
    if (isMatch) newSlug = $"post-{slug}";

    return LocalRedirectPermanent($"/blog/{newSlug}");
}

That ended up being remarkably simpler than I thought it would be as well.  This alone was causing me stress to maintain the URIs that had existed over time and using ASP.NET routing with RegEx I got what I needed quickly.

Moving to Azure App Service once this was all done was simple.  When I first moved ASP.NET Core 3.0 wasn’t yet available so I had to deploy as a self-contained app.  This isn’t difficult though and in some cases may be more explicitly what you want to do.  I wrote how to Deploy .NET apps as self-contained so you can follow the steps.  This basically is a ‘bring the framework with you’ approach when the runtime might not be there.  Azure App Service now has .NET Core 3.1 available though so I no longer have to do that, but good to know I can test future versions of .NET by using this mechanism.

Summary

So what did I learn?  Well, not having source for your apps you care about hurts.  I didn’t even get a chance to actually attempt to truly migrate SubText to ASP.NET Core because I had let my implementation rot for so long.  I have become such a huge believer in DevOps now it’s unreal.  I won’t do a simple project even without it.  The confidence you gain when your projects have continuity through automation is amazing.  My new blog app is fully run on DevOps and deploys using that as well…I just commit changes and they are deployed when I approve them.  I learned that even though this was ‘just a blog’ it was a fairly involved app with separation of user controls and things.  It didn’t need to be so complex, but it was and I’m glad for MiniBlog not being so complex.  The performance of my content site and costs are much more manageable now and my stress is reduced knowing that should anything happen I’m in a better place for restoring a good state.  My biggest TODO task I think is re-thinking the XML-based data store though.  This actually is the one thing causing me some DevOps pain because the ‘data’ is content within the web app and when using slot-staging deployment that doesn’t work well.  Azure has a way to use Azure Storage as a mounted point to serve content from in your App Service though and I’ve started to try that with some mixed results so far.  Using this approach separates my app from my data and allows for more meaningful deployment flows and data backup.  I’ve also explored using an Azure Storage provider for my data layer, but the method for how the initial cache is build in MiniBlog right now makes this not a great story due to startup latency when you have 2,000 posts to retrieve from blob container.  I’m still playing around with ideas here, so if you have some I’d love to hear (dasBlog users would hit similar concerns).

I’m happy with where I landed and hope this keeps me on a path for a while.  I’ve got a simple design, responsive design, easy-to-maintain source code, all the features I want (for now), no broken links (I think), works with my editing flow (Open Live Writer), and less stress worrying about a server.  I’ve already updated to ASP.NET Core 3.1 and it was a simple config change to do that now that my setup is so streamlined. 

What are your migration stories?

| Comments

In some of our internal discussion lists there was some questions about how to host certain content for their application.  Most of the discussion came up from apps needing a privacy policy (Rule 4.1 from the Windows Store App Certification Requirements).  Some folks had apps they just developed, but no “site” or service they were using.  But they needed to host a privacy policy.  Lots of thoughts were floated around and I suggested Azure Free Web Sites as an option.  I originally suggested it as a simple way you could just have a URL to a privacy policy, but…duh, you could easily use it as a very quick marketing site for your app.

Creating a web site in Windows Azure

If you didn’t know, Windows Azure allows you to create a free web site!  It is very quick and simple to set up once you have your Azure account set up.  After doing that go to the portal and choose to create a new Web Site.  I recommend picking from the gallery and choosing WordPress.  There is such a vast ecosystem around WordPress as a CMS system that it is simple to use and set up. 

NOTE: Of course you can use others, even a static web site using the TFS/Git deploy feature even.  Do whatever you want.  I just think WordPress is great and allows you to scale your site features without writing code, etc.

Choose to create a site from the gallery:

Azure Find Apps image

This will walk you through a wizard to pick some names and options.  It is pretty self explanatory.  Once completed you will see your site provisioned in the portal.

Azure web sites image

Just click that link and you will be taken to your site…which for WordPress will be the initial setup page to choose your login.

Choosing a WordPress Theme

Once in the WordPress admin site, choose the Appearance option on the left, then the Themes sub-menu.  Once there, go to the Install Themes tab to pick a theme from the gallery.  Optionally you can grab a premium theme from various sites.  There is one called “Responsive” (search for that term) that is in the built-in gallery, free, and actually serves the basic needs well.  We’ll choose that one:

Responsive theme install image

After you pick it you’ll want to Activate it as the current theme.  Now let’s do some simple configuration.

Configure the WordPress site

In order to serve the goal of making this more of a marketing site and to host our privacy policy we don’t need commenting on pages.  Additionally we want “nice” URLs for our site.  Let’s start with a few simple tweaks.

First (in the WordPress admin site) go to Settings->Permalinks.  Choose the “Post name” option.

WordPress permalink settings image

Next go to Settings->Discussions.  In the top uncheck the “Allow people to post comments on new articles” in the top:

WordPress discussion settings image

This will prevent commenting on content.  Next using the Responsive theme we can configure the home page.  Go to the Appearance->Theme Options section.  You’ll see a “Home Page” option for the Responsive theme.  You can set the main text, tagline and two other options.  This gives you the chance to set the URL for your app that is provided from the Windows Store. 

Here are some posts to help you find URLs for your app: Linking your apps on the web and Connect your web site to your store app.

Here is an example:

Responsive theme home page config image

Now go to Appearance->Widgets to modify some information on the home page for the 3 widget areas.  These could be simple things like a quick blurb about what the app is, maybe some top features, or whatever.  In the admin widget area you’ll see Home Page Widget 1,2,3.  Simply add the “Text” widget to these and you can add the title/text for this:

Home page widget image

You can also remove the theme logo to provide your own logo in the Appearance->Header section to provide your own image.  You can also specify a custom image on the main “hero” section of the home page instead of the default image and specify that.  Once all these quick tweaks are done, your home page is done.  Visit the site to see the quick changes.

Now let’s add our privacy policy so we have a permanent place for our privacy details.

Add a Privacy Policy page

From the WordPress admin site, go to Pages->Add Page.  Name it “Privacy Policy” and then type in your privacy policy text.  You can modify the text using HTML formatting to fit your needs.  Make sure to use the “Full Width Page” template so that it shows the entire page.  Notice the URL for the permalink.  I recommend keeping it simple with “privacy” or “privacy-policy” as the name (which should be default):

Privacy page editing image

Once you Publish the page, your site will now have a link to “Privacy Policy” on the home page and you can use that in your site, for your certification process and other areas you may need it.

You can now provide a link within your app to your privacy policy as well.  Andy has a simple method of adding it to the Settings charm and simply linking to your online site now.  You can visit the site to confirm it looks as you wish:

Privacy page image

Done!  You can see this example at http://timscoolapp.azurewebsites.net

Profit!

Now you have a free site on Azure to host marketing your app.  Of course this helps those getting over the hump with providing a nice place for a simple privacy page, but also enables you to have a way to provide other pages for your app.  You could provide more detail on features, have a form to collect feature requests, whatever.  WordPress is very flexible and the same process you used to create the privacy page can be used for other full-page content.  Or you can explore what WordPress has to offer you.  Again, there are many different ways you can do this and even within WordPress other themes you could choose.  However I think the Responsive one is a simple one to get started with as a base.

You may want to not have “azurewebsites.net” as your site URL as well.  If you wanted you can migrate to a shared instance (not free) and have custom domain name resolution on your Azure site as well.

Hope this helps!

| Comments

I should have known better honestly.  I’ve had one strike with cloud billing catching me by surprise and I’m not sure why I’m shocked it happened again.  This time, however, I thought I really did plan it out, pay attention to things and asked what I thought were the right questions.  Unfortunately I didn’t get the full answers.  This time I was stung by my shiny new SQL Azure service choice.

UPDATE 12-APR-2012: Based on comments I've received I feel the need to clarify that I'm not bashing Azure or cloud services in general here.  I don't think anywhere I indicated Azure was a crap product or that I hated it at all.  In fact, I indicated I was completely happy with the service offering.  My frustration was *only* with the fact that the pricing was unclear to me based on how I researched it...that is all and nothing more.  As many have pointed out, cloud services like Azure are extremely important in the marketplace and the ability to scale real-time with minimal effort is an exceptional feature.  *FOR ME* I currently don't have those needs so I couldn't justify the charges beyond what I had planned...that is all, nothing more.  My experience with SQL Azure was a positive one as a product.  Quick setup, familiar tools to manage, worry-free database management, great admin interface and a reliable data storage solution.  My architecture, however, just didn't prove ideal currently with my site not being in Azure as well.  When VM roles come out of beta I will be sure to evaluate moving sites there and plan better.

A while back I heard about the change in price for some Windows Azure services and the one that piqued my interest was the SQL Azure.  At the time it hit me right as I needed to move around some of my hosting aspects of my site.  The lure of the $5/month SQL Azure database (as long as it was < 100MB) was appealing to me.  The SQL server aspect of my site has always been a management headache for me as I don’t want to have to worry about growing logs, etc.

Stung by marketing

I followed the announcements to the http://www.windowsazure.com site and read the descriptions of the services.  I was immediately convinced of the value and heck, it was a service from my company so why shouldn’t I give it a try and support it?  When I began to set it up, however, there were questions being asked during setup and I started to get concerned.  I asked around about if this $5 fee was really the only fee.  I didn’t want to get surprises by things like compute time.  Perhaps I wasn’t asking specific enough questions, but all answers I got was that signs pointed to yes, that would be my only fee.

NOTE: As of this writing yes I am a Microsoft employee, but this is my own opinion and I realize that peoples’ expectations and results vary.  This is only my experience.  I’m not only an employee but also a customer of Microsoft services and in this instance a full paying customer.  No internal benefits are used in my personal Azure hosting accounts.

Yesterday I learned that wasn’t the case.  I received my first Azure billing statement and it was way more than I expected.  Yes my $5 database was there as expected, but also was suddenly “Data Transfer” charges of $55.

Trying to make sense of billing

I immediately tried to make sense of this billing.  I immediately remembered that I had created a storage account as well for a quick test and perhaps I forgot to disable/delete that service.  I logged into the management portal and saw that my storage account was properly deleted and nowhere to be seen.  But how to make sense of these charges from the past week then?  Luckily Azure provides detail usage download data so I grabbed that.  The CSV file I download did indeed provide some detail…perhaps too much as some of it I couldn’t discern, namely the one piece that I had hoped would help me: Resource ID.  This ID was a GUID that I thought pointed to a service that I used.  It did not, or at least that GUID was nowhere to be seen on my Azure management portal.

I contacted the billing support immediately to help.  I was able to talk with a human fairly quickly which was a plus.  The gentleman explained to me that I had a lot of outgoing data leaving the Azure data centers and that was the source of the costs.  He asked if I knew if anything was connecting to my SQL Azure instance externally.  Well, duh, yes it was my site!  He went on to explain that this constitutes “Data Transfer” and I’m billed at a per GB rate for any data that leaves the Azure data center. 

I took a deep breath and asked where this was documented in my SQL Azure sign-up process.  We walked through the site together and he agreed that it wasn’t clear.  After being put on hold for a while, I was assured I would receive a credit for the misunderstanding.  Unfortunately for Azure, the damage was done and they lost a customer.

Where the failure occurred

For me the failure was twofold: me for not fully understanding terms and Azure for not fully explaining them in context.  I say “in context” because that was the key piece that was missing in my registration of my account.  Let me explain the flow I took (as I sent this same piece of internal feedback today as well) as a customer once I heard the announcement about the SQL Azure pricing changes:

  • I received notice of updated SQL Azure pricing
  • I visited the site http://www.windowsazure.com for more information
  • I clicked the top-level “PRICING” link provided as that was my fear
  • I was presented with a fancy graphical calculator.  I moved the slider up to 100MB and confirmed the pricing on the side (no asterisks or anything)
  • I notice a “Learn more about pricing, billing and metering” link underneath the calculator and click it to learn more
  • I’m presented with a section of 10 different options all presented at the same level giving the appearance as unique services.
  • I choose the Database one and again read through and confirm the charge for the 100MB database option.
  • I click the “More about databases” link to double-verify and am presented with another detailed description of the billing

Not once during that process was context provided.  Not at any of the steps above (3 different pricing screens) was there context that additional fees could also apply to any given service.  Data transfer, in fact, doesn’t even describe itself very well.  As I was assured in asking folks involved in Azure about my concern on pricing, this “Data Transfer” wasn’t brought up at all.  I’m not sure why at all it is listed along side services and almost presented as a separate service as it appears all Azure services are subject to data transfer fees.  This is not made clear during sign up nor marketing of the pricing for each service.  SQL Azure should clearly state that the fees are database *plus* any additional fees resulting from data transfer.  Heck Amazon does this with S3 which also makes it so confusing to anticipate the cost of billing there as well…but at least it is presented that I need to factor that into my calculation.

I’m to blame, so why am I whining

I said I’m to blame as well for not understanding better what I’m getting into.  It is unfortunate because I really did like the service and felt an assurance of more reliability with my database then I had before.  The management portal was great and the uptime and log management was something I didn’t have to think about anymore. 

So why, you might ask, am I complaining about a service fee for something that was providing me value? 

NOTE: You may ask why I didn’t just move my site within Azure as well so that no data would be leaving the data centers.  This is a fair question, but unfortunately my site won’t run on any Azure hosting services and additionally I manage a few sites on a single server so it is cost prohibitive to have multiple Azure hosting instances for me right now.

Well it is simple.  I’m not made of money.  This blog has no accounting department or annual budget and such, I have to be smart about even the smallest cost.  I already have sunk costs into the server that hosts this site as well as a few others.  A $5/month database fee was nothing and justifiable easily with the value I was getting and the minor additional cost.  $50 (and growing) just wasn’t justifiable to me.  It was already at the same cost as my dedicated server and just no longer made sense for my scenario here.  In this instance I’m the “little guy” and need to think like one.  Perhaps cloud services are not for me.

Summary

So what did I learn?  Well, I really need to understand bandwidth and transfer data better for the sites I have.  Unfortunately this isn’t totally predictable for me and as such if I can’t predict the cost then it isn’t something that I should be using.  If you are considering these types of services regardless of if they are from Azure or Amazon (or whomever) you need to really plan out not only the service but how it will be used.  Don’t be lured by those shiny cost calculators that let you use sliders and show you awesome pricing but don’t help you estimate (or alert you) to that some of those sliders should be linked together.

I think Azure (and other similar services) have real customer value…there is no doubt in that.  For me, however, it just isn’t the time right now.  The services, based on my configuration needs, just don’t make sense.  Had I had a clearer picture of this when signing up, I wouldn’t have been in this situation of frustration.  Choose your services wisely and understand your total usage of them.  For me it currently doesn’t make sense and I’m moving back to a SQL Express account on my server.  Yes I’ll have to manage it a bit more, but my costs will be known and predictable.

Hope this helps.

| Comments

I run my site on Subtext which has been around for 6+ years in some form (Subtext is a fork of .Text from way back).  As a part of the framework, there was initially built-in capabilities for tracking referral traffic.  On each view of the application, it would tick a referral note and you could see this in the statistics view of the admin pages.

As the standards (for lack of a better term) of tracking Page Views, Referrals, etc. moved to more proven/consistent reporting like Google Analytics (or other platforms) these type of platform tracking became worthless to me.  I never checked them because, frankly, I didn’t believe them anyway.  The problem is that Subtext is still tracking this information for me and taking up valuable little bytes in my database.

For Subtext, specifically, contributors have created scripts and maintenance pages to help manage some of these referrals that may not matter to folks and are just taking up space.  I am one of those people.  In my recent migration to SQL Azure I wanted to take advantage of 100MB pricing.  Surely my blog was not bigger than that.  To my surprise my blog was 650MB in size. 

What!?

I hadn’t run my database maintenance script in a while and decided to run that which purges the referral tracking.  It got down to 35MB.  Yeah baby.  In fact this topic has been discussed on a few times on the Subtext developer mailing list and even tracking as a issue for the project.  In the meantime I wanted to solve it myself for my blog.

In Subtext there is a stored proc that runs to get some of the entry tracking data called subtext_TrackEntry.  Within that proc is where it looks to see if it is a referral and adds that data.  I simply altered my proc on my end to be like this (keeping in the old function just so that I know what I did in case I needed to revert back):

   1: ALTER PROCEDURE [dbo].[subtext_TrackEntry]
   2: @EntryID INT, @BlogId INT, @Url NVARCHAR (255)=NULL, @IsWeb BIT
   3: WITH EXECUTE AS CALLER
   4: AS
   5: -- Removing the referral tracking
   6: -- if(@Url is not NULL AND @IsWeb = 1)
   7: -- BEGIN
   8: --    EXEC [dbo].[subtext_InsertReferral] @EntryID, @BlogId, @Url
   9: -- END
  10: EXEC [dbo].[subtext_InsertEntryViewCount] @EntryID, @BlogId, @IsWeb

Now I’m no longer tracking referrals because my analytics package is doing that for me already.  My database is now representative of things that matter to me, rather than things I just want to clean up.  If you are a Subtext user and never knew that referral logging was wasting your database (and you are using an analytic package to track that anyway), then I hope this helps!