| Comments

I run my site on Subtext which has been around for 6+ years in some form (Subtext is a fork of .Text from way back).  As a part of the framework, there was initially built-in capabilities for tracking referral traffic.  On each view of the application, it would tick a referral note and you could see this in the statistics view of the admin pages.

As the standards (for lack of a better term) of tracking Page Views, Referrals, etc. moved to more proven/consistent reporting like Google Analytics (or other platforms) these type of platform tracking became worthless to me.  I never checked them because, frankly, I didn’t believe them anyway.  The problem is that Subtext is still tracking this information for me and taking up valuable little bytes in my database.

For Subtext, specifically, contributors have created scripts and maintenance pages to help manage some of these referrals that may not matter to folks and are just taking up space.  I am one of those people.  In my recent migration to SQL Azure I wanted to take advantage of 100MB pricing.  Surely my blog was not bigger than that.  To my surprise my blog was 650MB in size. 

What!?

I hadn’t run my database maintenance script in a while and decided to run that which purges the referral tracking.  It got down to 35MB.  Yeah baby.  In fact this topic has been discussed on a few times on the Subtext developer mailing list and even tracking as a issue for the project.  In the meantime I wanted to solve it myself for my blog.

In Subtext there is a stored proc that runs to get some of the entry tracking data called subtext_TrackEntry.  Within that proc is where it looks to see if it is a referral and adds that data.  I simply altered my proc on my end to be like this (keeping in the old function just so that I know what I did in case I needed to revert back):

   1: ALTER PROCEDURE [dbo].[subtext_TrackEntry]
   2: @EntryID INT, @BlogId INT, @Url NVARCHAR (255)=NULL, @IsWeb BIT
   3: WITH EXECUTE AS CALLER
   4: AS
   5: -- Removing the referral tracking
   6: -- if(@Url is not NULL AND @IsWeb = 1)
   7: -- BEGIN
   8: --    EXEC [dbo].[subtext_InsertReferral] @EntryID, @BlogId, @Url
   9: -- END
  10: EXEC [dbo].[subtext_InsertEntryViewCount] @EntryID, @BlogId, @IsWeb

Now I’m no longer tracking referrals because my analytics package is doing that for me already.  My database is now representative of things that matter to me, rather than things I just want to clean up.  If you are a Subtext user and never knew that referral logging was wasting your database (and you are using an analytic package to track that anyway), then I hope this helps! 

| Comments

For the past 6 years I’ve run this blog on the Subtext project (an Open Source software project which is an ASP.NET blog framework).  It has served me very well with being flexible and allowing me to customize things that I want.  It is based on SQL Server and uses stored procedures and relational database “stuff” to accomplish the goals of the design. 

Recently I saw the news from Scott Guthrie about the reduction in pricing on some Windows Azure products, introducing a 100MB pricing option for SQL Azure which you can read about here and here.  I thought this would be a good time to start looking at moving some of my infrastructure of my blog “to the cloud” so I started looking at the details…after all, $5/month seemed reasonable for my database.

Now, I’m only talking about the database portion here…not the ASP.NET application.  Right now I’m happy being in complete control of my own server and have no need for moving the .NET site at this time.  Perhaps in the future, but given the pricing on app hosting, it just doesn’t make sense for my little site here.  I have, however, had some issues with the database infrastructure over the past year with some latency, outages and just age of the server it is hosted on for my site (not the same as my web app).  Because of this I was investigating SQL Azure.

I’m happy to say that this site now runs the data side on SQL Azure…but the process was not without hiccups.  I wanted to share some frustrations I had so that you might be able to avoid them.  My comments below are related to migrating an existing SQL Server database to SQL Azure and not creating a new one from scratch.

My environment

To set the stage, it is fair to note my particular environment for migration.  My database resides on a Windows 2003 server and is a SQL Server 2005 instance/database.  My web application also resides on a Windows 2003 server and is configured in a virtual LAN to have access to the database infrastructure server. 

Yes I realize these are not the “latest” in server products, but I also don’t think they are too old relatively speaking.  They work to my needs and I hadn’t needed any of the new features for a long while (I’ve been desiring to move to Windows 2008 and ASP.NET MVC for Subtext though).

Getting the necessary tools

I chuckled a bit at how many client tools I actually needed to complete this process of moving things to the ‘cloud’ during this process.  I’ve had experience in the past moving SQL databases around using SQL Management Studio (SSME) and other script methods.  I thought this would be very similar using the import/export capabilities nicely provided in SSME. 

I was wrong.

In doing this I started on the Windows Azure web site for some data.  Now, I clearly didn’t navigate deep enough (more on that in a moment) because I wasn’t finding what I was looking for in a “migrating an existing database to Azure” article.  So I did what anyone else would do and searched the web for ‘migrate sql to azure’.  I was presented with a first top choice titled Migrating Databases to SQL Azure.  This provided me with a few options, but no really good end-to-end example.  Truth be told, I tried a few of these and was completely frustrated because the details were not complete.

In the end I found what I needed in the following tools that were essential to me.  Here were the tools that you will need:

The last 2 bullets (direct connectivity) are needed to ensure you can do this from your machine.  The last one – access to your SQL Azure server – isn’t entirely intuitive that you need it set up in advance, but you do.  Another thing to note is that in order for the SQL Azure Migration Wizard to work, your SSME installation must be 2008 R2 SP1.  The link above is to SP1 but I could not find a download for just the SSME tool for R2 SP1, so I downloaded the full SP1 of SQL Express With Tools and just installed the management studio.

Setting up the connections

Once you configure your Windows Azure account, you’ll need to create a SQL Azure server.  This initially confused (and concerned) me because I only wanted my 100MB database account and not anything that will bump up my compute time costs.  However I’ve been assured this just represents the “instance” of your DB and not a compute server.  So you’ll need to configure that first.

To do this you’ll login to your account at https://windows.azure.com and select the subscription server you set up for SQL Azure.  You’ll then want to add a firewall rule to your server.  The “Add” button shows your current IP address so you can add just that if you’d like.

SQL Azure Server Configuration

This takes a few minutes to propagate so I’d do this first.  Once you have this you can configure this connection in SSME to connect to.  Your server name is on the right hand side of this screen (blurred for my account) and is something like XXXXXX.database.windows.net.  In SSME you will connect to this as XXXXXX.database.windows.net,1433 as the server name.

Start by adding a connection in SSME for your current database and your SQL Azure server you just configured.  Do not create any database yet at this time on SQL Azure.

Exporting the current database schema

I tried a few different methods, but by far the easiest was the Data-tier Application method.  To do this go to your existing database in SSME and right-click, choose Tasks and then Extract Data-tier Application:

Extract Data-tier Application

This will create a “dacpac” file via the wizard you will be presented with.  Essentially this extracts the schema and objects for you.  Now why this instead of just a normal TSQL script?  Your mileage may vary, but this was the only method I had real success with in my configuration.

Creating the database from the DACPAC

Once you have the exported .dacpac file go back to SSME and on your SQL Azure instance right-click and choose Deploy Data-tier Application:

Deploy Data-tier Application

This will create the database and schema for you, but not any data.  This is another wizard that walks you through this process.  Once complete you should have a new SQL Azure database matching the schema from your original one.

Migrating the data

Once I had the new schema in my SQL Azure database I was ready to move the data.  This is where the SQL Azure Migration Wizard comes into play.  Launch that tool and you will be asked to choose a source and destination target.  For the source, connect to your original database and after specifying the connection information (I chose just the db, not Master and was fine), click the Advanced button and change to Data only:

Migrating data only

You will then start the process and notice that it is basically doing bcp.exe commands for you to extract the data:

Migrating data

Once this is done you will select the destination – your SQL Azure DB that was just created.  Now since SQL Azure may not have the same features as your source database there may be some conflicts.  The migration wizard tool will stop you on errors on the bcp import commands and give you a chance to resolve/continue those conflicts.  As an example, some of my clustered indexes didn’t transfer over in the schema creation (no idea why) and I needed to re-create those before two tables could be imported.  No big deal, but it was cool that the import was “paused” for me with a Retry function so that I could do this without starting all over.

Migration clean-up

There were a few things that didn’t migrate well for my Subtext experiment here.  First, even though stored procedures in my source database had correctly identified some parameters as ‘out’ it seems they didn’t transfer well.  I’m not sure if this is an issue with the Data-tier Application export or something in SQL Azure, but it required me to go back and ALTER those procs with the correct flag of OUTPUT.  Luckily I could actually do this through the Silverlight application for managing my database just fine after the database was configured:

Alter stored procs in portal

Logins also didn’t transfer, but users did.  My web app doesn’t use an admin user so I wanted to make sure I had a correct login for that.  Through SSME connected to SQL Azure, there is not the GUI interface for doing these and you have to use all TSQL scripts.  The commands in SSME when connected to SQL Azure DB will generate the template for you and you just put in the right values.

Subtext uses some system stored procedures in some admin screens that I use and sp_spacesed is not available.  Luckily others have seen this and I just needed to modify some areas to use similar scripts.

General frustrations

Some of you may be looking at this and wondering why I had so much trouble.  Why didn’t I just read this document (don’t you hate me for putting that last) that walked me through similar steps (minus migration wizard)?  That would be a good question.  That document didn’t show up in search for me anywhere and it is under the “Develop” section of the Azure information site.  I didn’t think to look for as I wasn’t developing anything just yet.  I only found out about that link from my friend Peter Laudati.  It would have saved me some time, but not most.  The first link on that site shows to download SSME R2 RTM…but the migration wizard requires SP1 and without it you’d see an error message about some missing SQL types (Smo). 

Why didn’t I just use the migration wizard for schema *and* data…why two steps?  That’s a good question.  Frankly I don’t know why the migration wizard itself didn’t work for me for the wholesale schema+data approach.  It could be my SQL2005 version or something.  But for me, it just didn’t work.  The steps above were the only paths that worked for me and my SQL2005 database to migrate.

Summary

While I was successful in finally migrating my database, discovering the proper steps wasn’t as in-my-face as it should be.  There are pieced together areas about migrating, but the MSDN article I expected to be a more full-featured end-to-end example.  I remember there was a big push for Access->SQL Server and there was an “upsizing wizard” provided from Microsoft to move that data.  I wish that Azure had something more one-stop like this.  The migration wizard seems like a first approach, but didn’t work as smooth as a one-stop solution for me…hopefully it does for others.  It would have also been nice to have this actually integrated into the tools of the Azure portal.  Let me provide a connection to my existing database and just magically create the Azure one for me – that would have been awesome.

Once my migration was complete, everything in SQL Azure is working with my app as expected and the tools are familiar to me to do any maintenance on the data that I need.  I like the Silverlight management interfaces via the web that I can get a snapshot of my datbase at anytime and they even display query performance on the dashboard which is cool.  I don’t have access to the log files nor am I able to run DBCC commands anymore, but I’m trusting that SQL Azure is more efficient than my own DBA skills of old and that my database will be managed effectively with regard to these items. 

Hope this helps.

| Comments

I wanted to believe, I really did.  It has been over a month since my first impressions of the Amazon Kindle Fire.  Over the holidays, I processed a return for my Kindle Fire.  When the Fire was announced I was intrigued and excited as I thought that Amazon had the real potential to make a great product and the customer base to capitalize on that potential.  For me, it just didn’t live up to the hype.  I’ll stress that last sentence…this is my opinion based on my experiences/desires.  As with anything in life, your mileage may vary.

So what went wrong?

I used the Fire a lot.  I watched videos on it daily (my evening ritual of getting caught up on TV) via Netflix and Hulu apps.  I rented about 10 movies via Amazon on the device.  For video, it was great.  For everything else, it was pretty much frustrating for me.  I’ve been able to isolate it to a few areas: apps, user experience, prejudice.

Apps

I downloaded the free daily app from the Amazon Android store daily…and ended up with a device full of sub-standard products mostly.  The Hulu app really was the only 3rd party one that I felt was designed for the Fire and did most things well.  Even then it had quirks, but mostly it was fine.  Netflix’s app is horrible, lagging, confusing and not enjoyable to use before you get to the playing content.  Most other apps just weren’t doing anything for me.

The lack of a Mail solution *provided by the device* for my mail configuration led to a decreased usage in the device to me.  The responsiveness in the games that I acquired was just not there as well.  Overall I felt the only “app” I was using was video playback.  Everything else wasn’t cutting it…even the Kindle reading app was just too bright for me for long periods of reading.

User Experience

Large area of failure here for me.  Here’s my list of areas that lacked polish and just failed:

  • Hardware home button – I’m realizing how important this really is.  My kids couldn’t figure out how to get back to the ‘start’ screen.  On the iPad, they know immediately.
  • Software ‘home bar’ (not sure what to call it) sometimes appeared, sometimes didn’t.
  • Touch responsiveness – I felt like I had to do gestures multiple times to get it to respond.  The first update was said to fix some of this, but it didn’t do anything noticeable for my use.
  • Touch feedback – I know this seems odd, but there were times I couldn’t tell if I had actually completed a touch interaction…visual state changes didn’t happen, etc.
  • Orientation changing – general inconsistency here in what was supported or not within the own set of experiences delivered by the Fire.  But the transition from one orientation to another was jarring, like a snap rather than a smooth transition.
  • Apps experience – no consistency.  I’m not looking for lets-make-every-app-the-same consistency, but as a user there was know real reliability in controls usage, visuals, responsiveness, action expectations, etc.  This is the good/bad of the Android platform – ultimate freedom but at the price of confusion and quality sometimes.
  • Application lifetime – the management of the state of an application was horrible for an end-user.  The parts of Android really showed through here.  I would occasionally get “not responding” windows in an application or when trying to start one.  These types of things do not pass the mother-in-law sniff test for me.

These were some of the things that continually frustrated me.  There were other nits, but not always in my face. 

Prejudice

Aside from any technical reasons the biggest factor for my return is prejudice.  Don’t get me wrong, I love Amazon.  I’m a Prime member, and only get my purchased digital media from them (i.e., video rentals and MP3).  They have great service offerings and catalog of goods.  These are all the reasons I thought they could execute well out-of-the-gate with the Fire.

However, I also have an iPad.

Make no mistake about it: if you use an iPad for the same amount of time you use a Kindle Fire, you will likely share the same experience that the iPad just is an all-around better product currently.  Now the media (and users like myself) are the ones drawing the comparisons of the Fire to an iPad.  Amazon itself hasn’t done any comparisons side-by-side or even remotely close.  They have never marketed (to my knowledge) the Fire as an iPad competitor.  But that doesn’t matter…because consumers rule the world and we have already drawn that conclusion.  Bottom line is that if you are making a touch device I can travel with that has media and a store where I can get application and content – you’re competing with the iPad.

Since I already am an iPad user I could not erase the experience that I have with my iPad when using the Fire.  All my user experience annoyances around touch are because it is just better on the iPad.  If I didn’t have an iPad, maybe my perception would be hugely different.  But since I have one, my prejudice is set and the comparison bar as well.

Holiday gift taste test

When I arrived to the in-laws for the holidays they mentioned they were getting my wife’s ~80yr old (*very* active) grandmother a Kindle Fire because that is what she wanted.  I shirked a bit (and probably commented too much) at the idea and told them I didn’t think this was a good idea.  GG (as we call her since she has 12 great-grandchildren) is not technically savvy and has never had anything remotely considered “new tech” in her life.  I knew that it would fall on me to be the resident Nick Burns and trainer for the holiday week.  And the time did come where I had to do that.  It went something like this *before* we started configuring her Fire…

Me: GG, why do you want a Fire?
GG: I want to get ‘with the times’ and this seems to be a hot item.
Me: Do you have an Amazon account or have ever bought anything on Amazon?
GG: No, never. Can’t I put books on it?
Me: Yes, but where do you plan on getting those books?
GG: Can’t I get them anywhere?
Me: No, you’ll be buying them through Amazon.

NOTE: I didn’t want to explain that technically you could put other publications on there as I knew that would be an action never accomplished.

GG: You mean I can’t get something from Barnes and Noble and put it on my Fire?
Me: No. But why would you, Amazon has a massive content library.
GG: Well, that seems monopolistic. What about movies?
Me: Yep, you can get movies, but through Amazon.
Me: Most of the time anything you put on there you will be buying from Amazon

This point seemed to have been lost on GG when desiring this device.  Regardless we proceeded with the setup.  Now since the device was purchased from the mother-in-law, when powered on it was attached to her account and we had to set up a new account for GG.  This was going to be fun, I thought.

The first step was to create an Amazon account since she didn’t have one.  The first screen on the Fire to do this asks for 4 simple bits of information: email, username, password, password confirmation.  This was the first introduction GG had to a software keyboard and it did not go well.  The first mistake made was to “press” the keyboard and I had to educate that click, press, push are no longer useful but rather tap, swipe, tap+hold are the new ways she needed to think.  This took some training as she continually hit wrong keys, held the key too long which produced duplicates, etc.  I am not sure if it was her bifocals or what but GG was continually ‘off by 1’ on the keyboard and we had many times to The password field was the hardest because it obfuscated the letter after typing it, providing minimal visual time to see if what was typed was correct.  Now I timed this exercise myself so I could see how long this really took.  With no exaggeration the time to complete this screen was about 30 minutes.  The password/re-enter password took up most of that time.  The next screen was address information…to which I offered to enter this data for her :-).  After that was credit card data.

GG: Why do they need my credit card?
Me: How do you plan on buying anything, money order?

In seriousness, this pointed to a generational gap of this concept of stored account information for one-click purchasing that is available on things like Amazon, Apple, anywhere.

We moved on to a review of the Fire and notable me mentioning that the user guide itself was a Kindle book.  This did not please GG as she was used to a manual.  Since she is a Scrabble lover and other folks in the house were playing Words with Friends, we downloaded that app, set her up an account, and taught her how to play that.  Again, the touch interaction here was painful to watch.

My bottom line for sharing this anecdote is that I don’t think the Fire is an every-generation device.  Contrast that to the iPad, where I think she would have had a much better on-boarding experience.  I left GG alone for the day with her device and the next day she shared her frustration that things didn’t seem to work and it was hard to use the touch keyboard and understand what to do.  Now I can easily (and will) chalk this up to a generational thing and a first-time ‘device’ user in GG.  However, it pointed to a fact to me that the Fire is only for a class of folks who are familiar with computers in a more-than-one-time-usage manner.

Summary

I will stress that again, for me, the Kindle Fire was a bust.  I still faithfully have my own Kindle reader which I will still hail as the ultimate in reading devices (and think that is what GG should exchange her Fire for).  The Fire, in current form, however is a bust in my opinion.  I think Amazon *can* get this right if they put some muscle behind it and tighten up the Android edges that show and concentrate a little more on experience refinement.  I absolutely loved the size of the device (hoping Apple takes note) and think that in a few versions they might get it right.

But for now, the Kindle Fire has been returned…and with a great customer service policy, my money fully refunded, satisfaction guaranteed.

| Comments

Lately I’ve been doing a lot of re-paving of machines and I never had my favorite tools on them, nor did I want to spend the time to re-install a set of tools that I knew I would blow away each day anyway.  Mostly my daily builds have been to do some scenario validation and is quite repetitive.  However there are times where a stable build combination comes along that I keep for a while to work on customer apps or sample development.  When these times happen I find myself needing my helpful little utilities more frequently.

Recently I’ve been trying to learn Git, a version control system that has been gaining popularity over the past few years and is quite cool and agile.  I am generally a huge fan of GUI tools because I feel that they are more in line with how I use other parts of the operating system/tools.  When exploring Git, however, I wasn’t a fan of having to re-install any of the tools over and over and do configuration, etc.  That’s when I discovered portable Git.  It immediately made me realize how dumb I’ve been all this time across machines and renewed my love for portable tools.

What is a portable tool? 

Quite simply it is a tool, regardless of size, that can run with no other dependencies than those that it comes with in the directory or executable.  No requirement of “oh you must have Foo framework 1.0.2.123 installed to use this” as the portable tools completely operate on their own.  Now some are single, small executables.  Others are full-blown programs that bring some serious runtime environments with them.  But both don’t require anything to pre-exist and can run without installation.

I took one of my 8GB USB drives (Costco was having a sale – 3 8GB sticks for $24!!) and started loading up my favorites.  Here’s what it ended up like:

portable tools directory

Now I use other tools on a more stable basis and when Windows 8 goes into more stable releases, I’ll probably lock to on-the-metal installs for those.  I also don’t require any huge amount of tools all the time, but like my favorites.  So what are those directories?

  • DiffMerge – a great, simple-but-visual tool to help do file or directory diff checking and merging if desired.  Some like WinDiff a lot, but I’ve been liking DiffMerge lately.
  • Git – as mentioned above, this is the portable Git tools which provides a bash or command prompt environment.
  • Inkscape – a vector-based graphics tool
  • Notepad2 – my absolute favorite simple text editor
  • Notepad++ (npp in the above image) – another great text editor.  Why two you ask?  The one thing I like about Notepad++ is that I can open up multiple files at a time and do find/replaces across multiple files.  I don’t always do that, but it is a time saver when I need it.
  • Sublime Text – yet another text editor.  When looking for cool portable tools I found this one.  I’m not sure I’ll use it given Notepad2/++ but since it is in the picture, I thought I’d explain it.  Some say it is the TextMate for Windows.
  • “portplat” – more on this in a bit.
  • _ninja – I wish I could tell…but, well can’t

This USB is now extremely helpful to me on a regular basis.  After loading up a new machine I have instant access to tiny little things that just make me more familiar and immediately productive.

NOTE: There are a few more that I use more infrequently but I think I’ll add to my USB key like reflector-type tools.

In writing this post I wanted more…a lot more and started looking around.  Then I found PortableApps.com!  I felt like a complete idiot when I discovered this because I thought I was so cool to seek out portable tools and here was this site who already aggregated them for me and went above that and made it easy to acquire them!

What they do is provide a “Portable Apps Platform” that serves as basically a mini-launcher for a set of portable tools.  In addition to my folders above, here’s what my USB key now allows me to have as well:

Portable Apps Platform

Now I should note that my existing tools (with the exception of diffmerge and git) are all available through this Portable Apps Platform tool setup.  I was already set up in my ways so I figured no need to change what I already had.  The Portable Apps Platform tool has a directory of a ton of apps:

Portable App Directory

Notice the scrollbar in that image above?  There is lots of stuff here.  Most I would never need, but nonetheless it is there…even portable games!

So now I have versions of some browsers on my USB and two of my other favorite tools I didn’t even know were portable: 7-zip and Console2.  The great thing as well is that any configuration you make for these portable tools travel as well…so my customizations for my Console2 environment are on the USB drive and I don’t have to set up my fonts/colors each time!

Within each tool there are likely customizations that you may want.  For example, I like having the “Open With…” settings on my context menu for Notepad2 and Notepad++ for convenience.  On my USB key I keep a setup.bat file with any special configurations for each tool that I want (that may either not be kept in the portable environment or may be machine-specific like the context menu).  I quickly run that and am ready to use my tools how I’m familiar with them. 

Some tools also have config files built in to their environments, like Git.  I was sick of continually typing git config –global user.name “Tim Heuer” and other config each time I set that up.  Luckily a few questions on Twitter and Paul Betts was able to point me to the obvious.  So now I have a customized .gitconfig file on my USB key and whenever I use the bash environment my settings are already there!

So what’s missing?

I’m feeing liberated with these tools lately that now I’m frustrated I can’t get everything in an install-free environment! 

Please don’t rant about Mac, Linux, whatever OS environment lets me do that.  I’m on Windows and it doesn’t for everything.  Yes…the registry lives…I deal with it.

There are a few things I wish that were portable that would complete me.

  • Visual Studio – ah, someday maybe, someday.
  • Paint.NET – While Inkscape is awesome, it isn’t as familiar with me as Photoshop-esque tools.  Paint.NET is the closest to that and would be awesome to always have around.
  • Fiddler – nuf said.
  • Silverlight Spy – extremely cool tool

It’s not a long list, but these are some regular tools that I wish were portable. 

So there you have it. If you haven’t discovered a portable set of tools, you should get out a USB key and load some up. Who knows when you’ll find it handy!

| Comments

One of the features introduced with Silverlight 4 was the out-of-browser feature, enabling you to create an application that can be installed, run offline, automatically updated, etc.  As a part of that feature, some of the major code signing certificate vendors (for Authenticode certs) provided our team with test certificates so that we could go through the same process as a developer would to acquire the cert and apply it to an app…and, of course, validate it works.

During that time some of those vendors had promotional codes for the first year for Silverlight developers, providing reduced-rate (but not reduced quality) code-signing certificates for their apps.  Still during this time there were a lot that questioned why some providers were still expensive and didn’t value “the little guy.”  By that I mean that there are a lot of smaller firms or independent personal developers.  The thought of dropping a few hundred dollars on a cert is sometimes tough.

Last week a representative contacted me about their offerings as a premier partner of one of those providers.  Certs4less.com is now offering Thawte code-signing certificates for individual developers.  They are doing this at a price of $99 per year (less for multi-year). 

NOTE: As a part of this, like before with SL4, Certs4Less graciously offered a promotional cert for me to validate the end-to-end process so that I could speak accurately about it.  I do not use any of these certs provided by these companies for testing purpose toward any production application and they are for testing purposes only.  Besides, I’ve not found the time to write production code for apps lately ;-).  I am not getting paid for this post, nor am I getting another promo code for personal use myself.  I am simply providing what I think is valuable information and get no compensation from Thawte or Certs4Less.

I went through the process of obtaining this cert from Certs4Less.com and it produced exactly what you’d expect, a valid Authenticode code-signing certificate I can use for my Silverlight and Windows 8 application packages!  I shared a few points of feedback with the contact there and will enumerate them here for you as well (as well as some tips)

Your ‘Common Name’

Think about this one pretty good when you buy a cert.  This has a two-fold purpose why I mention this.  First, it is what your customers will see.  Do you want them to see an app signed by a name that isn’t recognizable or doesn’t make sense…of course not.  Additionally this is the name that will be verified.  So if you claim you work for Fizbin Enterprises, but that doesn’t actually exist…you’ll have issues during verification.

One year, 2 or more

One thing you should know about code-signing certificates is that once they expire, the keys change during renewal.  In some cases this can cause issues for your app (ClickOnce).  For this reason I personally recommend getting the longest you can afford.  Most likely this will be a wise investment and you’ll have piece of mind.

Apply on the computer you will receive it

One thing we as developers don’t do well is read directions.  One of the instructions you’ll see is to be sure that you do the cert request process from the same machine you plan on picking up the cert from!  Seriously, this is critical if you use the browser process because of the private key.  If you don’t…you’ll be screwed and out some cash.  Plan ahead and don’t do this while on vacation on your laptop that you repave weekly.

Verification Process

This is an area where I think I had the most negative feedback.  These verification steps are a bit old.  I understand they have their reasons, but in this digital age the fact that I had to find a notary was…well, just inconvenient.  This Certs4Less/Thawte process required me to do this.  The ‘form’ they emailed me really wasn’t a form…just an email with text broken out with ‘==========================’ before each section.  So when I brought in my printed out GMail ‘form’ to the Notary he looked at me like I was an idiot.  The verification form was nothing formal looking at all and I had to have 3 different people look at it before they finally just said ‘okay’ and signed it.

The thing that was most troublesome in this process was it was a distractor.  I had to actually print stuff out, find a passport, go to a bank, wait in line…you know, real people stuff.  But still, it felt annoying in this modern age.

Some of my other process with other vendors have been a lot more streamlined and I think this can/should improve.

Acquiring the certificate

Most of the time this is a quick process.  Remember when I mentioned that developers don’t read instructions?  Yeah, I’m no different.  The final email I got indicating my cert had instructions that I didn’t read that talked about making sure I had intermediate certificates installed first.  Without this I got ambiguous errors when trying to retrieve my certificate.  Be sure to read any verification instructions in detail to provide a good experience.

Back up/export your certificate

I don’t know about you but I’d probably use my cert in automated build processes, keep it on a share (perhaps a dropbox/Live/git location) so that I don’t have to only use my one machine to sign an app.  One thing I highly recommend is after the key is installed is to use the certmgr.msc tool and export the certificate.  When doing this be sure to export the all the key data as well as cert chain so that your resulting PFX file is portable.  Then you can use it in your build process for Silverlight as described here in my previous blog post about that feature.

Summary

I want to thank Certs4Less for reaching out to the independent developer and providing a valuable product at an ‘independent developer’ price level.  I appreciate them also reaching out to allow me to test the process to verify it is fairly painless and the result is what I expected.

Code-signing certificates are very valuable in many ways and I believe every developer should have one for their personal projects as well as their large ones.

Hope this helps!