×

First time here?

You are looking at the most recent posts. You may also want to check out older archives. Please leave a comment, ask a question and consider subscribing to the latest posts via RSS or email. Thank you for visiting!

Build 2017 UI Recap

Well that was fun!  It was really exciting to share with the world what our team has been working on in designing and developing over the past few years with regard to Windows UI platform advancements.  Build 2017 was a culmination of a lot of efforts across the company in various areas, but for UI it was the introduction of our evolution of design, the Fluent Design System.  This represents a wave of UI innovations over time, with Build 2017 showing the first views of Wave 1.  There was a lot of great buzz about Fluent, but for a great introduction be sure to check out my colleague Paul Gusmorino’s session introducing the design system:

Of course as developers sometimes we wince at the word ‘design’ because we don’t have the skills, maybe don’t understand it, or want to ensure we can achieve it with maximum ROI of our own developer time!  We agree!  In defining the Fluent Design System, we ensured that a lot of these new innovations are ‘default’ in the platform.  Starting now with the Fall Creator’s Update Insider SDKs you can start seeing some of these appear in the common controls.  When you use the common controls as-is, you will get the best of Fluent incorporated into your app.  James Clarke joined Paul later to explain and demonstrate this in practice showing how the new (and some existing) common controls take this design system into account and help you get it by default:

In addition to what we are doing *now* we also wanted to share what is on the horizon.  I was able to join Ashish Shetty at Build and talk about what is new in XAML and Composition platform areas for developers.  We shared more of the ‘default’ that is exhibited in the common controls but also explained some of the ‘possible’ in the platform that you can achieve with great improvements to our animation system.  We also shared the vision for the future in this space around semantic animations and vector shape micro-animations.  Check out our session on this area:

We had so much to talk about that I wasn’t able to show the simplicity of enabling the pull-to-refresh pattern in the new controls area.  Not wanting you to feel ripped off, I recorded a quick demo of a few of the things we weren’t able to demo.  Take a look here at my impromptu demo insert for you!

There is a lot of great new things coming in the Windows UI platform area for UWP:

  • NavigationView
  • ParallaxView
  • RefreshContainer
  • SwipeContainer
  • TreeView
  • ColorPicker
  • RatingsControl
  • Improved text APIs: CharacterRecieved, CharacterCasing, IsTrimmed
  • Improved input APIs like PreviewInput
  • Implicit animations
  • Connected animations improvements for ListViewBase
  • Advanced color and HDR for Image
  • SVG support for Image
  • Keytips support for XAML
  • ContentDialog and MenuFlyout improvements
  • Context menu support everywhere
  • UI analysis and Edit-and-continue in Visual Studio
  • Narrator developer mode
  • and more!

It is so great to be a part of this latest release and continue to deliver value (hopefully) to you, our developer customer.  Please be sure to let us know how you are using these new improvements and the Fluent Design System.  Share your creations with us at @windowsui so we can share with others as well!

We also announced a vision for defining a common dialect for UI everywhere around XAML.  We call this XAML Standard and are drafting a v1 specification now.  We will want your input on this and have established an open process to encourage community collaboration.  Please join the conversation at http://aka.ms/xamlstandard.  This is at very early stages but with your help we will establish the right fundamentals first and evolve over time.  Getting the core right is critically important…you can’t unify on a set of control APIs if the foundation isn’t solid and makes sense.  In addition to this, .NET Standard 2.0 for UWP was announced as well and is a HUGE advancement for .NET developers writing apps for UWP.  Oh no big deal, just about 20K more APIs you have access to now.  Yowza.  Listen to Scott Hunter, Miguel and myself talk about these areas on Channel 9:

I’m excited to see the creativity unleased by our developer community.  Thanks for letting me be a small part of it!

Implementing a type converter in UWP XAML

Verbose XAML, we all love it right?  What?!  You don’t like writing massive amounts of angle brackets to get to define certain properties?  I mean who doesn’t love something like this:

<MapControl>
    <MapControl.Center>
        <Location>
            <Latitude>47.669444</Latitude>
            <Longitude>-122.123889</Longitude>
        </Location>
    </MapControl.Center>
</MapControl>

What’s not to love there?  Oh I suppose you prefer something like this?

<MapControl Center="47.669444,-122.123889" />

In the XAML dialect this is what we refer to as a ‘type converter’ or more affectionately at times ‘string to thing’ as the declarative markup is just a string representation of some structure.  In WPF and Silverlight this was implemented through requiring to use the System.ComponentModel.TypeConverter class model where you would attribute your class with a pointer to an implementation of TypeConverter that would override the common things you need, most of the time ConvertFrom capabilities.

In UWP where we currently could not rely on the exact same implementation of System.ComponentModel.TypeConverter as it is not a part of the API exposure to UWP apps at this time as well as being a .NET concept which wouldn’t be available to other WinRT developers.  In looking at ways to achieve the same primary scenario, we can now look at the Creator’s Update to deliver the functionality for us.  In the markup compiler for Creator’s Update we now leverage the metadata CreateFromString attribute in WinRT to generate the correct metdata to do the conversion.  The responsibility lies in the owner of the class (looking at you ISVs as you update) to add this metadata capabilities.

NOTE: To enable this capability, the consuming app must currently have minimum target to the Creator’s Update.

Let’s use an example following my pseudo map control I used above.  Here is my class definition for my MyMap control

using Windows.UI.Xaml.Controls;

namespace CustomControlWithType
{
    public class MyMap : Control
    {
        public MyMap()
        {
            this.DefaultStyleKey = typeof(MyMap);
        }

        public string MapTitle { get; set; }
        public Location CenterPoint { get; set; }
    }
}

Notice it has a Location type.  Here’s the definition of that type:

using System; namespace CustomControlWithType { public class Location { public double Latitude { get; set; } public double Longitude { get; set; } public double Altitude { get; set; } } }

Now without a type converter I can’t use the ‘string to thing’ concept in markup…I would have to use verbose markup.  Let’s change that and add an attribute to my Location class, and implement the conversion function:

using System;

namespace CustomControlWithType
{
    [Windows.Foundation.Metadata.CreateFromString(MethodName = "CustomControlWithType.Location.ConvertToLatLong")]
    public class Location
    {
        public double Latitude { get; set; }
        public double Longitude { get; set; }
        public double Altitude { get; set; }

        public static Location ConvertToLatLong(string rawString)
        {
            string[] coords = rawString.Split(',');
            
            var position = new Location();
            position.Latitude = Convert.ToDouble(coords[0]);
            position.Longitude = Convert.ToDouble(coords[1]);

            if (coords.Length > 2)
            {
                position.Altitude = Convert.ToDouble(coords[2]);
            }

            return position;
        }
    }
}

As you can see in the highlighted lines, I added two things.  First I added an attribute to my class to let it know that I have a CreateFromString method and then provided the fully qualified name to that method.  The second obvious thing is to implement that method.  It has to be a public static method and you can see my simple example here.

Now when using the MyMap control I can specify the simpler markup:

And the result would be converted and my control that binds to those values in it’s template are able to see them just fine

Yes, my control is quite lame but just meant to illustrate the point.  The control binds to the CenterPoint.Latitude|Longitude|Altitude properties of the type.

If you are in this scenario of providing APIs that are used in UI markup for UWP apps, try this out and see if it adds delighters for your customers.  I’ve uploaded the full sample of this code to my GitHub in type-converter-sample if you want to see it in full.  Hope this helps! 

Write your Amazon Alexa Skill using C# on AWS Lambda services

After a sick day a few weeks ago and writing my first Alexa Skill I’ve been pretty engaged with understanding this voice UI world with Amazon Echo, Google Home and others.  It’s pretty fun to use and as ‘new tech’ it is pretty fun to play around with.  Almost immediately after my skill was certified, I saw this come across my Twitter stream:

I had spent a few days getting up-to-speed on Node and the environment (I’ve been working in client technologies for a long while remember) and using VS Code, which was fun.  But using C# would have been more efficient for me (or so I thought).  AWS Lambda services just announced they will support C# as the authoring environment for a Lambda service.  As it turns out, the C# Lambda support is pretty general so there is not compatibility in the dev experience for creating a C# Lambda backing a skill as there presently is for Node.JS development…at least right now.  I thought it would be fun to try and was eventually successful, so hopefully this post finds others trying as well.  Here’s what I’ve learned in the < 4 hours (I time-boxed myself for this exercise) spent trying to get it to work.  If there is something obvious I missed to make this simpler, please comment!

The Tools

You will first need a set of tools.  Here was my list:

With these I was ready to go.  The AWS Toolkit is the key here as it provides a lot of VS integration that will help make this a lot easier.

NOTE: You can do all of this technically with VS Code (even on a Mac) but I think the AWS Toolkit for VS makes this a lot simpler to initially understand the pieces and WAY simpler in the publishing step to the AWS Lambda service itself.  If there is a VS Code plugin model, that would be great, but I didn’t find one that did the same things here.

Armed with these tools, here is what I did…

Creating the Lambda project

First, create a new project in VS, using the AWS Lambda template:

This project name doesn’t need to map to your service/function names but it is one of the parameters you will set for the Lambda configuration, so while it doesn’t entirely matter, maybe naming it something that makes sense would help.  We’re just going to demonstrate a dumb Alexa skill for addition so I’m calling it NumberFunctions.

NOTE: This post isn’t covering the concepts of an Alexa skill, merely the ability to use C# to create your logic for the skill if you choose to use AWS Lambda services.  You can, of course, use your own web server, web service, or whatever hosted on whatever server you’d like and an Alexa skill can use that as well. 

Once we have that created you may see the VS project complain a bit.  Right click on the project and choose to restore NuGet packages and that should clear it up.

Create the function handler

The next step is to write the function handler for your skill.  The namespace and public function name matter as these are also inputs to the configuration so be smart about them.  For me, I’m just using the default namespace, class and function name that the template provided.  The next step is to gather the input from the Alexa skill request.  Now a Lambda service can be a function for anything…it is NOT limited to serve Alexa responses, it can do a lot more.  But this is focused on Alexa skills so that is why I’m referring to this specific input.  Alexa requests will come in the form of a JSON payload with a specific format.  Right now if you accept the default signature of the function handler of string, ILambdaContext it will likely fail due to issues you can read about here on GitHub.  So the best way is to really understand that the request will come in with three main JSON properties: request, version, and session.  Having an object with those properties exposed will help…especially if you have an object that understands how to automatically map the JSON payload to a C# object…after all that’s one of the main benefits of using C# is more strongly-typed development you may be used to.

Rather than create my own, I went on the hunt for some options.  There doesn’t exist yet an Alexa Skills SDK for .NET yet (perhaps that is coming) but there are two options I found.  The first seemed a bit more setup/understanding and I haven’t dug deep into it yet, but might be viable.  For me, I just wanted to basically deserialize/serialize the payload into known Alexa types.  For this I found an Open Source project called Slight.Alexa.  This was build for the full .NET Framework and won’t work with the Lambda service until it was ported to .NET Core, so I forked it and moved code to shared and created a .NET Core version of the library. 

NOTE: The port of the library was fairly straight forward sans for a few project.json things (which will be going away) as well as finding some replacements for things that aren’t in .NET Core like System.ComponentModel.DataAnnotations.  Luckily there were replacements that made this simple.

With my fork in place I made a quick beta NuGet package of my .NET Core version so I could use it in my Lambda service (.NET Core projects can’t reference DLLs so they need to be in NuGet packages).  You can get my beta package of this library by adding a reference to it via your new Lambda project:

This now gives me a strongly-typed OM against the Alexa request/response payloads.  You’ll also want to add a NuGet reference to the JSON.NET library (isn’t every project using this now…shouldn’t it be the default reference for any .NET project???!!!).  With these both in place now you have what it takes to process.  The requests for Alexa come in as Launch, Intent and Session requests primarily (again I’m over-simplifying here but for our purposes these are the ones we will look at).  The launch request is when someone just launches your skill via the ‘Alexa, open <skill name>’ command.  We’ll handle that and just tell the user what our simple skill does.  Do do this, we change the function handler input from string to SkillRequest from our newly-added Slight.Alexa.Core library we added:

public string FunctionHandler(SkillRequest input, ILambdaContext context)

Because SkillRequest is an annotated type the library knows how to map the JSON payload to the object model from the library.  We can now work in C# against the object model rather than worry about any JSON path parsing.

Working with the Alexa request/response

Now that we have the SkillRequest object, we can examine the data to understand how our skill should respond.  We can do this by looking at the request type.  Alexa skills have a few request types that we’ll want to look at.  Specifically for us we want to handle the LaunchRequest and IntentRequest types.  So we can examine the type and let’s first handle the LaunchRequest:

Response response;
IOutputSpeech innerResponse = null;
var log = context.Logger;

if (input.GetRequestType() == typeof(Slight.Alexa.Framework.Models.Requests.RequestTypes.ILaunchRequest))
{
    // default launch request, let's just let them know what you can do
    log.LogLine($"Default LaunchRequest made");

    innerResponse = new PlainTextOutputSpeech();
    (innerResponse as PlainTextOutputSpeech).Text = "Welcome to number functions.  You can ask us to add numbers!";
}

You can see that I’m just looking at the type and if a LaunchRequest, then I’m starting to provide my response, which is going to be a simple plain-text speech response (with Alexa you can use SSML for speech synthesis, but we don’t need that right now).  If the request is an IntentRequest, then I first want to get out my parameters from the slots and then execute my intent function (which in this case is adding the parameters):

else if (input.GetRequestType() == typeof(Slight.Alexa.Framework.Models.Requests.RequestTypes.IIntentRequest))
{
    // intent request, process the intent
    log.LogLine($"Intent Requested {input.Request.Intent.Name}");

    // AddNumbersIntent
    // get the slots
    var n1 = Convert.ToDouble(input.Request.Intent.Slots["firstnum"].Value);
    var n2 = Convert.ToDouble(input.Request.Intent.Slots["secondnum"].Value);

    double result = n1 + n2;

    innerResponse = new PlainTextOutputSpeech();
    (innerResponse as PlainTextOutputSpeech).Text = $"The result is {result.ToString()}.";

}

With these in place I can now create my response object (to provide session management, etc.) and add my actual response payload, using JSON.NET to serialize it into the correct format.  Again, the Slight.Alexa library does this for us via that annotations it has on the object model.  Please note this sample code is not robust, handles zero errors, etc…you know, the standard ‘works on my machine’ warranty applies here.:

response = new Response();
response.ShouldEndSession = true;
response.OutputSpeech = innerResponse;
SkillResponse skillResponse = new SkillResponse();
skillResponse.Response = response;
skillResponse.Version = "1.0";

return skillResponse;

I’ve now completed my function, let’s upload it to AWS!

Publishing the Lambda Function

Using the AWS Toolkit for Visual Studio this process is dead simple.  You’ll first have to make sure the toolkit is configured with your AWS account credentials which are explained here in the Specifying Credentials information.  Right click on your project and choose Publish to AWS Lambda:

You’ll then be met with a dialog that you need to choose some options.  Luckily it should be pretty self-explanatory:

You’ll want to make sure you choose a region that has the Alexa Skill trigger enabled.  I don’t know how they determine this but the US-Oregon one does NOT have that enabled, so I’ve been using US-Virginia and that enables me just fine.  The next screen will ask you to specify the user role (I am using the basic execution role).  If you don’t know what these are, re-review the Alexa skills SDK documentation with Lambda to get started there.  These are basically IAM roles in AWS that you have to choose.  After that you click Upload and done.  The toolkit takes care of bundling all your stuff up into a zip, creating the function (if you didn’t already have one – as if you did you can choose it from the drop-down to update an existing one) and uploading it for you.  You can do all this manually, but the toolkit really, really makes this simple.

Testing the function

After you publish you’ll get popped the test window basically:

This allows you to manually test your lambda.  In the pre-configured requests objects you can see a few Alexa request object specified there.  None of them will be the exact one you need but you can start with one and manually modify it easily to do a quick test.  If you notice my screenshot I modified to specify our payload…you can see the payload I’m sending here:

{
  "session": {
    "new": false,
    "sessionId": "session1234",
    "attributes": {},
    "user": {
      "userId": null
    },
    "application": {
      "applicationId": "amzn1.echo-sdk-ams.app.[unique-value-here]"
    }
  },
  "version": "1.0",
  "request": {
    "intent": {
      "slots": {
        "firstnum": {
          "name": "firstnum",
          "value": "3"
        }, "secondnum" : { "name": "secondnum", "value": "5" }
      },
      "name": "AddIntent"
    },
    "type": "IntentRequest",
    "requestId": "request5678"
  }
}

That is sending an IntentRequest with two parameters and you can see the response functioned correctly!  Yay!

Of course the better way is to use Alexa to test it so you’ll need a skill to do that.  Again, this post isn’t about how to do that, but once you have the skill you will have a test console that you can point to your AWS Lambda function.  I’ve got a test skill and will point it to my Lambda instance:

UPDATE: Previously this wasn’t working but thanks to user @jpkbst in the Alexa Slack channel he pointed out my issue.  All code above updated to reflect working version.

Well I had you reading this far at least.  As you can see the port of the Slight.Alexa library doesn’t seem to quite be working with the response object.  I can’t pinpoint why the Alexa test console feels the response is valid as the schema looks correct for the response object.  Can you spot the issue in the code above?  If so, please comment (or better yet, fix it in my sample code).

Summary (thus far)

I set out to spend a minimal amount of time getting the C# Lambda service + Alexa skill working.  I’ve uploaded the full solution to a GitHub repository: timheuer/alexa-csharp-lambda-sample for you to take a look at.  I’m hopeful that this is simple and we can start using C# more for Alexa skills.  I think we’ll likely see some Alexa Skills SDK for .NET popping up elsewhere as well. 

Hope this helps!

Making circular images in XAML

A long while back it seemed like the new cool app thing to do was to represent people/avatars in circles instead of the squares (or squares with rounded corners).  I made a snarky comment about this myself almost exactly 2 years ago when I noticed that some apps I was using at the time switched to this:

Now since this seems to be a popular trend and people are doing it I’ve thought XAML folks have figured it out.  However I’ve seen enough questions and some people trying to do a few things that make it more complex that I thought I’d drop a quick blog post about it.  I’ve seen people trying to do profile pic upload algorithms that clip the actual bitmap and save on disk before displaying it to people stacking transparent PNG ‘masking’ techniques.  None of this is needed for the simplest display.  Here you go:

<Ellipse Width="250" Height="250">
    <Ellipse.Fill>
        <ImageBrush ImageSource="ms-appx:///highfive.jpg" />
    </Ellipse.Fill>
</Ellipse>

That’s it.  You’ll see that Line 3 shows us using an ImageBrush as the fill for an Ellipse.  Using an Ellipse helps you get the precise circular drawing clip without having pixelated edges or anything like that.  The above would render to this image as the example in my app:

Circular image

Now while this is great, using an ImageBrush doesn’t give you the automatic decode-to-render-size capability that was added in the framework in Windows 8.1.

NOTE: This auto decode-to-render-size feature basically only decodes an Image to the render size even if the image is larger.  So if you had a 2000x2000px image but only displayed it in 100x100px then we would only decode the image to 100x100px size saving a lot of memory.  The standard Image element does this for you.

For most apps that control your image sources, you probably are already saving images that are only at the size you are displaying them so it may be okay.  However for apps like social apps or where you don’t know where the source is coming from or your app is NOT resizing the image on upload, etc. then you will want to ensure you save memory by specifying the decode size for the ImageBrush’s source specifically.  This is easily done in markup using a slightly more verbose image source syntax.  Using the above example it would be modified to be:

<Ellipse Width="250" Height="250">
    <Ellipse.Fill>
        <ImageBrush>
            <ImageBrush.ImageSource>
                <BitmapImage DecodePixelHeight="250" DecodePixelWidth="250" UriSource="ms-appx:///highfive.jpg" />
            </ImageBrush.ImageSource>
        </ImageBrush>
    </Ellipse.Fill>
</Ellipse>

No real change other than telling the framework what the decode size should be in Line 5 using DecodePixelHeight and DecodePixelWidth.  The rendering would be the same in my case.  This tip is very helpful to when you are most likely going to be displaying a smaller image than the source and not the other way around. 

So there you go.  Go crazy with your circular people representations!  Hope this helps.


This work is licensed under a Creative Commons Attribution By license.

Build 2015 recap for XAML and native apps

Wow, what a week.  I have to say even as employees of Microsoft, we get surprised when we go to our conferences and see some of the bigger announcements.  There are things that are being worked on that are new or just in different divisions that we’re not focused on.  This past week at the Build 2015 conference was an example of that for me.  Lots of good stuff for developers from client to server!

Universal Windows Platform

At Build this year we introduced the Universal Windows Platform v10 with a set of new APIs and unified features for all Windows devices.  Perhaps the best vision of this is the Day 2 Keynote where Kevin Gallo walked through an example of this and a single app running on tablet, phone, Surface Hub, HoloLens, etc. 

Visit the keynote and watch the whole thing or if you want to jump to the start of this portion it starts at about 23 minutes in.  A really well done, compelling demonstration of the Universal Windows Platform.

XAML Session Recap

For the XAML developer on Windows, there was a lot of goodness shown from my team.  We’ve been working hard on a lot of internals and new API exposure for the Universal Windows Platform.  Our team had some representation in some deep-dive sessions from Build and the recordings are all now available…here’s a list for you to queue up:

One of the things I was really happy to have is part of the Office team come and talk about how they build Office on the same platform we ask you to build apps on.  It is good insight into a large application with lots of legacy and goals that might not be typical of smaller apps or smaller ecosystems.  A big focus for XAML this release was performance given that customers like Office and the Windows shells themselves leveraging XAML for their UI.

I hope that if you are a XAML developer you take some time to look at what new features are available in the Universal Windows Platform for you in Windows 10.

Get the goods!

If you want to get started playing around, the best way is to be a part of the Windows Insiders program.  Everything you need to get started you can find here https://dev.windows.com/en-US/windows-10-for-developers.  You’ll want to join the Insiders program, then download the Visual Studio tools and get started creating/migrating apps!  To help get you started after that here are some helpful links:

Give us feedback!

As you play around with the bits, please continue to give us feedback.  The best way is to be involved in the conversation on the forums.  Ask questions there, get help from the community, share learnings, etc.  Secondarily the Windows Insider Feedback tool (an app that is installed on Windows already for you as ‘Windows Feedback’) is available for you to give direct feedback to the teams.  Please choose categories carefully so that the feedback gets directly to the right team quickly. 

Thanks for helping make the Windows Platform better.  I hope these direct links help you jumpstart your learning!


This work is licensed under a Creative Commons Attribution By license.

Join me at various Build events across America

I can’t wait to talk XAML at Build 2015 with you all!!!

Hey all!  Been really quiet here on the blog as I’ve been focusing on both new personal and work aspects of my life.  On the work front, the team I work on has been working hard on delivering on our promise of converged Windows app development using the native UI framework for the platform – XAML.  It has been a real journey of change, stress of new customers and some exciting changes to the platform that are just the beginning.

My team (XAML) and the entire Windows Developer Platform team will be joining thousands of you in San Francisco for Build 2015 to share what we’ve been working on for Windows 10.

I’ll be joining members of my team in San Francisco to talk about what’s new in the UI framework, some ideas/tools/new ‘stuff’ to build apps across mobile and desktop, improvements in data binding, all the work we did in the platform for performance, and more!

Aside from San Francisco, I’ve been fortunate enough to be asked to deliver an address at a few events as a part of the Build Tour.  These are a set of events across the globe (25 events running from after Build main event until near end of June) that bring the best of Build along with local flare/content with partner showcases and are FREE events!

I will be joining the local community of developers in Atlanta (20-May-2015 at the Georgia Aquarium) and Chicago (10-June-2015 at The Field Museum of Natural History) in the United States event.  Unfortunately (for me as well as I would have loved to meet more in the world) some of the international events conflicted with personal obligations so more of my colleagues will be attending those representing the developer platform.

Please consider joining me and colleagues around the world at these FREE events by registering for your closes Build Tour at http://www.build15.com and encourage your friends and co-workers to register as well!

I look forward to sharing our work with you, hearing your feedback about the Windows developer platform and seeing what kind of apps you are bringing to the ecosystem for our mutual customers!

See you in San Francisco, Atlanta and Chicago!


This work is licensed under a Creative Commons Attribution By license.

I lost 55lbs using these two amazing simple steps–you can too!

TL;DR – I got off my butt, started eating better and lost 40lbs in ~100 days.  You can too.

UPDATE: As of NOV-2014 I’m down 55lbs since this timeframe.  Feel amazing, found new hobbies that are active and loving life again.

This year started very depressing personally as each look in the mirror showed another chin growing under the previous.  I weighed the most I’ve ever weighed in my life and it just kept getting away from me.  My move to Redmond brought back a different office life for me from my previous role at Microsoft and I became much more sedentary than before.  Now, this doesn’t mean I was a triathlete before, but I certainly didn’t sit in an office as much as I have in the past 4 years.  I really put myself fully into work and nothing else…and it showed physically.

As my kids got more active (I have two kids; 8 and 12) I participated more with their activities.  I realized how out of shape I was when I couldn’t ride a bike like my kids, couldn’t run around the baseball field to help with Little League practices without being out-of-breath, etc.  I had to do something.  The problem is I’ve ‘tried’ before a few times.  I haven’t really dedicated myself but tried dumb diets and various techniques to lose weight, get active and get healthy.  None worked…or I should say none lasted long at all.  Perhaps another catalyst is the fact that I turn 40 this year and my wife keeps bothering me about “project 40” and being the best we can in our 40’s.  These all combined to be a wake up for me to really, really try harder to get fit and healthier.

As of this writing I’ve lost 40 pounds!  I’m very happy with my results so far as it is much more than just pounds, but I still have a way to go for full change.  I wanted to share my ‘program’ with others in the tech community as I think others probably suffer from the same sedentary work/life style as I do and probably use the same excuses that I have in the past as well. 

For context, I’m a white male, 39 years of age, 5 feet, 9 inches and started weighing 225lbs at the start of this process.  I started wearing a size 38 pants, wore an XL (and sometimes it was snug), and my dress shirts were 17.5 collar.  I have no idea of my body fat percentage other than it was probably a lot when I looked at my man boobs and belly.  Maybe too much information for a blog post?  Oh well, now you know where I was coming from.

The Secret Formula to Getting Fit and Losing Weight

I hesitated to write this secret in a blog post because I could make MILLIONS sharing it as the health industry is booming right now.  But I’m generous so here’s the single secret formula to losing weight and getting fit: STOP EATING SO MUCH CRAP AND MOVE AROUND MORE!  Seriously, anyone who really tells you any different is just lying.  Yes I’m aware there are medical conditions.  Yes I’m aware there are tons of studies around the plagues of white bread.  Yes I’m aware that there are fads that encourage eating only things that dinosaurs could have eaten.  I don’t care.  Seriously.  Nothing can sustain your fitness more than the realization that you just need to eat smarter and be active.  I don’t think this involves you turning into a gym rat, nor does it involve requiring any significant investment in stuff/supplies/equipment, but it does involve dedication and time…sometimes two of the hardest things to give up.

Eating

No amount of exercise will help if you keep stuffing yourself with crap food.  That was a big part of my problem.  I LOVE food…all kinds – I don’t discriminate at all on food.  Ever.  I would eat whatever I thought tasted good…and lots of it.  There is no way anyone can really lost weight and get fit without changing the habit around eating smarter.  Notice I didn’t say diet, or only eat plants, or whatever.  Eating smarter isn’t always about eliminating everything, but about being aware of what you are eating.

For me, the easiest way was to track calories.  Now I don’t care about anyone who is going to jump on my method and say ‘counting calories is not the right way, blah blah’ – I don’t want to hear it.  For me it is a simple formula.  More calories in…more weight.  Less calories…less gain.  This method also allowed me to basically continue to eat whatever I wanted.  My eating habits would change naturally because guess what – an 18oz rib-eye is a lot of calories!  There are other methods like Weight Watchers that are similar (‘points’ based) in basically assigning a value to all foods you eat.  I think these approaches are the easiest to understand and manage.

The best way to understand what you are eating is to write it down.  Seriously write EVERYTHING down.  And I’m not talking “sandwich” is what you write down, I’m talking “two slices wheat bread, 1.5oz turkey, 1T mayo, 1oz cheese” detail here.  There is no way you can begin to understand how much you are eating unless you understand it at that level.  For basically 4 months I wrote every ingredient down in every meal I ate.  It started out to be incredibly annoying but just became routine and I used tools to make it easier (see below).

My wife helped out a lot here as she’s been trying to eat more “clean” foods.  Whole foods.  Our home doesn’t have a lot of packaged foods anymore and our refrigerator looks like a farmer’s market most of the time.  Do I miss a lot of foods?  HECK YES!  But now that I’m at my goal I can indulge a little bit more and moderate the ‘not good but tastes great’ stuff more easily.

For me, I had to radically change my diet.  I wasn’t eating any breakfast usually (and when I did it was a chocolate-filled croissant), would have whatever the café special was at work (usually high in calories) and would LOVE to go out for dinner at typical restaurants (Mexican, Asian mostly).  I also had a soda for lunch and didn’t really drink much else during the day.  I changed this completely.  I focused on trying to eat more protein, less sodium and less sugars.  I didn’t really focus on other specifics like carbs, etc.  Here was a typical day for me food-wise:

  • Breakfast: Protein Shake.  I exercised first thing in the morning and would have a post-workout protein drink.  I used 100% Whey Protein as my mix.  I know this has more things in it that I couldn’t pronounce, but I allowed myself to break the whole foods rule for my protein substitutes.  The mix that I used was Cytosport 100% Whey Protein.  It was not delicious but was not disgusting either.  Over time I changed this to mix with almond milk, frozen bananas and a berry choice for more of a shake.  This shake approach is now my preferred method.  Freeze a bunch of bananas and they add great flavor but also serve as the ‘ice’ to get it to shake consistency.
  • Lunch: Depending on my schedule I’d have another protein substitute like a protein bar.  I’ve been pretty busy at work lately and not really near any food options easily (lots of meetings and running between buildings) so using protein bars helped provide me some nutrition and tide my hunger over.  My sister-in-law turned me on to these Quest bars and they actually are quite good.  They don’t taste like glue and sawdust.  These have become my choice protein bar for now.  Low in calories as well.  If I was near a café for lunch I would choose a plain grilled chicken breast and add that to some romaine lettuce, roasted red peppers, 1 hard-boiled egg and 2T of balsamic vinaigrette.  I’ve NEVER liked salad, but this was okay and I got used to it.
  • Dinner: Whatever my wife made for me.  Seriously the best thing is to have no choice, especially when your wife is on a healthy food lifestyle.  If I came home and there was a plate of quinoa and kale in front of me, I’d eat it.  NOTE: Kale is a weed…I hate it.  But this was the best thing as it prevented my mind from wondering what’s for dinner and dreaming of different things.  Occasionally we’d go out for sushi or Mexican food, but I would choose very simple options there…and exercise harder the next day.
  • Snacks: none.  I just cut them out.  This was what was causing me a lot of problems throughout the day.  I just stopped cold-turkey.  At home if I wanted a later snack after dinner, I would hunt for a banana or nectarine and that satisfied me.
  • Drinks: I haven’t had a soda in 4 months.  I’ve been drinking only water.  I hate water.  It has no flavor and no enjoyment.  I chose to vary my water with herbal lemon tea (we have an easy setup at work).  I tried to drink as much water when I could remember.  I went to the bathroom a LOT more during the day and that’s a good thing.  Seriously – drink more water.

That’s the essence of it.  It was/is VERY hard change for me and my food love.  But it was necessary to get me back on track.  I think I’ll be celebrating my goal victory with an amazing meal but will be smart about my choices more often than other times.  This is what worked for me.  Calorie counting doesn’t work for everyone and there are a ton of opinions about it…but for me, it worked and I have the evidence to prove it.

Exercise

I first realized I needed to exercise and move around.  I am not an athlete at all other than golf and in the Pacific Northwest that is harder to be active in golf regularly for me than it was in Arizona.  I’m not a runner.  I’m not a biker.  I had a gym membership for two years that never saw a single swipe of my membership card.  Ever.  I’m horrible at exercise.  I realized though that extreme times calls for extreme measures.  I’m not one that is going to be real proactive in determining what I should be doing and I would likely get lazy quick.  For me, I need someone to guide me to jumpstart, etc.  I’ve had friends talk about the P90X3 routines so I decided I was going to go that route.  My wife was going to do it with me as well so I’d have two people yelling at me to keep going!

For those who don’t know, P90X3 is a pretty intense exercise routine to be completed in 90 days.  It is a derivative of other P90X programs, but the latest form is shrunk to 30 minutes/day of intensity.  I figured that was a good time block I could commit to as an hour just seemed like I’d give up.  I have a great benefit plan at work that actually paid for the DVDs and some equipment for me.  I chose the P90X3 Basic Kit to start.  And I just started.  That’s it, you just have to start.

This program is intense (for me at least).  It is 30 minutes of non-stop exercising.  I cannot do a pull-up yet one whole 30 minutes is dedicated to doing pull-ups, then drop-down to push-ups, then back up to pull-ups – repeat for 30 minutes…seriously.  FWIW, I still can’t do a pull-up.  Because I learned this early, I also got a set of resistance bands as the ‘modified’ workout for these types of things.  For the 90 days I did almost always the ‘modified’ version shown in the DVDs as I sometimes just couldn’t lift my unfit body into the positions they were asking! 

Time of day was important to me as well.  I chose the morning before everyone got up.  My house really sucks for this type of exercise, but I did my best to make it work.  Each morning around 6 or 7 I’d get up with my wife and we’d put in the DVD and just start.  I’d grunt, complain and get frustrated…but I did it each day.  30 minutes seemed like an eternity but it was a great chunk of time to be able to get started.  There were 2 weeks that were really tough.  One week I was traveling to Russia and did my best to exercise on-track with the program, but admittedly it was harder not in your comfort zone. 

SIDE NOTE: These people are still shipping DVDs only.  I had to scramble to remember how to burn a DVD, only to also have to find a DVD drive to burn them for my travel.  C’mon people, digital copies!!!

The second week we had a house full of visitors and they occupied my workout area.  This gave me a good ‘excuse’ not to do it, but I felt horrible for missing my commitments that week.

I also chose to get more active in general.  I volunteered for helping my son’s Little League team.  It doesn’t sound like much, but again when you are out of shape, you try running around a field catching balls constantly hit into the outfield!  I also had so much fun with Little League that my wife and I started a co-ed softball team with the city.  We suck bad, but it keeps me active and I’m having a ton of fun playing softball!  Most recently I went and invested more into my fitness and acquired a bike.  I have been riding at least twice weekly to work and on the weekends.  I live in a place where everything is uphill and this is proving to be a challenge to me, but again, I’m making a commitment to being active, no matter what.  In the short time I’ve become addicted to cycling now and being a gadget freak doesn’t help my cause…so many cool gadgets for cycling!

There are also little things I do, like only take stairs now at work, walk whenever possible between buildings that maybe I used to shuttle between, etc.  These are small things but they add up.

Tools

Outside of the actual exercise equipment (resistance bands, dumbbell, DVDs) my tools I used were only three more: a scale, a heart rate monitor and the Lose It! app.

At first I only weighed myself once a week.  I recommend this in the beginning as well.  When I started seeing results I started getting more excited and got more aggressive with my goal plan and was motivated by immediate results.  I started weighing myself daily.  Same place, same scale, same time every day.  For me it was after my morning shower.  I logged my weight each day – the gains and the losses.  It served as a constant reminder of how close I was to my goal and how much progress and success my plan was working.  I used a simple scale, not a fancy Bluetooth/Wi-Fi thing that automatically synced with an app (although that would be cool).

I wanted to estimate my calorie burn during exercises.  I acquired a heart rate monitor in order to assist.  I chose a Timex watch combo that I found a good deal on and love.  I completely acknowledge that calorie burn on these things are not an exact science and estimates.  Don’t flame me about that in the comments.  Again, this is something that worked for me.  I used it for each workout and to be aggressive I would subtract 75-100 calories from what it said I was burning in the exercise.  I felt this really was a valuable tool to my success.  Doing the P90X3 routine after a while I knew that the MMX workout, for example, would burn a certain amount of calories so sometimes I didn’t use the monitor if I knew that I was doing the workout the same as previously.

Lastly the best tool for me was the Lose It! app.  I know some of the folks that wrote this app and so I’ve had it installed for a long while.  Since then there are others like it, probably the most popular other one being MyFitnessPal.  I started with Lose It and I like it so I haven’t switched.

NOTE: MyFitnessPal is also free and actually has more connection points to services, etc. than Lose It’s free version.  Lose It has a premium fee ($40/yr) where you can do these same things that MyFitnessPal provides for free.  It might be enough for me to switch but I like the Lose It app method of program for goals.

Lose It (and similar apps) provide everything I need to achieve my goal.  It provides a method for establishing a program based on current/goal weight and gives you a daily calorie budget to monitor.  You add exercises (estimating calorie burn) and can create your own custom exercises as I did with the P90X3 routines.  It has a food log that includes a scanner so if something you are eating has a UPC symbol (or even the ingredients), then you can scan it and get all the nutritional info not just calories.  Others have this too.  It has a social component that you can join challenges and have other micro-goals to achieve.  The food log became the most valuable tool for me as I mentioned earlier that writing down what you are eating is the most insightful thing to weight/food/portion control.

These simple tools was all I needed to be successful.

Results

So hear I am 109 days later from the point I started.  What are the results?  Here’s the raw numbers:

  • 40.5lb weight loss (Started at 225 and set my first goal at 185, today I weighted in at 184.5)
    • UPDATE 20-NOV-2014: I’m down 55lbs now!!!
  • I fit EASILY into size 34 pants…may have to go lower
  • My t-shirts are size L now and my wife thinks they are still too big
  • My dress shirt is a size 16 collar
  • I have one less extra chin
  • I had to get rid of most of my clothes and buy new ones…this was expensive, but a happy expense to pay.  As a bonus, I had to get a new suit and my style got updated from 9 years ago!
  • I can fit into some of my favorite sports jerseys and t-shirts that I’ve been saving for so long!

Of course your results may vary.  People’s attitudes, abilities, body shapes and physiological make-up are all different and I’m not here to say that anyone would get the same results as me.  Maybe some would be better, some worse.  I’m certainly not going out and buying coach shorts and muscle shirts yet, nor am I ready to put my before/after pictures on display for the world to see but I’m proud of my results so far and feel a lot better about my health now than I have in quite a long time.

Summary

If you are reading this blog you probably are in the same industry as me (tech) and maybe you face the same challenges of constant sitting for long times, no exercise, crap food for ‘convenience’ and just perhaps the same laziness that I suffer from.  Well, if you have anyone that cares about you, or that you care deeply about then you owe it to yourself to take a moment and think about your health and what you can do.  For me I was just gaining too much and it got to a point of concern.  The single-best wake-up call for me was starting to track what I ate.  It was obscene the amount of calories I was stuffing my face with and the types of food I was okay eating on a regular basis.  I don’t think a Snicker’s bar is bad…but all things must be in moderation and I just wasn’t.  Take a step even if you feel in good health and write down your food for a week.  It might surprise you as much as it did me and perhaps may trigger you to do something.  If you are already eating right, exercising and a healthy person then bravo for you!

I am no fitness coach, no health expert, no dietician.  I’m just some normal guy who was fed up with my laziness and decided to do something about it.  The decision and steps were NOT easy for me.  In fact, they continue to be difficult for me.  But I have proof now in my own abilities to get healthy, eat smart and learn to love types of exercise.  I have found something that works for me.

Hope this helps anyone!


This work is licensed under a Creative Commons Attribution By license.

Updated Flickr4Writer for new Flickr API restrictions

Before Windows Live Writer was even publically released, I was glad to have been an early beta user/tester of the product.  The team thought early about an extensible model and it has been my content authoring tool ever since.  It has allowed me to use *my* preferred content workflow with my cloud providers/formatters/tracking and other such plug-ins due to this extensibility.

Flickr4Writer screenshotOne of the first plugins available was one of mine I called Flickr4Writer.  It was pretty popular (as most ‘firsts’ are) and I got a lot of good feedback that changed the functionality and user interface.  Is it the best design/code?  Probably not, but it seems to have served the needs of many folks and I’m happy about that.  I put the code into the Open Source world around the same time and it never received much uptake there and only one contribution of literal code (plenty of feedback). 

I depended on an early library that was created called FlickrNet.  I contributed a few small fixes during my development of Flickr4Writer to the cause.  This has been a very popular library and I think even used in some close-to-official Flickr apps for the Windows platform.  It served my purpose fine for a LONG time…until 2 days ago.

Because Flickr4Writer was pretty much complete and ‘bug-free’ for the mainstream cases, it hadn’t been touched in years and there was never any need.  I felt no need to fiddle with code at all that didn’t need to be messed with.  Another factor also was that Live Writer plugins are pretty locked on .NET 2.0 for loading, so there was no real incentive for me to move to anything else.  Two days ago I started getting emails that Flickr4Writer was not working anymore.  One writer sent me a very kind note detailing what he felt the problem was due to the recent API changes required by Flickr.  One 27-June-2014 the Flickr API went SSL-only and pretty much all my code broke.  Well, to be true, the version of FlickrNet I was using no longer worked.  It was time for me to update.

I spent a few hours today switching to the latest FlickrNet library (and using NuGet now since it is published that way now) and take the time to switch over all the now-obsolete API usage my app was using.  I hit a few speed bumps along the way but got it done.  I sent the bits to a few of the folks that emailed me and they indicated it was working so I’m feeling good about publishing it.  So here is the update to Flickr4Writer, version 1.5 and the steps:

  1. Close Windows Live Writer completely
  2. Uninstall any previous version of Flick4Writer from Control Panel on your machine
  3. Run the new installer for Flickr4Writer by downloading it here.
  4. Launch Windows Live Writer again
  5. Go to the Plugin Options screen and select ‘Flickr Image Reference’ and click Options
  6. Step #5 should launch the authentication flow again to get new tokens. 
  7. Pay attention to the permission screen on Flickr web site as you will need the code provided when you authorize
  8. Enter the code and click OK
  9. Resume using Flickr4Writer

This worked for a set of folks and a few tests I did on my machines.  Performing the re-authentication is key to get the updated tokens for the API usage for this plugin.  I apologize about making folks uninstall/re-install but the installer code was one thing that was really old and I just didn’t want to spend too much time getting that working so I just created a new one.

I’m really glad people find Flickr4Writer useful still and I apologize for not having an update sooner (I actually didn’t get the notice that Flickr indicates was sent out…probably in my spam somewhere) but I appreciate those users who alerted me to the problem quickly!

Hope this helps!


This work is licensed under a Creative Commons Attribution By license.

Determining Portable Class Library compatibility

Recently I embarked on porting the TagLib# library to a Portable Class Library (PCL).  In my efforts I noted some frustration I had of the “convert and compile” flow to find issues.  Well, turns out I didn’t have to do that much pain as pointed out by Daniel in the comments!  The .NET team has released a tool to help out us developers called the API Portability Analyzer (currently in Alpha).  This tool basically looks at any existing .NET assembly and gives you a report to help you see where the APIs used are supported in the various .NET profiles available.

The tool is a single command-line exe and is as simple as launching:

ApiPort.exe path-to-your-assembly-file.dll

I recommend putting this in your path somewhere so you don’t have to remember the full path to launch.  The output from the console tells you very little and only really about what you it is doing:

Microsoft (R) API Portability Analyzer version 1.0 (alpha)
Copyright (C) Microsoft Corporation. All rights reserved.

To learn more about how this tool works, including the data we are collecting, go here - http://go.microsoft.com/fwlink/?LinkId=397652

Identifying assemblies to scan. Done in 0.01s.
Detecting assembly references. Processed 1/1 files.Done in 0.23s.
Sending data to service. Done in 2.88s.
Computing report. Processed 508 items.Done in 0.02s.
Writing report. Done in 0.17s.

Replaced output file "c:\ApiPortAnalysis.xlsx"

You may notice that the tool says ‘sending’ and yes, it is communicating with a public service.  The team notes this in the download:

NOTE: During the process of identifying the .NET APIs used by a binary Microsoft collects the list of .NET APIs used by the user submitted binaries. Microsoft also collects the names of various user created APIs. The tool does not collect the binary code, only names of APIs are collected. Microsoft will also collect assembly information such as assembly references for the binary & the Target Framework Moniker (TFM).

The real value is in the output data conveniently formatted into a pre-filterable Excel document.  The process was fairly fast for me, but I suspect might take longer for larger libraries (duh).  An example of the output is like the one here directly showing the TagLib# data that I used above.

If you read my previous post you will see that the areas I had frustrations about are clearly identified in the Unsupported columns for my target platform.  The tool attempts to recommend some alternatives when it can.  I can imagine this gets better over time as the recommendations for TagLib# were only two, whereas it should have provided recommendations for XmlDocument/XmlElement/etc. to the XLINQ equivalent areas.

In the end though, this is a helpful tool for those looking to convert.  I wish I had known about it in advance, but now that I know it is in my toolbox and my PATH!

Hope this helps!


This work is licensed under a Creative Commons Attribution By license.

Working with Portable Class Libraries and porting TagLib#

A long while back I had written a quick sample when Silverlight introduced drag-and-drop into the framework.  Then I decided to show dragging MP3 files into a Silverlight app and reading the metadata and album art.  In order to accomplish this I had to read into the ID3 data from a Silverlight library.  I found a few libraries but settled on TagLib# to do the job.  I had to modify it a bit to get it working in Silverlight as the .NET profile wasn’t the same.  Recently a surge of people have been emailing me for the code.  I spent time searching and apparently I didn’t think the TagLib# modifications were that important because I never saved them anywhere!  A conversation started on Twitter and I decided to devote some “20% time” to making these modifications and take it a step further and make it into a Portable Class Library (PCL).  Here’s my journey…

Deciding to Fork

My first task was finding the source of truth for TagLib#.  The main link on the Novell developer site was broken and stale.  I found it on GitHub and started looking around.  It really hadn’t been updated in a long while (after all, it really didn’t need to be fore core .NET framework) and the project is very stale.  There is no open issues list on GitHub as you have to use Bugzilla, but that didn’t even look like it was getting much attention.  I emailed the maintainer listed in the authors file in the repository and he indicated he’s not really the maintainer.  This felt like a project kind of fizzling down (if not fizzled already).

In looking at the tests, the project structure, and taking into account that I may want to do some things different, I made the decision to fork rather than clone.  I’m not totally married to the decision but I don’t think anyone is keeping the lights on to take a pull request either though.  I forked the code and started with new projects and using Visual Studio 2013 as my tool of choice.

Using Shared Projects

At first I thought that I would be doing perhaps a few different flavors, portable and full .NET framework.  Because I thought I would I decided to use the new Shared Project system in Visual Studio 2013 and the new Shared Project Reference Manager VS extension that allows me to add references from any project to shared code easily.  This gives me the flexibility for the future and sets my project system up in advance.  You’ll see in the end I haven’t actually needed to take that step and perhaps won’t even need the Shared Project anymore, but for now I’m keeping it as it does me no harm.

First Compile

Once I moved the code over and set my target profile for PCL, I hit build.  Whoa.  About 140 compile errors.  Immediately I thought that I didn’t want to spend the time.  I took a look at the issues and quickly realized that the base code had, in fact, changed a bit from when I messed with it in Silverlight.  I started making a list of the things that weren’t compiling as I was targeting .NET 4.5, Silverlight 5, Windows 8+, Windows Phone 8+, and iOS/Android (Xamarin).  The biggest errors came from the fact that the library was still using XmlDocument and hadn’t moved to XLinq.  Beyond that there were things like Serializable, ICloneable, ComVisible and File IO that weren’t going to work.  I got really frustrated quickly and about gave up.

Working at Microsoft I am fortunate to have access to try more things and indeed I reached out to some folks for help.  I was able to get some things working continued with XmlDocument, but it didn’t feel right and starting to think about releasing this updated library I just realized this wasn’t going to work.  I remained frustrated.

Helpful Friends

Sometimes when you are frustrated you just want to vent to the world.  We call that Twitter these days.  I was pulling some hair out and posted a comment, which was quickly replied by a member of the .NET team with a bit of a touché comment. 

I chuckled but I also knew that David and others were going to be the key to helping me find the fastest path here.  I started emails with the PCL team, including David and Daniel, who are incredibly knowledgeable and responsive.  I finally got most working and then my colleague and I started chatting about my frustrations.  He worked on XLinq for a bit and basically told me to suck it up and do the conversions and that it wasn’t that bad.  We walked through a few of the scenarios and indeed it really ended up all being isolated into one area that I could quickly scan through.  I could now remove my dependency on XmlDocument and have no other dependencies for this portable library.

Hooray for helpful people!  Even when you vent, the good peeps will still help out!

Changes to TagLib# for portability

After the full conversion, a few things remain.  Right now I have #ifdef’d out come of the interfaces and attributes that weren’t working.  Once I get to a point of porting all the tests over, I’ll decide if they are even needed.  Perhaps the biggest change though for users of this lib will be the removal of the default string path of file access.  In discussing with some folks, I could have tried to make a portable storage layer work, but it started to make less sense quickly to do that in the library and leave that simple task to the app developer.  This provides flexibility for the app to do what they want without the library trying to work around how different platforms do their file IO routines.  What that means is that the default way of reading a file’s tags changes from:

var tagFile = File.Create("ironlionzion.mp3");
var tags = tagFile.GetTag(TagTypes.Id3v2);
string album = tags.Album;

to

' file is a StorageFile
var fileStream = await file.OpenStreamForReadAsync();
var tagFile = File.Create(new StreamFileAbstraction(file.Name, fileStream, fileStream));
var tags = tagFile.GetTag(TagTypes.Id3v2);
string album = tags.Album;

in a simple case.  Yes, you as the app author have to write a bit more code, but it puts you in control of ensuring the file location you are reading.  You can see here that I did add my StreamFileAbstraction class to my fork by default, which was the key in the Silverlight port and is actually the key for WinRT as well.  Any app developer can create their own IFileAbstraction implementation and substitute it in the ctor of the create functions and be ready to read.  I actually did this in the test project to re-implement a LocalFileAbstraction for test purposes and used the System.IO.File classes to achieve that, which are available when running VS unit tests.

Summary

What started out as a frustrating exercise turned out to be helpful for me to better understand PCLs and hopefully add value to those who have been asking for this.  As mentioned, this isn’t fully tested and still a ways to go, so if you use it please log bugs (and help fix them) to complete the implementation.  I won’t be working on this full time of course, but do hope to get the test suite ported over as well.  Here are some relevant links:

Hope this helps!


This work is licensed under a Creative Commons Attribution By license.



DISCLAIMER:

The opinions/content expressed on this blog are provided "ASIS" with no warranties and are my own personal opinions/content (unless otherwise noted) and do not represent my employer's view in any way.