Using Flurry Analytics on Windows Phone

I recently added Flurry Analytics support to my application and in the few weeks it has been available I have learnt huge amounts about my users, their habits, and found and fixed several bugs that no-one has even reported.

[This post is my personal opinion as some guy producing Windows Phone apps, and should not be taken as any kind of Microsoft endorsement for this company or its products]

Flurry is an analytics engine that you add to your app and then it reports as much or as little as you would like it to about your app, within about four hours. It is very easy to use (its documentation consists of a tiny text file), it works very well, and its web site lets you grok the information in an easy to use way. Oh and it is totally free!

Getting Started

First off sign up on the site and you can register your application and get a unique key for it. Download the Windows Phone Flurry SDK and add a reference to their assembly in your project. Add a couple of calls to your Application class per their documentation and you already have the basics: you will now know how many new users you have, how long they use your app, in what country they are, what phone and what version they are using, and lots of other information. Those of us accustomed to the glacial performance of the App Hub system and web site, with its inexplicable reporting delays, will be especially pleased with all aspects of the Flurry system. Apart from the default data collection, the data that I find most important is ship-asserts and basic analytics.

Basic Analytics

I report when any page in the app is visited, sometimes with an additional argument. I also report which buttons are clicked as well, to determine which features customers are using and which they are not. I use the Flurry LogEvent API for this, with arguments of “Button”, the page name (“LittleWatson" in the example below) and then the name of the button itself.

Ship Asserts

There are points in my code where I have ship-asserts: they are similar to regular asserts, but actually exist in release builds. Before Flurry they would generate emails, based on an extended version of my LittleWatson code, but I learned when I first added Flurry to the beta version of my app that even beta testers often don’t bother to send those emails (I instrumented LittleWatson itself, see the graph above). So now I report ship-asserts, often with additional useful data, via Flurry. Flurry has a limit of 255 characters per argument but I often have data larger than that I want to see, so I use this code so I can still see all of the data:

 internal static void ShipAssert(string name, string item, string value)
    const int max = 250;
    List<FlurryWP7SDK.Models.Parameter> items = new List<FlurryWP7SDK.Models.Parameter>();
    if (value == null)
        value = "<null>";
    if (value.Length < max)
        items.Add(new FlurryWP7SDK.Models.Parameter(item, value));
        // chop up long values
        int chunk = 0;
        for (int i = 0; i < value.Length; i += max)
            int len = Math.Min(value.Length - i, max);
            string key = item;
            if (chunk != 0)
                key = item + "_" + chunk.ToString();
            items.Add(new FlurryWP7SDK.Models.Parameter(key, value.Substring(i, len)));
    FlurryWP7SDK.Api.LogEvent(name, items, false);

On the Flurry web site if you Export as CSV on an Event it will gather together the _1, _2 etc chunks for you (though not in order), you can’t see this in the default view. Two of my ship-asserts led me to find bugs that I had no idea existed: no-one had emailed me directly or via LittleWatson, and no-one reported it in Reviews (which is a terrible way to report bugs IMHO).


By default Flurry reports the version of the app by reading it from AppManifest.xml. However in my apps I always leave that at and update the versions in AssemblyInfo.cs, so I set this version explicitly via their API. It is important to do this as soon as possible, else some data will be reported as coming from version erroneously. I also change the version number reported for Debug, Beta and Trial apps, and Flurry produces handy graphs based on the version:

 // Get the Version, as set in AssemblyInfo.cs
private static string _version;
internal static string Version
        if (_version == null)
            string name = System.Reflection.Assembly.GetExecutingAssembly().FullName;
            _version = new System.Reflection.AssemblyName(name).Version.ToString();
        return _version;
private string FLURRY_KEY;
// call this from Application_Launching and Application_Activated
private void InitFlurry()
    string ver;
    ver = "D" + App.Version;
#elif BETA
    ver = "B" + App.Version;
    ver = App.Version;


Flurry also has a mechanism to report crashes, so I added it to my existing LittleWatson code. I will probably disable LittleWatson-generated emails in the next update, except in beta builds as beta users sometimes add additional information (e.g. the repro) when they send

Beta testing

I think it is very important to beta test with Flurry enabled, so you can be sure you are reporting the right things in the right way, and that you find the resulting data genuinely useful before you ship your app. I use a different application key for beta versions and release versions, as I have a lot less users of the former than the latter and it is easier to detect anomalies (eg spikes in asserts) in the beta version when the amount of data (and users) is small. It also means that I can easily remove the beta data when it becomes obsolete or was generated by a bug in my reporting code in the beta itself.

Opting Out

As with any app that collects data from users, I highly recommend letting the user opt-out of this data collection. Although all data collected is anonymous, it is a simple courtesy to allow customers to not send it. I have a simple checkbox in my app to do this. Sure, you lose data from those who choose to opt-out, but those same users can hardly complain if you don’t fix an issue that they didn’t even inform you about by disabling the option.


The support folks at Flurry were very responsive on the only occasion I had need of them: I had a beta tester in Europe telling me the app crashed on startup after a timezone change. I could not repro this myself, so I took a look at the callstacks from the App Hub (after the requisite two-day delay) and could see that the crash was in Flurry itself, so I reported it to them. In a few weeks there was an update to the Flurry SDK to fix this issue.


I highly recommend adding Flurry support to your Windows Phone application if you want to know more about your users and how they use your application, so you can fix and improve it.