Tuesday, 12 March 2013

Working Example - Azure Diagnostics using log4net and table storage


This has had me tearing my hair out - and I don't have much anyway but I have had so many problems with diagnostic logging in Azure that it has quite literally tempted me to ditch Platform as a Service and go back to Infrastructure as a Service and have to hand-craft all my boxes.
The problem is a cloud problem. How do you use traditional file-based logs on boxes that might be recycled, added to or removed from at any time? How do you merge logs from multiple instances into a single file? The obvious answer is that you can't really. Microsoft have provided a mechanism however using the DiagnosticMonitorTraceListener which is designed to copy log data from the standard System.Diagnostics trace data into permanent Azure storage and this will work across multiple instances into a single table.
Before we start, I would seriously advise starting with the simplest example possible and building it up once you know it works. I fall foul of this so often when I start with a large example with too many variables.
You will need all of these sections so in no particular order...


This is the entry point for your role and might be called something different but you will need to modify OnStart to setup your transfer schedule:

public override bool OnStart()
    Trace.Listeners.Add(new DiagnosticMonitorTraceListener());
    string wadConnectionString = "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString";
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue(wadConnectionString));
    RoleInstanceDiagnosticManager roleInstanceDiagnosticManager = storageAccount.CreateRoleInstanceDiagnosticManager(RoleEnvironment.DeploymentId, RoleEnvironment.CurrentRoleInstance.Role.Name, RoleEnvironment.CurrentRoleInstance.Id);
    DiagnosticMonitorConfiguration config = DiagnosticMonitor.GetDefaultInitialConfiguration();
    config.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1D);
    config.Logs.ScheduledTransferLogLevelFilter = LogLevel.Error;
    config.WindowsEventLog.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
    return base.OnStart();

The first line might or might not be necessary but it does allow you to trace information from within this webrole. This is important because the webrole runs as a different account from the IIS worker role and will have different permissions.
The rest of the code effectively sets up the connection string for storage (Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString) and configures how often and at what level to transfer logs from trace into storage. Most examples uses 1 minute and in my case, I only want the Error traces (and above) transferred, not everything.
Notice I am copying Windows event log entries but only where the Provider is HostableWebCore which is what the web application runs as.


Because of the order I did these changes in, I'm not sure whether the addition of the trace listener into web config is just a duplication of the one added in WebRole.cs or whether this is also required for the web application logging. It doesn't seem to do any harm so here it is:

    <trace autoflush="true">
        <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics" name="AzureDiagnostics">
          <filter type="" />

Nothing magical here, just the standard way to specify a trace listener. Because I copy the diagnostics assembly locally, I do NOT add the public key and culture information which I believe forces web config to look into the GAC for the assembly which may or may not be present.


This might be included in your web config but in my case it is in a separate file to keep web.config cleaner.

    <section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler, log4net"/>
    <appender name="TraceAppender" type="MyAssemblyName.Security.AzureTraceAppender, MyAssemblyName">
      <layout type="log4net.Layout.PatternLayout">
        <conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
      <level value="ERROR"/>
      <appender-ref ref="TraceAppender"/>

What you will notice is that I am NOT directly using the log4net TraceAppender but a sub-class called AzureTraceAppender. I got this code, thanks to Pete McEvoy who noticed that the log4net TraceAppender logs everything using Trace.Write which basically outputs everything as INFO level and which therefore is blocked by any filtering done by the DiagnosticMonitorTraceListener. His suggestion is the following code in a sub-class in your project which does what it should have done in the first place:


using System.Diagnostics;
using log4net.Appender;
using log4net.Core;

namespace MyAssemblyName.Security
    public class AzureTraceAppender : TraceAppender
        protected override void Append(LoggingEvent loggingEvent)
            var level = loggingEvent.Level;
            var message = RenderLoggingEvent(loggingEvent);
            if (level >= Level.Error)
            else if (level >= Level.Warn)
            else if (level >= Level.Info)
            if (ImmediateFlush)


In your code, you can then simply use log4net as you normally would: log.Error("My Error");


If this all works as expected, your application, once deployed to Azure should create a table in the table storage of the storage account used on your application called WADLogsTable. To ensure this works, you could add a simple Trace.WriteError("Starting role"); into your web role OnStart method and this will at least ensure early on that it works. Windows logs are copied into blob storage, I can't remember the exact container name but it is something obvious like wad-iislogs.
If it doesn't work. Try using System.Diagnostics.Trace.WriteError() instead of log.Error() and this will bypass log4net and see whether the problem is with your log4net config or the DiagnosticMonitorTraceListener configuration. Also, check what your diagnostic connection string is set to in your cloud role configuration, especially after moving storage accounts!
Post a Comment