Performance on-demand; Giving your ops team runtime flexibility

Performance On Demand

Pretend you are in an operations position, in which your job is to maintain the infrastructure that routes traffic and the servers that serve requests. Wouldn't it then be nice, if you suddenly had a surge in traffic or a drop in available server hardware (be it expected or unexpected), you could alter the performance characteristics of your web applications?

This is a problem we've been tackling with our new set of web apps, and we think we've got a pretty good solution in place.

Operations Administration Panel for Runtime Configuration

For starters, we've created an administration web application for our operations folks, whose primary purpose is that of runtime configuration. Operations can control various aspects of our systems from this application, including:

  • Logging levels
  • Caching TTLs
  • Database masters/slaves and replication strategies
  • Application settings
  • Network locations for editorial assets
  • Logical service bus participants
  • Etc.

When any of these settings are updated, we send a message on the service bus informing subscribers of changes in the settings they care about. Let's analyze the one we made reference to above, which will provide ops with a way to dial in performance on demand.

Output Caching

In the administration panel, we've provided a settings page where the output caching TTL and data caching TTL can be set for a given application. When this setting is updated, we publish a message on the service bus, which our front end rendering ASP.NET MVC application can subscribe to.

Creating a handler in the rendering application then is pretty easy. We listen for the settings type that corresponds to caching:

    public class CacheSettingsUpdater : SettingsChangedHandler
    {
        protected override bool ShouldHandle(string id)
        {
            return string.Equals(
                id,
                CacheSettingsData.StorageId,
                StringComparison.OrdinalIgnoreCase);
        }

        protected override void Update(CacheSettingsData settingsData)
        {
            CacheSettings.UpdateSettings(settingsData);
        }
    }

 

As you can see, the handler can then inform a settings class by calling its "UpdateSettings" method, who keeps a reference to the latest data.

    public static class CacheSettings
    {
        private static CacheSettingsData Data = new CacheSettingsData();

        public static int OutputCacheDurationSeconds
        {
            get
            {
                return
                    Data.CurrentAppCacheParameters
                        .OutputCacheDurationSeconds;
            }
        }

        internal static void UpdateSettings(CacheSettingsData data)
        {
            Data = data;
        }
    }

 

Leveraging it with OutputCacheAttribute

Now, in ASP.NET MVC, there is an action filter for output caching: OutputCacheAttribute. This attribute can be applied at the controller level, or at the individual action level. When an action is run the first time, the framework will cache the result, such that the next request won't require processing again, and will be delivered from cache. The cached item will be delivered from cache until the TTL/Duration expires. The effect of this is that your application won't be processing for every request, and will be able to therefor serve more requests.

The issue with connecting up our runtime configuration class from above (CacheSettings) to the OutputCacheAttribute, is that the settings for a filter can only be specified by constants, like so:

    [OutputCache(Duration = 10)]
    public ActionResult Index()

 

So, we need to instead create our own action filter, which inherits from OutputCacheAttribute, so we can control where it gets its values from. I've simplified this for brevity to just illustrate the Duration extensibility point.

    public class ConfiguredOutputCacheAttribute : OutputCacheAttribute
    {
        public new int Duration
        {
            get { return base.Duration; }
            set
            {
                throw new NotSupportedException(
                    "Duration cannot be set directly. " +
                    "Set from runtime config.");
            }
        }

        public ConfiguredOutputCacheAttribute()
        {
            base.Duration = CacheSettings.OutputCacheDurationSeconds;
        }

        public override void OnActionExecuting(
            ActionExecutingContext filterContext)
        {
            base.Duration = CacheSettings.OutputCacheDurationSeconds;
            base.OnActionExecuting(filterContext);
        }
    }

 

As you can see, when we hit OnActionExecuting, we check the CacheSettings class for the current output cache duration, and set it on the base OutputCacheAttribute class we inherited from. The effect of this, is that during day to day traffic, operations can control the cache TTL.

Then we just apply it where we want to cache:

    [ConfiguredOutputCache]
    public ActionResult Index()

 

Well, how did we do?

Let's see what it looks like if I simulate light traffic load. The red line indicates request execution time.

Turning on output caching results in a dramatic drop off in request execution time.

The dramatic drop off occurred when I went into the operations administration panel and changed the TTL. The spikes every 10 seconds following the drop off are when the cache duration TTL expired, forcing the page to actually process again.

There are a number of things we can do to enhance the flexibility of this system. For example, we could specify groupings in the operations administration panel that correspond to cache policies, and then simply specify on each instance of our attribute which policy we'd like to use:

    [ConfiguredOutputCache(CachePolicy = "FooCachePolicy")]
    public ActionResult Index(string streamSlug)

 

We think this feature will be particularly valuable in situations where we need more performance on demand, and look forward to extending it to have more flexibility as needed.

Happy coding!

Discuss this post

You're in Easy Mode. If you prefer, you can use XHTML Mode instead.
As a new user, you may notice a few temporary content restrictions. Click here for more info.