While working for an internet marketing company I was tasked with reviewing an application for performance optimization opportunities. The goal was to reduce the latency of each web request. The application was nothing more than a tracking web site that was used to track clicks from PPC (pay-per-click) advertisements (e.x, AdWords). When a user clicked on an ad, they would be sent to the web app and then redirected to the final destination URL.

Problem?

It’s the same old story. The system started out fast and as time went on (read: as more data/traffic came in) the latency increased for each request. After adding five additional servers for load balancing, the latency didn’t improve.

The definition of slow was 80-100ms. The target was 20ms. After investigation, I found that the web app was writing the details of each request to a database. Not just any database though, the only database the company had. The same database that every other application relied on.

The concept of scalability was lacking.

Continuing my investigation, I ask the usual questions about the data that was being logged, how it’s used, how often and of course, how fresh does it have to be? The answers made it very clear on how I could meet the goal of 20ms. The data was stored in a table where it would be processed every hour by some other process, but only data from the previous hour would be handled. In summary the data did not need to be delivered in real time. It just needed to eventually get delivered.

Solution!

Since there was no way that I could optimize the database, or calls to it, I went ahead and decided to optimize the architecture of the web app instead.

Instead of sending the request details to the database one-by-one, they would now go into a message queue. MSMQ to be exact. Sending messages to a local queue is extremely fast and is very simple to implement.

using System.Messaging;

namespace MSMQExample
{
    class Program
    {
        static void Main(string[] args)
        {
            var msg = new Message();
            msg.Body = "Hello!";

            var queue = new MessageQueue(".\\Private$\\requestdata");
            queue.Send(msg);
        }
    }
}

To get the data into the database I built a worker application that would dequeue the messages and then perform a bulk insert into the database. Bulk inserts allow for a very large number of rows to be inserted in milliseconds compared to individual inserts.

Now each request to the web app would enqueue the request details and then redirect the user to their final destination. On a set schedule (every 15 minutes), the worker application would process the queued items and perform the database insert.

The end result? ~19ms. The web app was no longer hindered by the database and doing bulk inserts every 15 minutes eliminated unnecessary load on the database.

Advertisements