tsJensen

A quest for software excellence...

Agile and Architecture

One of the most misunderstood and misrepresented documents in the history of software development is the Agile Manifesto. This may be due to many of its readers overlooking the phrase “there is value in the items on the right.” Most seem to focus on the items on the left only. Here’s the text that Cunningham, Fowler, Martin and other giants in the field created:

Manifesto for Agile Software Development

We are uncovering better ways of developing
software by doing it and helping others do it.
Through this work we have come to value:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on the right,
we value the items on the left more.

Note that I have emphasized the items on the right. These do indeed have value but so many advocates of Agile deliberately ignore and even exclude these from their software development process and organization. Some have advocated the elimination of architecture and design entirely, leaving these open to gradual discovery through the iterative process driven by use cases and user stories and backlog tasks.

Recently I have read a number of discussions, blog posts and articles on the question of Agile and architecture. The comments and discussion around the topic have been interesting. Perhaps this is due to the notion that developing software is only about writing the code. The general theme of these sources is that architecture (and design) are at odds with Agile. This is one of the great fallacies of our time.

Architecture and Design are Software Development Artifacts

Teams and organizations who skip architecture and design will sooner or later find themselves off track and repeating work unnecessarily. Such waste is not entirely preventable, but this does not mean we should not try. Teams that incorporate these activities into their iterations, regularly revisiting architectural questions such as non-functional requirements and component design, will find that they are better able to stay on course.

Organizations that have multiple teams will find greater stability in moving forward when guided by a centralized architecture team, comprised of architects or leads who are dedicated to and work within the organizations development teams. The architecture team works in an Agile fashion, with its own backlog and its own products including working prototypes, cross cutting standardized components, documentation of the architecture and designs, and work items to be placed on the backlogs of development teams.

In this way, development teams have dependencies on the architecture team and can make requests for additional guidance or improvements or extensions to shared, standardized libraries for which the architecture team is responsible. These requests keep the backlog of the architecture team charged with work throughout the software development lifecycle.

In addition to formal activities to improve architecture and design across the organization, the architecture team should regularly interact with development teams to include (but not limited to) the following:

  • practice improvement activities—e.g. SOLID principles
  • technology deep dives—e.g. digging deeper into .NET
  • technical solutions brainstorming sessions—solving the hard problems
  • technical debt evaluation and pay-down planning
  • code reviews and walkthroughs—one on one and as a team
  • presenting and sharing solutions and ideas from other development teams
  • exploring and evaluating new technologies and tools

Like testing and coding, architecture and design are a part of the whole of software development. These activities are perfectly suited to Agile development practices, including SCRUM. And when all of these aspects of delivering quality software are taken into account and incorporated into your Agile process, your chances of success are greatly improved.

----

P.S. And if you add to all this a great dev ops team to support your efforts with automated build and deploy systems, your life will be that much easier and your chances of success are automatically improved.

ServiceMock a New ServiceWire Based Project

I know. There are some really great mocking libraries. The one I’ve used the most is Moq 4. While I’ve not been a regular user of mock libraries. I am fascinated with their usefulness and I’ve recently been thinking about how I might utilize the ServiceWire dynamic proxy to create a simple and easily extended mock library. After a few hours of work this morning, the first experimental of ServiceMock comes to life.

This is not a serious attempt to replace Moq or any other mocking library. It is for the most part a way to demonstrate how to use the dynamic proxy of ServiceWire to do something more than interception or remote procedure call (RPC). It is entirely experimental, but you can get it via NuGet as well.

With ServiceMock, you can now do something like this:

// create your interface
public interface IDoSomething
{
   void DoNoReturn(int a, int b);
   string DoSomeReturn(string a, string b);
}

// now mock and use the mock
// note: you don't have an implementation of the interface
class Program
{
   static void Main(string[] args)
   {
      var mock = Mock.Make<IDoSomething>();

      mock.DoNoReturn(4, 5);
      var mockReturnValue = mock.DoSomeReturn("a", "b");
      Console.WriteLine(mockReturnValue);

      Console.WriteLine("Press Enter to quit.");
      Console.ReadLine();
   }
}

To create a library that takes advantage of the ServiceWire dynamic proxy, you need a factory (Mock), a channel (MockChannel) that the dynamic proxy will invoke, a channel constructor (MockDefinition) parameter class, and finally an function for invoke and exception handling should the invoke throw (MockActions). And of course, you can supply your own customized function and assign it to the MockActions instance.

The heart of the extensibility is the ability to inject your own “invoke” function via the instance of the MockActions class in the MockDefinition constructor parameter.

var mock = Mock.Make<IDoSomething>(new MockDefinition
{
   Id = 1,
   Actions = new MockActions
   {
      Invoke = 
         (id, methodName, returnType, parameters) =>
         {
            // do your thing here
            var retval = new object[parameters.Length + 1];

            // assign your return value to the first object
            // in the return array
            retval[0] = returnType.Name == "String"
               ? returnType.ToString()
               : TypeHelper.GetDefault(returnType);

            //by default, return all parameters as supplied
            for (int i = 0; i < parameters.Length; i++)
            {
               retval[i + 1] = parameters[i];
            }
            return new object[parameters.Length + 1];
         },
      InvokeExceptionHandler = 
         (id, methodName, returnType, parameters, exception) =>
         {
            //do your custom exception handler if your invoke throws
            return true; //return true if you want exception thrown 
            //return false if you want the exception buried
         }
   }
});

Here’s the default “invoke” code should you not wish to provide one.

(id, methodName, returnType, parameters) =>
   {
      Console.WriteLine(id + methodName);
      var retval = new object[parameters.Length + 1];
      
      //return params must have returnType 
      //as first element in the return values
      retval[0] = returnType.Name == "String" 
         ? returnType.ToString() 
         : TypeHelper.GetDefault(returnType);

      //by default, return all parameters as supplied
      for (int i = 0; i < parameters.Length; i++)
      {
         retval[i + 1] = parameters[i];
      }
      return retval;
   };

Of course, you might want to log the calls, aggregate counts per methodName or whatever you wish. I hope you find this useful, but I hope more that you will build your dynamic proxy wrapper for your own cool purposes.

ServiceWire and ServiceMq Improvements

Over the past several days, I’ve been working on a day-job project that may be using ServiceMq which uses ServiceWire under the covers. I say “may” because it depends on whether the prototype proves to be sufficiently reliable and efficient. The prototype is really more of an extended integration test with the following requirements:

  • Blocking Send method that throws if sending fails.
    • Tries the primary destination first.
    • Alternative destinations are tried successively until the message is successfully sent.
    • Control over send connection timeout failure to enable “fail fast.”
  • Standard caching receive.

This is because the senders are transient and may not be restarted should they fail. The sender’s also need immediate feedback because the action is part of a transaction involving other operations.

The first order of business was to add the Flash class to ServiceMq. This evolved to become the Flasher class which implements IDisposable in order to take advantage of client connection pooling using the updated PooledDictionary class in ServiceWire (more on this later). The Flasher’s Send method allows you to send a message to a primary destination with zero to many alternate destinations.

[TestMethod]
public void FlashDestDownTcpTest()
{
   var qfrom = new Address(Dns.GetHostName(), 8966);
   var q1Address = new Address(Dns.GetHostName(), 8967);
   var q2Address = new Address(Dns.GetHostName(), 8968);

   // create and use Flasher in using to guarantee Dispose call
   using (var flash = new Flasher(qfrom))
   {
      // create a receiving queue
      using (var q2 = new MessageQueue("qf2", q2Address, @"c:\temp\qf2"))
      {
	     // send to primary which does not exist - received on secondary q2
         var id = flash.Send(q1Address, "my test message", q2Address);
         var msg = q2.Receive();
         Assert.IsTrue(msg.Id == id);
      }

      using (var q1 = new MessageQueue("qf1", q1Address, @"c:\temp\qf1"))
      {
	     // send to primary - received on primary
         var id = flash.Send(q1Address, "my test message", q2Address);
         var msg = q1.Receive();
         Assert.IsTrue(msg.Id == id);
      }

	  // demonstrate Send throws when neither receiver is "up"
      try
      {
         var id = flash.Send(q1Address, "my test message", q2Address);
      }
      catch (Exception e)
      {
         Assert.IsTrue(e is System.Net.WebException);
      }
   }
}

In order to support a more robust client side connection timeout, a significant improvement was made to ServiceWire. The TcpEndPoint class was introduced and an overloaded constructor was added to TcpChannel which allows you to specify a connection timeout value when creating an instance of TcpClient<T> (see below). This involved use of the Socket class’s ConnectAsync method with a SockeAsyncEventArgs object.

private void Initialize(Type serviceType, 
   IPEndPoint endpoint, int connectTimeoutMs)
{
   _serviceType = serviceType;
   _client = new Socket(AddressFamily.InterNetwork, 
                        SocketType.Stream, ProtocolType.Tcp);
   _client.LingerState.Enabled = false;

   var connected = false;
   var connectEventArgs = new SocketAsyncEventArgs
   {
      // must designate the server you want to connect to
      RemoteEndPoint = endpoint
   };
   connectEventArgs.Completed 
      += new EventHandler<SocketAsyncEventArgs>((sender, e) =>
         {
            connected = true;
         });

   if (_client.ConnectAsync(connectEventArgs))
   {
      //operation pending - (false means completed synchronously)
      while (!connected)
      {
         if (!SpinWait.SpinUntil(() => connected, connectTimeoutMs))
         {
            if (null != _client) _client.Dispose();
            throw new TimeoutException("Unable to connect within " 
                                       + connectTimeoutMs + "ms");
         }
      }
   }
   if (connectEventArgs.SocketError != SocketError.Success)
   {
      if (null != _client) _client.Dispose();
      throw new SocketException((int)connectEventArgs.SocketError);
   }

   if (!_client.Connected) throw new SocketException(); 
   _stream = new BufferedStream(new NetworkStream(_client), 8192);
   _binReader = new BinaryReader(_stream);
   _binWriter = new BinaryWriter(_stream);
   SyncInterface(_serviceType);
}

The final problem to solve was TCP/IP port exhaustion. My original implementation had the sending client being created and taken down with each call to the Send method. The construction overhead is minimal but the connect time and the port exhaustion problem quickly becomes a problem where there are potentially many threads sending messages from the same client machine.

To solve the problem, I used a connection pooling strategy that involved making the PooledDictionary<TKey, TValue> implement the IDisposable interface to allow for easy disposal of pooled client objects. The Send method then uses one of two client pools, depending on whether it is a local Named Pipes connection of a TCP connection.

private void SendMessage(OutboundMessage message)
{
   NpClient<IMessageService> npClient = null;
   TcpClient<IMessageService> tcpClient = null;
   IMessageService proxy = null;
   var poolKey = message.To.ToString();
   try
   {
      // determine whether to use NamedPipes or Tcp
      var useNpClient = false;
      if (message.To.Transport == Transport.Both)
      {
         if (message.To.ServerName == message.From.ServerName)
         {
            useNpClient = true;
         }
      }
      else if (message.To.Transport == Transport.Np) useNpClient = true;

	  // requet a client from the pool, providing a way to create a new one
      if (useNpClient)
      {
         npClient = npClientPool.Request(poolKey, 
            () => new NpClient<IMessageService>(
               new NpEndPoint(message.To.PipeName, _connectTimeOutMs)));
         proxy = npClient.Proxy;
      }
      else
      {
         tcpClient = tcpClientPool.Request(poolKey,
            () => new TcpClient<IMessageService>(new TcpEndPoint(
               new IPEndPoint(IPAddress.Parse(message.To.IpAddress), 
                  message.To.Port), _connectTimeOutMs)));
         proxy = tcpClient.Proxy;
      }

	  // send the message via the proxy RPC Enqueue* method
      if (null == message.MessageBytes)
      {
         proxy.EnqueueString(message.Id, message.From.ToString(), 
		    message.Sent, message.SendAttempts,
            message.MessageTypeName, message.MessageString);
      }
      else
      {
         proxy.EnqueueBytes(message.Id, message.From.ToString(), 
		    message.Sent, message.SendAttempts,
            message.MessageTypeName, message.MessageBytes);
      }
   }
   finally
   {
      if (null != tcpClient) tcpClientPool.Release(poolKey, tcpClient);
      if (null != npClient) npClientPool.Release(poolKey, npClient);
   }
}

It remains to be seen whether these changes will result in a sufficiently robust message passing and queuing system to allow it to be used on my day job project. More testing and prototyping is required. There are alternatives to which I can fall back, but none of them are trivial and all of them are less desirable. Given my personal bias, I must take extra care to scrutinize this possible solution and abandon it should it prove to be insufficient.

Broadcast Added to ServiceMq

I have added a Broadcast method to distribute in guaranteed order a single message to multiple destinations. If one of those destinations is down, message delivery will resume when it becomes available again. If the message cannot be delivered in 24 hours, the the message will get logged to the failed log.

The new version of this library also creates send, read and failed log files by minute to assure the files are not too large when there is a very high number of messages flowing. It will also remove the sent and read files after 48 hours, a value you can change in the constructor.

Get the NuGet package here. Or check out the code on GitHub. Here’s some test code that demonstrates how these features work. (Update: the 1.2.2 package just published adds CountOutbound and CountInbound properties to allow you determine if your queues are getting backed up and more receiving or less sending should occur based on the limits you choose.)

[TestMethod] //test of send while dest down
public void DestDownTest()
{
   var q1Address = new Address("qd1pipe");
   var q2Address = new Address("qd2pipe");
   using (var q1 = new MessageQueue("qd1", q1Address, @"c:\temp\qd1"))
   {
      q1.Send(q2Address, "hello world 1");
      Thread.Sleep(200); //destination not available
      q1.Send(q2Address, "hello world 2");
      // now fire up the destination - simulating dest down
      using (var q2 = new MessageQueue("qd2", q2Address, @"c:\temp\qd2"))
      {
         var msg = q2.Receive();
         Assert.IsNotNull(msg);
         Assert.AreEqual(msg.MessageString, "hello world 1");
         msg = q2.Receive();
         Assert.IsNotNull(msg);
         Assert.AreEqual(msg.MessageString, "hello world 2");
      }
   }
}

[TestMethod] //test of broadcast to multiple destinations
public void BroadcastTest()
{
   var q1Address = new Address("qb1pipe");
   var q2Address = new Address("qb2pipe");
   var q3Address = new Address("qb3pipe");
   var q4Address = new Address("qb4pipe");
   using (var q4 = new MessageQueue("qb4", q4Address, @"c:\temp\qb4"))
   using (var q3 = new MessageQueue("qb3", q3Address, @"c:\temp\qb3"))
   using (var q2 = new MessageQueue("qb2", q2Address, @"c:\temp\qb2"))
   using (var q1 = new MessageQueue("qb1", q1Address, @"c:\temp\qb1"))
   {
      q1.Broadcast(new [] 
         { 
            q2Address, 
            q3Address,
            q4Address
         }, "hello\r\nworld");
      var msg2 = q2.Receive();
      Assert.IsNotNull(msg2);
      Assert.AreEqual(msg2.MessageString, "hello\r\nworld");
      var msg3 = q3.Receive();
      Assert.IsNotNull(msg3);
      Assert.AreEqual(msg3.MessageString, "hello\r\nworld");
      var msg4 = q4.Receive();
      Assert.IsNotNull(msg4);
      Assert.AreEqual(msg4.MessageString, "hello\r\nworld");

      // confirm message received is the same
      Assert.AreEqual(msg2.Id, msg3.Id);
      Assert.AreEqual(msg3.Id, msg4.Id);
      Assert.AreEqual(msg2.Sent, msg3.Sent);
      Assert.AreEqual(msg3.Sent, msg4.Sent);
   }
}

If you find it helpful, please give me a shout. If you want a change, shout louder.

PriorityQueue<T> in C# Using Heap

Today I had occasion to use in a work project the PriorityQueue<T> and Heap code that I had written and blogged about recently. In doing so, I discovered a couple of bugs and fixed them and added tests to cover the issue that was uncovered.

Here’s what changed in the PriorityQueue<T>. You can follow the link above to see the change to Heap.

// before
public void Enqueue(T item, T minItem, T maxItem)
{
   if (_order == PriorityOrder.Max)
      Heap.MaxInsert(_data, item, maxItem, _heapSize++);
   else
      Heap.MinInsert(_data, item, minItem, _heapSize++);
}

// after
public void Enqueue(T item, T minItem, T maxItem)
{
   if (_order == PriorityOrder.Max)
      Heap.MaxInsert(_data, item, minItem, _heapSize++);
   else
      Heap.MinInsert(_data, item, maxItem, _heapSize++);
}

And here is the full code for PriorityQueue<T>:

using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace AlgoLib.Structures
{
   public enum PriorityOrder
   {
      Min,
      Max
   }

   public class PriorityQueue<T> : IEnumerable<T>, 
      ICollection, IEnumerable where T : IComparable
   {
      private readonly PriorityOrder _order;
      private readonly IList<T> _data;
      private int _heapSize = 0;

      public PriorityQueue(PriorityOrder order)
      {
         _data = new List<T>();
         _order = order;
      }

      public PriorityQueue(IEnumerable<T> data, PriorityOrder order)
      {
         _data = data as IList<T>;
         if (_data == null) _data = new List<T>(data);
         _order = order;
         _heapSize = _data.Count;
         if (_order == PriorityOrder.Max)
            Heap.BuildMaxHeap(_data);
         else
            Heap.BuildMinHeap(_data);
      }

      public PriorityQueue(int initialCapacity, PriorityOrder order)
      {
         _data = new List<T>(initialCapacity);
         _order = order;
      }

      public void Clear()
      {
         _data.Clear();
      }

      public bool Contains(T item)
      {
         if (_order == PriorityOrder.Max)
            return Heap.MaxContains(_data, item, 0, _heapSize);
         else
            return Heap.MinContains(_data, item, 0, _heapSize);
      }

      public T Dequeue()
      {
         if (_heapSize == 0) throw new InvalidOperationException();
         if (_order == PriorityOrder.Max)
            return Heap.ExtractMax(_data, _heapSize--);
         else
            return Heap.ExtractMin(_data, _heapSize--);
      }

      public void Enqueue(T item, T minItem, T maxItem)
      {
         if (_order == PriorityOrder.Max)
            Heap.MaxInsert(_data, item, minItem, _heapSize++);
         else
            Heap.MinInsert(_data, item, maxItem, _heapSize++);
      }

      public T Peek()
      {
         return _data[0];
      }

      public void TrimExcess()
      {
         // trim remove items in _data beyond _heapSize
         while (_heapSize < _data.Count)
         {
            _data.RemoveAt(_data.Count - 1);
         }
      }

      public T[] ToArray()
      {
         return _data.ToArray();
      }

      public IEnumerator<T> GetEnumerator()
      {
         return _data.GetEnumerator();
      }

      IEnumerator IEnumerable.GetEnumerator()
      {
         return _data.GetEnumerator();
      }

      public void CopyTo(Array array, int index)
      {
         _data.CopyTo((T[])array, index);
      }

      public int Count
      {
         get { return _heapSize; }
      }

      public int Size
      {
         get { return _data.Count; }
      }

      public bool IsSynchronized
      {
         get { return false; }
      }

      public object SyncRoot
      {
         get { return _data; }
      }
   }
}

If you like it, let me know.

How to Rescue Distressed Projects and Teams

If you have worked in the software development world long enough, it has likely been your privilege (tongue firmly in cheek) to work on a project and with a team that has been taken to or even driven over the brink of failure. A project like this usually involves an unhappy client, a frustrated management and a very discouraged delivery team. It generally involves an “interrupt-driven” task and workflow prioritization process with a fixed delivery schedule, a once fixed but changing requirements set, and estimates and assumptions that failed to consider the full lifecycle of a feature, story, or task.

Often such projects are cancelled and teams dismantled. Sometimes they push through to a bitter end with something that works but everyone unhappy. It took too long. It cost too much. It works but not well. Clients are lost. Teams suffer from unnecessary attrition. Blame and resentment prevail. But there is a better way. Teams and projects can be rescued.

To rescue a distressed project and team is not as hard as one might think. Many have written about this. Some of us have even experienced it first hand. One excellent case study was published two years ago by Steve Andrews on InfoQ. There are many other stories like the one he shares and they all have several common aspects that can come to your rescue.

Analyze and Decide Using Facts
Working from facts and data, such as defect counts and other available metrics, can help to eliminate the emotional element and engage the team’s analytical talents.

Drive Quality with Acceptance Tests
Make quality and testing come first. Create acceptance tests for a given feature or story before you begin coding. Acceptance tests should clearly define “done” and support validation.

Eliminate Waste—Control Flow—Decrease Batch Size
Long established principles of quality manufacturing, these can be applied to software development. Creating very large and complex requirements documents that will be invalidated shortly after development begins is waste. Managers pushing large sets of tasks and assigning specific work items to specific team members creates waste. “Fix all the bugs” creates a batch that may overwhelm any team. But when team members pull work from a queue (aka backlog) and a team’s total work-in-progress (WIP) is limited, individual and team work flows efficiently.

Allow Teams to Self-Organize
Coach teams in Scrum and Kanban and let them choose which works best for them to control flow and achieve individual and team efficiency. Some teams may choose a combination. In any case, self-organized teams pull work and progress more efficiently than those who wait for management to assign out tasks. Management is then free to focus on grooming the backlog.

Manage the Backlog
Change control and therefore control of the backlog is critical to the success of a project. A manager with one of any number of titles controls what gets added to the backlog and when. The manager gathers details from stakeholders and delivery team members for each item on the backlog to provide sufficient detail for an estimate to be made. Based on input from stakeholders, the manager prioritizes items. Delivery team members add estimates before items can be taken off the backlog and put into a ready state or work-in-progress state. Estimates can be in abstract “points” or “ideal days” or some other common unit of measure to allow tracing metrics as work proceeds. Once estimates are provided, the manager works with stakeholders to finalize backlog priorities.

Work as a Team
Even if you are not using Kanban, you still need to eliminate bottlenecks and prevent individuals from working too far ahead of the team. If analysts are unable to keep up with writing acceptance tests, re-task other team members to avoid starving or bunching up of the team’s work-in-progress. If the delivery team lacks a well groomed and ready backlog, you should alter your planning cadence, decoupling it from your delivery cadence.

The primary factor in rescuing a distressed project and team is the motivation of management. If you believe in your people and give them the tools, processes and coaching they need to achieve great things, you can turn around a troubled project and team.

Priority Queue with Heap Algorithms in C#

Continuing with my C# Algorithms series, I’ve just completed a rather lengthy effort to implement and test min and max heap algorithms, including heap sort, along with a priority queue, something not provided by the BCL. While the heap sort is not always the most efficient, the algorithms required to accomplish the sort, specifically the heap data structure functions, supply the requisite functionality to make the implementation of a priority based queue quite easy to implement.

This work is a part of my continuing effort to work through all of the common algorithms found in Introduction to Algorithms 3rd Edition, I highly recommend this work to anyone who wishes to study classic computer science algorithms. For me, the exercise of implementing them in C# is a great learning experience.

First, let’s look at how it works. You have a job scheduler that needs to execute jobs in order of priority but jobs are enqueued with different priorities but need to be dequeued according to that priority. Here’s the Job class. Note the implementation of the MinValue and MaxValue which is needed for the heap based ordering required after a dequeue.

public class Job : IComparable
{
   public string JobId { get; set; }
   public double Priority { get; set; }

   public int CompareTo(object obj)
   {
      var other = obj as Job;
      if (null == other) return 1; //null is always less
      return this.Priority.CompareTo(other.Priority);
   }

   private static Job _min;
   private static Job _max;

   public static Job MinValue
   {
      get
      {
         if (_min == null)
         {
            _min = new Job { JobId = null, Priority = double.MinValue };
         }
         return _min;
      }
   }

   public static Job MaxValue
   {
      get
      {
         if (_max == null)
         {
            _max = new Job { JobId = null, Priority = double.MaxValue };
         }
         return _max;
      }
   }
}

And now here’s the test code that demonstrates the use of PriorityQueue<T> with the Job class.

[TestMethod]
public void JobTest()
{
   IList<Job> jobs = new List<Job> 
   {
      new Job { JobId = "test1", Priority = 45.0 },
      new Job { JobId = "test2", Priority = 25.0 },
      new Job { JobId = "test3", Priority = 4.0 },
      new Job { JobId = "test4", Priority = 88.0 },
      new Job { JobId = "test5", Priority = 96.0 },
      new Job { JobId = "test6", Priority = 18.0 },
      new Job { JobId = "test7", Priority = 101.0 },
      new Job { JobId = "test8", Priority = 7.0 }
   };
   var jobQueue = new PriorityQueue<Job>(jobs, PriorityOrder.Max);
   jobQueue.Enqueue(new Job 
                   { 
                      JobId = "test8", 
                      Priority = 232.0 
                   },
                   // min and max needed for MaxInsert or MinInsert
                   Job.MinValue, Job.MaxValue);
   Assert.IsTrue(jobQueue.Count == 9);
   var val = jobQueue.Dequeue();
   Assert.IsTrue(val.Priority == 232.0);
   Assert.IsTrue(jobQueue.Count == 8);
   Assert.IsTrue(jobQueue.Size == 9); //heapSize not same
   val = jobQueue.Peek();
   Assert.IsTrue(val.Priority == 101.0);
   jobQueue.TrimExcess();
   Assert.IsTrue(jobQueue.Count == jobQueue.Size);
}

I’m posting the PriorityQueue<T> class here along with the static Heap class that provides the underlying heap structure algorithms. I hope you get some use out of them.

public class PriorityQueue<T> : IEnumerable<T>, 
   ICollection, IEnumerable where T : IComparable
{
   private readonly PriorityOrder _order;
   private readonly IList<T> _data;
   private int _heapSize = 0;

   public PriorityQueue(PriorityOrder order)
   {
      _data = new List<T>();
      _order = order;
   }

   public PriorityQueue(IEnumerable<T> data, PriorityOrder order)
   {
      _data = data as IList<T>;
      if (_data == null) _data = new List<T>(data);
      _order = order;
      _heapSize = _data.Count;
      if (_order == PriorityOrder.Max)
         Heap.BuildMaxHeap(_data);
      else
         Heap.BuildMinHeap(_data);
   }

   public PriorityQueue(int initialCapacity, PriorityOrder order)
   {
      _data = new List<T>(initialCapacity);
      _order = order;
   }

   public void Clear()
   {
      _data.Clear();
   }

   public bool Contains(T item)
   {
      if (_order == PriorityOrder.Max)
         return Heap.MaxContains(_data, item, 0, _heapSize);
      else
         return Heap.MinContains(_data, item, 0, _heapSize);
   }

   public T Dequeue()
   {
      if (_heapSize == 0) throw new InvalidOperationException();
      if (_order == PriorityOrder.Max)
         return Heap.ExtractMax(_data, _heapSize--);
      else
         return Heap.ExtractMin(_data, _heapSize--);
   }

   public void Enqueue(T item, T minItem, T maxItem)
   {
      if (_order == PriorityOrder.Max)
         Heap.MaxInsert(_data, item, maxItem, _heapSize++);
      else
         Heap.MinInsert(_data, item, minItem, _heapSize++);
   }

   public T Peek()
   {
      return _data[0];
   }

   public void TrimExcess()
   {
      // trim remove items in _data beyond _heapSize
      while (_heapSize < _data.Count)
      {
         _data.RemoveAt(_data.Count - 1);
      }
   }

   public T[] ToArray()
   {
      return _data.ToArray();
   }

   public IEnumerator<T> GetEnumerator()
   {
      return _data.GetEnumerator();
   }

   IEnumerator IEnumerable.GetEnumerator()
   {
      return _data.GetEnumerator();
   }

   public void CopyTo(Array array, int index)
   {
      _data.CopyTo((T[])array, index);
   }

   public int Count
   {
      get { return _heapSize; }
   }

   public int Size
   {
      get { return _data.Count; }
   }

   public bool IsSynchronized
   {
      get { return false; }
   }

   public object SyncRoot
   {
      get { return _data; }
   }
}

And here’s the Heap code. Not light reading. For help in walking through the code, review the unit tests on GitHub.

public static class Heap
{
   /* What a max heap looks like from 45, 25, 4, 88, 96, 18, 101, 7:
    * 
    *                    0
    *                  /   \
    *                 1     2                       
    *                / \   / \
    *               3   4 5   6
    *              /
    *             7
    *                   101
    *            96             45
    *       88        25    18       4
    *    7
    */

   /// <summary>
   /// Convert IList to a max-heap from bottom up such that each node maintains the
   /// max-heap property (data[Parent[index]] >= data[index] where Parent = index / 2).
   /// </summary>
   /// <typeparam name="T"></typeparam>
   /// <param name="data"></param>
   public static void BuildMaxHeap<T>(IList<T> data) where T : IComparable
   {
      var heapSize = data.Count;
      for (int index = (heapSize / 2) - 1; index > -1; index--)
      {
         MaxHeapify(data, index, heapSize);
      }
   }

   /* What a min heap looks like from 45, 25, 4, 88, 96, 18, 101, 7:
    * 
    *                 0
    *               /   \
    *              1     2                       
    *             / \   / \
    *            3   4 5   6
    *           /
    *          7
    *             
    *                 4
    *           7          18
    *       25    96    45    101
    *    88       
    */

   /// <summary>
   /// Convert IList to a min-heap from bottom up such that each node maintains the
   /// min-heap property (data[Parent[index]] <= data[index] where Parent = index / 2).
   /// </summary>
   /// <typeparam name="T"></typeparam>
   /// <param name="data"></param>
   public static void BuildMinHeap<T>(IList<T> data) where T : IComparable
   {
      var heapSize = data.Count;
      for (int index = (heapSize / 2) - 1; index > -1; index--)
      {
         MinHeapify(data, index, heapSize);
      }
   }

   /// <summary>
   /// Maintain max-heap property for data at index location for specified heap size 
   /// such that data[Parent[index]] >= data[index] where Parent = index / 2.
   /// </summary>
   /// <typeparam name="T"></typeparam>
   /// <param name="data"></param>
   /// <param name="index"></param>
   /// <param name="heapSize"></param>
   public static void MaxHeapify<T>(IList<T> data, int index, int heapSize) where T : IComparable
   {
      var largest = index;
      var left = HeapLeft(index);
      var right = HeapRight(index);
      if (left < heapSize
         && (data[left] != null
            && data[left].CompareTo(data[index]) > 0))
      {
         largest = left;
      }
      if (right < heapSize
         && (data[right] != null
            && data[right].CompareTo(data[largest]) > 0))
      {
         largest = right;
      }
      if (largest != index)
      {
         //exchange data[index] with data[largest}
         var tempRef = data[index];
         data[index] = data[largest];
         data[largest] = tempRef;
         //recurse
         MaxHeapify(data, largest, heapSize);
      }
   }

   /// <summary>
   /// Maintain min-heap property for data at index location for specified heap size
   /// such that data[Parent[index]] <= data[index]
   /// </summary>
   /// <typeparam name="T"></typeparam>
   /// <param name="data"></param>
   /// <param name="index"></param>
   /// <param name="heapSize"></param>
   public static void MinHeapify<T>(IList<T> data, int index, int heapSize) where T : IComparable
   {
      var smallest = index;
      var left = HeapLeft(index);
      var right = HeapRight(index);
      if (left < heapSize
         && (data[left] == null
            || data[left].CompareTo(data[index]) < 0))
      {
         smallest = left;
      }
      if (right < heapSize
         && (data[right] == null
            || data[right].CompareTo(data[smallest]) < 0))
      {
         smallest = right;
      }
      if (smallest != index)
      {
         //exchange data[index] with data[largest}
         var tempRef = data[index];
         data[index] = data[smallest];
         data[smallest] = tempRef;
         //recurse
         MinHeapify(data, smallest, heapSize);
      }
   }

   /// <summary>
   /// Extrax max and re-heapify with decremented heapSize. 
   /// Caller must remember to decrement local heap size.
   /// </summary>
   /// <typeparam name="T"></typeparam>
   /// <param name="data"></param>
   /// <param name="heapSize"></param>
   /// <returns></returns>
   public static T ExtractMax<T>(IList<T> data, int heapSize) where T : IComparable
   {
      heapSize--;
      if (heapSize < 0) throw new IndexOutOfRangeException();
      T max = data[0];
      data[0] = data[heapSize];
      if (heapSize > 0) MaxHeapify(data, 0, heapSize);
      return max;
   }

   /// <summary>
   /// Extrax min and re-heapify with decremented heapSize. 
   /// Caller must remember to decrement local heap size.
   /// </summary>
   /// <typeparam name="T"></typeparam>
   /// <param name="data"></param>
   /// <param name="heapSize"></param>
   /// <returns></returns>
   public static T ExtractMin<T>(IList<T> data, int heapSize) where T : IComparable
   {
      heapSize--;
      if (heapSize < 0) throw new IndexOutOfRangeException();
      T max = data[0];
      data[0] = data[heapSize];
      if (heapSize > 0) MinHeapify(data, 0, heapSize);
      return max;
   }

   public static void MaxIncrease<T>(IList<T> data, int index, T item) where T : IComparable
   {
      if (null == item || item.CompareTo(data[index]) > 0) 
         throw new ArgumentException("new item is smaller than the current item", "item");

      data[index] = item;
      var parent = HeapParent(index);
      while (index > 0 
         && (data[parent] == null
            || data[parent].CompareTo(data[index]) < 0))
      {
         //exchange data[index] with data[parent}
         var tempRef = data[index];
         data[index] = data[parent];
         data[parent] = tempRef;
         index = parent;
         parent = HeapParent(index);
      }
   }

   public static void MinDecrease<T>(IList<T> data, int index, T item) where T : IComparable
   {
      if (null == item || item.CompareTo(data[index]) < 0)
         throw new ArgumentException("new item is greater than the current item", "item");

      data[index] = item;
      var parent = HeapParent(index);
      while (index > 0
         && (data[index] == null
            || data[index].CompareTo(data[parent]) < 0))
      {
         //exchange data[index] with data[parent}
         var tempRef = data[index];
         data[index] = data[parent];
         data[parent] = tempRef;
         index = parent;
         parent = HeapParent(index);
      }
   }

   /// <summary>
   /// Insert item into max heap. Caller must remember to increment heapSize locally.
   /// </summary>
   /// <typeparam name="T"></typeparam>
   /// <param name="data"></param>
   /// <param name="item"></param>
   /// <param name="minOfT"></param>
   /// <param name="heapSize"></param>
   public static void MaxInsert<T>(IList<T> data, T item, T minOfT, int heapSize) 
      where T : IComparable
   {
      heapSize++;
      if (heapSize < data.Count)
         data[heapSize] = minOfT;
      else
         data.Add(minOfT);
      MaxIncrease(data, heapSize - 1, item);
   }

   /// <summary>
   /// Insert item into min heap. Caller must remember to increment heapSize locally.
   /// </summary>
   /// <typeparam name="T"></typeparam>
   /// <param name="data"></param>
   /// <param name="item"></param>
   /// <param name="maxOfT"></param>
   /// <param name="heapSize"></param>
   public static void MinInsert<T>(IList<T> data, T item, T maxOfT, int heapSize) 
      where T : IComparable
   {
      heapSize++;
      if (heapSize < data.Count)
         data[heapSize] = maxOfT;
      else
         data.Add(maxOfT);
      MinDecrease(data, heapSize - 1, item);
   }

   public static bool MaxContains<T>(IList<T> data, T item, int index, int heapSize) 
      where T : IComparable
   {
      if (index >= heapSize) return false;
      if (index == 0)
      {
         if (data[index] == null)
         {
            if (item == null) return true;
         }
         else
         {
            var rootComp = data[index].CompareTo(item);
            if (rootComp == 0) return true;
            if (rootComp < 0) return false;
         }
      }
      var left = HeapLeft(index);
      var leftComp = 0;
      if (left < heapSize)
      {
         if (data[left] == null)
         {
            if (item == null) return true;
         }
         else
         {
            leftComp = data[left].CompareTo(item);
            if (leftComp == 0) return true;
         }
      }

      var right = HeapRight(index);
      var rightComp = 0;
      if (right < heapSize)
      {
         if (data[right] == null)
         {
            if (item == null) return true;
         }
         else
         {
            rightComp = data[right].CompareTo(item);
            if (rightComp == 0) return true;
         }
      }

      if (leftComp < 0 && rightComp < 0) return false;

      var leftResult = false;
      if (leftComp > 0)
      {
         leftResult = MaxContains(data, item, left, heapSize);
      }
      if (leftResult) return true;

      var rightResult = false;
      if (rightComp > 0)
      {
         rightResult = MaxContains(data, item, right, heapSize);
      }
      return rightResult;
   }


   public static bool MinContains<T>(IList<T> data, T item, int index, int heapSize)
      where T : IComparable
   {
      if (index >= heapSize) return false;
      if (index == 0)
      {
         if (data[index] == null)
         {
            if (item == null) return true;
         }
         else
         {
            var rootComp = data[index].CompareTo(item);
            if (rootComp == 0) return true;
            if (rootComp > 0) return false;
         }
      }
      var left = HeapLeft(index);
      var leftComp = 0;
      if (left < heapSize)
      {
         if (data[left] == null)
         {
            if (item == null) return true;
         }
         else
         {
            leftComp = data[left].CompareTo(item);
            if (leftComp == 0) return true;
         }
      }

      var right = HeapRight(index);
      var rightComp = 0;
      if (right < heapSize)
      {
         if (data[right] == null)
         {
            if (item == null) return true;
         }
         else
         {
            rightComp = data[right].CompareTo(item);
            if (rightComp == 0) return true;
         }
      }

      if (leftComp > 0 && rightComp > 0) return false;

      var leftResult = false;
      if (leftComp < 0)
      {
         leftResult = MinContains(data, item, left, heapSize);
      }
      if (leftResult) return true;

      var rightResult = false;
      if (rightComp < 0)
      {
         rightResult = MinContains(data, item, right, heapSize);
      }
      return rightResult;
   }


   private static int HeapParent(int i)
   {
      return i >> 1; // i / 2
   }

   private static int HeapLeft(int i)
   {
      return (i << 1) + 1; //i * 2 + 1
   }

   private static int HeapRight(int i)
   {
      return (i << 1) + 2; //i * 2 + 2
   }
}

If you find any flaws, please do let me know. I will be working on improving this library over time, so look for updates and check GitHub for latest code. Enjoy.