Be a Better Technology Manager

While browsing the deep space of Alpha Quadrant of the web this evening, I ran into a Forbes article entitled, 6 Fundamentals That Can Make You A Better Manager In 2014 by Victor Lipman. I enjoyed the brief article so much, I decided to refactor it into my own words for the manager we either want to be or want to have.

In our agile quest to improve our software and our processes, we may sometimes overlook the nuances of management. In shops with empowered agile teams, it is possible for managers to make management an afterthought, allowing the self-organizing teams to pick up the slack. It is my opinion that this false sense of security and resulting dip in the quality of management can lead to fundamental and long term organizational and cultural debt.

To avoid accumulating this management debt, team members should encourage, perhaps even require, the following from their managers. And certainly managers should strive to focus on these fundamentals even while riding on the success of an empowered, agile development team.

  1. Be open to suggestion. Seek out and embrace ideas and opportunities to improve your management practices. Be a good listener. Take regular one-on-one’s with your direct reports. Conduct “town hall” meetings with your direct reports and their teams. Encourage honest and open feedback. Don’t allow the “way it’s always been done” to overshadow your improvement courage.
  2. Expect excellence and reward it. Set high but attainable expectations. Communicate them clearly. Be gentle but firm and require regular accountability and reporting. Openly recognize and praise success. Privately discuss under-performance and obtain solid commitments to improve from under performing members of your team.
  3. Use your time effectively and efficiently. Be generous but measured with your time. Spend small amounts of planned time socializing with your team members. Be careful not to impinge on their time or waste yours. A thirty second conversation can do wonders for interpersonal rapport but a five minute chat session can degrade efficiency. Protect your schedule. Insist on well run, timely meetings. Focus on your priorities while making a top priority of maintaining a personal connection with your team. If they know you care, they will care when you need them.
  4. Communicate feedback in realtime. Your team needs regular and immediate feedback. This includes public praise for a job well done. It also includes private and direct feedback on unfulfilled expectations with an immediate call to corrective action or a resolution to an expectation that has become unreasonable. DO NOT do negative feedback by email. Back up any positive emailed feedback in person. Real words from a real person mean a thousand times more than an email.
  5. Embrace conflict. Don’t run away from or duck conflict. Hit it head on, in person, and resolve it. Don’t dwell on blame but fairly examine cause and effect and then focus on actions required to resolve the conflict and move forward. Don’t be afraid to apologize or accept responsibility for creating conflicting expectations or misunderstandings. Invite others to suggest ways to improve. Listen and see step #1.

Now go be a better manager. And if you want a better manager, share this or the Forbes article with her or him.

Insertion Sort and Sorted Merge in C#

Update (9/6/2014) - Updated code on GitHub with performance enhancement to sorted merge algorithm.

As promised a few days ago, here’s the first installment in the algorithm series. A simple insertion sort and the previous sorted merge combine to provide you with a quick way to sort and merge multiple arrays without copying data from one array to another. You can find complete code and tests on GitHub.

Here’s the insertion sort code:

/// <summary>
/// Simple insertion sort of IList or array in place.
/// </summary>
/// <typeparam name="T">Type to be sorted.</typeparam>
/// <param name="data">IList or array to be sorted.</param>
public static void InsertionSort<T>(IList<T> data) 
   where T : IComparable
{
   if (data == null || data.Count < 2) return;
   for (int keyIndex = 1; keyIndex < data.Count; keyIndex++)
   {
      var key = data[keyIndex];
      var priorIndex = keyIndex - 1;
      while (priorIndex > -1 
         && data[priorIndex] != null 
         && data[priorIndex].CompareTo(key) > 0)
      {
         data[priorIndex + 1] = data[priorIndex];
         priorIndex -= 1;
      }
      data[priorIndex + 1] = key;
   }
}

And here’s one test example:

[TestMethod]
public void CombinedMergeSortTest()
{
   IList<MergeSortTestData> a = new List<MergeSortTestData> 
   { 
      new MergeSortTestData { Name = "Robert", Age = 43.5 },
      null,
   };
   IList<MergeSortTestData> b = new List<MergeSortTestData> 
   { 
      new MergeSortTestData { Name = "Robert", Age = 23.5 },
      null,
   };

   Sorter.InsertionSort(a);
   Sorter.InsertionSort(b);

   MergeSortTestData prev = null;
   int count = 0;
   foreach (var val in Merger.SortedMerge<MergeSortTestData>(a, b))
   {
      if (null != val) Assert.IsTrue(val.CompareTo(prev) > 0);
      prev = val;
      count++;
   }
   Assert.AreEqual(count, 4);
}


public class MergeSortTestData : IComparable
{
   public string Name { get; set; }
   public double Age { get; set; }

   public int CompareTo(object obj)
   {
      var other = obj as MergeSortTestData;
      if (null == other) return 1; //null is always less
      if (this.Name == other.Name)
      {
         return this.Age.CompareTo(other.Age);
      }
      return this.Name.CompareTo(other.Name);
   }
}

Mischief Managed-Aligning Blog with GitHub and NuGet

For about a year and a half I’ve been working on various open source projects published on GitHub and several of them have packages on NuGet. When I first set them up, I used the name “duovia” which was the name of a little S corp I use from time to time for corp-to-corp consulting projects. But the name was just causing confusion for anyone looking me up and checking out my open source projects.

So I set out to simplify things a bit all around the name “tylerjensen.” I hope This will make it easier for people to find my work and less confusing when they do.

Blog Facelift and BlogEngine.Net Upgrade

Two days ago a friend of mine pointed out that some of my posts displayed a related link to one or more pages on my blog that I had not actually authored. I don’t generally use the pages feature of BlogEngine.Net, so you can imagine my surprise to find that my blog had been hacked by someone trying to promote a cause. If any of you were offended by that content, I sincerely apologize.

I quickly removed the rogue pages and found that the most likely point of entrance was a vulnerability in the combination of Disqus and the version of BlogEngine.Net that I had been running. The upgrade was not terribly hard but it was a bit tricky. Several side effects of the upgrade included a number of broken links to older posts that used double escaped characters in their title and links. This required enabling requestFiltering with allowDoubleEscaping="true" in the web.config.

The upgrade also sports a far better theme structure and rather than take the time to migrate my custom theme, I decided to go with the existing standard theme with just one or two modifications. This includes the new blog logo, an homage to the company that made my first computer—Commodore.

And finally, of course, just to be sure it wasn’t a simple case of Javascript injection via a malicious comment, I changed all my passwords. I also deleted older non-Disqus comments and updated my Disqus settings and password. For now, I’ll keep the current theme. It suits me. And with all of that out of the way, I can get back to keeping this blog current with what I hope will be useful material.

ASP.NET vNext Update from Hanselman

I’ve started to watch with eager interest the work being done on ASP.NET vNext. Read this excellent status rollup from Scott Hanselman.

Here are my favorite items:

  • Runs on Windows, Mac, and Linux.
  • Runtime in-memory compilation with Roslyn compiler.
  • Cloud optimized CoreCLR installed locally (optional).
  • New project.json system--takes NuGet to infinity and beyond.

I don’t often parrot other blog posts, but you really need to read Hanselman. If you haven’t been paying attention to what the team is doing with ASP.NET, you can repent now and get on board.

Also check out David Fowler’s blog and the official team blog.

Merge Algorithm for Multiple Sorted IEnumerable<T> Sources

This evening I was asked to write a merge algorithm to efficiently merge multiple iterator sources, yielding a merged iterator that would not require the algorithm to read all of the data into memory should the sources be very large. I’ve never written such an algorithm nor can I recall seeing one, so I didn’t have a very good good answer. Of course that left a simmering thread of though on the back burner of my brain.

After letting it rattle around a bit and without resorting to old fashioned Googling, I sat down and banged out the following code. It was fun to write and works but it took me much too long to write from scratch—about 90 minutes. It may be time to refresh and reload, perhaps by writing a series of posts that implement C# versions of selected algorithms found in a book I recently purchased but have since spent no time reading: Introduction to Algorithms 3rd Edition.

Updated Code (9/6/2014)The original code gets a big performance boost with this refactoring:

public static IEnumerable<T> SortedMerge<T>
  (params IEnumerable<T>[] sortedSources)
  where T : IComparable
{
  if (sortedSources == null || sortedSources.Length == 0)
    throw new ArgumentNullException("sortedSources");

  //1. fetch enumerators for each sourc
  var enums = (from n in sortedSources
         select n.GetEnumerator()).ToArray();

  //2. create index list indicating what MoveNext returned for each enumerator
  var enumHasValue = new List<bool>(enums.Length);
  // MoveNext on all and initialize enumHasValue
  for (int i = 0; i < enums.Length; i++)
  {
    enumHasValue.Add(enums[i].MoveNext());
  }

  // if all false, nothing to iterate over
  if (enumHasValue.All(x => !x)) yield break;

  //3. loop through
  while (true)
  {
    //find index with lowest value
    var lowIdx = -1;
    T lowVal = default(T);
    for (int i = 0; i < enums.Length; i++)
    {
      if (enumHasValue[i])
      {
        // must get first before doing any compares
        if (lowIdx < 0 
		    || null == enums[i].Current //null sorts lowest
		    || enums[i].Current.CompareTo(lowVal) < 0)
        {
          lowIdx = i;
          lowVal = enums[i].Current;
        }
      }
    }

    //if none found, we're done
    if (lowIdx < 0) break;

    //get next value for enumerator chosen
    enumHasValue[lowIdx] = enums[lowIdx].MoveNext();

    //yield up the lowest value
    yield return lowVal;
  }
}

Here’s the original code. I hope you enjoy it. And if you see ways to improve on it, please let me know.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace Merger
{
  class Program
  {
    static void Main(string[] args)
    {
      int[] a = { 1, 3, 6, 102, 105, 230 };
      int[] b = { 101, 103, 112, 155, 231 };

      var mm = new MergeMania();

      foreach(var val in mm.Merge<int>(a, b))
      {
        Console.WriteLine(val);
      }
      Console.ReadLine();
    }
  }

  public class MergeMania
  {
    public IEnumerable<T> Merge<T>(params IEnumerable<T>[] sortedSources) 
      where T : IComparable
    {
      if (sortedSources == null || sortedSources.Length == 0) 
        throw new ArgumentNullException("sortedSources");
      
      //1. fetch enumerators for each sourc
      var enums = (from n in sortedSources 
             select n.GetEnumerator()).ToArray();
      
      //2. fetch enumerators that have at least one value
      var enumsWithValues = (from n in enums 
                   where n.MoveNext() 
                   select n).ToArray();
      if (enumsWithValues.Length == 0) yield break; //nothing to iterate over
       
      //3. sort by current value in List<IEnumerator<T>>
      var enumsByCurrent = (from n in enumsWithValues 
                  orderby n.Current 
                  select n).ToList();
      //4. loop through
      while (true)
      {
        //yield up the lowest value
        yield return enumsByCurrent[0].Current;

        //move the pointer on the enumerator with that lowest value
        if (!enumsByCurrent[0].MoveNext())
        {
          //remove the first item in the list
          enumsByCurrent.RemoveAt(0);

          //check for empty
          if (enumsByCurrent.Count == 0) break; //we're done
        }
        enumsByCurrent = enumsByCurrent.OrderBy(x => x.Current).ToList();
      }
    }
  }
}

And if this answers any questions for you, please do drop me a line to let me know.

Distributed Cache Library in C#

Last week I had a conversation with another software engineer who asked me about caching. We had an interesting conversation and it was only later that I remembered that I had experimented with a distributed cache library using the precursor to ServiceWire under the covers along with the v3 open edition of ServiceStack.Text for serialization.

It’s funny that I had completely forgotten about this work. It would have been helpful had I remembered it at the time, but now that I’ve dug it back up, it might be a good idea to dust it off and put some finishing touches on it.

Here’s a small test sample:

[TestMethod]
public void TestClientConfigurationAndConnect()
{
   CacheConfiguration.Current.HostStripedNodePort = 8098;
   CacheConfiguration.Current.ClientStripedClusterEndPoints.Add("cluster",
      new IPEndPoint[]
      {
         new IPEndPoint(IPAddress.Parse("127.0.0.1"), 8098) 
      });

   using (var host = new CacheHost("cluster"))
   {
      host.Open();
      using (ICacheClient client = CacheClient.Connect("b", "cluster"))
      {
         Assert.IsNotNull(client);
         int tval = 89;
         int oval;
         client.Write("k", tval);
         var result = client.TryRead("k", out oval, 0, s => 45);
         Assert.IsNotNull(result);
         Assert.AreEqual(tval, oval);
      }
   }
}

This library was designed to create a local in-memory cache or a distributed cache with ease. Have a look at the code and you’ll find that hosting the cache service in any application domain is rather easy. And if you have multiple service hosts, cache items are distributed via hash and buckets allow you to create more than one named cache which allows the same key to be used without conflict. This would allow more than one client process to utilize the distributed cache.

Of course, this code is not ready for prime time. It still has many rough edges and lacks any monitoring or stats collection. As an experiment it was fun to write. And my only excuse for not blogging about it before is that I honestly got busy with other things and forgot about it.

Have fun with it and as always, if you get any good use out of it, I’d love to hear from you

ServiceMq–A Peer to Peer Store and Forward Message Queue in C#

It’s a “catch up on blogging” weekend. Some months ago, while learning more about ZeroMq, I wrote and pushed to GitHub and NuGet a little library called ServiceMq, a peer-to-peer “store and forward” message queue library inspired by what I learned about ZeroMq and incorporating the ServiceWire library I had previously created.

ServiceMq is an experimental library at this point. I have not spent any time thoroughly testing or improving it since I created it. This is not because it’s not a cool project but only because my time has been limited by demands of the day job and family. One must have priorities. That’s what my wife says anyway.

So now with a brief moment of free time, I’m happy to share with you this little bit of work. Let me explain how it works and then I’ll share some test code here to illustrate. If you are interested in it, I urge you to get the NuGet package or clone the code and try it out and let me know if it has been useful to you.

It is also very important for me to mention that I pulled in and renamed the namespaces for neatness within the library the entire ServiceStack.Text v3 code base as the serialization library used by ServiceMq to enable fast and easy serialization using JSON across the wire without burdening the user of the library with having to make any special accommodations with their message DTO classes. You need to know that after v3, the ServiceStack.Text library’s license changed, so if you plan to use it on your own, be aware of the change. The version I’ve used is 100% compatible with the Apache 2.0 license and derivative notice in the code on GitHub.

While the test code below are the only tests I’ve written for the project. They cover only the primary use cases. The tests have both sender and receiver queues in a single process. In practice you would use the library generally in two processes to enable message passing between them.

The store and forward persistence of messages is important for this library as performance was less important than guaranteed sending and receiving. Scale and memory consumption were not addressed in this initial release.

Here’s the order of events on the sending end:

  • Send method first writes the message to a file.
  • Send method then tries to send to the intended recipient.
  • If the send fails, the message is placed on a “failed-retry” queue.
  • If the sending process fails or is shut down, all persisted messages are read back into memory when the process restarts and creates the message queue again.
  • When the message is successfully sent, the outbound message file is deleted after the message content is appended to a rolling outbound log so that an audit of messages sent is possible.

Now here’s the order of events on the receiving end:

  • The message queue receives a message and writes it to a file.
  • The queue’s Receive method is called and pulls a message when it becomes available off the queue and calls Acknowledge method (see more on Acknowledge below).
  • Or the queue’s Accept method is called and pulls a message when it becomes available off the queue but does NOT call the Acknowledge method. This is used by code that may fail to process the message and so the message is not actually removed from the inbound queue.
  • The Acknowledge method is called, either automatically in the Receive method, or manually after the Accept method is used. The Acknowledge method logs by appending the message to the inbound message log and deletes the individual message file.
  • If the receive process fails before the Acknowledge method is called to delete the message file and log it, the incoming queue will read it into memory prior to new messages arriving in order go guarantee order of delivery of the messages.

Now here’s the test code that shows how each end works:

[TestMethod]
public void SimpleTest()
{
    var q1Address = new Address("q1pipe");
    var q2Address = new Address("q2pipe");
    using (var q2 = new MessageQueue("q2", q2Address, @"c:\temp\q2"))
    using (var q1 = new MessageQueue("q1", q1Address, @"c:\temp\q1"))
    {
        q1.Send(q2Address, "hello world");
        var msg = q2.Receive();
        Assert.IsNotNull(msg);
        Assert.AreEqual(msg.MessageString, "hello world");
    }
}

[TestMethod]
public void SimpleTcpTest()
{
    var q1Address = new Address(Dns.GetHostName(), 8967);
    var q2Address = new Address(Dns.GetHostName(), 8968);
    using (var q2 = new MessageQueue("q2", q2Address, @"c:\temp\q2"))
    using (var q1 = new MessageQueue("q1", q1Address, @"c:\temp\q1"))
    {
        q1.Send(q2Address, "hello world");
        var msg = q2.Receive();
        Assert.IsNotNull(msg);
        Assert.AreEqual(msg.MessageString, "hello world");
    }
}

[TestMethod]
public void SimpleObjectTest()
{
    var q1Address = new Address("q6pipe");
    var q2Address = new Address("q8pipe");
    using (var q2 = new MessageQueue("q8", q2Address, @"c:\temp\q8"))
    using (var q1 = new MessageQueue("q6", q1Address, @"c:\temp\q6"))
    {
        int[] data = new int[] { 4, 8, 9, 24 };
        q1.Send(q2Address, data);
        Message msg = q2.Receive();
        Assert.IsNotNull(msg);
        var data2 = msg.To<int[]>();
        Assert.AreEqual(data[1], data2[1]);
    }
}

[TestMethod]
public void SimpleBinaryTest()
{
    var q1Address = new Address("q3pipe");
    var q2Address = new Address("q4pipe");
    using (var q2 = new MessageQueue("q4", q2Address, @"c:\temp\q4"))
    using (var q1 = new MessageQueue("q3", q1Address, @"c:\temp\q3"))
    {
        byte[] data = new byte[] { 4, 8, 9, 24 };
        q1.SendBytes(q2Address, data, "mybytestest");
        Message msg = null;
        while (true)
        {
            msg = q2.Receive();
            if (msg.MessageBytes != null) break;
        }
        Assert.IsNotNull(msg);
        Assert.AreEqual(msg.MessageBytes.Length, 4);
        Assert.AreEqual(msg.MessageBytes[2], (byte)9);
        Assert.AreEqual(msg.MessageTypeName, "mybytestest");
    }
}

I’m certain the code base needs work and needs to be tested under load and limited memory circumstances. Perhaps even a caching strategy needs to be implemented for scenarios where message volume is very high. I look forward to your feedback.