"It’s impossible to move, to live, to operate at any level without leaving traces, bits, seemingly meaningless fragments of personal information." William Gibson
One of the themes of this site is that the lack of transparency in the development process is a leading cause of mis-management. This need not be the case.
Nearly every aspect of software development leaves a digital trace. Analyzing those traces can help eliminate the fog surrounding software development. I believe the current state of the process can be made available to decision makers. I also believe, though it’s unproven, that the quality and productivity of teams and individual hackers can be measured by analyzing the traces the process leaves behind.
Whether or not that assertion is true, it raises the ethical question implied by the quote from William Gibson: Under what circumstances is it acceptable to put the process under the microscope?
The question is not new, though it is new to software development, where what hackers do is generally thought too complex to measure. In my experience, most programmers believe a great hacker is several times more productive than a marginal hacker, while simultaneously believing that it’s impossible to measure hacker productivity.
There is good reason to suggest that this is not the case. The next time you tell your phone to play a song or have Google translate something, remember that you’re watching statistical natural language processing at work, and that natural languages are far more complex than programming languages. Their complexity hasn’t prevented valuable progress from being made. Look more broadly and examples abound of the successful application of statistics in systems that are more complex than software development. Our metrics are weak not because software is so complex, but because our data sucks.
There is also good reason not to ignore the question of hacker productivity. In the long run, the only way to keep programming jobs in high wage locations is through demonstrably superior productivity.
The question of how to measure performance in an ethical and non-threatening way is old news in industries where statistical process control (SPC) is common. I had the privilege years ago to study with W. Edwards Deming at NYU, who was renowned for having taught SPC in Japan after the war. Japan, of course, taught it to the rest of the world by decimating their low-quality competitors. If you can drive to work without wondering if your car will break down, you owe something to Deming.
In Deming’s view the ultimate goal of process improvement was to “provide jobs and more jobs.” He saw this both as a moral imperative and a practical necessity: only “driving out fear” could prevent the sabotage of the metrics needed for SPC. Because of that, he spent relatively less time discussing math and more time teaching managers how to NOT to misinterpret data. And he emphasized consistently employee morale and security.
If we are to make effective use of data in software engineering we need to be equally vigilant. The data must be used only to (1) help lower-performing individuals improve, and (2) to help move the team as a whole to a higher average. If it turns out that one of your people just doesn’t have the ability to be a strong hacker, it’s on you to find a way for them to contribute. If this happens often, you need to work on your hiring process. Having good data may help. What you don’t do is fire them. The first time someone uses your data as grounds for termination, you’re lost.
[An aside: Other people are starting to move in the direction of analyzing data produced as a normal part of the development process. Michael Feathers had a recent post on his excellent blog that mentions several different SCM-mining efforts underway, though they’re a bit different from what I have in mind.]
 Well, that, and the infinite stupidity of America’s corporate leadership. He was a pretty cranky guy on that subject.