02 December 2010

Developing Organization Change Skill

When I was trying to describe what it takes to do a good rollout of something new across a larger software development organization, I came up with a chant that made a lot of sense:
Here's the old
Here's the new
Here's the difference
Here's what you can do
Yesterday, I was faced with frustration that came from feeling incapable of doing the rollout tasks that were my lot. And I wanted other people to be capable of rolling new stuff across the development organization, too.

I asked myself:
How am I going to get other people to be capable of rolling new stuff out?
That is when the idea came.

So I think that a successful rollout formula is:
  1. presenting all 4 things in sequence, and
  2. making it easy for people who are affected by the rollout to take the next step
Published with Blogger-droid v1.6.5

04 November 2010

Reality Vacuum

A place where people have chosen to ignore a certain part of reality, and where that choice to ignore is embedded in the culture of that place.

I know I've occupied such a space, and even contributed to the vacuum. However, once I realize what's going on, I always want out.

To some degree, each person in the world has to keep up in one way or another. Some just have a better feel for keeping in touch with current reality, and what to do when they're not.

Personally, I want to improve in my ability to close the loop better and faster.
Published with Blogger-droid v1.6.4

23 August 2010

Long Term Envisioning

What do people mean when they do:
Long Term Planning
I don't know what that means.

The following are my guesses as to what "Long Term Planning" really means:
  1. An exercise in stating some future desired state with no idea how to get there.
  2. An explicit agreement that you will diverge from reality at some near point in the future.
  3. An exercise in exploring organizational identity.
For conversation types #1 and #2, see the "Planning is guessing" chapter in Rework (p 19). OK, that takes care of 80% of the Long Term Planning situations.

I think that conversation type #3 is a very important conversation that typically gets way too little time. Devoting time to a pragmatic answer to "Why does this organization exist?" can produce a shared vision that can make everyone's work much more coordinated. See the last paragraph on page 58 of "A simpler way" (and subsequent paragraphs) for an inspiring discussion of this topic. You can search for "58" inside this book on Amazon as of Aug. 2010 and read a couple pages.

Each person's individuality can stop going in n directions and start converging on producing real stuff that satisfies the "Why?" question. If it's worthwhile to answer "Why?" with real stuff delivered to the real world, then it continues to be worthwhile to be associated with an organization.

If it becomes impossible to reasonably answer "Why?" or becomes impossible to actually deliver real answers to that question to real people, that is the departure point at which you have to start looking for some other organizational affiliation.

For me, the answer to "Why?" for FamilySearch software development is contained in Doctrine and Covenants 128:24 and continues to be a rich source of inspiration for interesting and valuable software contribution.

28 July 2010

Playing catch-up

The world is changing all the time, which implies the requirement of constant learning to keep current. In other words, you are always behind.

If you accept the fact that you are always going to be behind, in one way or another, the problem then becomes more tractable. How, in a limited amount of time, can you bootstrap yourself into a learning environment where you can catch up enough to get something working?

The core questions are:
  1. What to learn? (because of the limited time, you know you have to be selective)
  2. How to go about it? (because of the limited time, and the constant churn, you have to be a quick learner/applier)

Andy Hunt wrote a book about pragmatic learning. Clayton Christensen wrote several books about disruptive innovation. The Wikipedia contributors wrote an article about the term "learning curve". The ideas in these books can be instructive.

I have my own opinion about the matter.

My answers to the core questions are:
  1. Look around and get creative about how you can apply about-to-be-stable newer technology to the software problem at hand.
  2. Climb the dynamic learning curve by becoming an "early adopter".

Being an "early adopter" is a productive approach to bootstrapping yourself into a rich learning environment. The key to quality learning is keeping it real & experiential. And trying new technology out and trying to apply it to the task at hand is certainly real & experiential.

The part that makes this whole learning equation possible is that the "innovators" actually need the "early adopters" in order to gain traction and stability. In an open world, that means that you can use early adoption as a means by which it is always possible to inject yourself into a rich and productive learning environment.

After I play this game for a while and become skilled at it, I'm guessing there will be a point at which I will want to have a talk with Paul Graham about a startup. Or maybe I will care about being an innovator in my family more than being famous in a technical sphere. Who knows.

Counterpoint

Donald Knuth is the classic example of someone whose life mission specifically excludes playing month-to-month catch-up. Oh, and by the way, that page returns the following HTTP header (in 2010):
Last-Modified: Fri, 23 Sep 2005 04:39:22 GMT

Even the innovation-encouraging Paul Graham wrote an article about addiction that cautions about blanket acceptance of technical improvements.

A Theist Balance?

I recently came across Clayton Christensen's recent article in the Washington Post: A Theist Balance on the Court.

I liked the pun on a-theist.

But, more seriously, I think that the topic of "voluntary obedience to unenforceable laws" is core to the American experiment.

There is a point at which people cannot be governed by law except to their own detriment. I believe that this point is reached when individuals decide to trample each others' rights and attempt to defend themselves, by legal or illegal means, in doing so.

And I very much like Clayton's reframing of the problem: If the "religions of atheism and secularism offer us no institutions whose mission is to inculcate in the next generation of Americans the instinct to obey unenforceable laws," then it is improper to marginalize the contribution of any and all of the institutions that DO inculcate such an instinct.

12 July 2010

How to rewrite a complex test

An Integrated Test is suboptimal for asserting basic functionality and basic sub-component collaboration behavior.

So if you have a massive Integrated Test, how is it possible to rewrite that test into some number of the following kinds of tests?
  • focused unit test
  • focused collaboration test (how one class collaborates with another)
  • systems-level integration test (load balancer behavior, queuing system behavior)
I think it comes down to the following activities:
  • enumerate the different permutations of state
  • enumerate the different permutations of flow
  • for each permutation of state: create one focused unit test
  • for each permutation of flow: decide whether 1) the permutation of flow devolves into a sub-component collaboration test, or 2) into a systems-level integration test
  • create the required focused collaboration tests
  • create any required systems-level integration tests (usually very rare)
There is an interesting smell that comes from the activity of creating tests. There may be an existing test that is responsible for asserting the focused behavior, but it isn't in the right place, so it is hard to find out whether it exists. In this case, the act of "create focused test" implies the act of "move focused test into its rightful home" (so others, including yourself, can find it later).

In a meeting in Feb. 2010, I wrote the following about problems I've experienced with a test suite at work:
  • Inability to run a test context-free => high re-run costs and downstream delays
  • Too much custom test infrastructure => high maintenance costs
  • Risk of centralized integration => waiting on central integration before shipping
The approach I suggested was to find the top 20% costliest tests, and focus on those.

I suggested measuring "costliest tests" using a combination of the following criteria:
  • How many superfluous assertions in this test?
  • How many superfluous historical failures has this test generated in the last 6 months?
  • How long does it take to run this test?
  • How many "permutations of state" is this test trying to cover?
  • How many "permutations of flow" is this test trying to cover?
  • How far away from the code is this test?
  • Is there a place closer to the code where those "permutations of state and flow" can be adequately tested?
  • Are there ways to ensure all the "permutations of flow" can be covered without having to mix the test with trying to test all the "permutations of state" at the same time?
The whole idea is to simulate expensive parts of our tests in a way that still gives us the confidence that the test is valid and covers the desired integration case.

Where and how to test what

J.B. Rainsberger wrote in the fall of 2009 about why he thinks typical programmer use of "Integrated Tests" leads to a vicious cycle of denial and suboptimal behavior.

His overall ideas were summarized well by Gabino Roche, Jr.

And there are good uses of integration tests.

I was about to hit the delete button on this post, because I thought all I had to say had already been said. But there was still something to say: How do I personally work in a way that avoids the Vortex of Doom?

The key idea that has me personally is to pause, and ask the following question:
What is the responsibility of this test?
and then to consider the answer to the related question:
What is the responsibility of the class being tested?

Of course, those are fairly basic OO questions. However, when you're writing tests along with the code, there is a situation that is easy to get stuck in: having so many things in mind at once, that you get confused about the purpose of the test, and even the software you are working to create.

There are at least three things that tend to compete for mind space:
  1. What observable behavior do you want out of your software?
  2. How do you think you might implement that?
  3. How does what you are building interacts with the rest of your system?
And, when #2 gets the top spot in my mind, I find myself forgetting about #3, and resorting to copy/paste for #1 (from other similar tests). However, when I focus on #1 and, by extension, #3, I find myself getting new ideas about how to actually implement the new behavior.

In addition, I find that these new ideas are reorienting in nature. The new stuff I'm working on ends up either modifying an existing concept in a novel way, or introducing a new concept that collaborates with the existing system in a certain way. Then the test I thought I was going to write ends up being a straightforward unit test on a new class, or new methods of an existing class. And a couple of collaboration tests that make sure the new behavior actually gets used.

In the end, there are a few questions that need to get answered:
  1. Does the new behavior work? (unit tests will tell you this, 80% of tests)
  2. Is the new behavior hooked up? (collaboration tests will tell you this, 15% of tests)
  3. Does the whole thing hold together? (automated deploy to a production-style test site with system-level smoke tests will tell you this, 5% of tests)
And the system-level smoke tests are only responsible for making sure that certain itemized flows work, not all permutations of state that go through those flows.

Hopefully this is a useful addition to the already-posted conversations started in 2009.

03 June 2010

Annotated pedigree on a timeline

Here is an idea for a new kind of pedigree display:
Annotated splay tree on a timeline
instead of the conventional:
Split tree with generational columns

I want to show you how this plays out for my family. There are enough steps, that I'd like to show each step, introducing one new display element at a time.

First, here is my family in traditional split tree, generational columns form:


Next, here is my family in splay tree form, still in generational columns:


Now, here is my family in traditional split tree form, but on a timeline:


Now, combine the two -- here is my family in splay tree form *and* on a timeline:


While it is interesting to see the generations grouped together, it would be interesting to see generational hints on the timelines themselves.

One annotation I found useful is something I want to call "generational progression". Ok, so the name is a rough cut, here is the legend:
  • "<<<": time that this person lived before having the child that appears on this pedigree
  • "===": time that this person lived *after* having the child that appears on this pedigree, but *before* having the grandchild (if any) that appears on this pedigree
  • "---": time that this person lived *after* having any grandchild on this pedigree


Here is my family annotated in such a way in split tree form:


And annotated in splay tree form:


What would be nice is to have this graphically displayed, instead of my hokey text attempt. Also, it would be nice to have this be data-driven. But despite the hokey text format, this let me see things about my family that I had not been able to see before. Like how my Grandma was 6 when her mom died. And that I wanted to ask my dad about his memories of his grandpa who was still alive when my dad was 8 or something.

I met Janet Hovorka at the BYU Computerized Genealogy Conference. I saw her chart work, remembered this idea, showed this to her, and promised a writeup. This is my shot at keeping that promise, even if it's a little late. :)

The idea to put stuff on timelines and to annotate things graphically comes straight from listening to, and reading the works of Edward Tufte.

29 April 2010

Reality Quotient

There is a fairly subjective measure I've only recently been able to give a name to:
Reality Quotient = ability to keep context while working toward a specific goal

This has to do with how deep you allow your stack to be, which if you allow it to get too deep, this affects your net throughput on Cockburn's "unvalidated decisions" or Demarco's "Total Useful Mental Discriminations (TUMD)". If the stack is too deep, you end up wasting a lot of time making useless decisions about things that have ceased having anything to do with the REAL task at hand.

The tendency to accept decisions that pin you into a corner is closely related to lowering the Reality Quotient. There is a whole book about the attitude of Getting Real, and I equate that attitude with a high Reality Quotient.

I did a search to see whether anyone else had published a writeup under the "Reality Quotient" heading. Although I found a lot of stuff on the web, none of it really matched what I wanted to say. The topic of this post is NOT:

The measurement I wanted to talk about is how capable you are of focusing on the problem you set out to solve until 1) it is truly solved and published to the world, or until 2) you have redefined the problem into another solvable one and published that transformation to the world.

In short, a high Reality Quotient requires a short stack, with tight feedback loops, focused on publishing real stuff to real people.

28 April 2010

Expert genealogy answers

The focus on "expert answers" really helped me to better understand the shift in focus that was recently announced for StackExchange.

The obvious application is to the genealogy domain. There is a huge "long tail" to genealogy. FamilySearch's wiki is opening that space up -- but there may be some room for a Q&A kind of experience.

IntelliJ patches don't seem to install

When I started IntelliJ (9.0.1, community edition) today, it popped a dialog up with an option to download a "patch" to install 9.0.2 release.

I've clicked "Download Patch" before, but nothing ever happened, and I ended up resorting to a full install.

But today, I came across a comment that described how to apply the patch:

http://blogs.jetbrains.com/idea/2010/01/intellij-idea-901-released/#comment-128607

And it worked great for me.

Others on my team said their patches installed just fine. To be clear, I'm using the Community Edition, and it may only be an issue with that edition.

BYU Conference on Computerized Family History & Genealogy

I attended BYU's Conference on Computerized Family History & Genealogy Monday and Tuesday this week.

It was really informative and disorienting in a good way. I've been working on New FamilySearch for a long while and not realized fully what kind of a target audience I've been developing for. When I was one of the only 30-somethings in the room, and we were talking about a more-than-a-decade technology generation gap, it finally sunk in that the world was different than I had thought.

The main topics I came away with were:
  • The technology generation gap is NOT ok, and needs to be bridged (by both the younger and the older generations).
  • Among serious genealogists, there is a very real drive to publish, similar to the scientific community's reputation-based drive, but motivated also by a desire to preserve research that would otherwise be obliterated by death or lack of interest on the part of direct descendants.
  • There is also a gaping technological hole in the genealogical community: Lack of Internet-style content-driven collaboration on genealogical research. Even serious genealogists seem to be content with this state of things.
  • The Next Generation (TNG) looks really nice. I want to use it for the Sumsion family tree on sumsion.org/genealogy.
The idea that I can make a difference here is VERY motivating.