Cost:Benefit vs Risk:Reward
At Pythia Cyber we understand that just about every organization must take in more money than it puts out. Even non-profits have to obey this rule somehow.
This fact makes the Cost:Benefit analysis something just about every member of every senior management team understands, respects and keeps in mind.
The Risk:Reward analysis gets less attention. For many people at the big table, the risks are either hard to quantify or easy to ignore. All too often the entire subject can be side-stepped by "doing what everybody else does."
Pity the technology leader: he or she ignores the Risk:Reward analysis at their peril. The risks can quickly turn into very tangible problems which, in hindsight, could have been avoided.
But you cannot simply beat the "that is risky!" drum all the time. The enterprise has to transact business. These days that means that a very high proportion of enterprises have to expose themselves to cybersecurity risks. And there is often real pressure from management peers: "my bank does it" and "all our competitors do it" and "if we don't do it, we will be left behind." We have to balance the cost against the benefit against the risk as well.
Technology leaders often do themselves a disservice by not being clear with their peers. Sometimes we need to say "no" (although "not yet" and "not unless" are better answers). We need to be better about distinguishing these common scenarios:
Houston, We Have A Problem
A mature, trusted technology base can be a good thing. Stable, trusted, good enough and paid for are all desirable attributes. Then along comes a new way of doing things and suddenly your mature, trusted technology base is a liability. In all likelihood you have not been saving that "no need to invest more" dividend all the time that your technology base just worked. This creates an issue: either we say no to joining the stampede toward that new way of doing things, or we (truthfully) say that providing that new service will require a massive amount of time and money. The problem is that saying "I can't do that thing everyone else is doing without a ton of money and time" sound like a gross overstatement or an outright lie. We have to bring our peers along with us in talking about core systems and how, sometimes, you cannot bolt a rocket onto your car and make it into a airplane. Sometimes, if you suddenly need to be able to fly, you have to junk your car and buy an airplane. Even if your neighbor, who bought a flying car as their first car, was able to fly immediately and at no extra cost.
Keeping Up With The Joneses
Sometimes we are asked to do provide new technological capability simply because other organizations are doing it. In this case, it is tempting to "just do it" and ignore either analysis, or even both. This blows up in our faces two ways: first, in whatever cost overrun or risk we incur and second in loss of credibility: we agreed to the do the impossible and we (inevitably) failed. But what our peers remember is that we failed. Political capital is not vanity, it is a tool you need to get you over the hump sometimes. It is very rare to have too much political capital; most of us need all the political capital we can get. We certainly can't afford to waste it because we were uncomfortable pointed out that Cost:Benefit doesn't work or that the Risk:Reward doesn't work.
Not All Children Are Above Average
Sometimes the truthful answer to "why can't you do what THEY did" is "I don't have that capability in my team." There is not--or should not be--any shame in not being able to do something. That is what learning and recruitment are for. But learning and recruitment take time and money and sometimes we are pressured into trying to fudge it. Surely some bright spark on our team can figure this out? Often the bright spare can figure it out: it just takes too long and you end up with someone's self-taught first attempt.
Programmer Optimism
Programmers are most often associated with this phenomenon, but many technologists has a very rosy view of technology deployment. We can see so clearly how it will all work. In the tech development world, project managers have an even more cynical way of expressing this tendency. They have a (mostly) tongue-in-cheek rule to handle programmer optimism: "Get a time estimate from a programmer. Double the amount and go up and unit of measurement. Thus 1 day becomes 2 weeks, 2 weeks becomes 4 months, etc." When I was a programmer, this was acutely annoying. When I entered management, it was annoying but had a grain of truth to it. When I entered senior management, it seemed to be a useful tool. Hold yourself accountable internally and externally. It is better to be explicitly slower with accurate deadlines than apparently faster with untrustworthy deadlines. Guard that political capital.
But be aware of the Cost:Benefit for long deadlines; the benefits accrue slowly, over time. The cost is immediate in that your peers may be frustrated or disappointed right from the start of the project. The risk is that someone will decide that inaccurate short deadlines are better than accurate long deadlines. The reward is that your peers will know that they can trust you--if you get the chance.
The Scotty Effect
A complicating factor when fighting Programmer Optimism is the Scotty Effect. This is when a technologist gives a wildly exaggerated deadline so that they can beat that deadline and be a hero. The name comes from the original Star Trek serious, in which Chief Engineer Montgomery Scott constant claimed that a project would take days or weeks and then, when given a grossly shorter time frame by his Captain, managed the job in a fraction of the time. This is a real phenomenon and it is Programmer Pessimism. If programmers have too sunny an outlook when they think they can see exactly how this is all going to play out, they have too dark an outlook when faced with something unfamiliar. If you manage technology people long enough, you will live to see teams actually cycle through these state: the task is easy! It is impossible. It will be quick! It will take forever. We can get to the moon! How about we settle for low earth orbit? Part of your job as the technology leader is to deal with this internally: we know that this is relatively normal, but our peers do not. They will be deeply unsettled by the shifting sands under their feet.
Conclusion
Leading a technology unit is challenging. Weigh the pros and cons in every case. Give the best deadlines you can, have meaningful milestones and be brutally honest about those milestones. Keep the Cost:Benefit in mind but never lose sight of the Risk:Reward. Your job is to help the organization leverage technology to do whatever it is that they do, but whatever the money folks tell you, they expect a minimum amount of safety.
Comments
Post a Comment