Patterns in static

Transparency vs effectiveness

navigational aids:

News ticker:

topics covered:

the feedback logo. It rotates.

12 June 06.

[PDF version]

Democracy and talent are fundamentally at odds. You want to be able to rotate people every few years if The People should decide that they want to shift left or right, but your bureaucracy is most effective when its members have time to gain years or decades of continuous experience. The bureaumetricians have observed this problem forever; I'm pretty sure Niskanen wrote about this in the early 70s.

[Don't ever say your field is boring until you've spent time in bureaumetrics--the statistical study of bureaucracy. I know only one guy who works on this full time, whose actual real-life name is Dr. Dull. Not to kill the joke, but he's a really fun guy and his work is kind of interesting.]

The number of people who can be successful managers in a technical discipline are few, because you're looking at the intersection of people in the technical field and people who have the skills to be managers. If you cut that list in half by only looking at people who are on the right or left, then you have a fifty-fifty chance of losing the most qualified individual. The person who has been on the job for a decade is pretty likely to be the most qualified, because they were about right for the job a decade ago, and now they have ten years' experience, so if we switch bureaucrats every time the regime changes we're paying a heavy cost in administrative effectiveness for the sake of democratic whim.

Resolving technical issues democratically
Worse is the case of the exceptionally technical fields. The People do not know a thing about welfare shifts from derivative securities regulation, or battlefield logistics, or the multiplier effect of the interest rate at the open market window.

The optimists tell us that for even the most technical discussion, somebody will be able to summarize it for readers with a ninth-grade education so that they can collectively aggregate their opinions to the wisest choice. The majority of The People will ferret out the truth through the authors' flowery writing style and argumentation tricks and select the right option, and then the elected politicians will listen to the majority opinion and successfully communicate the opinion to the technical bureaucrats who were having the original debate.

Majority rule is great because it decentralizes power better than most any other system short of just drawing straws, but there's nothing about it that makes it an effective means of detangling technical issues. There's the Iowa presidential futures market, that does a great job of aggregating a semi-majority opinion of how the majority will vote. There are the standard stock markets, which do a great job of aggregating information about how people will value a stock into a stock value. But the majority of voters are homophobes who would rather get a cut in their property taxes than fund their kids' schools, and the majority in the marketplace just wants more porn and SUVs.

The majority (by vote or dollar) has a lot going for it, but the claim that it makes wise choices is based more on faith than on fact. Some pro-market types define away the problem, by assuming that if the market wants something, it must be wise; this is 100% equivalent to somebody who says that we don't understand everything that the Good Lord does, but it must always be good. It also makes a certain assumption about Benthamic additive utilities that nobody actually believes.

Academia is incredibly political, but it's not a Democracy. Debates transpire between the handful of few people who have devoted their lives to minutiæ like supermodularity in production functions, and everybody else gets to hang on the sidelines and watch (while they have their own debates about other issues). You won't find many academics who think we should switch to a system where technical debates are resolved by a majority vote among the vaguely-informed.

Producing accountability
OK, so the technicians should be left to do math and administer bureaucracy as best they know how, without the uninformed masses telling them that their models need to take into account the immorality of the homosexual lifestyle. On the other hand, none of us want public policy handed down from a black box in the sky, because we know that the people in that black box have their own biases. Sometimes, the eggheads have ideas that don't sit so well with the rest of us on an ethical level that we lay-idiots can readily evaluate. So how do we get the good parts of accountability without the negative effects of political meddling?

There's no easy formula for balancing the need for accountability with the need for apolitical and clear-headed analysis, but some organizations are clearly doing a better job of it than others.

The Fed
The Federal Reserve has a whole lot of really smart people who write models that you will never see, but which affect your life. The Fed desperately needs to retain those people, and as far as I understand it, it does.

Mr. DF of Bozeman, MT, pointed out to me that the Fed does so well because it insulates its eggheads from politics. He reports that there is a whole layer of management whose job is to keep the politicians happy without revealing anything about what the modelers are doing.

Now, the Fed has a few features that allow it to maintain its secrecy. It is semiautonomous by design, and when it's short on cash it just prints bonds. The system is thus built so that a politically incorrect model won't cause budget cuts. Further, there's the simple justification that if the models were public, people could plan accordingly, thus frustrating some of the Fed's attempts at control.

Joel the Guru opines that the sole role of management in a tech firm is to insulate the talent from the mess of politics and money issues and customer complaints, while still keeping those customers happy. The strategy of the Fed's management is very similar: mollify the politicians while keeping their influence on the models minimal.

The Department of Defense
Like the Fed, the DoD does a lot of things about which The Enemy must not be informed, and which is too technical for the average person to really understand. It is basically the only department of the federal government that actually does anything besides printing forms and allocating funds, and it sure ain't run like a Democracy.

In this case, the big questions are presented to The People: e.g., should we invade Iraq, and given that we ignored the majority rule and invaded anyway, should we stay or go. The detailed questions about tactics and logistics and strategy are basically nondemocratic and in most cases classified.

I'm reserving judgment about whether this is an effective or correct means of doing things, but there's the DoD strategy: let the broad policy strokes be public and insulate all details of implementation.

Now let's have a look at last week's DHS funding reallocation. Under secretary Foresman assures us that "Political considerations play no part in the allocation process - none whatsoever. And I'm unequivocal on that." It sounds like there was a reasonably academic process involved, even including 2 X 2 grids, but I don't know what political haggling went in to the design of the nice, scientific allocation process. There were set criteria written down before the meeting, but did those criteria favor some places over others? E.g., perhaps less focus was placed on the iconic targets that are all over NYC and DC but not so common in Nebraska. There are a few frustrated economists shaking their heads at all of it (I've met them), bothered that their perfectly good research doesn't affect anything because others who were better at politicking got there first.

In short, the DHS has no insulation, and politically unpopular modeling will get you defunded. The DHS management that I've met is not entirely happy with this, and would rather be working on increasing security than increasing the perception of security, but the design of the system, rooted directly under the President, means that it will be an uphill battle to create that insulation. The fact that certain agencies keep doing unethical things in secret doesn't help.

The World Bank
My last entry went into detail about the World Bank, where a bunch of managers oversee research assistants who gather semirelevant research done elsewhere. The Bank's operations gain little respect from academics, and The People don't much like them either, so they're a worst-case situation.

In the context here, the modelers are in no way insulated from the politicians. Every Bank report gets handed to a politician somewhere, and the politician then decides whether his or her country will continue funding the Bank. An analyst can get fired for writing a politically incorrect model. The Bank's managers seem very OK with this setup, and in my opinion many even embrace it. But indeed, the embrace of politically-driven analysis leads to the sort of worst-case issues I've discussed, where neither The People nor the academics trust the output. But it is well-funded.

The UN, by the way, would probably fall into this class too, and nobody likes them either.

To the extent that I have any conclusion, it is this: the Bank and DHS and UN don't have to be this way. If the Bank's operations departments were able to retain good economists trained in modern methods (and right now they can't), the the role of management in this would not be to oversee RAs and generate half-baked research, but to insulate the real producers from the politics and ensure that they keep getting funded. In such a world, there would be less management than before, and the balance of power would shift in favor of the analysts and away from the politicians (because it can't shift any more toward the politicians).

[link] [No comments]
[Previous entry: "Policy recommendations for the World Bank"]
[Next entry: "An open letter to GE PR"]

Yes, the comment box is tiny; write in a real text editor then just cut and paste here.
If you are a human, type the letter h in the first box.
h for human: