Input | Output |
---|---|
Link | YouTube |
Published | 2022/03/31 |
Theme | |
Status | article incomplete |
Beau says:
Beau explains how intelligence failures stem from groupthink, outdated information, assumptions about the opposition, and political pressure, leading to self-inflicted errors despite available data.
Analysts, strategists, policymakers
Importance of continuous learning and adaptation in intelligence analysis.
#Intelligence #Analysis #Groupthink #PoliticalPressure #Security
Well, howdy there, internet people.
It's Beau again.
So today, we are going to talk about how intelligence
failures occur, why there are times when the intelligence
community comes out and says, this
is what's going to happen with a high degree of certainty.
This is our assessment.
And they are wrong.
And they're way wrong.
And they had the information to know they were wrong.
But for some reason, it wasn't applied.
So we're going to talk about how that occurs.
There are four main factors to this.
The first is something you heard a lot after Iraq, groupthink.
Now, groupthink really means two different things.
There are two really prevalent methods
of this occurring that leads to an intelligence assessment
failure.
The first is the one that everybody knows about,
most people know about.
It's where all of the people making assessments,
they all talk to each other.
And they share information.
And they share their assessments rather than just publishing it.
So what tends to happen when that occurs
is people change their estimates based on information provided
by the people that they talk to.
The problem with that is that it doesn't always
go to the most accurate.
It goes to the person in that group of people
who are talking about it who is the most persuasive.
And they may not be right.
So then you get a whole bunch of estimates
all saying the same thing.
And they're wrong.
It's because they all talk to each other.
And basically, they all copied from one person
who is persuasive about their assessment, their estimate.
You see this a lot.
And the only way to guard against this is to mute people.
I have a video because somebody after Afghanistan
asked about Malcolm Nance.
And I'm like, I don't know what he said.
I don't listen to him.
And it's because he's actually pretty good when it comes
to forecasting what's going to occur.
He actually has a pretty good record when it comes to that.
So I don't listen to him because I
don't want him to influence what I'm going to say.
It's worth noting that prior to hostility starting in Ukraine,
I was like, yeah, I think people are underestimating
the Ukrainian military.
He was like, they're underestimating them.
And it's going to be bad for the Russians from the start.
I have it on very good authority that he does the same thing.
He blocks people out.
He doesn't listen to them.
So that's one method of groupthink.
And that's the way to correct it.
You don't share it.
You don't talk to other people.
You use your information.
Another way this happens is using other people's assessments
and estimates to make yours.
In this case, if you look, most estimates
had a number for functional infantry fighting vehicles,
armored cars, stuff like that, right?
But that estimate, that was 13 years old.
A lot of them had degraded since then.
But that wasn't included because rather than looking
for the newest, freshest information,
people went with the most cited, which also had
to be the oldest.
So standing on the shoulders of people
who created estimates before and using that to make your own
isn't the best, not the best route, because things change.
A decade in military development, that's huge.
That is huge.
The second way it can occur, the second way
intelligence failures occur, is that people
are not aware that they're wrong.
And the second way intelligence failures occur,
assuming the opposition won't make a mistake,
that this is what I do all the time,
assuming that they know better than to do something
that should be obvious that it's the wrong move.
It's really based on the assumption
that they have the same information you do.
When we're talking about the airdrops,
I don't think they're going to do that,
because it's a horrible move.
But they did do it.
They just didn't know it was going to be a horrible move.
That's another one that can lead to bad consequences,
especially if you're not a commentator,
but you're actually producing these kind of estimates
for defense.
That could lead to an airfield being undefended
when it really should be.
The third one is not understanding the opposition's
applied doctrine, the applied doctrine,
not their public doctrine.
You had a whole lot of people during this talking
about Russia's new doctrine, what they say that it is.
The problem is we've never actually seen them apply that.
It's not their real doctrine.
That's just like a paper, comparatively,
would be like a paper published at the War College.
Sure, it's in their library, but it's not actually
what they use.
They have no track record of using that,
so you can't base your estimate on that.
This was the source of the idea that Russia
was holding stuff in reserve, because they
have a doctrine that says that they might do something
like that, that they may use this method to try to keep
people off balance in that way.
The problem is we've never actually seen them do it,
so you can't count on that.
Be aware of it, but you couldn't count on it.
And then the fourth one is political pressure.
There's a lot of pressure to come up with the right answer,
and this is for commentators and people doing it
in government service.
You come up with an estimate that says a good one.
Hey, the United States, they can go toe-to-toe with Russia
and China at the same time and win.
That sounds like something that would be pleasing to the higher
ups.
It's not, because when that information goes public, well,
then why are we spending all this money?
So there's an incentive to come up
with an estimate that makes it a little bit more evenly matched.
In the commentators' world, there's
the concern with having a bad take,
the concern with putting information out there
that their audience rejects and says, no, that's
not what's going to happen.
Luckily, this isn't something I really have to contend with,
but I know there's a lot of people
who do. There's a lot of people who have built audiences
that view the world in a binary, good, bad,
and they want certain estimates to reinforce that.
But that's not reality.
When you're back talking about the military side of things,
if you have an estimate that says, hey,
this highly publicized thing is going to be a good thing,
this highly publicized thing that the administration wants
to do that's going to blow up in their face,
that may be ignored because the administration's already
gone public with it.
That's how intelligence failures occur.
Group think, assuming the opposition won't make a mistake,
not using their applied doctrine,
not the one they say they have, and political pressure.
Most times after events like this,
and it will certainly happen this time,
I know for a fact it'll happen this time,
when they do the hot wash afterward,
and how did this happen, how are we so wrong?
The answers are always the same.
They had the information, they just didn't use it.
They used old estimates to formulate their new estimates.
They talked to each other.
They blended stuff together that shouldn't have been blended.
They didn't come up with independent analysis.
All of this stuff leads to intelligence failures,
and it's something that the US has been battling
since the fall of the Berlin Wall.
That habit of just making these mistakes over and over
and over again, it's an American tradition at this point.
Everybody is looking to Ukraine right now, but understand,
the same thing happened in 2003.
The same thing happened in Iraq the first time.
They had people terrified of the Republican Guard,
a special unit, but the reality is that's all paper doctrine.
In real life, they weren't that good,
and there were estimates that showed that,
but they had people terrified of them.
And it goes back even further than that.
This is a long-running thing.
The people who can produce accurate estimates,
they try not to make these mistakes, these four mistakes.
Those are the big ones.
Smaller ones are formulating it without having the information.
In today's age, the information is out there.
You just have to look for it.
Or being fed bad information, having estimates
that were intentionally seeded with false information
from the other side.
Those are the things, but those are rare.
Most of the real errors are self-inflicted.
They're these things right here.
So anyway, it's just a thought.
Y'all have a good day.
{{Shirt}}
{{EasterEgg}}