US national security faces rising challenges from insider threats and organizational rigidity, says political scientist Amy Zegart.
Zegart writes that in the past five years, seemingly trustworthy US military and intelligence insiders have been responsible for a number of national security incidents, including the WikiLeaks publications and the 2009 attack at Fort Hood in Texas that killed 13 and injured more than 30.
She defines “insider threats” as people who use their authorized access to do harm to the security of the United States. They could range from mentally ill people to “coldly calculating officials” who betray critical national security secrets.
In her research, which relies upon declassified investigations by the US military, FBI, and Congress, Zegart analyzes the Fort Hood attack and one facet of the insider threat universe—Islamist terrorists.
In this case, a self-radicalized Army psychiatrist named Nidal Hasan walked into a Fort Hood facility in 2009 and fired 200 rounds, killing 13 people and wounding dozens of others. The shooting spree remains the worst terrorist attack on American soil since 9/11 and the worst mass murder at a military site in American history, she adds.
Zegart, co-director of the Center for International Security and Cooperation at Stanford University and a senior fellow at the Hoover Institution, reports the findings in the US Army War College Quarterly Parameters.
Three stumbling blocks
Zegart’s study of insider and surprise attacks as well as academic research into the theory of organizations led her to three key insights about why the Army failed to prevent Hasan’s attack when clues were clear:
1. Routines can create hidden hazards. People in bureaucracies tend to continue doing things the same old way, even when they should not, Zegart saus, and not just in America. In the Cuban missile crisis of 1962, for example, US spy planes were able to spot Soviet missile installations in Cuba because the Soviets had built them exactly like they always had in the Soviet Union—without camouflage.
In the Fort Hood case, she says, bureaucratic procedures kept red flags about Hasan in different places, making them harder to detect.
2. Career incentives and organizational cultures often backfire. As Zegart writes, several researchers found that “misaligned incentives and cultures” played major roles in undermining safety before the Challenger space shuttle disaster.
Zegart’s earlier research on 9/11 found the same dynamic played a role in the FBI’s pursuit of two 9/11 hijackers just 19 days before their attack. Because the FBI’s culture prized convicting criminals after the fact rather than preventing disasters beforehand, the search for two would-be terrorists received the lowest priority and was handled by one of the least experienced agents in the New York office.
3. Organizations matter more than most people think. Robust structures, processes, and cultures that were effective in earlier periods for other tasks proved maladaptive after 9/11.
In the case of the Fort Hood attack, the evidence suggests that government investigations, which focused on individual errors and political correctness (disciplining or investigating a Muslim American in the military) identified only some of the root causes, missing key organizational failures.
Hasan slipped through the cracks not only because people made mistakes or were prone to political correctness, but also because defense organizations “worked in their usual ways,” according to Zegart.
Threats from the inside
In terms of organizational weaknesses, Hasan’s Fort Hood attack signaled a new challenge for the US military: rethinking what “force protection” truly means, Zegart says. Before 9/11, force protection reflected a physical protection or hardening of potential targets from an outside attack. Now, force protection has evolved to mean that the threats could come from within the Defense Department and from Americans, she adds.
“For half a century, the department’s structure, systems, policies, and culture had been oriented to think about protecting forces from the outside, not the inside,” Zegart writes.
In the case of Hasan, the Defense Department failed in three different ways to identify him as a threat: through the disciplinary system, the performance evaluation system, and the counter-terrorism investigatory system run jointly with the FBI through Joint Terrorism Task Forces.
[Boston bombing: What 180 million tweets tell us about fear]
“Organizational factors played a significant role in explaining why the Pentagon could not stop Nidal Hasan in time. Despite 9/11 and a rising number of homegrown Jihadi terrorist attacks, the Defense Department struggled to adapt to insider terrorist threats,” Zegart writes.
Another problem was that the Pentagon faced substantial personnel shortages in the medical corps—especially among psychiatrists. So the Defense Department responded to incentives and promoted Hasan, despite his increasingly poor performance and erratic behavior.
In addition, Zegart found the Defense Department official who investigated Hasan prior to the attack saw nothing amiss because he was the wrong person for the job—he was trained to ferret out waste, fraud, and abuse, not counterterrorism, which is why he did not know how to look for signs of radicalization or counterintelligence risk.
[Should governments negotiate with home-grown terrorists?]
“In sum, the Pentagon’s force protection, discipline, promotion, and counter-terrorism investigatory systems all missed this insider threat because they were designed for other purposes in earlier times, and deep-seated organizational incentives and cultures made it difficult for officials to change what they normally did,” she wrote.
Zegart acknowledges the difficulties of learning lessons from tragedies like 9/11, the NASA space shuttle accidents, and the 2009 Fort Hood shooting.
“People and organizations often remember what they should forget and forget what they should remember,” she says, adding that policymakers tend to attribute failure to people and policies. While seemingly hidden at times, the organizational roots of disaster are much more important than many think, she adds.
Source: Stanford University