The Leaderless Revolution — Chapter 1 : The Wave and the Suicide Bomber

By Carne Ross

Entry 8215

Public

From: holdoffhunger [id: 1]
(holdoffhunger@gmail.com)

../ggcms/src/templates/revoltlib/view/display_grandchildof_anarchism.php

Untitled Anarchism The Leaderless Revolution Chapter 1

Not Logged In: Login?

0
0
Comments (0)
Permalink
(1966 - )

Carne Ross (born 1966) is the founder and executive director of Independent Diplomat, a diplomatic advisory group. Carne Ross taught in Zimbabwe before attending the University of Exeter where he studied economics and politics. He joined the British foreign service in 1989. Ross's testimony in the Butler Review directly contradicted the British position on the justification behind the invasion of Iraq. (From: Wikipedia.org.)


On : of 0 Words

Chapter 1

1. The Wave and the Suicide Bomber

When American troops entered Iraq in 2003, they were briefed to expect a conventional army consisting, as such armies do, of tanks, artillery and infantry. Saddam Hussein’s army had once contained more Main Battle Tanks, a primary unit of the conventional army, than all the armies of Western Europe put together.[162]

The lead elements of the American and British armies, then, were surprised to find that most of the opposition they faced comprised not tanks and howitzers, but men in pickup trucks, bearing rocket-propelled grenades (RPGs) and machine guns. These bands would attempt to ambush the advancing allied columns, launch the RPGs, then flee. They were not often successful. Indeed, so desperate and dangerous to their participants were these attacks that they resembled nothing so much as the Japanese kamikaze suicide attacks familiar from the Pacific theater of the Second World War. These fedayeen fighters, as they came to be known, did not appear to belong to particular Iraqi army units, or if they did, their members had abandoned their uniforms and badges that denoted their unit allegiance.

The allied invasion proceeded with remarkably little substantial opposition. The American tanks at the head of the advance reached Baghdad almost as fast as they could drive. The capital quickly fell, the statues of the hated dictator were toppled and the allies assumed control of the country, taking possession of the main government buildings abandoned by Saddam’s cohorts and thus, they believed, control.

It was only in the days that followed that the real military opposition to the invasion began to assert itself. The first suicide attack had taken place during the march on Baghdad. The attacker—an Iraqi army officer dressed in civilian clothes—drove a taxi to a checkpoint near the central city of Najaf and, as American soldiers approached, detonated the vehicle. Four soldiers were killed.[24] Iraq’s then vice president, Taha Ramadan, warned that there would be many more such “martyrdom missions”: He was right, though the attacks that followed were not under government orders; his government would soon disappear. A few days later, two women suicide bombers killed three coalition soldiers north of Baghdad.

Over the days and months that followed, the number of suicide bombings rose dramatically. In one month in 2004, there were several attacks every day. As Dexter Filkins reported in The New York Times, “in the first five years, more than nine hundred people detonated themselves in Iraq, sometimes several in a single day. That was before you counted the car bombs when the driver got out before it exploded. There were thousands of those.”[25]

Suicide bombers used cars, trucks and motorbikes; often they came on foot, sometimes on bicycles. During the “surge” of American troops in 2008, attackers launched fusillades of massive “lob bombs”—explosive-filled gas cylinders propelled by crude rocket engines—from flatbed trucks parked alongside U.S. bases. The operators, their intent clearly suicidal, were inevitably annihilated, but only after unleashing hours of bombardment. Such were the ferocity and effectiveness of the attacks, and the allies’ inability effectively to stop them, that they began to undermine the will of the U.S. to remain. Even before Barack Obama’s election as president in 2008 when during the campaign he had promised to withdraw U.S. troops from Iraq, the administration of George W. Bush had declared a date when the soldiers would leave.

In Afghanistan, allied war planners preparing the 2002 invasion had expected a more irregular resistance. The Taliban who ran the country were more militia than a conventionally organized army, more AK-47 than Main Battle Tank (though they did have a few tanks, at least before the allied airstrikes began). Their tactics had been honed in decades of fighting against other Afghan militias and conventional military forces during the Soviet occupation in the 1980s.

Adept at ambush and hit-and-run attacks, the Taliban fighters were extremely hardy and able to endure long periods without logistical support. After trekking over Afghanistan’s harsh terrain, they would launch an attack with RPGs and machine guns, and occasionally a heavier weapon like a mortar or small artillery piece, then melt away into the unforgiving countryside. The Taliban were not known, however, to use suicide attacks. During the Soviet occupation of Afghanistan, there were no known instances of such tactics, except very occasionally by foreign mujahideen fighters, some of whom later became infamous as Al Qaeda.

When I was posted to Afghanistan as a diplomat shortly after the allied invasion, the defenses of our embassy reflected this assessment of the Taliban’s military capabilities. The embassy was located in a compound surrounded by high, thick walls. Atop the walls was another high fence of sturdy netting, designed to prevent the flight of RPGs into the compound.

In the early days, during and after the allied invasion, this military assessment proved correct. But after a while, as in Iraq, things began to change. The harbinger had taken place on September 9, 2001, an event overshadowed by the attacks shortly afterwards in Washington and New York. Suicide bombers posing as a television film crew assassinated the anti-Taliban mujahideen leader, Ahmed Shah Massoud. Setting up the camera to film him, the “cameraman” blew himself up, and fatally wounded Massoud, who died a few hours later. Indicative of the changing and multinational nature of that conflict, the bombers were Tunisian; the camera had been stolen in Grenoble, France.

There were other antecedents. The Tamil Tigers used suicide attacks extensively against the Sri Lankan army (and, sometimes, civilians) in their fight for a separate Tamil homeland in northern Sri Lanka. Hezbollah cadres used bomb-laden cars and explosive-bearing individuals to attack Israeli army patrols and convoys during Israel’s occupation of southern Lebanon. To those watching elsewhere, the technique appeared crucial in dislodging an enemy which otherwise enjoyed a massive military advantage: Israel’s conventional strength in tanks and aircraft was far superior to Hezbollah’s. Israel withdrew from Lebanon in 2000, demoralized by the suicide attacks it could not effectively prevent.

But it was the suicide attacks in Iraq that seemed to have the most influence. As such attacks mounted in Iraq and increased the discomfort of the allies, suicide attacks became more frequent in Afghanistan, where before they had barely featured. Allied troops, and even trucks carrying humanitarian supplies, were forced to form convoys, protected by tanks and armored vehicles. Not a single major road was safe to travel.

In both Iraq and Afghanistan, the use of suicide bombings produced its own consequences. American and allied forces were forced to adopt aggressive defensive tactics to prevent attacks, including challenging, shooting and destroying people or vehicles that approached allied patrols too closely and ignored (or failed to hear or understand) the warnings given them.

The consequences of these tactics can be imagined and were realized in civilian deaths and growing antipathy to “the occupiers.” Eleven members of the same Iraqi family were shot dead in their car approaching coalition troops, just days after the first suicide attack in March 2003.[26] The effects on the troops obliged to adopt these tactics can also be imagined. As in Iraq, debates grew about the wisdom of a long-term allied military presence in Afghanistan. One reason for the spread of the technique of suicide attacks was all too clear: It worked.

This was a new phenomenon. Normally, the deployment of particular military techniques—aerial bombing, mass armored assault—was a function of hardware: the availability of tanks or aircraft, and carefully constructed strategy. These factors themselves are functions of others: economic development and the degree of organization within both the military and society as a whole. The spread of suicide bombings was different. They were spreading like a virus. If their appearance was correlated with anything, it was not the degree of economic development or military organization, but their opposites.

Some analysts suggest that common to suicide attackers is their strategic objective to remove occupiers from desired territory;[27] some that religious ideology, and in particular Salafi jihadism, is the driving force.[28] Whatever the debate about motives, there is agreement that the incidence of suicide attacks has dramatically increased everywhere over the last two decades, and particularly the last few years. Suicide attacks were not confined to religiously motivated terrorist groups like Hezbollah, the Taliban or Al Qaeda; in Turkey’s Kurdish regions and Sri Lanka, the technique was used by groups driven primarily by secular, and indeed nationalist, ideology.[29] Whatever the motivation, the empirical results—of casualties caused, and political effects in consequence—were demonstrable.

This recent trend had earlier precedents. Japan employed kamikaze attacks only during the last stages of the Pacific war, when all chance of strategic victory had evaporated. The Japanese leadership did, however, encourage the attacks, after initial experiment, for the very same reason: They worked. During battles such as that in Leyte Gulf, the U.S. Navy lost scores of vessels to kamikaze attacks. A later survey showed that kamikaze missions were four to five times more likely than conventional missions to damage or sink their targets.[30]

Just as today’s suicide attackers are often characterized as fanatical and therefore irrational, the kamikazes have been similarly dismissed as the product of death-loving samurai cultlike thinking that gripped the Japanese military elites. But for them, too, there was a logic: The higher the price exacted upon U.S. forces approaching the Japanese homeland, the more, they hoped, America would hesitate to attack the home islands, and would instead sue for a peace more favorable to Japan. Just as in Iraq, Lebanon and now Afghanistan, suicide attacks were permitted by a culture that celebrated death in combat, but also, and above all, because they had a palpable and successful political effect.

By 2005, the use of suicide bombings had spread to Bali and Britain, which suffered major suicide attacks on the London Underground and busses that year. The U.S. of course had already seen such attacks on September 11, 2001. In Mumbai in 2009, suicide attackers killed nearly two hundred people and wounded more than three hundred in a three-day rampage of shooting and murder. Suicide attacks are now commonplace across North Africa and the Middle East, Pakistan and the Horn of Africa, and have spread elsewhere, including sub-Saharan Africa, Turkey, the Caucasus, Indonesia and the Indian subcontinent and even Iran, where in 2010, suicide bombers killed thirty-nine.


Despicable as some may find it, suicide bombing has been perhaps the most influential political-military technique of the late twentieth and early twenty-first centuries: In conflicts that are about different ideologies, territories and religions, fighters have adopted the technique without prejudice. In its horror, suicide bombing offers up an insight into something important, something about how change happens, and how we as people work, and thus how things might be changed for the better—but without killing people. Curiously, that lesson is apparent too in sports stadiums.

At many a baseball game, it takes only one, or a small group, to stand, raise their arms in an attempt to start a wave (they may whoop or cheer at that point). Sometimes the attempt is ignored, but at other times it might initiate a coordinated yet spontaneous motion of tens of thousands of people around the stadium. It’s frivolous, fun, but also oddly moving: “We’re in this together.”

In his book Herd, marketing guru Mark Earls explains why people buy what they do, or rather how they are influenced by the person sitting—or whooping—next to them. Earls cites the sales phenomenon of the Apple iPod. He describes how the color of the headphone cable was a crucial factor in the device’s dramatic sales success. The unusual white color of the cords attracted people’s attention—and enabled them for once to see the brand choices of their peers even though the product itself remained hidden: The cables made the otherwise private choice visible. The innovative features of the product were of course a vital factor in the ultimate decision to buy the iPod, but it was the white cords that triggered the chain of events that led to the purchase.

Earls suggests that most of our lives are “quotations from the lives of others,” as Oscar Wilde put it, a phenomenon evident in the spread of agricultural mechanization across America’s Midwest in the late nineteenth and early twentieth century, when farmers bought tractors when they saw their neighbors had them; or the names we give our children and the music we listen to. All of these trends, Earls asserts, are shaped by social influence first and foremost, and not by our own independent decisions or the inherent appeal of the thing being chosen.

Hitherto, economic theory has suggested that rational choice—a weighing up of the costs and benefits—is the primary basis for decision making, and particularly purchase decisions. But it turns out that nothing more complicated than mimicry may be a better explanation of why people buy what they do. As one correspondent to the New Scientist put it, man should not be named Homo sapiens, “wise man,” but Homo mimicus, “copying man.”

Conventional economic theory claims that humans calculate by numbers, assessing rationally the profit and loss of any transaction. But it appears that even in deciding our finances, like taking on or abandoning a mortgage, the behavior of others is influential. The herdlike popularity of subprime mortgages is already well documented. More recently, the practice of abandoning properties whose mortgages cost more than the value of the house has spread “like a contagion,” according to a recent study, as both its economic rationale but also, crucially, its social acceptability have grown: “It’s okay to walk away.”[31] Researchers found that borrowers were 23 percent more likely to default on their mortgage once their neighbors had done the same.

This mechanism is evident elsewhere. The British government commissioned research to find out how to persuade people to adopt more pro-environment behavior, for example to limit their carbon emissions. The research found that government was itself an ineffective device to encourage behavioral change: People did not trust government and believed it was using climate arguments as an excuse simply to raise taxes.[32] (Indeed, this distrust is one reason why government may be ineffective in promoting the change necessary to protect the environment.) Instead, the research found, the government would need to recruit more influential agents to persuade people to act. These were not scientists, officials or experts, all of whom were nevertheless more trusted than government. Those with the most potential to influence others’ behavior were, the researchers concluded, our next-door neighbors. Indeed, it appears from another study that people take more notice of each other’s actions than they do of formal rules.

Researchers at the University of Groningen in the Netherlands tried to see whether the well-known “brokenwindows theory” of policing actually worked: the concept that if police aggressively target minor crime, such as littering and vandalism, they will reduce overall lawlessness, including major crime, like assault and mugging.[33] The researchers ran various experiments to find out how context—the environment people encounter—affects behavior, including law-breaking. The researchers were trying to understand how disorderly behavior spreads.

In one experiment, they tested whether people took more notice of a clear legal prohibition—a police sign telling people not to lock their bicycles at a particular spot—or of whether other people were violating the rule by locking their own bicycles there. To test this, they ran scenarios with and without the sign, and the presence or absence of other rule-violators: people illegally locking their bikes.

The study’s results were clear. People were more inclined to violate the rule and lock their bicycles illegally if they saw others doing the same thing, regardless of what the police sign said. The study’s authors suggest that their evidence therefore confirms the brokenwindows theory. As such, the study could be taken as affirmation of an assertive policing model where police act quickly and robustly to deal with minor violations, and thus deter more serious crime. But the study also implies a more subversive message. The Groningen experiments show that norms are more important than rules: It is the actions of other people that have the most influence on what we do.

Earls offers the wave as a metaphor for this model of change—it is also in its way an example. It takes no instruction or authority to initiate the rolling wave of spectators standing up and lifting their arms at a sports stadium. One or two people might try to start a wave. If others around them follow, the wave can quickly ripple around the stadium, involving tens of thousands of people in an utterly spontaneous yet coordinated act. The point is a clear one: The person most important in influencing change may be the person standing right next to you.

Suicide bombing and the wave thus offer strikingly similar lessons in how to affect others. Intriguingly, both suggest that it is action in the microcosmos, our own little universe, that matters most: what we do. This is not the only parallel.

First, neither suicide bomber nor waver looks to anyone else, let alone their government, to produce the desired effect. Simply, if you want to start a wave, you do not wait for someone else to stand up. More starkly, the suicide bomber is prepared to sacrifice their own body and existence to attack their enemy. Horrible though it may be, it is truly a politics of personal and direct action.

Second, the action is directly linked to the desired effect—in fact, the action is that effect. Standing up in your stadium seat, though a small action in a crowd of thousands, constitutes the start of a wave. In contrast, voting for a wave to be started most emphatically does not constitute the start of a wave. Detonating a bomb that kills your attackers, as well as yourself as the necessary adjunct, may be viewed by many of us as unconscionable but it does constitute resistance in a very material and—often—effective manner. Action and consequence are connected without intermediation.

Third, both suicide bombing and waves can plausibly be replicated by others, indeed in the case of the wave, that is the very point. One reason why suicide bombing has proven so effective is that it requires very little training to undertake and is relatively cheap compared to other military tactics: Others can easily imitate the tactic. An uneducated peasant can suicide-bomb as effectively as an experienced infantryman. Indeed, it would be a waste of a trained soldier to expend him so.

Fourth, the action offers the possibility of real and immediate change. The wave, if initiated at all, is initiated immediately. This must be very satisfying to the person who stands up to start it (I have never done this). The suicide bomber, if successful, will destroy the enemy vehicle or the people he or she is targeting. Though they will die in the process, the effect they seek is as immediately forthcoming as their own death.

And in one crucial respect, of course, suicide bombers and wavers are very different. Unless coerced, which they sometimes are, suicide bombers are motivated by a belief (some would call it fanaticism) so great that they are willing to sacrifice their lives. This too helps explain the uniquely persuasive power of suicide bombing. Along with the bomb belts portending the deaths of themselves and their victims, suicide bombers carry something else, undeniably: conviction.

And here is where we must abandon the example of the wave as too superficial, for however fun, few would be much impressed by the conviction of those participating in a wave. And it is conviction that convinces.

Suicide bombers illustrate this truth with horrific violence, but others—Gandhi, American civil rights protesters—have shown the uniquely persuasive force of nonviolence. In either case, it was conviction that propelled the action; it was the action that recruited others to the cause. Thus, an essential first step to produce any lasting influence and change is the discovery of conviction.

This discovery is sometimes a personal realization; sometimes it is conducted with others. For Gandhi, it began in South Africa when as a “colored” he was thrown off a whites-only train. In Alabama in 1955, fifteen-year-old Claudette Colvin was riding the bus home from school when the driver demanded that she give up her seat for a middle-aged white woman, even though three other seats in the row were empty. Claudette Colvin refused to budge. As she put it, “If she sat down in the same row as me, it meant I was as good as her.”[34]

Colvin was arrested. Two police officers, one of them kicking her, dragged her off the bus and handcuffed her. On the way to the police station, they took turns trying to guess her bra size. Colvin’s action took place six months before the same was done by Rosa Parks, whose refusal and arrest are the more celebrated, but together their actions triggered a bus boycott. The court case occasioned by the boycott, at which Claudette Colvin testified, effectively ended bus segregation. As David Garrow, a biographer of Martin Luther King, Jr., commented, “It’s an important reminder that crucial change is often ignited by very plain, unremarkable people who then disappear.”[35]

Interestingly, network researchers have found similar effects. Contrary to some recent popular books, such as The Tipping Point, it is not necessarily a few key influencers who create viral trends; it can be anyone.[163] In fact, Duncan Watts has found that predicting who is influential in starting or shaping any particular trend is more or less impossible. This may be bad news for advertizers trying to save money by targeting their campaigns to a few key influencers, but in terms of political change, it is very exciting. Anyone can initiate a profound social change.

Whatever the insights of network theory or marketing gurus, political change is rather different from buying iPods or downloading the latest Lady Gaga single. Our beliefs about right and wrong are powerfully held; to shift the convictions of others requires profound experience or equal if not more powerful conviction, something rather more substantial than clicking “like” on a Facebook page. In a word, action.

These forces are rather harder to measure, though somehow we can tell when such experience strikes or when we are moved by the actions of others: You know it when you see it. Conviction can be found in myriad different ways, but it can rarely be told: As in all good theater, it is better shown.

To find true political conviction, beliefs that move us and others must be tested, lived, embodied, just as suicide bombers, horribly, embodies theirs. And for this to happen, it’s necessary first to confront a painful reality.

It is comforting to believe that governments can provide for us, and protect us. Governments want us to believe it, and we want to believe them. Unfortunately, it is ever more evident that this comfortable pact between us rests upon weak foundations indeed.

From : TheAnarchistLibrary.org

(1966 - )

Carne Ross (born 1966) is the founder and executive director of Independent Diplomat, a diplomatic advisory group. Carne Ross taught in Zimbabwe before attending the University of Exeter where he studied economics and politics. He joined the British foreign service in 1989. Ross's testimony in the Butler Review directly contradicted the British position on the justification behind the invasion of Iraq. (From: Wikipedia.org.)

Chronology

Back to Top
An icon of a news paper.
February 14, 2021; 5:36:46 PM (UTC)
Added to http://revoltlib.com.

Comments

Back to Top

Login to Comment

0 Likes
0 Dislikes

No comments so far. You can be the first!

Navigation

Back to Top
<< Last Entry in The Leaderless Revolution
Current Entry in The Leaderless Revolution
Chapter 1
Next Entry in The Leaderless Revolution >>
All Nearby Items in The Leaderless Revolution
Home|About|Contact|Privacy Policy