Читать книгу Foreign Intervention in Africa after the Cold War - Elizabeth Schmidt - Страница 12

Оглавление

2

The Post–Cold War Context

Shifting Paradigms and Misconceptions

THIS CHAPTER PROVIDES historical context for foreign intervention in Africa after the Cold War, performing three important tasks. First, it describes how the political and economic crises of the 1970s and 1980s, which were rooted in colonial and Cold War policies, ushered in a new wave of external involvement in the 1990s. Second, it shows how the outside powers that responded to this instability had additional tools at their disposal. Post–World War II institutions and legal frameworks threw into question longstanding views concerning state sovereignty and international law. Postwar conventions and interpretations advanced new rationales for foreign intrusion into the affairs of nation-states that threatened regional stability and civilians’ lives. The paradigms of response to instability/responsibility to protect and the war on terror—put to use after the Cold War—emerged from this intellectual ferment. Third, the chapter investigates Western misconceptions about Islam that underpinned the war on terror and had devastating effects on millions of Muslims worldwide.

Africa after the Cold War

The roots of many problems afflicting Africa today lie in its colonial and Cold War past. Distinctions in power and privilege and conflicts over natural resources have long been a part of human history; in Africa, these phenomena predated the colonial period. However, the plundering of riches through unequal exchange was embedded in colonial economic practices, and colonial-era ethnic and regional hierarchies—sometimes built on preexisting distinctions—often assumed new potency after independence. Internal corruption, economic mismanagement, and pyramids of privilege resulted in unstable societies marked by huge disparities in wealth and power. Money and weapons distributed by Cold War patrons entrenched power differentials and rendered local conflicts deadlier than those of previous eras. The end of the Cold War introduced a new set of problems with roots in this troubled past.

The Cold War drew to a close in the late 1980s and early 1990s, when the Soviet Union collapsed economically and politically. African conflict zones that were once Cold War battlegrounds were increasingly ignored, and dictators who were no longer useful to their Cold War patrons were rapidly abandoned. Across the continent, nations suffered the consequences of depleted resources, enormous debts, dysfunctional states, and regional wars over the spoils. Weapons left over from the Cold War poured into volatile regions and fueled new competition for riches and power. Countries already weakened by economic and political crises descended into violent conflicts that often transcended international borders. In some cases, popular movements or armed insurrections ousted dictators who had lost the support of outside powers. However, because war and repression had stymied organized political opposition in many countries, warlords and other opportunists often moved into the power vacuums. Unscrupulous leaders manipulated ethnicity to strengthen their drive for power and privilege, sometimes unleashing ethnically based terror.

During the first post–Cold War decade, foreign intervention assumed a new character. Many Western nations that had been implicated in African conflicts during the Cold War turned their attention elsewhere. The United States, as the self-proclaimed Cold War victor, showed little interest in direct military intervention and severely reduced its economic assistance as well. However, in keeping with its call for African solutions for African problems, Washington initiated new programs to bolster African military capabilities and others that focused on free market economic development and trade. Recognizing that Africa’s enormous external debts, often incurred by Cold War clients, and the HIV/AIDS pandemic contributed to political and economic instability, the United States also introduced programs to address these problems. The policy shift meant that most military interventions during the 1990s were conducted by African countries—sometimes to reestablish regional peace and security, but in other cases to support proxy forces that granted access to their neighbors’ resources.

Although extracontinental powers were less likely to intervene unilaterally during the 1990s, multilateral intervention by both African and non-African powers intensified and took shape under new auspices. The UN, the Organization of African Unity (OAU), and various subregional bodies intervened in response to instability—to broker, monitor, and enforce peace accords and to facilitate humanitarian relief operations. Peacekeeping and humanitarian interventions were viewed positively by many African constituencies, although disparities in power meant that African agents had little authority over external forces once implanted on African soil. In a striking deviation from Cold War trends, critics castigated the international community for not acting quickly or boldly enough—as in the case of the Rwandan genocide in 1994, the Liberian civil war that ended in 2003, and the Darfur conflict in Sudan that began in 2003. The UN Security Council, in particular, was criticized for its refusal to thwart the Rwandan genocide and to act more forcefully in Darfur. Under pressure from human rights and humanitarian lobbies and from African civil societies, the UN General Assembly passed a resolution in 2005 that held countries responsible for protecting their citizens from “genocide, war crimes, ethnic cleansing and crimes against humanity.” Sometimes called the R2P resolution, the General Assembly action granted the international community the right to intervene through UN Security Council–sanctioned operations if governments failed to fulfill their “responsibility to protect” (R2P).1

Appeals for humanitarian intervention in African affairs increased during the first decade of the twenty-first century; military intervention for other ends also intensified. The ongoing struggle to secure energy and other strategic resources and the onset of the war on terror brought renewed attention to the continent. Heightened foreign military presence, external support for repressive regimes, and disreputable alliances purportedly intended to root out terror resulted in new forms of foreign intervention in Africa. The continent, its people, and its resources again became the object of internal and external struggles in which local concerns were frequently subordinated to foreign interests.

Paradigm 1: Response to Instability and the Responsibility to Protect

The political, economic, and social upheavals that characterized the late Cold War and early post–Cold War periods resulted in severe instability in numerous African states and regions. Foreign powers and multilateral institutions took note when domestic turmoil was perceived to jeopardize international peace and security. In most instances, their involvement entailed brokering, monitoring, and enforcing peace agreements. Diplomatic and military interventions were often justified on the grounds that outside actors had both the right and the responsibility to guarantee international peace and security if individual states failed to do so. In such cases, intervention was authorized under Chapters VI, VII, or VIII of the United Nations Charter, adopted in 1945.2 In instances where large civilian populations were at risk and refugee flows heightened regional tensions, the response to instability was bolstered by newer claims that the international community had a responsibility to protect civilian lives. In such cases, intervention was justified by the 2005 UN General Assembly resolution, mentioned above, that bestowed on the international community the responsibility to protect civilians when their governments were unable or unwilling to do so.

Post–Cold War intervention in African affairs saw increased involvement by multinational bodies that drew on changing notions concerning the right to intervene. Since the mid-1990s, when the international community largely ignored appeals to thwart the Rwandan genocide, growing constituencies in Africa and the West have called for humanitarian interventions to end human rights abuses and protect civilians, with or without the consent of the states in question. Such interventions might include military force, sanctions, or the forcible delivery of humanitarian aid. Although the notion of humanitarian intervention has gained support, it remains controversial. External interference in a state’s domestic affairs challenges a premise of international law, national sovereignty, that has held sway for more than three and a half centuries.

The contemporary system of international law emerged from the 1648 Peace of Westphalia, a series of treaties that concluded the Thirty Years’ War in Europe and laid the foundations for the modern nation-state. Enshrined in the treaties is the principle of national sovereignty, which granted monarchs control over feudal princes and inhabitants of their territories, as well as absolute power to maintain order within their realms and to protect the state from external forces. Deemed above the law, sovereigns were exempt from moral scrutiny. From 1648 until the end of World War II, the sovereignty of the nation-state was defined in such a way that internal conflicts and their consequences were considered domestic matters outside the purview of the international community. However, another seventeenth-century principle of international law eventually established a framework for a more expansive understanding of national sovereignty. The notion that a state and its citizenry are bound by a social contract that carries reciprocal rights and responsibilities gradually superseded the view that sovereigns are beyond moral scrutiny. If the social contract requires citizens to relinquish some of their liberties in exchange for state protection, then the state bears a responsibility to ensure its citizens’ welfare by protecting their rights and liberties and maintaining peace and security within state borders.

The mass exterminations of European Jews and other populations during World War II challenged the principles of international law that had allowed such crimes to occur, and the impunity of national leaders was called into question. The Nuremberg trials (1945–49), which held key individuals in Nazi Germany’s political, economic, and military establishment accountable for war crimes and crimes against humanity, led to increased scrutiny of national leaders. The postwar order witnessed an expansion of democratic values and institutions. Universal principles of human rights were enshrined in the International Bill of Human Rights, comprising the Universal Declaration of Human Rights (1948), the International Covenant on Civil and Political Rights (1966), and the International Covenant on Economic, Social and Cultural Rights (1966). In 1948, the UN General Assembly adopted the Convention on the Prevention and Punishment of the Crime of Genocide (the Genocide Convention), which required member nations “to prevent and to punish” genocide wherever and whenever it is found.3 Emergent human rights and humanitarian movements gave primacy to individual over states’ rights and emphasized the protection of minorities and other vulnerable members of society. National laws were no longer off limits to international investigation. Subject peoples in Europe’s African and Asian empires embraced universal human rights claims and demanded equal treatment under the law and national self-determination. In the 1950s and 1960s, their efforts culminated in widespread decolonization.

The establishment of the United Nations in 1945 further undermined the seventeenth-century notion that state sovereignty is absolute. Like the post–World War I League of Nations, the UN was founded to promote international peace and security. However, the UN’s mission, which was uniquely premised on respect for universal human rights and freedoms, led to a supplementary mandate. The UN was also charged with promoting “the economic and social advancement of all peoples.”4 Aware that conflicts were frequently rooted in material deprivation and in unequal distribution of power and resources, political and human rights leaders argued that the maintenance of international peace and security required governments to use their capacities to benefit all their citizens and that states should be held accountable for the protection of basic human rights within their borders.

The end of the Cold War brought additional challenges to the state sovereignty principle. The Soviet Union had disintegrated, and the United States and other Western powers no longer felt the same need for strongmen to protect their interests. Newly critical of their clients’ corrupt practices and human rights abuses, they withdrew their support from longstanding dictators and called for accountability in governance. These momentous political shifts provided opportunities for new ways of thinking, and a cadre of public intellectuals in the Global North and South began to argue for a fundamental reconceptualization of the premises of state sovereignty, one that harkened back to the social contract that sometimes had confounded sovereigns’ ability to wield their power with impunity. These thinkers charged that to legitimately claim sovereignty, a state must provide basic conditions for the well-being of its citizenry, including not only peace, security, and order, but also adequate food, clean water, clothing, shelter, health care, education, and employment. In some polities, dominant groups target populations who differ in race, ethnicity, or religion from those in power. In some cases, the state not only fails to protect vulnerable populations from gross human rights violations, ethnic cleansing, or genocide, but is also complicit in perpetrating those crimes. According to the new paradigm, a state that is unable or unwilling to fulfill its foundational responsibilities forfeits the right to sovereignty over its territory and people—and its exemption from outside interference.

It was in this new context that the UN moved toward a broader definition of international responsibility for the protection of human rights. In June 1993, governmental and nongovernmental representatives from 171 nations met in Vienna at the UN-sponsored World Conference on Human Rights, where they endorsed the claim that “All human rights are universal, indivisible and interdependent and interrelated. . . . While the significance of national and regional particularities and various historical, cultural and religious backgrounds must be borne in mind, it is the duty of States, regardless of their political, economic and cultural systems, to promote and protect all human rights and fundamental freedoms.”5 In theory, a state’s failure to protect its citizens could warrant UN intervention.

After the Cold War, the disintegration of the Soviet Union, the splintering of states in Eastern Europe and Central Asia, and challenges to other states elsewhere produced millions of refugees and spawned untold numbers of armed insurgents who crossed borders and fomented instability. Because the UN’s purpose is to “maintain international peace and security,” and because massive human rights violations have ripple effects that affect entire regions, rectifying such wrongs increasingly was understood to be within the UN’s purview.6 However, UN actions did not keep pace with the expanded understanding of the organization’s jurisdiction. Prioritizing their own domestic and foreign policy agendas, permanent members of the Security Council opposed measures that might have thwarted the genocide in Rwanda in 1994 and ethnic cleansing in Sudan’s Darfur region in 2003–4. Continued pressure from nongovernmental organizations and human rights activists pushed the UN General Assembly to pass the 2005 R2P resolution, which allowed the international community to intervene if governments did not protect their citizens from gross human rights violations.7 Supported by 150 countries, the R2P resolution upended an understanding of state sovereignty that had been one of the fundamental tenets of international law since the seventeenth century. In theory, deference to “state sovereignty” no longer could be used as an expedient to allow ethnic cleansing, genocide, or other crimes against humanity to proceed unhindered.

Once again, the reality was far more complicated. New principles of international intervention had been endorsed, but enforcement remained problematic. Governments were reluctant to set precedents that might be used against them in the future, and powerful members of the Security Council rarely committed the resources or personnel necessary to implement the R2P resolution. If a culpable state opposed external involvement, outside powers ordinarily persisted only if their own interests were at stake. Action was likely solely in the case of weak states or those without powerful allies on the Security Council—that is, in states that could not effectively challenge foreign intervention.

As calls for multilateral diplomacy evolved into appeals for military intervention under the mantle of responsibility to protect, there was sharp disagreement over the motives of those intervening, the means they employed, and the nature of the outcomes, that is, whether intervention provided protection for civilians or only increased their insecurity. Some governments reacted to international scrutiny by invoking the old principle of national sovereignty. Others charged that international human rights laws were based on Western capitalist norms that give primacy to the rights of individuals over those of society and thus were not applicable to their cultures or conditions. They argued that Western claims regarding the universality of their human rights definitions were yet another example of cultural imperialism and neocolonialism. Still others claimed that humanitarian intervention was simply a guise for Western powers’ pursuit of their own economic or strategic objectives, and they warned that Western countries were attempting to recolonize the Global South. In countries and regions affected by conflict, governments and citizens were divided on the merits of outside intervention, whether by international organizations, neighboring states, or extracontinental powers. Many remained skeptical of outsiders’ motives and their capacity to bring peace, even when their actions were part of an approved multilateral initiative.

Similar problems have plagued the International Criminal Court (ICC), which was established in 2002 to investigate and prosecute individuals believed to have engaged in war crimes, crimes against humanity, or genocide. Just as the UN Security Council may not intervene without a host country’s consent unless the government has failed to protect its citizens from gross human rights violations, the ICC is authorized to act against alleged human rights abusers only if their national governments and courts are unable or unwilling to do so. However, the ICC’s jurisdiction is far from universal. The international court may investigate alleged crimes in countries that have ratified the ICC treaty, in cases referred to it by the UN Security Council, or when the ICC prosecutor opens a case of his or her own volition. Although 123 UN member states had ratified the ICC treaty by 2017, 70 others had not. Among the holdouts were three permanent members of the UN Security Council that have veto-wielding powers: the United States, China, and Russia. These countries refused to recognize the ICC’s jurisdiction over their own citizens, and they also shielded their allies from the court’s authority. ICC member states have also undermined ICC operations. Although they are technically obliged to comply with the court’s decisions, the ICC has no police or military to enforce summonses or arrest warrants. As a result, alleged perpetrators with powerful allies avoid prosecution, while those without connections are more likely to be held accountable.

Like advocates of R2P, the ICC has been accused of bias against African countries and norms. The court is authorized to investigate human rights abuses worldwide, but nine of the ten investigations it conducted between 2002 and 2017 and all of its indictments, prosecutions, and convictions involved African political and military figures. As a result, some critics have charged that the ICC is simply another neocolonial institution. Criticism from the African Union has been especially sharp, with some African leaders urging AU member states to withdraw from the international court—a step that Burundi took in 2017. However, other African leaders and many civil society organizations have voiced strong support for the court and urged it to expand its protection of African civilians rather than to reduce it. The degree to which the ICC can promote equal justice in an unequal international order remains an open question.

Paradigm 2: The War on Terror

If the roots of the first paradigm can be traced to post–World War II understandings of the need for peace, justice, and human rights to ensure a stable international order, the seeds of the second paradigm can be found in the Cold War struggle between capitalism and communism. From the outset, the United States recognized the power of religion as a weapon against its atheistic opponents, and it mobilized conservative religious groups to fight the communist menace. In Europe, it supported Christian parties and organizations that opposed the Italian, Greek, and French communist parties that had gained strength during World War II and its aftermath. In the Middle East, it backed conservative Muslim organizations and regimes that sought to suppress both communism and radical nationalism. When the pro-Western Shah of Iran was overthrown in January 1979 and replaced by militants who embraced the Shi’a branch of Islam, Washington rallied extremists in the rival Sunni branch to counter Iran’s growing prominence.8 Saudi Arabia, a staunch US ally, promoter of fundamentalist Sunni teachings, and competitor with Iran for regional dominance, joined the United States in its patronage of Sunni militants.

Most relevant for this study is the CIA-led multinational coalition that recruited, trained, armed, and financed Sunni militants from all corners of the globe to challenge the decade-long Soviet occupation of Afghanistan (1979–89). After ousting the Soviets from Afghanistan, the fighters dispersed to their home countries, where they founded new organizations and spearheaded insurgencies, primarily against Muslim states they deemed impious. These Soviet-Afghan War veterans played prominent roles in most of the extremist groups that emerged in Africa and the Middle East in the decades that followed. A brief summary of that history provides the context for the war on terror.

In 1978, a military coup in Afghanistan installed a communist government that was sympathetic to Moscow. It was also brutal, internally divided, and challenged by popular opposition, including an Islamist-backed Sunni insurgency. Faced with instability on its borders, the Soviet Union had two fundamental concerns: first, that the Afghan government would fall and that a new regime would ally with US interests; and second, that the Islamist-backed insurgency in Afghanistan might stimulate similar uprisings in the Soviet republics of Central Asia, which included large Muslim populations. To bolster the Kabul regime, Moscow invaded Afghanistan in December 1979, beginning an occupation that would result in a decade-long war. Determined to secure US dominance over Indian Ocean communication lines and the oil-rich countries of the Persian Gulf, the United States mobilized an international coalition to challenge the Soviet Union in Afghanistan and undermine its authority in adjacent Soviet republics.

For the duration of the ten-year war, the United States and its allies recruited tens of thousands of Muslim fighters from Africa, Asia, Europe, and North America to combat the Soviet occupation. The anti-Soviet recruits, many of whom were inspired by Saudi Arabia’s fundamentalist teachings, referred to themselves as mujahideen—those who struggle to defend the Islamic faith. Spearheaded by the CIA, the endeavor was largely funded by the United States and Saudi Arabia. The CIA provided the militants with sophisticated weapons, including shoulder-fired, heat-seeking Stinger antiaircraft missiles that easily circumvented Soviet decoy flares.9 The CIA and the US Army, Navy, and Air Force Special Operations Forces, along with the UK’s Special Air Service, trained and instructed Pakistani officers and mujahideen leaders in guerrilla and terrorist tactics. Pakistan’s intelligence services trained the bulk of the mujahideen forces on the ground and provided critical logistical, intelligence, and military support, while France, Israel, Egypt, and Morocco also helped train and arm the anti-Soviet forces. Iran played a significant but independent role, training both Shi’ite and Sunni militias.

The CIA and Pakistani intelligence countered Iran’s support for Shi’ite militants in Afghanistan by bolstering Sunni organizations such as that of Osama bin Laden, a wealthy Saudi of Yemeni descent whose family had close ties to the ruling Saudi dynasty and had made its fortune in business and finance. Bin Laden’s organization raised funds, recruited, and provided services for the mujahideen, including a hostel for Algerian, Egyptian, Saudi, and other fighters in Pakistan and a camp in Afghanistan. After the Soviet withdrawal from Afghanistan in 1989, some Afghan militants—primarily religious students and mujahideen fighters—reconstituted themselves as the Taliban (Seekers of the Truth) and fought regional warlords and other mujahideen factions for political control. By 1996, the Taliban had seized most of the country, imposing law and order in areas rife with corruption, banditry, and the drug trade. Turning to opium and heroin to finance their operations, the Taliban employed brutal methods to impose their own interpretation of Islamic law.

After the Soviet departure, the foreign fighters carried their terror tactics and sophisticated weapons to new battlegrounds around the globe. Soviet-Afghan War veterans were at the forefront of guerrilla insurgencies in Algeria, Bosnia and Herzegovina, Egypt, Gaza, Kashmir, the Philippines, the West Bank, and Yemen. They engaged in terrorist activities in Kenya, Sudan, Tanzania, France, and the United States. CIA-backed drug lords and allies, including Osama bin Laden, funded the new networks, joined by Muslim banks and charities.

One of the most significant terrorist networks was al-Qaeda (The Base), which was established from the core of fighters and other volunteers who had passed through Osama bin Laden’s camps. Founded in 1989 with bin Laden as its primary organizer and patron, al-Qaeda advocated jihad against apostate Muslim regimes and their supporters worldwide.10 Although bin Laden considered Saddam Hussein’s secular Arab nationalist regime in Iraq to be apostate, he opposed military intervention by the US-led coalition during the First Gulf War (1990–91); he also denounced the Saudi government’s decision to allow hundreds of thousands of US and allied troops to be stationed in Saudi Arabia, which was home to the holy cities of Mecca and Medina. The Saudi government responded by expelling bin Laden from the country and, eventually, revoking his citizenship. When the Gulf War ended, the United States retained its military bases and thousands of troops on the Arabian Peninsula. The removal of US military forces from the holy land was one of al-Qaeda’s primary objectives. As a result, the United States—bin Laden’s onetime ally—would become an important al-Qaeda target.

The First Gulf War also precipitated the 1991 transfer of al-Qaeda’s headquarters and training camps to Sudan. From there the organization launched a network of cells and allied organizations that radiated into the Greater Horn of Africa, a geographic region that included Burundi, Djibouti, Eritrea, Ethiopia, Kenya, Rwanda, Somalia, Sudan, Tanzania, and Uganda. In May 1996, under pressure from the United States, Saudi Arabia, and the UN Security Council, the Sudanese government asked bin Laden to leave. He moved al-Qaeda’s headquarters back to Afghanistan, where the organization allied with the Taliban. Blaming the United States for his ejection from Sudan, bin Laden focused new attention on this distant enemy. In August 1996 he issued a declaration of jihad against US military forces in Saudi Arabia and called on all Muslims to expel Americans and Israelis from Muslim lands.

Al-Qaeda’s September 2001 attacks on the World Trade Center in New York City and the Pentagon in Washington, DC, were preceded by a number of other assaults against US citizens and infrastructure. These included the 1993 World Trade Center bombing as well as thwarted attacks on New York City bridges and tunnels, the UN headquarters, and the local office of the Federal Bureau of Investigation (FBI); the 1998 bombings of US embassies in Kenya and Tanzania; a failed attempt in 1999 to blow up Los Angeles International Airport; and in 2000, a successful attack on the US Navy destroyer USS Cole, which was docked in Yemen. Although al-Qaeda’s September 2001 attacks opened a new chapter in the war on terror, the United States had been fighting the terrorist organizations it had helped to create since the mid-1990s.

Misconceptions about Islam

If the role of the United States and its allies in fomenting extremist violence is frequently overlooked, the role of Islam in abetting terrorism is often misunderstood. The US-led war on terror has inspired or reinforced many misconceptions about Islam, a religion that originated on the Arabian Peninsula in the seventh century and has spread around the world since then. The emergence of modern political movements operating under Islam’s banner has led to considerable debate over appropriate ways to distinguish these movements and the terminology used to describe them. The lack of authoritative consensus has resulted in much confusion. Islamism, a twentieth-century ideology and movement pertaining to social, political, and religious life, has been confounded with Islamic fundamentalism, which pertains to religious doctrine. Similarly, political Islam—one aspect of Islamism—is often conflated with political terrorism, actions that are embraced by only a small minority of Muslims and whose legitimacy is widely challenged in the world Muslim community. Finally, the Arabic word jihad is frequently translated as “holy war” and associated with death by the sword. In Islam, however, there are three meanings of jihad, two of them nonviolent. Although experts continue to debate the precise meaning of these terms, this study has adopted the following definitions as the most appropriate.11

Islam is the name of a world religion, derived from the Arabic word salema, which means peace, purity, submission, and obedience. The name implies submission to Allah’s will and obedience to his law. The two main branches of Islam, Sunni and Shi’a, agree on its five pillars: (1) faith in a monotheistic deity, Allah, whose messenger is Muhammad; (2) engaging in prayers five times daily; (3) giving alms to the poor; (4) fasting during the holy month of Ramadan; and (5) making a pilgrimage to Mecca at least once, if physically and financially able.

Islamic fundamentalism refers to Islamic beliefs that reject religious innovation or adaptation in response to new circumstances. Practitioners of fundamentalism, more generally, advocate a return to basic religious principles and the strict application of religious law. Fundamentalism often emerges as a reaction to liberalizing trends within a religion or to secularization in the broader society. It represents a struggle between tendencies within a given religion, rather than a clash between religions. The descriptor “religious fundamentalism” was first associated with late nineteenth-century Protestant Christians in the United States who embraced a literal interpretation of the Bible. Like their Christian counterparts, Islamic fundamentalists promote strict observance of their religion’s basic tenets and laws. Their movements have gained strength in the face of the religious innovation, Westernization, and secularization that followed the establishment of European colonialism in the twentieth century and globalization in the twenty-first. The vast majority of Islamic fundamentalists are law-abiding and oppose violent jihad, focusing instead on the ethical, moral, and personal aspects of jihad (see below). They believe that an Islamic state will emerge from a Muslim community that has been purified from within through preaching and proselytizing and that such a state cannot be established through political or armed struggle.

Islamism refers to a social, political, and religious ideology and movement that emerged in response to European colonialism and the social instability wrought by encounters with the West. Its adherents hold that Islamic principles should serve as the basis of the social, political, and legal order and guide the personal lives of individual Muslims. Often led by intellectuals rather than clergy members, Islamist movements focus on social and political change rather than on religious doctrine. Moderate Islamists work within established institutions and political processes to pursue social and political reforms that, they hope, will result in states that are premised on Islamic law and built from the bottom up. Radical Islamists strive to monopolize political power so that they can construct Islamic states from the top down. Islamists do not reject all aspects of Western culture, and they may even embrace Western education and technology as useful tools for the construction of Islamic states. Islamists, in contrast to jihadis (defined below), reject the use of violence to achieve their objectives.

Political Islam is sometimes used synonymously with Islamism, even though it constitutes only one aspect of the social, political, and religious ideology and movement. Although political Islam employs the language of religion, it represents a political rather than a religious response to Westernization. Its adherents do not reject modernity, but they repudiate a particular brand of modernity. They refute the claim that the Western definition of modernity is a universal one and embrace an Islamist variant in its place.

Jihad means effort or struggle. A person who engages in jihad is a mujahid (plural, mujahideen). Jihad has three interrelated meanings: first, the inner spiritual struggle to live righteously, as a good Muslim; second, the struggle to build and purify the Muslim community; and third, the struggle to defend the Islamic faith from outsiders, with force if necessary. The first meaning, which refers to a personal spiritual struggle, constitutes the greater jihad. The second and third meanings, which focus on the outside world, comprise the lesser jihad. Historically, jihad has been understood first and foremost as an inner struggle that begins with the self and extends outward to the broader society. Those who undertake such struggle believe that social and political reforms are best achieved through preaching, proselytizing, and mobilizing the masses to effect change from the bottom up. Engaging in the lesser jihad is held to be a collective duty of the Muslim community, as determined situationally by religious and legal authorities, rather than a permanent personal duty as determined by individuals or self-appointed preachers.

Since the onset of the war on terror, Western observers have frequently collapsed all forms of jihad into one, erroneously defined as a “holy war” against nonbelievers. The concept of holy war originated among Christians in medieval Europe to justify crusades against Muslims; it has no direct counterpart in mainstream Islamic thought. Jihad is not one of the five pillars of Islam and thus is not a practice that is essential to Muslim identity.

Jihadism refers to a minority insurgent movement that broke from Islamism and employs violence in the name of religion. Jihadism emerged in the context of severe social, political, and economic inequalities, and in many cases, political persecution. The movement has primarily attracted young men who feel alienated from mainstream society. Its adherents reject the traditional interpretation of the lesser jihad as a collective struggle of the Muslim community, determined by officially recognized religious and legal authorities, and define it instead as a personal one, to be determined by each individual as he or she sees fit or by self-described clerics. From the early 1970s until the mid-1990s, jihadis generally targeted local secular and Muslim regimes that they deemed impure (the “near enemy”), with the goal of overthrowing them and Islamizing state and society from the top down. However, from the mid-1990s, a small minority began to focus on distant impious or non-Muslim regimes (the “far enemy”), heralding the emergence of global jihad.

Western commentators often overlook these distinctions, failing to differentiate between jihadist factions and frequently merging Islamism and jihadism under the misleading rubric of “Islamic terrorism.” Some erroneously deem both movements a threat to Western societies and argue that both must be opposed in an open-ended war on terror and an effort to restructure the Muslim world. Policies based on this misunderstanding have tended to result in increased hostility and an even greater threat to the West.

A jihadi is a militant Muslim activist who opposes the secular sociopolitical order at home, and Westernization and globalization more broadly, and who engages in armed struggle to establish an Islamic state. The term is not synonymous with mujahid, which refers to a person engaged in any of the three forms of jihad. The term jihadi (jihadist, adjective) was coined in the early twenty-first century by militants who self-identified as such. Jihadis who focus on local struggles against purportedly impious Muslim or secular regimes constitute the majority of this minority faction, while those who focus on distant or non-Muslim regimes—the so-called global jihadis—are a tiny minority of the minority movement.

Islamic terrorism is a commonly used but misleading term that associates religious doctrine with terrorist activity. Islamic fundamentalism, radical Islamism, and political Islam are not equivalent to Islamic terrorism. Muslims who engage in terrorism and claim religious justification for these activities constitute a minuscule minority of Muslims worldwide, and their actions are strongly condemned by the majority. Although these violent extremists deploy the language and symbols of religion to justify their actions, their turn to terrorism was often inspired by social, political, and economic grievances rather than by religious beliefs. This study rejects the use of the term Islamic terrorism as both inaccurate and dangerous. Violence that targets civilians for political reasons is described as “violent extremism” or simply “terrorism.” In some instances, “Muslim extremist” is used to distinguish violent actors who claim to be operating on behalf of their Islamic faith from other violent actors.

Conclusion

Political, economic, and social instability in the late twentieth and early twenty-first centuries brought renewed attention to the African continent. Employing new justifications for their actions, foreign powers and multilateral institutions challenged the centuries-old principle of national sovereignty and claimed the right to intervene to restore stability, protect civilian lives, and combat terrorism. Although some of these interventions reestablished law and order and saved civilian lives, others left conflicts unresolved and laid the groundwork for future strife. Misinterpretations and distortions of Islam, which influenced external actions in the war on terror, often had devastating consequences for civilians. Chapter 3 introduces the major foreign actors involved in African conflicts after the Cold War, including extracontinental powers, neighboring states, multilateral state-based organizations, and nonstate actors associated with international terrorist networks.

Suggested Reading

Suggested readings relevant to specific countries follow chapters 4–11. The works listed below provide general overviews or are pertinent to multiple African countries.

African economic crises that began in the 1970s sparked many of the continent’s political crises. The following works provide contrasting views of the origins of these crises and their solutions. For an insider’s critique of the role of the IMF, the World Bank, and the World Trade Organization in promoting global inequality, see Joseph E. Stiglitz, Globalization and Its Discontents (New York: W. W. Norton, 2002). Nicolas van de Walle, African Economies and the Politics of Permanent Crisis, 1979–1999 (New York: Cambridge University Press, 2001), argues that the internal dynamics of neopatrimonial African states rather than external impositions were primarily responsible for the postcolonial economic crises. David Sahn and colleagues contend that the policies mandated by international financial institutions did not harm the African poor, but neither were they sufficient to reduce poverty. See David E. Sahn, ed., Economic Reform and the Poor in Africa (New York: Oxford University Press, 1996); and David E. Sahn, Paul A. Dorosh, and Stephen D. Younger, Structural Adjustment Reconsidered: Economic Policy and Poverty in Africa (New York: Cambridge University Press, 1997). Léonce Ndikumana and James K. Boyce, Africa’s Odious Debts: How Foreign Loans and Capital Flight Bled a Continent (London: Zed, 2011), focuses on capital flight from Africa and the role of foreign debt in the current crises.

Post–Cold War political crises in African states are considered from diverse perspectives. Books on the failure of state institutions written from Western political science perspectives include I. William Zartman, ed., Collapsed States: The Disintegration and Restoration of Legitimate Authority (Boulder, CO: Lynne Rienner, 1995); Robert I. Rotberg, When States Fail: Causes and Consequences (Princeton, NJ: Princeton University Press, 2004); and Robert H. Bates, When Things Fell Apart: State Failure in Late-Century Africa (New York: Cambridge University Press, 2008). A critique of Western theories of weak, fragile, troubled, failed, and collapsed African states and the ways in which Western powers have responded can be found in Charles T. Call, “The Fallacy of the ‘Failed State,’” Third World Quarterly 29, no. 8 (2008): 1491–1507. Diverse views are offered in the collection edited by Leonardo A. Villalón and Phillip A. Huxtable, The African State at a Critical Juncture: Between Disintegration and Reconfiguration (Boulder, CO: Lynne Rienner, 1998).

Jean-François Bayart, Stephen Ellis, and Béatrice Hibou, The Criminalization of the State in Africa (Bloomington: Indiana University Press, 1999), examines the role of the state in the plunder of resources, privatization of armies and state institutions, and involvement in global criminal networks. Patrick Chabal and Jean-Pascal Daloz, Africa Works: Disorder as Political Instrument (Bloomington: Indiana University Press, 1999), shows how African political actors have manipulated ethnic and regional tensions and used the ensuing disorder to obtain and maintain power. William Reno, Warlord Politics and African States (Boulder, CO: Lynne Rienner, 1998), considers the destruction of bureaucratic state structures of revenue collection, policing, and provision of social services in post–Cold War Africa and their replacement by warlords whose goal is to plunder economic resources rather than to mobilize citizens. Pierre Englebert, Africa: Unity, Sovereignty, and Sorrow (Boulder, CO: Lynne Rienner, 2009), argues that states have failed to protect their citizens yet continue to endure because they offer benefits to regional and national elites.

A number of works provide a deeper understanding of post–Cold War conflicts in Africa. Mary Kaldor, New and Old Wars: Organized Violence in a Global Era (Stanford, CA: Stanford University Press, 1999), explores the causes of increased ethnic violence in the 1990s and the reasons the international community failed to stop it. William Reno, Warfare in Independent Africa (New York: Cambridge University Press, 2011), focuses on African internal conflicts, including anticolonial movements, reformist rebellions, and warlord-led insurgencies. David Kilcullen, The Accidental Guerrilla: Fighting Small Wars in the Midst of a Big One (New York: Oxford University Press, 2009), offers an overview of the interactions of local insurgencies, international movements, and the global war on terror. Several edited collections examine diverse insurgencies and civil wars. Paul D. Williams, War and Conflict in Africa, 2nd ed. (Malden, MA: Polity, 2016), assesses the causes and consequences of more than 600 armed conflicts in Africa from 1990 to 2015, including the impact of outside intervention. See also Christopher Clapham, ed., African Guerrillas (Bloomington: Indiana University Press, 1998); Morten Bøås and Kevin C. Dunn, eds., African Guerrillas: Raging against the Machine (Boulder, CO: Lynne Rienner, 2007); Morten Bøås and Kevin C. Dunn, eds., Africa’s Insurgents: Navigating an Evolving Landscape (Boulder, CO: Lynne Rienner, 2017); Paul Richards, ed., No Peace, No War: An Anthropology of Contemporary Armed Conflicts (Athens: Ohio University Press, 2005); and Preben Kaarsholm, ed., Violence, Political Culture and Development in Africa (Athens: Ohio University Press, 2006).

Several works examine African conflicts and peace agreements. Two companion volumes edited by Alfred Nhema and Paul Tiyambe Zeleza examine the causes of and possible solutions to African conflicts from African perspectives: The Roots of African Conflicts and The Resolution of African Conflicts (Athens: Ohio University Press, 2008). Adebayo Oyebade and Abiodun Alao, eds., Africa after the Cold War: The Changing Perspectives on Security (Trenton, NJ: Africa World Press, 1998), assesses civil conflicts, economic crises, and environmental degradation as the primary threats to post–Cold War African security. Grace Maina and Erik Melander, eds., Peace Agreements and Durable Peace in Africa (Scottsville, South Africa: University of KwaZulu-Natal Press, 2016), offers a framework for evaluating prospects for a successful accord. Case studies for Côte d’Ivoire, the DRC, Somalia, and Sudan are especially relevant. Séverine Autesserre, Peaceland: Conflict Resolution and the Everyday Politics of International Intervention (New York: Cambridge University Press, 2014), explains why international peace interventions often fail, scrutinizing the modes of thought and action that prevent foreign interveners from thinking outside the box. A sharp assessment of past failures and future prospects for democracy can be found in Nic Cheeseman, Democracy in Africa: Successes, Failures, and the Struggle for Political Reform (New York: Cambridge University Press, 2015).

The post–World War II emphasis on human rights and humanitarian intervention is the focus of several works. Samantha Power, A Problem from Hell: America and the Age of Genocide (New York: Basic Books, 2002), analyzes six twentieth-century genocides and the US government’s failure to stop them. This study has been pivotal to recent debates on international law and human rights policies and had an important political impact on the Obama administration. Samuel Moyn, The Last Utopia: Human Rights in History (Cambridge, MA: Harvard University Press, 2010), contends that post-1960s discontent with regimes established on the basis of utopian and anticolonial ideologies paved the way for human rights as a justification for international actions that challenged state sovereignty. Timothy Nunan, Humanitarian Invasion: Global Development in Cold War Afghanistan (New York: Cambridge University Press, 2016), argues that foreign intervention in Afghanistan during the Cold War and its aftermath became the model for future humanitarian interventions that destabilized societies and undermined national sovereignty in the Global South. Alex de Waal, “Writing Human Rights and Getting It Wrong,” Boston Review, June 6, 2016, casts a critical eye on humanitarian intervention lobbies, particularly those that focused on Somalia, Sudan, and Rwanda. He argues that their judgments were often ill-informed and reduced complex situations to straightforward narratives of heroes and villains; as a result, the military interventions they promoted sometimes did more harm than good. Carrie Booth Walling and Susan Waltz’s website, Human Rights Advocacy and the History of International Human Rights Standards (http://humanrightshistory.umich.edu/). It is especially useful for teachers, students, researchers, and advocates.

A number of works examine the reshaping of international legal principles and the struggle for global accountability. Two are central to discussions of the responsibility to protect: Francis M. Deng, Sadikiel Kimaro, Terrence Lyons, Donald Rothchild, and I. William Zartman, Sovereignty as Responsibility: Conflict Management in Africa (Washington, DC: Brookings Institution, 1996); and Francis M. Deng, “From ‘Sovereignty as Responsibility’ to the Responsibility to Protect,” Global Responsibility to Protect 2, no. 4 (2010): 353–70. Elizabeth Borgwardt, A New Deal for the World: America’s Vision for Human Rights (Cambridge, MA: Harvard University Press, 2005), examines the role of New Deal visionaries in constructing the postwar international order that eroded the primacy of national sovereignty and strengthened the position of human rights.

Other works critique the new human rights/R2P discourse and international actions based on its principles. Robert Meister, After Evil: A Politics of Human Rights (New York: Columbia University Press, 2011), argues that the democratic capitalist world has monopolized the concept of “human rights,” producing a version that does not challenge the structural inequalities that underlie poverty and oppression, and has used the responsibility to protect paradigm to justify militaristic ventures. Alex J. Bellamy and Paul D. Williams, “The New Politics of Protection? Côte d’Ivoire, Libya and the Responsibility to Protect,” International Affairs 87, no. 4 (July 2011): 825–50, explores the role of external powers and stakeholders in determining which civilians are to be protected. A critical assessment of the International Criminal Court and its uneven record in advancing global accountability can be found in David Bosco, Rough Justice: The International Criminal Court in a World of Power Politics (New York: Oxford University Press, 2014).

Two important works focus on the UN’s role in humanitarian intervention: Norrie MacQueen, Humanitarian Intervention and the United Nations (Edinburgh: Edinburgh University Press, 2011), provides an overview of UN interventions in various world regions, including sub-Saharan Africa, and assesses their impact and moral implications. Carrie Booth Walling, All Necessary Measures: The United Nations and Humanitarian Intervention (Philadelphia: University of Pennsylvania Press, 2013), investigates the ways in which human rights concerns have altered Security Council attitudes toward state sovereignty and explains the variation in UN response to violations.

The Cold War roots of international terrorist movements associated with Islam are explored in several texts. Three works investigate the CIA’s role in recruiting, training, and financing Muslim fighters to wage war against Soviet forces in Afghanistan; they also explore how Soviet-Afghan War veterans subsequently established worldwide terrorist networks, including al-Qaeda and its spinoff, the Islamic State. See John K. Cooley, Unholy Wars: Afghanistan, America and International Terrorism, 3rd ed. (Sterling, VA: Pluto Press, 2002); Steve Coll, Ghost Wars: The Secret History of the CIA, Afghanistan, and Bin Laden, from the Soviet Invasion to September 10, 2001 (New York: Penguin, 2004); and Mahmood Mamdani, Good Muslim, Bad Muslim: America, the Cold War and the Roots of Terror (New York: Pantheon, 2004). Jean-Pierre Filiu, From Deep State to Islamic State: The Arab Counter-Revolution and Its Jihadi Legacy (New York: Oxford University Press, 2015), exposes the ways in which Arab autocracies quashed the Arab Spring uprisings by unleashing internal security, intelligence, and military forces, as well as street gangs and violent extremists. He argues that these actions opened the door to the Islamic State. The origins of the Islamic State are also examined in Joby Warrick, Black Flags: The Rise of ISIS (New York: Doubleday, 2015), which contends that the policies of the George W. Bush and Barack Obama administrations aided in the organization’s emergence and expansion.

Conceptions and misconceptions about Islamic fundamentalism, Islamism, and jihad are examined in a number of works. They include International Crisis Group, Understanding Islamism, Middle East/North Africa Report 37 (Cairo/Brussels: International Crisis Group, 2005); Mamdani, Good Muslim, Bad Muslim (mentioned previously); and Martin Kramer, “Coming to Terms: Fundamentalists or Islamists?” Middle East Quarterly 10, no. 2 (Spring 2003): 65–77. Richard C. Martin and Abbas Barzegar, eds., Islamism: Contested Perspectives on Political Islam (Stanford, CA: Stanford University Press, 2009), presents diverse interpretations of Islamism by Muslim and non-Muslim intellectuals. Juan Cole, Engaging the Muslim World (New York: Palgrave Macmillan, 2009), dispels misconceptions about various movements within Islam, distinguishing between extremists and Islamic fundamentalists who reject violence. John L. Esposito, Unholy War: Terror in the Name of Islam (New York: Oxford University Press, 2002), contrasts the teachings of the Qur’an with their manipulation by a violent minority and examines the political roots of anti-Americanism in the Muslim world. Contributors to Roel Meijer’s edited collection, Global Salafism: Islam’s New Religious Movement (London: Hurst, 2009), explore commonalities and differences among various strands of Salafism and examine tensions between local and global goals. Fawaz A. Gerges, The Far Enemy: Why Jihad Went Global, 2nd ed. (New York: Cambridge University Press, 2009), argues that the majority of jihadis strive to transform or overthrow local regimes in the Muslim world and that only a small minority target the West. He also examines the reasons that global jihadism emerged in the late 1990s and analyzes the split in the jihadist movement that ensued. The United Nations Development Programme, Journey to Extremism in Africa: Drivers, Incentives and the Tipping Point for Recruitment (New York: UNDP, 2017), considers economic marginalization, low levels of education, absence of good governance, and security sector abuse as factors driving extremism, with religious knowledge often serving as a deterrent.

Two French scholars, Gilles Kepel and Olivier Roy, have engaged in a heated public debate about the origins of the violent extremism associated with contemporary jihadist movements. Gilles Kepel, Jihad: The Trail of Political Islam (Cambridge, MA: Harvard University Press, 2002), provides an overview of Islamist movements in the twentieth century, focusing especially on Iran, Saudi Arabia, Algeria, Egypt, and Afghanistan. Kepel argues that in the late 1990s, Islamist movements split into a majority faction that favored Muslim democracy and a small minority that engaged in terrorist attacks to promote their goals. Gilles Kepel, The War for Muslim Minds: Islam and the West (Cambridge, MA: Harvard University Press, 2004), tracks the origins of global jihad to the Soviet-Afghan War and argues that al-Qaeda’s ideology emerged both from Islam’s strict Salafist and Wahhabi traditions, which advocate abstention from worldly affairs, and from the more political Muslim Brotherhood, whose goal is to establish an Islamic state. Gilles Kepel, with Antoine Jardin, Terror in France: The Rise of Jihad in the West (Princeton, NJ: Princeton University Press, 2017), examines Muslim youth who were radicalized in the West and targeted Western populations. Olivier Roy, Globalized Islam: The Search for a New Ummah (New York: Columbia University Press, 2004), disputes the significance of conservative Islamic traditions and instead explains violent jihad as a response to social, political, and economic changes, one that is politically rather than religiously inspired. Roy argues that Islam has not been radicalized, but rather that radicalism has been Islamized. Alienated youth who had not previously been religious turned to a distorted variant of Islam for meaning, identity, and respect, just as earlier generations had embraced other radical ideologies; the result is the nihilistic rejection of a society that has rejected them. In the West, these youths have been radicalized not by established religious scholars and mosques, but in prisons—where they often serve time for petty crime—and by self-proclaimed authorities on the internet. Roy’s widely quoted challenge to Kepel’s thesis appears in Olivier Roy, “Le djihadisme est une révolte générationnelle et nihiliste,” Le Monde, November 24, 2015.

Foreign Intervention in Africa after the Cold War

Подняться наверх