The 2011 Ivory Coast Conflict

Perhaps the most internationally pressing and politically transforming global issue in the last year has been the Arab Spring.  A revolutionary wave of demonstrations and protests occurring in the Middle East have challenged state rulers, resulting in revolutions in Tunisia and Egypt, and a civil war in Libya, effectively ending the previous regime.  Civil uprisings have also occurred in Bahrain, Syria, and Yemen, threatening the political entities of each country.  Because the Arab Spring impacts much of the future of international relations, it has been easy to overlook other civil uprising around the globe.  One such uprising was the Second Civil War in the Ivory Coast, which occurred in March 2011.  Not unlike the Arab Spring, the Ivorian crisis was centered much on religion, as well as historical events.

Arab traders first introduced Islam into the western Sudan from North African, and the religion spread rapidly after the conversion of local rulers.  By the eleventh century, most of the rulers of the Sudanic empires had welcomed and adopted Islam.  However, the southern part of contemporary Ivory Coast remained unaffected by Islamic movements. Powerful Muslim states such as the Mali Empire had territorial sovereignty over the northern part of what is now Ivory Coast.  Songhai, a vassal state of the Mali Empire, succeeded the former and established rule in the area between the fourteenth and sixteenth centuries.  Due to internal disputes, numerous migrations brought new peoples into the southern part of Ivory Coast, an area noted for its forested landscapes. The dense rain forest covering the southern half of the country created barriers to large-scale political organizations as seen further north. Inhabitants lived in villages or clusters of villages and practiced local traditions.  In the seventeenth century, small kingdoms of Akan people, who had split from the powerful Asante Confederation of Ghana, held control over much of the area.

One such Akan group, the Baoulé, laid claim to Eastern Ivory Coast and established a kingdom that was not guided by Islam. The Baoulé played an exceptionally important role in Ivory Coast’s history, as they resisted French subjugation more than other peoples. The descendants of the rulers of the Agni kingdoms tried to retain their separate identity long after Côte d’Ivoire’s independence.  As late as 1969, the ruler of the Baoulé people attempted to break away from Ivory Coast and form an independent state. In the mid 1800s, French presence in the region increased, as trading posts were established on the coast and in the southern parts of the country.  Conversion to Christianity ensued over much of the south, but was unable to reach large areas of the interior; Islam prevailed in the North. In 1886, to support its claims of effective occupation, France reassumed direct control of its West African coastal trading posts, and embarked on an accelerated program of exploration in the interior. Throughout the early years of French rule, French military contingents were sent inland to establish new posts. Many Northern Ivorians offered resistance.  Most notable was Samori Ture, who in the 1880s and 1890s sought to establish an Islamic multinational state stretching from Guinea to the Northeastern part of the Ivory Coast. French colonial policy was based on the concepts of assimilation and association, and French culture was held above all others.  Southern Ivorians found it economically and socially more beneficial to adopt French culture, as they dominated the commercial aspect of the colonial state. As expected when the country gained independence in 1958, it was divided religiously and culturally, which caused few problems early on.  The leadership of Félix Houphouet-Boigny, a major revolutionary figure until 1993, allowed the country to develop without religious tensions.  A series of coups ensued, ending in a 2000 election won by Laurent Gbagbo, the main perpetrator of the 2011 conflict. In 2002 a rebellion in the North and the West effectively divided the country in three parts, a rebel-held North, a government run South, and a buffer zone were all created.  Government forces carried out Mass murders of protestors.  The conflict stemmed from a series of laws enacted by Gbagbo that hindered the rights of Northern Muslim Ivorians.  The concept of “ivoirité,” a racist term that aims mainly at denying political and economic rights to the northerners, was a major characteristic of Gbagbo’s rule that distanced recently settled Ivorians.  Immigrants from Burkina Faso found it especially difficult to survive in the country, as government hostility towards them was so great.  In 2004, the fighting ended with the rebel holding the north and Gbagbo the south.

The most recent Ivorian Crisis began in March 2011, when forces loyal to Gbagbo clashed with the internationally recognized president elect and popular Northern Ivorian, Alassane Ouattara.  Gbagbo’s security forces and allied militias engaged in killings, expulsions, rape, and torture in campaign against supporters of Ouattara.  Ouattara in return reciprocated the treatment, creating a society as entrenched in violence.  Many Ivorians have been displaced by the violence, fleeing to neighboring countries and disrupting the region.  Unsurprisingly, the U.S. decided to avoid participating in the struggle.  Events in sub-Saharan Africa are not  of great importance to the U.S., and the interventions that have been carried out have, have not ended well (Somalia).  The Ivory Coast does not have oil, and cocoa crop, once the largest in the world, has declined. In effect, however, the U.S. decision not to intervene has paid off.  France played a policing role, forcing Gbagbo to step down. As a result, the U.S. was able to steer the public’s attention in other directions to suit its own political agenda.  In this case, non-intervention was apparently the right decision.  So the question is this: Will the U.S. ever intervene in sub-Saharan countries? Would it have been better to create two different countries in the Ivory Coast?

 

Advertisements

Cameroon’s Anglophones; Linguistic Suicide or Bilingual Disadvantage?

Image

 

Figure A: Cameroon, situated in the pivot of Africa’s west coast with a population of 19 million.

A Bit of History to Get Us Started:

Confidently deemed the breadbasket of Central Africa, Cameroon suffers from internal linguistic afflictions which have often flown under the global radar. It is the only sub-Saharan African state that sustains two dissimilar colonial legacies, awkwardly dealing with a dichotomy of cultural, linguistic, and political differences. In 1916, after the German colonial regime was expelled, the French and British became the co-colonialists of Cameroon. Upon gaining independence in 1960, the government, burdened by an already violent path to sovereignty, decided that the country should become officially bilingual, hoping to avoid any further friction in national identity. While the country’s two official languages are of European origin, there are approximately 250 indigenous languages spoken in Cameroon, further intensifying the complicated pattern of linguistic anxiety.

Cameroon is compiled of ten administrative regions, two of which are officially English speaking, eight of which are considered formally francophone (see Figure C). The English-speaking region consists of the Southwest and Northwest provinces, where a form of Pidgin English, known as Wes Cos, is informally used as the lingua franca. The remaining area of the country (in blue) deems French its lingua franca, although it is home to multiple local tongues such as Ewondo, Bulu, Duala, Bamileke, and Fulfulde (See Figure B). Most Cameroonians speak one local language and one official language, though indigenous languages are not officially used in the education system and are rapidly fading in practice as younger generations are encouraged to learn English or French at an earlier age. The linguistic situation in Cameroon, in other words, does not just revolve around a fissure between Francophones and Anglophones.

Although the creation of a bilingual nation was seen as a way to avoid internal conflict at the dawn of independence, bilingualism has turned into a national handicap, creating a stark rift in communication throughout the country. Cameroon’s lack of linguistic unity has lead to a sickness that is seeping into other aspects of its national well-being. The disparity in population distribution lends to an inequality in the usage of French and English, skewing the representation of Anglophones and Francophones in the work force, governmental positions, and higher education institutions. Furthermore, many fear a loss of ethnic heritage as indigenous tongues lack utility in an increasingly globalized world. Although bilingualism was originally a plan, albeit a lazy one, to enforce political and social integration, it has now turned into a force of cultural and political disintegration.

The Case of Bilingualism; Can Cameroon Practice what it Preaches?

The country’s policy of bilingualism, adopted upon independence, created two very disparate education systems. Although Cameroon as a whole is deemed bilingual, realistically it is not. In the French-speaking region (80% of the country), the school system is based on the French model, and instruction is in French. In the Anglophone territories, instruction is based on the British system model, and courses are taught in English. The education system in Cameroon is thus not truly bilingual since students usually master only the language of their respective colonial system. These disparate school systems also represent broader cultural differences. The British school system is defined by administrative decentralization and relatively open social norms. The French system is defined by a strong executive and centralized bureaucracy, with less tolerance for open debate and dissent. As Cameroonians often themselves say, “The educational systems have produced very different people regardless of whether they originated from the same father.”  Take, for example, an excerpt from the political blog of an Anglophone Cameroonian:

“A francophone can hardly understand why an Anglophone will fight and die for principles. An Anglophone parent cannot understand why his francophone counterpart will cooperate with his child to influence the latter’s teacher. A francophone pupil is most likely to take it physically on his teacher than his Anglophone counterpart but an Anglophone pupil is more likely to challenge his teacher on ideas than a francophone pupil will do. An average francophone believes that cabinet ministers are more knowledgeable than him while an average Anglophone believes that he too is qualified to be minister… We can go on and on but we must know that it is the educational system and not the individuals that are responsible for the differences we have.” (Tijah)

Thus, the nationally “bilingual” education system has split the country into two arms, producing and supporting not only a linguistically divided Cameroon, but a culturally and ideologically separated one as well.

Figure B: Cameroon’s indigenous linguistic history constitutes countless tribal languages whose borders are blurred.

Although the problem is evident in elementary and secondary education, its effects are most clear within Cameroon’s higher education system. The first experiment in bilingualism in took place in 1962, at the Cameroonian Federal University. From its inception, inequalities between Anglophone’s and Francophones emerged, as English speakers were forced to seat high-level French examinations while French speakers were not forced to do the equivalent. Anglophones complained of discrimination and forced assimilation into the French system. Today, four of the six national universities in Cameroon are officially bilingual (University of Douala, University of Dschang, University of Yaoundé I, and University of Yaoundé II). The practice —or non-practice— of bilingualism continues to discredit Cameroon’s tertiary education system today. Many professors lecture in the language with which they are most comfortable, and students often blame poor academic performance on lack of proficiency in a second language, or a professor’s perceived inability to understand or mark correctly the work of students in a second language. The overabundance of French professors within the education system (approximately 80%) leaves Anglophone students at a disadvantage since most “bilingual” lectures are taught in French.

Figure C: The split between Cameroon’s English speaking and French speaking provinces is disproportionate. Red indicates Anglophone districts, Blue indicates Francophone districts, and Light Yellow is indicative of Equatorial Guinea’s partially Spanish speaking population.

Twisted Tensions?

Most Cameroonian Anglophones assert that they still suffer discrimination today. Due to their majority status, Francophone’s have continued to occupy top-ranking positions in government and the civil service, and no effective language policy guarantees the rights of minorities. This predicament creates a sense of identity for Cameroonian Anglophones, a fissure around which they collect in their mutual experience of prejudice. Their use of English becomes a symbol of group solidarity within an environment that they perceive as linguistically and socio-politically threatening. What do all of these tensions mean for the country’s education system? English speakers insist that they are cheated and marginalized, and they may have a point: as stated earlier, the majority of University professors are French, leaving English speaking students at a linguistic disadvantage in their studies. Francophone schooling also places a higher importance on foreign language acquisition, better preparing students for the challenge of a bilingual education, especially at the university level. Due to these perceived inequities, English speakers often refuse to send their children to predominantly French instruction schools, partially out of linguistic pride, and partially out of cultural spite. Although the decision to marginalize themselves may exhibit solidarity and protest, English speakers could potentially be worsening their position through linguistic exclusiveness.

The Rise of the Well Rounded Francophone

While Cameroonian Anglophones today often decline to take part in bilingualism, Francophones are positioning to do the opposite. Since the 1970s, increasing numbers of children of francophone parentage have been attending Anglophone schools. The trend has no systematic organization, deriving instead from private incentives within the francophone community. French-speaking Cameroonians believe that learning English will guarantee them a better position in the global sphere. Parents admire the Anglophone school system, which opens up opportunities to study in English-speaking institutions (potentially in the US) and seems to offer larger social benefits in an increasingly globalized world. In towns like Douala and Yaoundé, over fifty percent of children in some Anglophone government primary schools come from francophone homes. Francophone city dwellers realize the socioeconomic advantages of raising children who can gain access into a global market commanded by predominantly English speaking organizations.

A number of optimists insist that Francophone’s are attempting to integrate the two language communities and bridge the country’s deep cultural gap within the next generation. However, most Cameroonians are more realistic: “the francophone views English not necessarily through the patriotic eve of a Cameroonian who wants to be a better citizen by learning the other language, but in terms of individual interests regarding the educational and professional opportunities it offers—especially abroad” (Echu). Most likely, Francophones realize that becoming truly bilingual may in fact be the key to entrance into a globalized work force and economy. Could Cameroon’s zeitgeist of bilingualism be giving birth to a better-rounded Francophone?

Moving Forward…

Cameroon, due largely in part to the evolution of the education system, can no longer maintain English as a secondary language. Now viewed as a high status global tongue, English is slowly becoming the country’s de facto lingua franca. It seems that this shift would have Anglophones leaping with joy in an instance of underdog pride and recognition. Though the situation seems to be strangely reversed. English speakers are losing their competitive edge, as their linguistic skill becomes simply a trait that all Cameroonians acquire by the level of tertiary education.  Anglophones have long felt discriminated against in the schools, and now many refuse to attend francophone institutions or become truly bilingual. The Anglophone rejection of bilingualism, to avoid contamination by la francophonie, puts them at a disadvantage. They no longer have a specialization or comparative advantage over their francophone neighbors. Moreover, an inability to speak French puts them further behind against the rising, truly bilingual, francophone generation. First, Cameroon was aversely affected by its failure to unify, or to truly implement bilingualism in its education system. Now, as it comes closer to reaching this goal, will the position of the Anglophone Cameroonian minority become even more disadvantaged?

Like a Good Neighbor… Kenya Invades?

Kenya’s recent invasion of Somalia does not fit the mold of a typical border struggle. Rather than enter neighboring Somalia to capture territory, Kenya’s military marched across the border on October 16th  in hopes of quelling a terrorist network that has menaced both countries: Harakat Al-Shabaab al-Mujahideen, known more commonly as al-Shabaab. A group of Islamic extremists, al-Shabaab has caused havoc across most of Somalia and beyond its borders via kidnappings, bombings, and other acts of violence since 2006, all with the ultimate goal of a total takeover of the country.

Questions arise, however, as to whether al-Shabaab is the sole reason for Kenya’s military entry. The 1,600 Kenyan troops making their way through southern Somalia represent the largest Kenyan military operation since its independence in 1963, and many believe Kenya is trying to finally do something about Somalia’s chronic instability.  Indeed, the Kenyan invasion was greeted rather coldly by Somali president Sharif Sheik Ahmed who, despite signing a joint agreement calling for action against al-Shabaab, questioned Kenya’s motives while insisting that the Somali government retain territorial sovereignty.  Tension remains apparent despite the countries’ written agreement, some of which can be explained by the countries’ starkly different histories.

Since its independence, Kenya has boasted a relatively strong and dependable economy that has helped alleviate the political woes its neighbors have faced. Its fertile Eastern plateau allows the country to thrive agriculturally, which contributes to 22% of its annual GDP, and has become a hotspot for trade in East Africa.  Despite troubles with corruption in the past decade, Kenya is widely regarded as a bastion of stability in a region that is often fraught with turmoil. As such, Kenya finds itself constantly inundated with refugees from its less-prosperous neighbors; Dadaab, a trio of Kenyan refugee camps, sees the arrival of roughly 1,400 new Somali refugees every day, bringing the grand total to well over 400,000.

But overwhelming numbers of refugees are not the only problem that Somali instability has caused for Kenya. Kenya’s tourism industry, the country’s second-largest source of foreign exchange revenue, has been threatened by the increasing problem of Somali piracy.  Recent abductions and killings of travelers from some of Kenya’s most important resorts have caught the attention of the media, dealing significant blows to the $800 million tourism industry.  Hopes of achieving a record year for tourism revenue have been all but derailed.

But piracy is nothing new to Somalia, a country that has struggled through a tumultuous history even by East African standards. Despite having the resource potential for a relatively healthy economy, the country has been engulfed by nearly a half-century of civil unrest that has rendered it one of the poorest and most violent states in the entire world.  Economically well-situated on the African coast, Somalia has been unable to convert its favorable location into an economic benefit— ironically, piracy has turned the country’s biggest advantage into one of its biggest problems.

Somalia’s major problem, however, is a recurring famine that has lasted multiple decades. Food shortages this year have affected an estimated 12 million people around the Horn of Africa, including roughly half of the entire Somali population, exacerbated by the region’s worst drought in 60 years.  The problem is most dire in the southern regions of the country, where al-Shabaab has driven out the United Nations and other aid groups and has even deprived local farmers of water resources.

Al-Shabaab makes no secret of its ultimate goal in Somalia: the overthrow of the current government, called the Transitional Federal Government, and the spread of its very strict brand of Sharia law throughout the region.  Established little more than a decade ago as an off-shoot of two other Somali Islamist groups, al-Shabaab has developed significant ties to Al-Qaeda, with its chief military strategist expressing allegiance to Osama Bin Laden at one point.11 Like the pirates in the area, al-Shabaab has been recently involved in many acts of violence, actions that have lent a sense of urgency to Kenya’s willingness to enter Somalia with force.

But a successful Kenyan operation in Somalia is not without risk; indeed, many observers criticize the military move as  too aggressive and too dangerous. To begin with, recent rains in the area will significant hamper the Kenyan force’s mobility due to impassable terrain. The al-Shabaab fighters, roughly 15,000 strong, have already begun to conduct hit-and-run raids, and their guerilla tactics are made particularly effective by their superior knowledge of the territory.  Furthermore, the invasion itself may already have convinced jihadists to plot acts of vengeance in the form of suicide bombings in Kenya or elsewhere. Only one week after the Kenyan troops entered Somalia, two deadly grenade attacks were carried out in Kenya’s capital of Nairobi. With such attacks seeming to be direct responses to the invasion, the war effort will likely see significant erosion in support from the Kenyan citizenry.
Despite the obvious risk, Kenya and Somalia’s efforts to stop al-Shabaab have rallied powerful allies for support. Kenya’s military is backed by the United States, giving their forces a military edge over the terrorist network. The effort has received local support as well, with South Africa and Rwanda both firmly in-line with goal of thwarting al-Shabaab. Finally, global organizations such as the European Union and the Commonwealth Heads of State have also given their support to the movement, although their concerns rest primarily with the piracy that has threatened trade in the region.

Although all recognize that the military movement is fraught with risk, ultimately the consensus among Kenya’s allies is that ridding the region of al-Shabaab is well-worth the danger. Only time will tell if the Kenyans, Somalis, and their supporters have bitten off more than they can chew.

Polio in Nigeria: the problem persists

by Katie Nelson

Polio has not plagued the western world in decades. For most Americans, it seems a vestige of a bygone era. But for those working to eradicate it in the rest of the world, polio is a frustrating reminder of the way in which cultural frameworks can have dangerous health outcomes.

Polio is endemic in only a few countries. In India, repeated vaccination campaigns in the northern provinces have not generated adequate immunity. In Afghanistan and Pakistan, inadequate infrastructure and persistent conflict have impeded vaccination efforts. And then there is Nigeria. The most populous country in Africa—home to one in every seven Africans—Nigeria has recently overcome a spate of dictators and military coups to try to establish its rightful economic place as an oil-rich country.  Its healthcare infrastructure is generally poor, and its infant mortality rate is among the highest in the world, at 97.1 deaths per 1000 live births.

Such problems, however, are not unique to Nigeria, as they are similar to those of other post-colonial African countries. But in 2003, mounting religious and cultural tension in Nigeria turned vaccination into a political rallying point. In the beginning, several Muslim clerics in the north of the country began to spread rumors that polio vaccines had been designed by Western powers to sterilize Nigerians.  These rumors were given some sort of credibility when the Supreme Council for Shariah in Nigeria, then headed up by a doctor, began to perpetuate them.  Subsequently, the governors of the states of Kano, Kaduna and Zamfara suspended immunization programs until the vaccines could be tested and certified as safe.

Due to the particular features of polio, that was all it took. Polio is spread through the fecal-oral route, and contamination by one person in an unsanitary environment can cause infection of many other susceptible people.[i]  Soon, polio cases in Nigeria resurged dramatically. Previous increases in polio incidence had been a result of infrastructural inadequacies. Soon, a further jump in incidence could be observed as a result of religious conflict.

The events of 2003 need to be understood in the context of Nigeria’s cultural and historical intricacies. Until British expansion in the 19th century, the region that currently comprises Nigeria was divided into several different kingdoms. In the north, the Muslim influence of the fallen Mali Empire permeated the Hausa kingdoms in the 15th century.  These northern trading states were slowly infiltrated by Fulani people from the northeast, who created the powerful Hausa-Fulani caliphate in 1806. The southeastern region of the country became increasingly vulnerable to the coastal slave trade, which spread deeper into the interior of the region. The Ibibo people of Akwa Akpa who controlled part of the trade network primarily sold the Igbo people as slaves. The Igbo themselves lived in a system of republican communities in Igboland in the southeast of the country.  The Yoruba in the southwest are ethnically related to the people of Benin, and experienced their own cultural golden age under the Yoruba and Oyo Empires from the 12th through the 18th centuries.  These ethnic groups remain in roughly the same geographic position as they did before Colonization.[ii]

In 1807, the British government banned its citizens from engaging in the slave trade, and blockaded the Nigerian coast to prevent any illegal trading. The exigencies of maintaining the blockade led the British to establish a military presence on the Nigerian coast, which gradually expanded throughout the 19th century into a sphere of influence in the south. Christian missionaries were especially active among the Yoruba and Igbo peoples.   In 1900, the southern palm-oil producing states were officially named the Southern Nigeria Protectorate. The Northern Nigerian Protectorate was formed the same year largely to prevent incursions by other European powers, but the northern emirates maintained their Muslim identity and a large portion of their autonomy.  British officials forbade Christian missionaries from entering the region, and made no attempts to institute Western education that might interfere with the Islamic school system.   In 1914, the two British Nigerian colonies were merged despite their enormous cultural and ethnic differences. This was done largely for financial reasons, so that the relatively prosperous south could help the struggling north.

Despite these stark regional differences, an anti-British sense of Nigerian nationalism began to emerge in the 1920s. This nationalism was characterized more by pan-African identification than any specific Nigerian one, and the political parties that emerged were largely based on ethnic divisions. In the course of the 1940s and 1950s, the British government began to devolve power to the Legislative Council of Nigeria, but tensions between the more educated, westernized south and the repressive north continued to heighten up until independence in 1960. The Yoruba in the west developed their own political identity distinct from that of the Igbo in the southeast, and as a result the First Nigerian Republic was a fragmented coalition. The Nigerian civil war in 1966, during which Igbo military leaders assassinated northern and western politicians, culminated in the declaration of independence of Biafra in the southeast further splintered the country. The next thirty years of Nigerian history were marked by a series of coups, assassinations, military rule, and a descent into infrastructural chaos.

While the country’s politics have become more stable since the reestablishment of civilian government in 1999, interreligious violence has increasingly plagued Nigeria, and ethnic divisions run deep. Citizens’ faith in the government has been eroded by years of corruption, and most people rely more on the governance of their regional leaders rather than that of the Nigerian state.

Polio in Nigeria can be thought of as a product of these problems.  The north, whose leaders have been long afraid that modernization would threaten Islamic authority, has resisted many infrastructural and social programs, and remains backwards compared to the south. Already resistant to western health programs, northern Nigerian leaders were further alienated by the expansion of worldwide polio vaccination programs in the 1980s.  Matters came to a head in 2003, when a Southern Christian, Olusegun Obasanjo, was elected president, defeating the incumbent, a Muslim.

Fears about the vaccine began as rumors in village councils, but spread rapidly through the northern region. An easy scapegoat for the specter of western imperialism, the vaccine was condemned as a plot by the West to limit the Muslim population of Africa. Most damning of all, the rallying cry was taken up by Dr. Datti Ahmed, a doctor and leader of the National Council for Shariah. Ahmed claimed that “There were strong reasons to believe that the polio immunization vaccines was contaminated with anti fertility drugs, contaminated with certain viruses that cause HIV / AIDS, contaminated with Simian virus that are likely to cause cancer.”[iii]  Because of this statement, the governors of three of the most populous northern states had no choice but to ban vaccination efforts.

The reaction of the public health community compounded the problem.  Not believing that mere rumors could defeat the vaccination effort, they initially made no comment and waited for the furor to die down.  When it did not, they were forced to announce testing of vaccines by independent parties to ensure safety.

The World Health Organizations’s top-down approach to vaccination programs was equally problematic. While the Nigerian federal government was included in the program-planning process, religious and community leaders were not, a decision rife with consequences for the fragmented Nigerian state. If local leaders were not invested or even involved in the process of the campaign, they could not vouch for the vaccine’s benefits to their community. After nearly a year of independent testing by laboratories in Muslim countries, the governors lifted their bans. But it was too late; the disease had escaped Nigeria.  Polio can spread extremely quickly in an unprotected community, and it only takes one sick person shedding the virus to perpetuate the disease. Within months of the vaccination ban, Nigerian states that had been previously polio-free reported cases, and the disease spread into Niger, Burkina Faso, Togo, Ghana and Chad. To date, Nigeria has spread wild polio back into 25 countries.

from polioeradication.org

Despite the slow diminution of rumors, and reinstatement of the anti-polio program in all states, Nigerian vaccination rates continue to be low.  Six states in the north  have rates of missed vaccination (which includes refusals) of greater than 10%, and many of the refusals stem from inadequate information about the benefits of the vaccine. Infrastructural problems are the main cause, but refusals remain higher than in most African countries, due to religious and ethnic issues addressed above. The reelection of President Goodluck Jonathan, a southern Christian, may be problematic, since he lacks a northern ethnic or religious base of support.[iv]

from MMWR

However, some positive signs can be discerned. Refusal rates, while still high, have diminished by 12 percent since 2007. Much of the credit goes to the work of community health engagement programs, which seek on-the-ground support for vaccination programs. While Nigeria works hard to overcome the setbacks of 2003, it serves as an important case study for the global medical community about the importance of trust and communication for polio eradication efforts.


[i] Poliovirus has an R0 of 5-7, which means that each infected person will probably get 5-7 more people sick.  This geometric increase (each of those 5-7 will infect another 5-7 each, and so on) means that the higher the R0, the more difficult the disease is to control. Given that eradication programs have been ongoing for over three decades, polio has been extremely difficult.

[ii] Nigeria encompasses more than 250 ethnic groups and 400 languages, by some counts; the geographic and cultural distinctions mentioned in this paper are an unnuanced, general look at the tensions between the three most important and populous groups.

[iii] Another frightening step towards religious tension occurred in 2002-3, when Sharia law was officially extended to cover criminal as well as civil matters in twelve northern states.

[iv] In 2007, in an election considered to be extremely unfair, Umaru Musa Yar’Adua, a former governor of Katsina in the north and member of a Fulani royal family, was elected president, with Goodluck Jonathan, a southerner, as his vice presidential running mate. In 2009, Yar’Adua disappeared, apparently to seek medical treatment, and was never seen in public again.  This danger power vacuum created by his disappearance could only ever be partially filled by a Christian southerner like Jonathan.

Seeking El Dorado: The Impact of Colombia’s 21st Century Gold Rush

Gold is the newest killer in Colombia.  Successful efforts by the government to reign in illegal drug trafficking within the world’s largest cocaine-producing country have forced rebels to seek alternate sources of revenue.  Guerrilla forces have turned to the precious metal to finance their rebellions and terrorist activities.  Due to the inflated price of gold, which has skyrocketed by nearly 600% in a decade, from under $300/ounce in 2002 to over $1700/ounce in 2011, and the ease with which the rebels can sell gold to turn a profit, the rebels have located an efficient and plentiful source of income.  Reports suggest that the illegal gold market is now more dangerous than the notorious drug trade that has imperiled the country for decades.

Colombian Topography

Increased mining throughout Colombia presents severe environmental and social problems beyond sustaining the rebels in the mountains.  Colombian gold deposits, many still undiscovered, are riddled throughout the Andes Mountain Range that runs through the western region of the country.  Much of this gold can be found within a unique Andean highland ecosystem called the páramo.  This environment, located between the forest and snow lines, is characterized by its high levels of humidity, rapid temperature fluctuations and diverse vegetation.  The neo-tropical biome possesses over five thousand different plant species, most of which are giant rosettes, shrubs and grasses.  Historically, the Colombian government has protected this environment from mining, including the largest páramo in the world, Sumapaz, located just 30km south of the capitol, Bogotá.  Despite these protections, illegal gold mining activity by rebels and rural Colombians, many who are forced to work for the guerilla forces to earn a living, is spiraling out of control in areas of the páramo.  Sources claim that as many as 3000 unlicensed mines exist throughout the country.  These small-industry mines rely heavily on open-pit mining. This dirty mining technique requires workers to dig up the ground and the soil, destroying the natural ecosystem and polluting the air and water.  These costly mining practices threaten to transform large areas of the fragile páramo ecosystem into dead zones.

Pâramo Locations in Colombia

The Colombians are not merely concerned with preserving the natural beauty of the páramo region.  The Andean regions contain the majority Colombia’s population, including the capital city, Bogotá.  Millions of Colombians who live in highland cities and towns depend on the water generated within the páramo for their livelihoods.   This rare habitat is home to a spongy plant called the espeletia.  This plant, referred to as frailejón by native Colombians, captures and stores water from the surrounding air within its velvety leaves.  During the dry season, the frailejón plants leak the stored water into the soil.  This runoff feeds rivers that provide water to urban centers all over the region.

Pronounced human danger stems from the contaminating processes involved in extracting gold from the rock.  In the mines, massive heaps of rock are treated with a cyanide solution to separate the gold.  This process creates vast amounts of waste products including cyanide-laced water that can easily contaminate water supplies.  Sources indicate that many of the illegal mines throughout Colombia are also employing mercury treatments to extract the gold, which is more lethal than cyanide.  The millions of Colombians who live in páramo river basins are at high risk from polluted water.

Exacerbating the problem has been the Colombian government’s response to the situation.  While the government has pledged to shut down miners operating without permits, only twelve government inspectors monitor mines throughout the country.  The government has also encouraged the growth of the large-scale mining industry.  Many government officials, including President Juan Manuel Santos have made mineral extraction, in particular gold, the centerpiece of their economic revival plan for the country.  They have loosened foreign restrictions on mining and are welcoming international companies and investors.  The lure of potential rewards from gold mining is too strong to turn down.

Perhaps no mining conglomerate has been as active as Greystar Resources, a Canadian-based corporation that has begun operating in Colombia within the last few years.  Traditionally only an exploratory company, Greystar sees tremendous potential in Colombian mining.  Representatives believe that the watery tunnels of the Angostura mine in Northeast Colombia contain as much as thirty million ounces of gold, which would translate to nearly forty billion dollars at today’s exchange rate.  Greystar hails the economic benefits and prosperity their mines could bring to otherwise impoverished regions of Colombia.  The company’s president denies that their mining procedures would create uncontrolled toxic waste.  He claims that the contaminated water created in the mine will be recycled and that high-tech leach pads will prevent leakage.  But the harsh truth is that mining is a risky business and accidents do happen.

Grassroots movements have developed in response to the increased visibility of these mining corporations and the dangerous mines they hope to construct.  In April 2011, the efforts of an impassioned coalition of students, environmentalists and politicians forced Greystar Resources to revoke its permit for an open-pit mine in the Santurban Páramo, in northeast Colombia.  The proposed mine threatened the water supply to the nation’s fifth largest city, Bucaramanga.  Nearly 30,000 Bucaramangan citizens united on February 24th to march against the construction of the mine.

Still, such efforts seem unlikely to counter the momentum that the mining industry is gaining in Colombia.  As of March 2011, exploratory mining permits had been granted to companies covering over 10% of Colombia’s páramo regions.  What were once considered protected natural treasures are on the verge of being destroyed by speculators and gold miners.  And, those figures make no mention of the thousands of illegal mines operated by guerilla troops throughout the country.  No one can deny the importance of Colombia building a strong economy.  But are the environmental and human costs associated with mining really an acceptable price to pay?

Political Water Scarcity in Gaza and Nongovernmental Solutions

Post by Nadia Arid
Stanford University ’12
International Relations

Nations in the Middle East and North Africa currently lack access to freshwater resources.

Much of the Middle East is in a water crisis. The region’s population continues to grow at a rapid rate while water resources continue to diminish. Since the 1970s, the Middle East and North African have experienced dramatic increases in population, with the total number of inhabitants rising from 127 million in 1970 to 305 million in 2005. At the same time, most global projections for climate change indicate a drop in rainfall precipitation by 20 to 40 percent, accompanied by an increase in temperature due to global climate change.

Countries that lack access to clean and plentiful access to water are classified into two categories of water scarcity: physical and economic. Physical water scarcity is when there is not enough water in the area to meet the demands of the population, while economic scarcity is when the lack of water is a direct result of an inability to invest in water resources or a lack of human capacity to provide clean water. The entire Middle East, including the once-Fertile Crescent, is warming and its water resources are being stressed. Conventional water-scarcity maps show the region overall as devoting 70 percent of river flow to agriculture, industry, or domestic purposes, indicating physically water scarcity.

A more detailed examination of the situation in the Gaza Strip, however, may cause one to doubt the placement of this region under the category of “physical water scarcity.” As it turns out, the situation here is much more complicated. Economic issues, including an inability to invest in water treatment technology and a lack of human capital in the water sector, plague the region and have significantly contributed to the ensuing water crisis. This line between physical and economic water scarcity has become even more complicated by the political tensions between Israeli and Palestinian officials as a result of the ongoing Arab-Israeli conflict, creating a new category of water insecurity: political water scarcity.

Water Scarcity in the Gaza Strip

According to the International Committee of the Red Cross, hundreds of gallons of wastewater are dumped into local waterways in the Gaza Strip every day. Researchers from the University of Heidelberg and the Center for Environmental Research found that 90 percent of the available drinking water in Gaza contained two to eight times the amount of nitrate concentration recommended as safe by the World Health Organization. The coastal aquifer in the region, which has been heavily exploited and polluted with nitrate, is the sole source of freshwater for Gaza and only about 5 to 10% of the water extracted has the quality level high enough for safe drinking water.

Israeli laws have also generated barriers to building water management plants within the Gaza Strip. The Interim Agreement on the West Bank and Gaza Strip of 1995, more commonly referred to as Oslo II, is the last written agreement pertaining to water rights in the region. Unfortunately, the agreed-upon resolution in Oslo II did not offer a viable water regime for the Palestinian people. The Oslo II accords proposed a water-buying deal between Israel and Palestine that Palestinian officials claimed to be too costly and did not grant the Palestinian people the property rights necessary to build waste management facilities. The Palestinian Water Authority (PWA) was created in 1995 in order to deal with issues pertaining to access to water in the Gaza Strip. Yet, the PWA has been largely ineffective in achieving its aim to “ensure the equitable utilization and sustainable management and development of Palestinian water resources.” The group has faced significant challenges in investing in programs that will develop the water sector and in creating the coordination that is necessary to carry out its proposed projects.

In addition to lacking the ability to own the land on which to build water management facilities, Palestinians are also barred from importing construction materials into the region. Even running water is not taken for granted in the Gaza Strip. In 2008, 40 percent of homes had their water supplies cut off for more than a day due to a lack of electricity run the pumps. Oxfam International reported that Israel’s restriction of industrial diesel oil supply to the Gaza Strip could also result in a shortage of drinking water, the collapse of crucial sewage systems, and the halting of some hospital functions. While poverty has affected the region’s ability to access clean water on a regular basis, Israeli policies towards Palestinians have played a much greater role in limiting water availability in the Gaza Strip.

The Role of NGOs in the Water Crisis

Due to the highly politicized nature of this water crisis, non-governmental organizations may offer the most promising solutions, at least in the short term. In 2010, the United Nations Human Rights Committee deemed the Israeli government’s denial of access to water and sanitation in Gaza as a violation of the International Covenant of Civil and Political Rights. This was the first time that the agency considered the denial of access to water to be a violation of international law, which provides an avenue of recourse for any community in which water resources are withheld. More importantly, it created an impetus among NGOs to get involved in addressing global issues of water scarcity.

The International Red Cross has funded a water treatment plant in the southern end of the Gaza Strip to help bring freshwater to the Palestinian people. Built earlier this year, the treatment plant provides freshwater for 17,000 residents in the nearby area and is one of the greatest treatment plants in the Palestinian territories. These treatment plants are vital in providing sustainable access to water because they use existing water resources and turn them into water resources that can be used for drinking or for agricultural purposes.

The denial of water to Palestinian children has also been a considerable source of alarm to human rights organizations. The Middle East Children’s Alliance launched a program in 2009 called the Maia Project (“maia” is the Arabic word for water) in order to provide Palestinian children with clean and regular access to water. MECA has provided the funds to build water and desalination units in schools throughout the Gaza Strip. They have built 14 large purification units in United Nations schools in refugee camps and have built 13 small purification units in kindergartens throughout the region. Large purification units provide enough drinking water for 1500 to 2000 individuals while small purification units serve 150 to 450. By working through the UN, this organization has been able to avoid the Israeli government’s legal restrictions on building water plants.

USAid has also gotten involved in funding a large-scale seawater desalination plant and a water carrier to transport quality water to different areas of the Gaza Strip. This proposed desalination plant hopes to reach a capacity of 60,000 mgal of clean drinking water a day. The project will work through the Palestinian Authority in hopes of setting up a sustainable water sector in the region. The projects of these NGOs in the region are relatively new and have not yet stood strongly against the test of time, but because of the apolitical nature of some of organizations that have gotten involved, they may provide the most salient solution to this problem of political water scarcity in the Gaza Strip.