Fair Use Notice

FAIR USE NOTICE

OCCUPY THE COMMONS


This site may contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in an effort to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. we believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law.

In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml

If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.

FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates

All Blogs licensed under Creative Commons Attribution 3.0


Sunday, November 24, 2013

How the Magna Carta became a Minor Carta, part 1 & 2




How the Magna Carta became a Minor Carta, part 1 & 2

The Magna Carta was a milestone in civil and human rights. Can we stop its principles being shredded before our eyes?



Rare Copy Of Magna Carta
A rare copy of the Magna Carta in New York Sotheby's, auctioned off for $19m in 2007. Photograph: Michael Nagle/Getty Images
 
 
 
Down the road only a few generations, the millennium of Magna Carta, one of the great events in the establishment of civil and human rights, will arrive. Whether it will be celebrated, mourned, or ignored is not at all clear.

That should be a matter of serious immediate concern. What we do right now, or fail to do, will determine what kind of world will greet that event. It is not an attractive prospect if present tendencies persist – not least, because the Great Charter is being shredded before our eyes.

The first scholarly edition of Magna Carta was published by the eminent jurist William Blackstone. It was not an easy task. There was no good text available. As he wrote, "the body of the charter has been unfortunately gnawn by rats" – a comment that carries grim symbolism today, as we take up the task the rats left unfinished.

Blackstone's edition actually includes two charters. It was entitled The Great Charter and the Charter of the Forest. The first, the Charter of Liberties, is widely recognised to be the foundation of the fundamental rights of the English-speaking peoples – or as Winston Churchill put it more expansively, "the charter of every self-respecting man at any time in any land." Churchill was referring specifically to the reaffirmation of the Charter by Parliament in the Petition of Right, imploring King Charles to recognise that the law is sovereign, not the King. Charles agreed briefly, but soon violated his pledge, setting the stage for the murderous civil war.

After a bitter conflict between King and Parliament, the power of royalty in the person of Charles II was restored. In defeat, Magna Carta was not forgotten. One of the leaders of Parliament, Henry Vane, was beheaded. On the scaffold, he tried to read a speech denouncing the sentence as a violation of Magna Carta, but was drowned out by trumpets to ensure that such scandalous words would not be heard by the cheering crowds. His major crime had been to draft a petition calling the people "the original of all just power" in civil society – not the King, not even God. That was the position that had been strongly advocated by Roger Williams, the founder of the first free society in what is now the state of Rhode Island. His heretical views influenced Milton and Locke, though Williams went much farther, founding the modern doctrine of separation of church and state, still much contested even in the liberal democracies.

As often is the case, apparent defeat nevertheless carried the struggle for freedom and rights forward. Shortly after Vane's execution, King Charles granted a Royal Charter to the Rhode Island plantations, declaring that "the form of government is Democratical", and furthermore that the government could affirm freedom of conscience for Papists, atheists, Jews, Turks – even Quakers, one of the most feared and brutalised of the many sects that were appearing in those turbulent days. All of this was astonishing in the climate of the times.

A few years later, the Charter of Liberties was enriched by the Habeas Corpus Act of 1679, formally entitled "an Act for the better securing the liberty of the subject, and for prevention of imprisonment beyond the seas". The US constitution, borrowing from English common law, affirms that "the writ of habeas corpus shall not be suspended" except in case of rebellion or invasion. In a unanimous decision, the US supreme court held that the rights guaranteed by this Act were "[c]onsidered by the Founders [of the American Republic] as the highest safeguard of liberty". All of these words should resonate today.

The Second Charter and the Commons


The significance of the companion charter, the Charter of the Forest, is no less profound and perhaps even more pertinent today – as explored in depth by Peter Linebaugh in his richly documented and stimulating history of Magna Carta and its later trajectory. The Charter of the Forest demanded protection of the commons from external power. The commons were the source of sustenance for the general population: their fuel, their food, their construction materials, whatever was essential for life. The forest was no primitive wilderness. It had been carefully developed over generations, maintained in common, its riches available to all, and preserved for future generations – practices found today primarily in traditional societies that are under threat throughout the world.

The Charter of the Forest imposed limits to privatisation. The Robin Hood myths capture the essence of its concerns (and it is not too surprising that the popular TV series of the 1950s, "The Adventures of Robin Hood," was written anonymously by Hollywood screenwriters blacklisted for leftist convictions). By the 17th century, however, this Charter had fallen victim to the rise of the commodity economy and capitalist practice and morality.

With the commons no longer protected for co-operative nurturing and use, the rights of the common people were restricted to what could not be privatised, a category that continues to shrink to virtual invisibility. In Bolivia, the attempt to privatise water was, in the end, beaten back by an uprising that brought the indigenous majority to power for the first time in history. The World Bank has just ruled that the mining multinational Pacific Rim can proceed with a case against El Salvador for trying to preserve lands and communities from highly destructive gold mining. Environmental constraints threaten to deprive the company of future profits, a crime that can be punished under the rules of the investor-rights regime mislabeled as "free trade." And this is only a tiny sample of struggles underway over much of the world, some involving extreme violence, as in the Eastern Congo, where millions have been killed in recent years to ensure an ample supply of minerals for cell phones and other uses, and of course ample profits.

The rise of capitalist practice and morality brought with it a radical revision of how the commons are treated, and also of how they are conceived. The prevailing view today is captured by Garrett Hardin's influential argument that "freedom in a commons brings ruin to us all," the famous "tragedy of the commons": what is not owned will be destroyed by individual avarice.
An international counterpart was the concept of terra nullius, employed to justify the expulsion of indigenous populations in the settler-colonial societies of the Anglosphere, or their "extermination," as the founding fathers of the American republic described what they were doing, sometimes with remorse, after the fact. According to this useful doctrine, the Indians had no property rights since they were just wanderers in an untamed wilderness. And the hard-working colonists could create value where there was none by turning that same wilderness to commercial use.

In reality, the colonists knew better and there were elaborate procedures of purchase and ratification by crown and parliament, later annulled by force when the evil creatures resisted extermination. The doctrine is often attributed to John Locke, but that is dubious. As a colonial administrator, he understood what was happening, and there is no basis for the attribution in his writings, as contemporary scholarship has shown convincingly, notably the work of the Australian scholar Paul Corcoran. (It was in Australia, in fact, that the doctrine has been most brutally employed.)

The grim forecasts of the tragedy of the commons are not without challenge. The late Elinor Olstrom won the Nobel Prize in economics in 2009 for her work showing the superiority of user-managed fish stocks, pastures, woods, lakes, and groundwater basins. But the conventional doctrine has force if we accept its unstated premise: that humans are blindly driven by what American workers, at the dawn of the industrial revolution, bitterly called "the New Spirit of the Age, Gain Wealth forgetting all but Self."

Like peasants and workers in England before them, American workers denounced this New Spirit, which was being imposed upon them, regarding it as demeaning and destructive, an assault on the very nature of free men and women. And I stress women; among those most active and vocal in condemning the destruction of the rights and dignity of free people by the capitalist industrial system were the "factory girls," young women from the farms. They, too, were driven into the regime of supervised and controlled wage labour, which was regarded at the time as different from chattel slavery only in that it was temporary. That stand was considered so natural that it became a slogan of the Republican party, and a banner under which northern workers carried arms during the American civil war.

Controlling the Desire for Democracy


That was 150 years ago – in England earlier. Huge efforts have been devoted since to inculcating the New Spirit of the Age. Major industries are devoted to the task: public relations, advertising, marketing generally, all of which add up to a very large component of the Gross Domestic Product. They are dedicated to what the great political economist Thorstein Veblen called "fabricating wants." In the words of business leaders themselves, the task is to direct people to "the superficial things" of life, like "fashionable consumption." That way people can be atomised, separated from one another, seeking personal gain alone, diverted from dangerous efforts to think for themselves and challenge authority.

The process of shaping opinion, attitudes, and perceptions was termed the "engineering of consent" by one of the founders of the modern public relations industry, Edward Bernays. He was a respected Wilson-Roosevelt-Kennedy progressive, much like his contemporary, journalist Walter Lippmann, the most prominent public intellectual of 20th-century America, who praised "the manufacture of consent" as a "new art" in the practice of democracy.

Both recognised that the public must be "put in its place," marginalised and controlled – for their own interests of course. They were too "stupid and ignorant" to be allowed to run their own affairs. That task was to be left to the "intelligent minority," who must be protected from "the trampling and the roar of [the] bewildered herd," the "ignorant and meddlesome outsiders" – the "rascal multitude" as they were termed by their 17th century predecessors. The role of the general population was to be "spectators," not "participants in action," in a properly functioning democratic society.

And the spectators must not be allowed to see too much. President Obama has set new standards in safeguarding this principle. He has, in fact, punished more whistleblowers than all previous presidents combined, a real achievement for an administration that came to office promising transparency. WikiLeaks is only the most famous case, with British cooperation.

Among the many topics that are not the business of the bewildered herd is foreign affairs. Anyone who has studied declassified secret documents will have discovered that, to a large extent, their classification was meant to protect public officials from public scrutiny. Domestically, the rabble should not hear the advice given by the courts to major corporations: that they should devote some highly visible efforts to good works, so that an "aroused public" will not discover the enormous benefits provided to them by the nanny state. More generally the US public should not learn that "state policies are overwhelmingly regressive, thus reinforcing and expanding social inequality," though designed in ways that lead "people to think that the government helps only the undeserving poor, allowing politicians to mobilise and exploit anti-government rhetoric and values even as they continue to funnel support to their better-off constituents" – I'm quoting from the main establishment journal, Foreign Affairs, not from some radical rag.

Over time, as societies became freer and the resort to state violence more constrained, the urge to devise sophisticated methods of control of attitudes and opinion has only grown. It is natural that the immense PR industry should have been created in the most free of societies, the United States and Great Britain. The first modern propaganda agency was the British Ministry of Information a century ago, which secretly defined its task as "to direct the thought of most of the world" -- primarily progressive American intellectuals, who had to be mobilized to come to the aid of Britain during the first world war.

Its US counterpart, the Committee on Public Information, was formed by Woodrow Wilson to drive a pacifist population to violent hatred of all things German – with remarkable success. American commercial advertising deeply impressed others. Goebbels admired it and adapted it to Nazi propaganda, all too successfully. The Bolshevik leaders tried as well, but their efforts were clumsy and ineffective.

A primary domestic task has always been "to keep [the public] from our throats," as essayist Ralph Waldo Emerson described the concerns of political leaders when the threat of democracy was becoming harder to suppress in the mid-19th century. More recently, the activism of the 1960s elicited elite concerns about "excessive democracy," and calls for measures to impose "more moderation" in democracy.

One particular concern was to introduce better controls over the institutions "responsible for the indoctrination of the young": the schools, the universities, the churches, which were seen as failing that essential task. I'm quoting reactions from the left-liberal end of the mainstream spectrum, the liberal internationalists who later staffed the Carter administration, and their counterparts in other industrial societies. The right wing was much harsher. One of many manifestations of this urge has been the sharp rise in college tuition, not on economic grounds, as is easily shown. The device does, however, trap and control young people by debt, often for the rest of their lives, thus contributing to more effective indoctrination.

The Three-Fifths People


Pursuing these important topics further, we see that the destruction of the Charter of the Forest, and its obliteration from memory, relates rather closely to the continuing efforts to constrain the promise of the Charter of Liberties. The "New Spirit of the Age" cannot tolerate the pre-capitalist conception of the Forest as the shared endowment of the community at large, cared for communally for its own use and for future generations, protected from privatisation, from transfer to the hands of private power for service to wealth, not needs. Inculcating the New Spirit is an essential prerequisite for achieving this end, and for preventing the Charter of Liberties from being misused to enable free citizens to determine their own fate.

Popular struggles to bring about a freer and more just society have been resisted by violence and repression, and massive efforts to control opinion and attitudes. Over time, however, they have met with considerable success, even though there is a long way to go and there is often regression. Right now, in fact.

The most famous part of the Charter of Liberties is Article 39, which declares that "no free man" shall be punished in any way, "nor will We proceed against or prosecute him, except by the lawful judgment of his peers and by the law of the land."

Through many years of struggle, the principle has come to hold more broadly. The US Constitution provides that no "person [shall] be deprived of life, liberty, or property, without due process of law [and] a speedy and public trial" by peers. The basic principle is "presumption of innocence" – what legal historians describe as "the seed of contemporary Anglo-American freedom," referring to Article 39; and with the Nuremberg Tribunal in mind, a "particularly American brand of legalism: punishment only for those who could be proved to be guilty through a fair trial with a panoply of procedural protections" -- even if their guilt for some of the worst crimes in history is not in doubt.
The founders of course did not intend the term "person" to apply to all persons. Native Americans were not persons. Their rights were virtually nil. Women were scarcely persons. Wives were understood to be "covered" under the civil identity of their husbands in much the same way as children were subject to their parents. Blackstone's principles held that "the very being or legal existence of the woman is suspended during the marriage, or at least is incorporated and consolidated into that of the husband: under whose wing, protection, and cover, she performs every thing." Women are thus the property of their fathers or husbands. These principles remain up to very recent years. Until a supreme court decision of 1975, women did not even have a legal right to serve on juries. They were not peers. Just two weeks ago, Republican opposition blocked the Fairness Paycheck Act guaranteeing women equal pay for equal work. And it goes far beyond.

Slaves, of course, were not persons. They were in fact three-fifths human under the Constitution, so as to grant their owners greater voting power. Protection of slavery was no slight concern to the founders: it was one factor leading to the American revolution. In the 1772 Somerset case, Lord Mansfield determined that slavery is so "odious" that it cannot be tolerated in England, though it continued in British possessions for many years. American slave-owners could see the handwriting on the wall if the colonies remained under British rule. And it should be recalled that the slave states, including Virginia, had the greatest power and influence in the colonies. One can easily appreciate Dr Johnson's famous quip that "we hear the loudest yelps for liberty among the drivers of negroes".

Post-civil war amendments extended the concept person to African-Americans, ending slavery. In theory, at least. After about a decade of relative freedom, a condition akin to slavery was reintroduced by a North-South compact permitting the effective criminalisation of black life. A black male standing on a street corner could be arrested for vagrancy, or for attempted rape if accused of looking at a white woman the wrong way. And once imprisoned he had few chances of ever escaping the system of "slavery by another name," the term used by then-Wall Street Journal bureau chief Douglas Blackmon in an arresting study.

This new version of the "peculiar institution" provided much of the basis for the American industrial revolution, with a perfect workforce for the steel industry and mining, along with agricultural production in the famous chain gangs: docile, obedient, no strikes, and no need for employers even to sustain their workers, an improvement over slavery. The system lasted in large measure until World War II, when free labour was needed for war production.

The postwar boom offered employment. A black man could get a job in a unionised auto plant, earn a decent salary, buy a house, and maybe send his children to college. That lasted for about 20 years, until the 1970s, when the economy was radically redesigned on newly dominant neoliberal principles, with rapid growth of financialisation and the offshoring of production. The black population, now largely superfluous, has been recriminalised.

Until Ronald Reagan's presidency, incarceration in the US was within the spectrum of industrial societies. By now it is far beyond others. It targets primarily black males, increasingly also black women and Hispanics, largely guilty of victimless crimes under the fraudulent "drug wars". Meanwhile, the wealth of African-American families has been virtually obliterated by the latest financial crisis, in no small measure thanks to criminal behaviour of financial institutions, with impunity for the perpetrators, now richer than ever.

Looking over the history of African-Americans from the first arrival of slaves almost 500 years ago to the present, they have enjoyed the status of authentic persons for only a few decades. There is a long way to go to realise the promise of Magna Carta.




How the Magna Carta became a Minor Carta, part 2

The Obama administration has perpetuated an assault on the foundations of traditional liberties

 

Four of the earliest surviving copies of the Magna Carta, on display at the Bodleian library, Oxford
Four of the earliest surviving copies of the Magna Carta, the 1297 charter issued by King Edward I, on display at the Bodleian library, Oxford. Photograph: Martin Argles for the Guardian
 

Sacred persons and undone process


The post-civil war 14th amendment granted the rights of persons to former slaves, though mostly in theory. At the same time, it created a new category of persons with rights: corporations. In fact, almost all the cases brought to the courts under the 14th amendment had to do with corporate rights, and by a century ago, they had determined that these collectivist legal fictions, established and sustained by state power, had the full rights of persons of flesh and blood; in fact, far greater rights, thanks to their scale, immortality, and protections of limited liability. Their rights by now far transcend those of mere humans. Under the "free trade agreements", the Pacific Rim can, for example, sue El Salvador for seeking to protect the environment; individuals cannot do the same. General Motors can claim national rights in Mexico. There is no need to dwell on what would happen if a Mexican demanded national rights in the United States.

Domestically, recent supreme court rulings greatly enhance the already enormous political power of corporations and the super-rich, striking further blows against the tottering relics of functioning political democracy.

Meanwhile Magna Carta is under more direct assault. Recall the Habeas Corpus Act of 1679, which barred "imprisonment beyond the seas", and certainly the far more vicious procedure of imprisonment abroad for the purpose of torture – what is now more politely called "rendition", as when Tony Blair rendered Libyan dissident Abdel Hakim Belhaj, now a leader of the rebellion, to the mercies of Colonel Gaddafi; or when US authorities deported Canadian citizen Maher Arar to his native Syria, for imprisonment and torture, only later conceding that there was never any case against him. And many others, often through Shannon airport, leading to courageous protests in Ireland.

The concept of due process has been extended under the Barack Obama administration's international assassination campaign in a way that renders this core element of the Charter of Liberties (and the Constitution) null and void. The Justice Department explained that the constitutional guarantee of due process, tracing to Magna Carta, is now satisfied by internal deliberations in the executive branch alone. The constitutional lawyer in the White House agreed. King John might have nodded with satisfaction.

The issue arose after the presidentially ordered assassination-by-drone of Anwar al-Awlaki, accused of inciting jihad in speech, writing, and unspecified actions. A headline in the New York Times captured the general elite reaction when he was murdered in a drone attack, along with the usual collateral damage. It read: "The west celebrates a cleric's death." Some eyebrows were lifted, however, because he was an American citizen, which raised questions about due process – considered irrelevant when non-citizens are murdered at the whim of the chief executive. And irrelevant for citizens, too, under Obama administration due-process legal innovations.

Presumption of innocence has also been given a new and useful interpretation. As the New York Times reported: "Mr Obama embraced a disputed method for counting civilian casualties that did little to box him in. It in effect counts all military-age males in a strike zone as combatants, according to several administration officials, unless there is explicit intelligence posthumously proving them innocent." So post-assassination determination of innocence maintains the sacred principle of presumption of innocence.

It would be ungracious to recall the Geneva conventions, the foundation of modern humanitarian law: they bar "the carrying out of executions without previous judgment pronounced by a regularly constituted court, affording all the judicial guarantees which are recognised as indispensable by civilised peoples".

The most famous recent case of executive assassination was Osama bin Laden, murdered after he was apprehended by 79 Navy seals, defenceless, accompanied only by his wife, his body reportedly dumped at sea without autopsy. Whatever one thinks of him, he was a suspect and nothing more than that. Even the FBI agreed.

Celebration in this case was overwhelming, but there were a few questions raised about the bland rejection of the principle of presumption of innocence, particularly when trial was hardly impossible. These were met with harsh condemnations. The most interesting was by a respected left-liberal political commentator, Matthew Yglesias, who explained that "one of the main functions of the international institutional order is precisely to legitimate the use of deadly military force by western powers", so it is "amazingly naïve" to suggest that the US should obey international law or other conditions that we righteously demand of the weak.

Only tactical objections can be raised to aggression, assassination, cyberwar, or other actions that the Holy State undertakes in the service of mankind. If the traditional victims see matters somewhat differently, that merely reveals their moral and intellectual backwardness. And the occasional western critic who fails to comprehend these fundamental truths can be dismissed as "silly", Yglesias explains – incidentally, referring specifically to me, and I cheerfully confess my guilt.

Executive terrorist lists


Perhaps the most striking assault on the foundations of traditional liberties is a little-known case brought to the supreme court by the Obama administration, Holder v Humanitarian Law Project. The project was condemned for providing "material assistance" to guerrilla organisation PKK, which has fought for Kurdish rights in Turkey for many years and is listed as a terrorist group by the state executive. The "material assistance" was legal advice. The wording of the ruling would appear to apply quite broadly, for example, to discussions and research inquiry, even advice to the PKK to keep to nonviolent means. Again, there was a marginal fringe of criticism, but even those accepted the legitimacy of the state terrorist list – arbitrary decisions by the executive, with no recourse.

The record of the terrorist list is of some interest. For example, in 1988 the Reagan administration declared Nelson Mandela's African National Congress to be one of the world's "more notorious terrorist groups", so that Reagan could continue his support for the apartheid regime and its murderous depredations in South Africa and in neighbouring countries, as part of his "war on terror". Twenty years later Mandela was finally removed from the terrorist list, and can now travel to the US without a special waiver.

Another interesting case is Saddam Hussein, removed from the terrorist list in 1982 so that the Reagan administration could provide him with support for his invasion of Iran. The support continued well after the war ended. In 1989, President Bush even invited Iraqi nuclear engineers to the US for advanced training in weapons production – more information that must be kept from the eyes of the "ignorant and meddlesome outsiders."

One of the ugliest examples of the use of the terrorist list has to do with the tortured people of Somalia. Immediately after 11 September, the US closed down the Somali charitable network Al-Barakaat on grounds that it was financing terror. This achievement was hailed one of the great successes of the "war on terror". In contrast, Washington's withdrawal of its charges as without merit a year later aroused little notice.

Al-Barakaat was responsible for about half the $500m in remittances to Somalia, "more than it earns from any other economic sector and 10 times the amount of foreign aid [Somalia] receives" a UN review determined. The charity also ran major businesses in Somalia, all destroyed. The leading academic scholar of Bush's "financial war on terror", Ibrahim Warde, concludes that apart from devastating the economy, this frivolous attack on a very fragile society "may have played a role in the rise ... of Islamic fundamentalists" – another familiar consequence of the "war on terror".

The very idea that the state should have the authority to make such judgments is a serious offense against the Charter of Liberties, as is the fact that it is considered uncontentious. If the charter's fall from grace continues on the path of the past few years, the future of rights and liberties looks dim.

Who will have the last laugh?


A few final words on the fate of the Charter of the Forest. Its goal was to protect the source of sustenance for the population, the commons, from external power – in the early days, royalty; over the years, enclosures and other forms of privatisation by predatory corporations and the state authorities who co-operate with them, have only accelerated and are properly rewarded. The damage is very broad.

If we listen to voices from the south today we can learn that "the conversion of public goods into private property through the privatisation of our otherwise commonly held natural environment is one way neoliberal institutions remove the fragile threads that hold African nations together. Politics today has been reduced to a lucrative venture where one looks out mainly for returns on investment rather than on what one can contribute to rebuild highly degraded environments, communities, and a nation. This is one of the benefits that structural adjustment programmes inflicted on the continent – the enthronement of corruption." I'm quoting Nigerian poet and activist Nnimmo Bassey, chair of Friends of the Earth International, in his searing expose of the ravaging of Africa's wealth, To Cook a Continent, the latest phase of the western torture of Africa.

Torture that has always been planned at the highest level, it should be recognised. At the end of the second world war, the US held a position of unprecedented global power. Not surprisingly, careful and sophisticated plans were developed about how to organise the world. Each region was assigned its "function" by state department planners, headed by the distinguished diplomat George Kennan. He determined that the US had no special interest in Africa, so it should be handed over to Europe to "exploit" – his word – for its reconstruction. In the light of history, one might have imagined a different relation between Europe and Africa, but there is no indication that that was ever considered.
More recently, the US has recognised that it, too, must join the game of exploiting Africa, along with new entries like China, which is busily at work compiling one of the worst records in destruction of the environment and oppression of the hapless victims.

It should be unnecessary to dwell on the extreme dangers posed by one central element of the predatory obsessions that are producing calamities all over the world: the reliance on fossil fuels, which courts global disaster, perhaps in the not-too-distant future. Details may be debated, but there is little serious doubt that the problems are serious, if not awesome, and that the longer we delay in addressing them, the more awful will be the legacy left to generations to come. There are some efforts to face reality, but they are far too minimal. The recent Rio+20 Conference opened with meagre aspirations and derisory outcomes.
Meanwhile, power concentrations are charging in the opposite direction, led by the richest and most powerful country in world history. Congressional Republicans are dismantling the limited environmental protections initiated by Richard Nixon, who would be something of a dangerous radical in today's political scene. The major business lobbies openly announce their propaganda campaigns to convince the public that there is no need for undue concern – with some effect, as polls show.

The media co-operates by not even reporting the increasingly dire forecasts of international agencies and even the US Department of Energy. The standard presentation is a debate between alarmists and sceptics: on one side virtually all qualified scientists, on the other a few holdouts. Not part of the debate are a very large number of experts, including the climate change programme at Massachusetts Institute of Technology among others, who criticise the scientific consensus because it is too conservative and cautious, arguing that the truth when it comes to climate change is far more dire. Not surprisingly, the public is confused.

In his State of the Union speech in January, Obama hailed the bright prospects of a century of energy self-sufficiency, thanks to new technologies that permit extraction of hydrocarbons from Canadian tar sands, shale and other previously inaccessible sources. Others agree. The Financial Times forecasts a century of energy independence for the US The report does mention the destructive local impact of the new methods. Unasked in these optimistic forecasts is the question, what kind of a world will survive the rapacious onslaught?

In the lead in confronting the crisis throughout the world are indigenous communities, those who have always upheld the Charter of the Forests. The strongest stand has been taken by the one country they govern, Bolivia, the poorest country in South America and for centuries a victim of western destruction of the rich resources of one of the most advanced of the developed societies in the hemisphere, pre-Columbus.

After the ignominious collapse of the Copenhagen global climate change summit in 2009, Bolivia organised a People's Summit with 35,000 participants from 140 countries – not just representatives of governments, but also civil society and activists. It produced a People's Agreement, which called for very sharp reduction in emissions, and a Universal Declaration on the Rights of Mother Earth. That is a key demand of indigenous communities all over the world. It is ridiculed by sophisticated westerners, but unless we can acquire some of their sensibility, they are likely to have the last laugh – a laugh of grim despair.

Sunday, September 8, 2013

The South killed the safety net



The South killed the safety net

 

Europeans came to this country for a better life. It hasn't always meant helping the less fortunate





 
The South killed the safety net 
(Credit: kraphix via Shutterstock/akindo via iStock/Salon)


 
The South’s aversion both to taxes and to mandated government safety net structures had a long, and somewhat surprising, pedigree. In the late eighteenth century, popular radical writers such as Condorcet in France and Tom Paine in England had called for the creation of comprehensive social insurance systems based around universal pensions, child allowances, and education for all. Neither, however, managed to successfully alter prevailing political and moral doctrines. In France, after the frenzy of the revolutionary years the counterrevolution of the post-Napoleonic period put a halt to radical social experiments for decades. And in the United Kingdom, at least partially in response to the violence unleashed by revolutionaries in France, the early nineteenth century saw a tide of conservative reaction. Give money to the poor, the theory went, and you were encouraging indolence, dependency, and ultimately societal chaos. In 1834, after the publication of the Poor Law Report, “outdoor relief”—the giving of state moneys to the able-bodied poor in a non-workhouse context—was banned. For most of the rest of Queen Victoria’s near-seventy-year reign, “the great unwashed” were either left to find their own ways through terrain of hunger, homelessness, and disease, or were corralled into the sorts of ghastly workhouse settings made infamous by the writings of Charles Dickens.

In America, the South in particular took the Victorian lesson to heart, though to a lesser degree so too did the rest of the country. As did most of Europe. After all, Great Britain was the dominant power of the age, its economic prescriptions as hard to avoid as, say, the Washington consensus’s emphasis on opening up markets to international trade, privatizing public services, and deregulation a century and a half later. Coercive poor law politics, shaped around workhouses, poor houses, and other near-prison-like conditions for confining and attending to the subsistence needs of the poor was, as a consequence, the dominant response to poverty on both sides of the Atlantic throughout the middle decades of the nineteenth century.


A couple generations later, however, as the rise of industrial societies in Europe created huge economic dislocations and massive political unrest, Europe revisited the issue. Between 1883 and 1889, Otto von Bismarck’s Germany created a slew of social insurance programs. In England, at about the same time, social reformers such as Arnold Toynbee began calling for the creation of a government-funded safety net. And in 1908, Parliament passed the Old Age Pensions Act—part of a two-year spasm of social reform that culminated in the fabled People’s Budget of 1909. French reformers preached voluntary mutual assistance schemes and increasingly urged the government fund universal assistance programs out of a general tax base. After decades of agitation, the French Parliament enacted a state pension system in 1910.

In the United States, though, support for such reforms remained more tenuous.

True, an array of progressive political groups supported workers’ compensation laws by the early twentieth century. And by 1917, with the Supreme Court having upheld the constitutionality of these laws, thirty-seven states had systems in place, most of them compulsory. In fact, as a region, only the Deep South had completely neglected to implement compensation schemes for at least some categories of injured workers. But in contrast to this, enthusiasm for social insurance systems didn’t take off prior to World War I. True, several states in America created their own very limited pension plans during these years, especially for widows and for teachers—who at the time were mainly women—and several also seeded their own unemployment insurance systems. Yet not until the New Deal did the idea of a federal system gain traction. Before then, even the American Federation of Labor and the left-wing Nation magazine opposed mandatory Social Security. Hence the paradoxical fact that when, in 1912, Teddy Roosevelt’s Progressives came out in support of social insurance, including a form of compulsory medical insurance, an alliance of conservatives, socialists, trade unionists, and federalists combined to defeat it. Opponents argued that the imposition of mandates on working Americans, forcing them to pay into a system to support the elderly and to provide medical coverage for the sick, was foreign to the country’s founding principles. What was happening in Europe was, they argued, too paternalistic, too coercive. Moreover, in a land of great social mobility and endless opportunity such systems were unnecessary. Keep them for the ossified Old World—keep them for places where one’s station in life was determined by one’s parentage.

Thus, while America’s peer nations across the Atlantic were experimenting with forms of universal old age pensions and healthcare coverage—to tackle the extremes of poverty that had driven so many Europeans to migrate to the United States in search of higher living standards—when it came to the creation of nationwide safety net protections America stalled. Hobbled by the South’s antipathy to any form of welfare, and by a broader national reluctance to corral citizens into insurance programs against their will, advocates for the sorts of reforms occurring in Europe ran up against a brick wall. It would take the Great Depression, and the collapse of both the working and the middle classes’ sense of stability and burgeoning economic possibility, to shift public opinion behind the establishment of Social Security and government aid in the arena of housing and employment. In fact, it wasn’t until 1935, six years after Wall Street’s catastrophic collapse, that Congress legislated into being Social Security, disability and unemployment insurance, and Aid to Families with Dependent Children. And it was not until 1937 that Congress would take the lead on funding large-scale public housing.

As for healthcare reform, long a holy grail of social reformers, attempts by Franklin Roosevelt before World War II and Harry Truman at the end of the war to create universal healthcare foundered on the rocks of opposition from the American Medical Association, as well as more general hostility from the same political wellspring that had opposed Social Security’s creation. Truman’s proposal for a 4 percent payroll tax to cover a national health insurance system, which he proposed in a special message to Congress on November 19, 1945, was denounced as being an attempt to “socialize medicine.” It was a critique that would crop up repeatedly over the decades, when Presidents Truman, Kennedy, Johnson, Nixon, Carter, Clinton, and finally Obama proposed significant overhauls to the country’s dysfunctional and inequitable healthcare systems. Ultimately, it would take the upheavals of the 1960s to partially get around this critique and pave the way for Medicaid and Medicare—though in the case of Medicaid, Congress gave the states considerable leeway as to whom they covered and what services they provided. And it would take the 2008 financial collapse to create just about enough momentum for President Obama to get Congress to pass a watered-down version of universal healthcare. Even then, the backlash was massive, the acrimonious debate creating a climate in which the conservative Tea Party movement could flourish.

So, too, it would take the calamity of 1930s-era deflation, and the threat of wholesale bankruptcy for America’s millions of farmers for the federal government, at the urging of Secretary of Agriculture Henry Wallace, to start providing food aid to the country’s hungry. It would take the work of Michael Harrington and others a generation later to prod Washington to set up a national food stamps system and then to expand vital nutritional programs such as school breakfasts and lunches and WIC, the Women, Infants, and Children program run by the U.S. Department of Agriculture.

At first, this food assistance to the poor was, essentially, a way of propping up agricultural prices by having local counties buy up surplus crops and thus preserve market buoyancy. “We must adjust downward our surplus supplies until domestic and foreign markets can be restored,” Wallace, who had grown up on a farm in rural Iowa, declared in his first radio address on March 10, 1933. The strategy worked for the farmers, and to a degree it alleviated hunger for a portion of the indigent population. It was, however, massively incomplete. Too few people received the aid; too many regions and categories of poor were left unhelped.

In 1961, a generation later, newly inaugurated president John F. Kennedy signed an executive order creating a pilot food stamp program, funded by the federal government, in eighteen states. After Kennedy’s death, President Johnson pushed Congress to pass a Food Stamp Act that allowed counties across the country to choose to opt into the food stamp system, with poor families paying a percentage of their income to access food stamps worth considerably more—the stamps, in other words, were not free but were heavily subsidized. In the late 1960s, the voluntary nature of the system was replaced. Over several years leading up to 1974, all counties would have to opt into the system. By the mid-1970s, food assistance paid for by the feds, to the tune of billions of dollars a year, had become the country’s single most effective intervention against poverty; by the end of the decade, food stamps alone had expanded to the point that the program was costing the government more than $20 billion (in 2012 dollar values) annually.

In fact, even while President Richard Nixon rhetorically tilted rightward—talking of a Silent Majority enraged by lax criminal justice codes, a mollycoddling welfare state, and the presence of a seemingly permanent underclass—in reality he presided over significant expansions of the welfare state, especially when it came to antihunger programs. He also sought to create a universal healthcare system that in many of its particulars looked strikingly like the one ultimately implemented under President Obama nearly four decades later.

Three years after Nixon left office, Congress eliminated the requirement that poor families had to buy into the food stamp program; thenceforth, the cost of the food stamps was fully borne by the federal government. “The most important accomplishment of this period was the elimination of purchase prices as a barrier to participation,” wrote the social scientist Dennis Roth in a history of the food stamp program commissioned by the Economic Research Service of the U.S. Department of Agriculture.

[The achievements of the War on Poverty, however, rapidly came under fire.] From the 1970s on, as misery and hardship stubbornly refused to vanish from the national landscape, America’s commitment both to reducing income inequality and to mitigating the effects of that inequality began to wane. Both rhetorically and in terms of practical policies, America’s leadership class began a long march away from redistributive liberalism. Tax policy became more regressive. And the tax code came overwhelmingly to benefit the wealthiest Americans, with a falloff in the progressive nature of the tax bands and the near-complete emasculation of the estate tax system. Huge companies such as GE found ways to avoid paying any corporate taxes, and billionaires such as Warren Buffett, who made most of his money from capital gains, ended up paying a smaller percentage to the government in taxes than did their secretaries.

Cumulatively, programs and wage protections developed over the better part of a century came under tremendous pressure as the great unraveling of public infrastructure picked up pace.


Excerpted from “The American Way of Poverty: How the Other Half Still Lives” by Sasha Abramsky. Reprinted with permission from Nation Books.

Sasha Abramsky is a Nation magazine contributing writer, and a senior fellow at Demos. He is the author, most recently, of "Inside Obama’s Brain." He lives in Sacramento, California.

Tuesday, August 20, 2013

NSA Transparent? Not!




      

What NSA "Transparency" Looks Like

Take a look at what information about rule violations the agency itself released.

 
 
 

Last week, the Washington Post  published an internal audit finding the NSA had violated privacy rules thousands of times in recent years.

In response, the spy agency held a rare conference call for the press maintaining that the violations are “not willful” and “not malicious.”
 
 
It’s difficult to fully evaluate the NSA’s track record, since the agency has been so tight-lipped on the topic.
What information about rule violations has the agency itself released?  Take a look:


NOTE: CLICK HERE FOR THE REPORT. 

That is the publicly released version of a semiannual report from the administration to Congress describing NSA violations of rules surrounding the  FISA Amendments Act. The act is one of the key laws governing NSA surveillance, including now-famous programs like Prism.

As an oversight measure,  the law requires the attorney general to submit semiannual reports to the congressional intelligence and judiciary committees.

The section with the redactions above is titled “Statistical Data Relating to Compliance Incidents.”

One of the only unredacted portions reads, “The value of statistical information in assessing compliance in situations such as this is unclear. A single incident, for example, may have broad ramifications. Multiple incidents may increase the incident count, but may be deemed of very limited significance.”

The document, dated May 2010, was released after the ACLU filed a  freedom of information lawsuit.  

As the Post  noted, members of Congress can read the unredacted version of the semiannual reports, but only in a special secure room. They cannot take notes or publicly discuss what they read.


For more on the NSA, see our story on how the agency  says it can’t search its own emails, and what we know about the agency’s  tapping of Internet cables

Monday, August 19, 2013

Chomsky: The U.S. behaves nothing like a democracy

SALON





Saturday, Aug 17, 2013 08:30 AM EDT
                       

Chomsky: The U.S. behaves nothing like a democracy

                           

The MIT professor lays out how the majority of U.S. policies are opposed to what wide swaths of the public want







Topics: Democracy, United States, Obama, President Obama, Capitalism, Terrorism, Media, Editor's Picks, , ,
                      
Chomsky: The U.S. behaves nothing like a democracy

Noam Chomsky (Credit: AP/Nader Daoud)
 
The following is a transcript of a recent speech delivered Noam Chomsky in Bonn, Germany, at DW Global Media Forum, Bonn, Germany. It was previously published at Alternet.
 
                                   
AlterNet
I’d like to comment on topics that I think should regularly be on the front pages but are not — and in many crucial cases are scarcely mentioned at all or are presented in ways that seem to me deceptive because they’re framed almost reflexively in terms of doctrines of the powerful.
 
In these comments I’ll focus primarily on the United States for several reasons: One, it’s the most important country in terms of its power and influence. Second, it’s the most advanced – not in its inherent character, but in the sense that because of its power, other societies tend to move in that direction. The third reason is just that I know it better. But I think what I say generalizes much more widely – at least to my knowledge, obviously there are some variations. So I’ll be concerned then with tendencies in American society and what they portend for the world, given American power.
American power is diminishing, as it has been in fact since its peak in 1945, but it’s still incomparable. And it’s dangerous. Obama’s remarkable global terror campaign and the limited, pathetic reaction to it in the West is one shocking example. And it is a campaign of international terrorism – by far the most extreme in the world. Those who harbor any doubts on that should read the report issued by Stanford University and New York University, and actually I’ll return to even more serious examples than international terrorism.
 
According to received doctrine, we live in capitalist democracies, which are the best possible system, despite some flaws. There’s been an interesting debate over the years about the relation between capitalism and democracy, for example, are they even compatible? I won’t be pursuing this because I’d like to discuss a different system – what we could call the “really existing capitalist democracy”, RECD for short, pronounced “wrecked” by accident. To begin with, how does RECD compare with democracy? Well that depends on what we mean by “democracy”. There are several versions of this. One, there is a kind of received version. It’s soaring rhetoric of the Obama variety, patriotic speeches, what children are taught in school, and so on. In the U.S. version, it’s government “of, by and for the people”. And it’s quite easy to compare that with RECD.
 
In the United States, one of the main topics of academic political science is the study of attitudes and policy and their correlation. The study of attitudes is reasonably easy in the United States: heavily-polled society, pretty serious and accurate polls, and policy you can see, and you can compare them. And the results are interesting. In the work that’s essentially the gold standard in the field, it’s concluded that for roughly 70% of the population – the lower 70% on the wealth/income scale – they have no influence on policy whatsoever. They’re effectively disenfranchised. As you move up the wealth/income ladder, you get a little bit more influence on policy. When you get to the top, which is maybe a tenth of one percent, people essentially get what they want, i.e. they determine the policy. So the proper term for that is not democracy; it’s plutocracy.

Inquiries of this kind turn out to be dangerous stuff because they can tell people too much about the nature of the society in which they live. So fortunately, Congress has banned funding for them, so we won’t have to worry about them in the future.

These characteristics of RECD show up all the time. So the major domestic issue in the United States for the public is jobs. Polls show that very clearly. For the very wealthy and the financial institutions, the major issue is the deficit. Well, what about policy? There’s now a sequester in the United States, a sharp cutback in funds. Is that because of jobs or is it because of the deficit? Well, the deficit.

Europe, incidentally, is much worse – so outlandish that even The Wall Street Journal has been appalled by the disappearance of democracy in Europe. …[I]t had an article [this year] which concluded that “the French, the Spanish, the Irish, the Dutch, Portuguese, Greeks, Slovenians, Slovakians and Cypriots have to varying degrees voted against the currency bloc’s economic model since the crisis began three years ago. Yet economic policies have changed little in response to one electoral defeat after another. The left has replaced the right; the right has ousted the left. Even the center-right trounced Communists (in Cyprus) – but the economic policies have essentially remained the same: governments will continue to cut spending and raise taxes.” It doesn’t matter what people think and “national governments must follow macro-economic directives set by the European Commission”. Elections are close to meaningless, very much as in Third World countries that are ruled by the international financial institutions. That’s what Europe has chosen to become. It doesn’t have to.

Returning to the United States, where the situation is not quite that bad, there’s the same disparity between public opinion and policy on a very wide range of issues. Take for example the issue of minimum wage. The one view is that the minimum wage ought to be indexed to the cost of living and high enough to prevent falling below the poverty line. Eighty percent of the public support that and forty percent of the wealthy. What’s the minimum wage? Going down, way below these levels. It’s the same with laws that facilitate union activity: strongly supported by the public; opposed by the very wealthy – disappearing. The same is true on national healthcare. The U.S., as you may know, has a health system which is an international scandal, it has twice the per capita costs of other OECD countries and relatively poor outcomes. The only privatized, pretty much unregulated system. The public doesn’t like it. They’ve been calling for national healthcare, public options, for years, but the financial institutions think it’s fine, so it stays: stasis. In fact, if the United States had a healthcare system like comparable countries there wouldn’t be any deficit. The famous deficit would be erased, which doesn’t matter that much anyway.

One of the most interesting cases has to do with taxes. For 35 years there have been polls on ‘what do you think taxes ought to be?’ Large majorities have held that the corporations and the wealthy should pay higher taxes. They’ve steadily been going down through this period.

On and on, the policy throughout is almost the opposite of public opinion, which is a typical property of RECD.
In the past, the United States has sometimes, kind of sardonically, been described as a one-party state: the business party with two factions called Democrats and Republicans. That’s no longer true. It’s still a one-party state, the business party. But it only has one faction. The faction is moderate Republicans, who are now called Democrats. There are virtually no moderate Republicans in what’s called the Republican Party and virtually no liberal Democrats in what’s called the Democratic [sic] Party. It’s basically a party of what would be moderate Republicans and similarly, Richard Nixon would be way at the left of the political spectrum today. Eisenhower would be in outer space.

There is still something called the Republican Party, but it long ago abandoned any pretence of being a normal parliamentary party. It’s in lock-step service to the very rich and the corporate sector and has a catechism that everyone has to chant in unison, kind of like the old Communist Party. The distinguished conservative commentator, one of the most respected – Norman Ornstein – describes today’s Republican Party as, in his words, “a radical insurgency – ideologically extreme, scornful of facts and compromise, dismissive of its political opposition” – a serious danger to the society, as he points out.

In short, Really Existing Capitalist Democracy is very remote from the soaring rhetoric about democracy. But there is another version of democracy. Actually it’s the standard doctrine of progressive, contemporary democratic theory. So I’ll give some illustrative quotes from leading figures – incidentally not figures on the right. These are all good Woodrow Wilson-FDR-Kennedy liberals, mainstream ones in fact. So according to this version of democracy, “the public are ignorant and meddlesome outsiders. They have to be put in their place. Decisions must be in the hands of an intelligent minority of responsible men, who have to be protected from the trampling and roar of the bewildered herd”. The herd has a function, as it’s called. They’re supposed to lend their weight every few years, to a choice among the responsible men. But apart from that, their function is to be “spectators, not participants in action” – and it’s for their own good. Because as the founder of liberal political science pointed out, we should not succumb to “democratic dogmatisms about people being the best judges of their own interest”. They’re not. We’re the best judges, so it would be irresponsible to let them make choices just as it would be irresponsible to let a three-year-old run into the street. Attitudes and opinions therefore have to be controlled for the benefit of those you are controlling. It’s necessary to “regiment their minds”. It’s necessary also to discipline the institutions responsible for the “indoctrination of the young.” All quotes, incidentally. And if we can do this, we might be able to get back to the good old days when “Truman had been able to govern the country with the cooperation of a relatively small number of Wall Street lawyers and bankers.” This is all from icons of the liberal establishment, the leading progressive democratic theorists. Some of you may recognize some of the quotes.

The roots of these attitudes go back quite far. They go back to the first stirrings of modern democracy. The first were in England in the 17th Century. As you know, later in the United States. And they persist in fundamental ways. The first democratic revolution was England in the 1640s. There was a civil war between king and parliament. But the gentry, the people who called themselves “the men of best quality”, were appalled by the rising popular forces that were beginning to appear on the public arena. They didn’t want to support either king or parliament. Quote their pamphlets, they didn’t want to be ruled by “knights and gentlemen, who do but oppress us, but we want to be governed by countrymen like ourselves, who know the people’s sores”. That’s a pretty terrifying sight. Now the rabble has been a pretty terrifying sight ever since. Actually it was long before. It remained so a century after the British democratic revolution. The founders of the American republic had pretty much the same view about the rabble. So they determined that “power must be in the hands of the wealth of the nation, the more responsible set of men. Those who have sympathy for property owners and their rights”, and of course for slave owners at the time. In general, men who understand that a fundamental task of government is “to protect the minority of the opulent from the majority”. Those are quotes from James Madison, the main framer – this was in the Constitutional Convention, which is much more revealing than the Federalist Papers which people read. The Federalist Papers were basically a propaganda effort to try to get the public to go along with the system. But the debates in the Constitutional Convention are much more revealing. And in fact the constitutional system was created on that basis. I don’t have time to go through it, but it basically adhered to the principle which was enunciated simply by John Jay, the president of the ­ Continental Congress, then first Chief Justice of the Supreme Court, and as he put it, “those who own the country ought to govern it”. That’s the primary doctrine of RECD to the present.

There’ve been many popular struggles since – and they’ve won many victories. The masters, however, do not relent. The more freedom is won, the more intense are the efforts to redirect the society to a proper course. And the 20th Century progressive democratic theory that I’ve just sampled is not very different from the RECD that has been achieved, apart from the question of: Which responsible men should rule? Should it be bankers or intellectual elites? Or for that matter should it be the Central Committee in a different version of similar doctrines?

Well, another important feature of RECD is that the public must be kept in the dark about what is happening to them. The “herd” must remain “bewildered”. The reasons were explained lucidly by the professor of the science of government at Harvard – that’s the official name – another respected liberal figure, Samuel Huntington. As he pointed out, “power remains strong when it remains in the dark. Exposed to sunlight, it begins to evaporate”. Bradley Manning is facing a life in prison for failure to comprehend this scientific principle. Now Edward Snowden as well. And it works pretty well. If you take a look at polls, it reveals how well it works. So for example, recent polls pretty consistently reveal that Republicans are preferred to Democrats on most issues and crucially on the issues in which the public opposes the policies of the Republicans and favors the policies of the Democrats. One striking example of this is that majorities say that they favor the Republicans on tax policy, while the same majorities oppose those policies. This runs across the board. This is even true of the far right, the Tea Party types. This goes along with an astonishing level of contempt for government. Favorable opinions about Congress are literally in the single digits. The rest of the government as well. It’s all declining sharply.

Results such as these, which are pretty consistent, illustrate demoralization of the public of a kind that’s unusual, although there are examples – the late Weimar Republic comes to mind. The tasks of ensuring that the rabble keep to their function as bewildered spectators, takes many forms. The simplest form is simply to restrict entry into the political system. Iran just had an election, as you know. And it was rightly criticized on the grounds that even to participate, you had to be vetted by the guardian council of clerics. In the United States, you don’t have to be vetted by clerics, but rather you have to be vetted by concentrations of private capital. Unless you pass their filter, you don’t enter the political system – with very rare exceptions.

There are many mechanisms, too familiar to review, but that’s not safe enough either. There are major institutions that are specifically dedicated to undermining authentic democracy. One of them is called the public relations industry. A huge industry, it was in fact developed on the principle that it’s necessary to regiment the minds of men, much as an army regiments its soldiers – I was actually quoting from one of its leading figures before.

The role of the PR industry in elections is explicitly to undermine the school-child version of democracy. What you learn in school is that democracies are based on informed voters making rational decisions. All you have to do is take a look at an electoral campaign run by the PR industry and see that the purpose is to create uninformed voters who will make irrational decisions. For the PR industry that’s a very easy transition from their primary function. Their primary function is commercial advertising. Commercial advertising is designed to undermine markets. If you took an economics course you learned that markets are based on informed consumers making rational choices. If you turn on the TV set, you see that ads are designed to create irrational, uninformed consumers making irrational choices. The whole purpose is to undermine markets in the technical sense.

They’re well aware of it, incidentally. So for example, after Obama’s election in 2008, a couple of months later the advertising industry had its annual conference. Every year they award a prize for the best marketing campaign of the year. That year they awarded it to Obama. He beat out Apple computer, did an even better job of deluding the public – or his PR agents did. If you want to hear some of it, turn on the television today and listen to the soaring rhetoric at the G-8 Summit in Belfast. It’s standard.

There was interesting commentary on this in the business press, primarily The London Financial Times, which had a long article, interviewing executives about what they thought about the election. And they were quite euphoric about this. They said this gives them a new model for how to delude the public. The Obama model could replace the Reagan model, which worked pretty well for a while.

Turning to the economy, the core of the economy today is financial institutions. They’ve vastly expanded since the 1970s, along with a parallel development – accelerated shift of production abroad. There have also been critical changes in the character of financial institutions.

If you go back to the 1960s, banks were banks. If you had some money, you put it in the bank to lend it to somebody to buy a house or start a business, or whatever. Now that’s a very marginal aspect of financial institutions today. They’re mostly devoted to intricate, exotic manipulations with markets. And they’re huge. In the United States, financial institutions, big banks mostly, had 40% of corporate profit in 2007. That was on the eve of the financial crisis, for which they were largely responsible. After the crisis, a number of professional economists – Nobel laureate Robert Solow, Harvard’s Benjamin Friedman – wrote articles in which they pointed out that economists haven’t done much study of the impact of the financial institutions on the economy. Which is kind of remarkable, considering its scale. But after the crisis they took a look and they both concluded that probably the impact of the financial institutions on the economy is negative. Actually there are some who are much more outspoken than that. The most respected financial correspondent in the English-speaking world is Martin Wolf of the Financial Times. He writes that the “out-of-control financial sector is eating out the modern market economy from the inside, just as the larva of the spider wasp eats out the host in which it has been laid”. By “the market economy” he means the productive economy.

There’s a recent issue of the main business weekly, Bloomberg Business Week, which reported a study of the IMF that found that the largest banks make no profit. What they earn, according to the IMF analysis, traces to the government insurance policy, the so-called too-big-to-fail policy. There is a widely publicized bailout, but that’s the least of it. There’s a whole series of other devices by which the government insurance policy aids the big banks: cheap credit and many other things. And according to the IMF at least, that’s the totality of their profit. The editors of the journal say this is crucial to understanding why the big banks present such a threat to the global economy – and to the people of the country, of course.

After the crash, there was the first serious attention by professional economists to what’s called systemic risk. They knew it existed but it wasn’t much a topic of investigation. ‘Systemic risk’ means the risk that if a transaction fails, the whole system may collapse. That’s what’s called an externality in economic theory. It’s a footnote. And it’s one of the fundamental flaws of market systems, a well-known, inherent flaw, is externalities. Every transaction has impacts on others which just aren’t taken into account in a market transaction. Systemic risk is a big one. And there are much more serious illustrations than that. I’ll come back to it.

What about the productive economy under RECD? Here there’s a mantra too. The mantra is based on entrepreneurial initiative and consumer choice in a free market. There are agreements established called free-trade agreements, which are based on the mantra. That’s all mythology.

The reality is that there is massive state intervention in the productive economy and the free-trade agreements are anything but free-trade agreements. That should be obvious. Just to take one example: The information technology (IT) revolution, which is driving the economy, that was based on decades of work in effectively the state sector – hard, costly, creative work substantially in the state sector, no consumer choice at all, there was entrepreneurial initiative but it was largely limited to getting government grants or bailouts or procurement. Except by some economists, that’s underestimated but a very significant factor in corporate profit. If you can’t sell something, hand it over the government. They’ll buy it.

After a long period – decades in fact – of hard, creative work, the primary research and development, the results are handed over to private enterprise for commercialization and profit. That’s Steve Jobs and Bill Gates and so on. It’s not quite that simple of course. But that’s a core part of the picture. The system goes way back to the origins of industrial economies, but it’s dramatically true since WWII that this ought to be the core of the study of the productive economy.
Another central aspect of RECD is concentration of capital. In just the past 20 years in the United States, the share of profits of the two hundred largest enterprises has very sharply risen, probably the impact of the Internet, it seems. These tendencies towards oligopoly also undermine the mantra, of course. Interesting topics but I won’t pursue them any further.

Instead, I’d like to turn to another question. What are the prospects for the future under RECD? There’s an answer. They’re pretty grim. It’s no secret that there are a number of dark shadows that hover over every topic that we discuss and there are two that are particularly ominous, so I’ll keep to those, though there are others. One is environmental catastrophe. The other is nuclear war. Both of which of course threaten the prospects for decent survival and not in the remote future.

I won’t say very much about the first, environmental catastrophe. That should be obvious. Certainly the scale of the danger should be obvious to anyone with eyes open, anyone who is literate, particularly those who read scientific journals. Every issue of a technical journal virtually has more dire warnings than the last one.

There are various reactions to this around the world. There are some who seek to act decisively to prevent possible catastrophe. At the other extreme, major efforts are underway to accelerate the danger. Leading the effort to intensify the likely disaster is the richest and most powerful country in world history, with incomparable advantages and the most prominent example of RECD – the one that others are striving towards.

Leading the efforts to preserve conditions in which our immediate descendants might have a decent life, are the so-called “primitive” societies: First Nations in Canada, Aboriginal societies in Australia, tribal societies and others like them. The countries that have large and influential indigenous populations are well in the lead in the effort to “defend the Earth”. That’s their phrase. The countries that have driven indigenous populations to extinction or extreme marginalization are racing forward enthusiastically towards destruction. This is one of the major features of contemporary history. One of those things that ought to be on front pages. So take Ecuador, which has a large indigenous population. It’s seeking aid from the rich countries to allow it to keep its substantial hydrocarbon reserves underground, which is where they ought to be. Now meanwhile, the U.S. and Canada are enthusiastically seeking to burn every drop of fossil fuel, including the most dangerous kind – Canadian tar sands – and to do so as quickly and fully as possible – without a side glance on what the world might look like after this extravagant commitment to self-destruction. Actually, every issue of the daily papers suffices to illustrate this lunacy. And lunacy is the right word for it. It’s exactly the opposite of what rationality would demand, unless it’s the skewed rationality of RECD.

Well, there have been massive corporate campaigns to implant and safeguard the lunacy. But despite them, there’s still a real problem in American society. The public is still too committed to scientific rationality. One of the many divergences between policy and opinion is that the American public is close to the global norm in concern about the environment and calling for actions to prevent the catastrophe and that’s a pretty high level. Meanwhile, bipartisan policy is dedicated to ‘bringing it on’, in a phrase that George W. Bush made famous in the case of Iraq. Fortunately, the corporate sector is riding to the rescue to deal with this problem. There is a corporate funded organization – the American Legislative Exchange Council (ALEC). It designs legislation for states. No need to comment on what kind of legislation. They’ve got a lot of clout and money behind them. So the programs tend to get instituted. Right now they’re instituting a new program to try to overcome the excessive rationality of the public. It’s a program of instruction for K-12 (kindergarten to 12th grade in schools). Its publicity says that the idea is to improve critical faculties – I’d certainly be in favor of that – by balanced teaching. ‘Balanced teaching’ means that if a sixth grade class learned something about what’s happening to the climate, they have to be presented with material on climate change denial so that they have balanced teaching and can develop their critical faculties. Maybe that’ll help overcome the failure of massive corporate propaganda campaigns to make the population ignorant and irrational enough to safeguard short-term profit for the rich. It’s pointedly the goal and several states have already accepted it.

Well, it’s worth remembering, without pursuing it that these are deep-seated institutional properties of RECD. They’re not easy to uproot. All of this is apart from the institutional necessity to maximize short-term profit while ignoring an externality that’s vastly more serious even than systemic risk. For systemic risk, the market failure – the culprits – can run to the powerful nanny state that they foster with cap in hand and they’ll be bailed out, as we’ve just observed again and will in the future. In the case of destruction of the environment, the conditions for decent existence, there’s no guardian angel around – nobody to run to with cap in hand. For that reason alone, the prospects for decent survival under RECD are quite dim.

Let’s turn to the other shadow: nuclear war. It’s a threat that’s been with us for 70 years. It still is. In some ways it’s growing. One of the reasons for it is that under RECD, the rights and needs of the general population are a minor matter. That extends to security. There is another prevailing mantra, particularly in the academic professions, claiming that governments seek to protect national security. Anyone who has studied international relations theory has heard that. That’s mostly mythology. The governments seek to extend power and domination and to benefit their primary domestic constituencies – in the U.S., primarily the corporate sector. The consequence is that security does not have a high priority. We see that all the time. Right now in fact. Take say Obama’s operation to murder Osama Bin Laden, prime suspect for the 9/11 attack. Obama made an important speech on national security last May 23rd. It was widely covered. There was one crucial paragraph in the speech that was ignored in the coverage. Obama hailed the operation, took pride in it – an operation which incidentally is another step at dismantling the foundations of Anglo-American law, back to the Magna Carta, namely the presumption of innocence. But that’s by now so familiar, it’s not even necessary to talk about it. But there’s more to it. Obama did hail the operation but he added to it that it “cannot be the norm”. The reason is that “the risks were immense”. The Navy SEALs who carried out the murder might have been embroiled in an extended firefight, but even though by luck that didn’t happen, “the cost to our relationship with Pakistan – and the backlash of the Pakistani public over the encroachment on their territory”, the aggression in other words, “was so severe that we’re just now beginning to rebuild this important partnership”.

It’s more than that. Let’s add a couple of details. The SEALs were under orders to fight their way out if they were apprehended. They would not have been left to their fate if they had been, in Obama’s words, been “embroiled in an extended firefight”. The full force of the U.S. military would have been employed to extricate them. Pakistan has a powerful military. It’s well-trained, highly protective of state sovereignty. Of course, it has nuclear weapons. And leading Pakistani specialists on nuclear policy and issues are quite concerned by the exposure of the nuclear weapons system to jihadi elements. It could have escalated to a nuclear war. And in fact it came pretty close. While the SEALs were still inside the Bin Laden compound, the Pakistani chief of staff, General Kayani, was informed of the invasion and he ordered his staff in his words to “confront any unidentified aircraft”. He assumed it was probably coming from India. Meanwhile in Kabul, General David Petraeus, head of the Central Command, ordered “U.S. warplanes to respond if Pakistanis scrambled their fighter jets”. It was that close. Going back to Obama, “by luck” it didn’t happen. But the risk was faced without noticeable concern, without even reporting in fact.

There’s a lot more to say about that operation and its immense cost to Pakistan, but instead of that let’s look more closely at the concern for security more generally. Beginning with security from terror, and then turning to the much more important question of security from instant destruction by nuclear weapons.

As I mentioned, Obama’s now conducting the world’s greatest international terrorist campaign – the drones and special forces campaign. It’s also a terror-generating campaign. The common understanding at the highest level [is] that these actions generate potential terrorists. I’ll quote General Stanley McChrystal, Petraeus’ predecessor. He says that “for every innocent person you kill”, and there are plenty of them, “you create ten new enemies”.

Take the marathon bombing in Boston a couple of months ago, that you all read about. You probably didn’t read about the fact that two days after the marathon bombing there was a drone bombing in Yemen. Usually we don’t happen to hear much about drone bombings. They just go on – just straight terror operations which the media aren’t interested in because we don’t care about international terrorism as long as the victims are somebody else. But this one we happened to know about by accident. There was a young man from the village that was attacked who was in the United States and he happened to testify before Congress. He testified about it. He said that for several years, the jihadi elements in Yemen had been trying to turn the village against Americans, get them to hate Americans. But the villagers didn’t accept it because the only thing they knew about the United States was what he told them. And he liked the United States. So he was telling them it was a great place. So the jihadi efforts didn’t work. Then he said one drone attack has turned the entire village into people who hate America and want to destroy it. They killed a man who everybody knew and they could have easily apprehended if they’d wanted. But in our international terror campaigns we don’t worry about that and we don’t worry about security.

One of the striking examples was the invasion of Iraq. U.S. and British intelligence agencies informed their governments that the invasion of Iraq was likely to lead to an increase in terrorism. They didn’t care. In fact, it did. Terrorism increased by a factor of seven the first year after the Iraqi invasion, according to government statistics. Right now the government is defending the massive surveillance operation. That’s on the front pages. The defense is on grounds that we have to do it to apprehend terrorists.

If there were a free press – an authentic free press – the headlines would be ridiculing this claim on the grounds that policy is designed in such a way that it amplifies the terrorist risk. But you can’t find that, which is one of innumerable indications of how far we are from anything that might be called a free press.

Let’s turn to the more serious problem: instant destruction by nuclear weapons. That’s never been a high concern for state authorities. There are many striking examples. Actually, we know a lot about it because the United States is an unusually free and open society and there’s plenty of internal documents that are released. So we can find out about it if we like.
Let’s go back to 1950. In 1950, U.S. security was just overwhelming. There’d never been anything like it in human history. There was one potential danger: ICBMs with hydrogen bomb warheads. They didn’t exist, but they were going to exist sooner or later. The Russians knew that they were way behind in military technology. They offered the U.S. a treaty to ban the development of ICBMs with hydrogen bomb warheads. That would have been a terrific contribution to U.S. security. There is one major history of nuclear weapons policy written by McGeorge Bundy, National Security Advisor for Kennedy and Johnson. In his study he has a couple of casual sentences on this. He said that he was unable to find even a staff paper discussing this. Here’s a possibility to save the country from total disaster and there wasn’t even a paper discussing it. No one cared. Forget it, we’ll go on to the important things.

A couple of years later, in 1952, Stalin made a public offer, which was pretty remarkable, to permit unification of Germany with internationally supervised free elections, in which the Communists would certainly lose, on one condition – that Germany be demilitarized. That’s hardly a minor issue for the Russians. Germany alone had practically destroyed them several times in the century. Germany militarized and part of a hostile Western alliance is a major threat. That was the offer.

The offer was public. It also of course would have led to an end to the official reason for NATO. It was dismissed with ridicule. Couldn’t be true. There were a few people who took it seriously – James Warburg, a respected international commentator, but he was just dismissed with ridicule. Today, scholars are looking back at it, especially with the Russian archives opening up. And they’re discovering that in fact it was apparently serious. But nobody could pay attention to it because it didn’t accord with policy imperatives – vast production of threat of war.
Let’s go on a couple of years to the late ’50s, when Khrushchev took over. He realized that Russia was way behind economically and that it could not compete with the United States in military technology and hope to carry out economic development, which he was hoping to do. So he offered a sharp mutual cutback in offensive weapons. The Eisenhower administration kind of dismissed it. The Kennedy administration listened. They considered the possibility and they rejected it. Khrushchev went on to introduce a sharp unilateral reduction of offensive weapons. The Kennedy administration observed that and decided to expand offensive military capacity – not just reject it, but expand it. It was already way ahead.

That was one reason why Khrushchev placed missiles in Cuba in 1962 to try to redress the balance slightly. That led to what historian Arthur Schlesinger – Kennedy’s advisor – called “the most dangerous moment in world history” – the Cuban missile crisis. Actually there was another reason for it: the Kennedy administration was carrying out a major terrorist operation against Cuba. Massive terrorism. It’s the kind of terrorism that the West doesn’t care about because somebody else is the victim. So it didn’t get reported, but it was large-scale. Furthermore, the terror operation – it was called Operation Mongoose – had a plan. It was to culminate in an American invasion in October 1962. The Russians and the Cubans may not have known all the details, but it’s likely that they knew this much. That was another reason for placing defensive missiles in Cuba.

Then came very tense weeks as you know. They culminated on October 26th. At that time, B-52s armed with nuclear weapons were ready to attack Moscow. The military instructions permitted crews to launch nuclear war without central control. It was decentralized command. Kennedy himself was leaning towards military action to eliminate the missiles from Cuba. His own, subjective estimate of the probability of nuclear war was between a third and a half. That would essentially have wiped out the Northern Hemisphere, according to President Eisenhower.

At that point, on October 26th, the letter came from Khrushchev to Kennedy offering to end the crisis. How? By withdrawal of Russian missiles from Cuba in return for withdrawal of U.S. missiles from Turkey. Kennedy in fact didn’t even know there were missiles in Turkey. But he was informed of that by his advisors. One of the reasons he didn’t know is that they were obsolete and they were being withdrawn anyway. They were being replaced with far more lethal invulnerable Polaris submarines. So that was the offer: the Russians withdraw missiles from Cuba; the U.S. publicly withdraw obsolete missiles that it’s already withdrawing from Turkey, which of course are a much greater threat to Russia than the missiles were in Cuba.

Kennedy refused. That’s probably the most horrendous decision in human history, in my opinion. He was taking a huge risk of destroying the world in order to establish a principle: the principle is that we have the right to threaten anyone with destruction anyway we like, but it’s a unilateral right. And no one may threaten us, even to try to deter a planned invasion. Much worse than this is the lesson that has been taken away – that Kennedy is praised for his cool courage under pressure. That’s the standard version today.

The threats continued. Ten years later, Henry Kissinger called a nuclear alert. 1973. The purpose was to warn the Russians not to intervene in the Israel-Arab conflict. What had happened was that Russia and the United States had agreed to institute a ceasefire. But Kissinger had privately informed Israel that they didn’t have to pay any attention to it; they could keep going. Kissinger didn’t want the Russians to interfere so he called a nuclear alert.

Going on ten years, Ronald Reagan’s in office. His administration decided to probe Russian defenses by simulating air and naval attacks – air attacks into Russia and naval attacks on its border. Naturally this caused considerable alarm in Russia, which unlike the United States is quite vulnerable and had repeatedly been invaded and virtually destroyed. That led to a major war scare in 1983. We have newly released archives that tell us how dangerous it was – much more dangerous than historians had assumed. There’s a current CIA study that just came out. It’s entitled “The War Scare Was for Real”. It was close to nuclear war. It concludes that U.S. intelligence underestimated the threat of a Russian preventative strike, nuclear strike, fearing that the U.S. was attacking them. The most recent issue of The Journal of Strategic Studies – one of the main journals – writes that this almost became a prelude to a preventative nuclear strike. And it continues. I won’t go through details, but the Bin Laden assassination is a recent one.

There are now three new threats. I’ll try to be brief, but let me mention three cases that are on the front pages right now. North Korea, Iran, China. They’re worth looking at. North Korea has been issuing wild, dangerous threats. That’s attributed to the lunacy of their leaders. It could be argued that it’s the most dangerous, craziest government in the world, and the worst government. It’s probably true. But if we want to reduce the threats instead of march blindly in unison, there are a few things to consider. One of them is that the current crisis began with U.S.-South Korean war games, which included for the first time ever a simulation of a preemptive attack in an all-out war scenario against North Korea. Part of these exercises were simulated nuclear bombings on the borders of North Korea. That brings up some memories for the North Korean leadership. For example, they can remember that 60 years ago there was a superpower that virtually leveled the entire country and when there was nothing left to bomb, the United States turned to bombing dams. Some of you may recall that you could get the death penalty for that at Nuremberg. It’s a war crime. Even if Western intellectuals and the media choose to ignore the documents, the North Korean leadership can read public documents, the official Air Force reports of the time, which are worth reading. I encourage you to read them. They exulted over the glorious sight of massive floods “that scooped clear 27 miles of valley below”, devastated 75% of the controlled water supply for North Korea’s rice production, sent the commissars scurrying to the press and radio centers to blare to the world the most severe, hate-filled harangues to come from the Communist propaganda mill in the three years of warfare. To the communists, the smashing of the dams meant primarily the destruction of their chief sustenance: rice. Westerners can little conceive the awesome meaning which the loss of this staple food commodity has for Asians: starvation and slow death. Hence the show of rage, the flare of violent tempers and the threats of reprisals when bombs fell on five irrigation dams. Mostly quotes. Like other potential targets, the crazed North Korean leaders can also read high-level documents which are public, declassified, which outline U.S. strategic doctrine. One of the most important is a study by Clinton’s strategic command, STRATCOM. It’s about the role of nuclear weapons in the post-Cold War era. Its central conclusions are: U.S. must retain the right of first strike, even against non-nuclear states; furthermore, nuclear weapons must always be available, at the ready, because they “cast a shadow over any crisis or conflict”. They frighten adversaries. So they’re constantly being used, just as if you’re using a gun, going into a store pointing a gun at the store owner. You don’t fire it, but you’re using the gun. STRATCOM goes on to say planners should not be too rational in determining what the opponent values the most. All of it has to be targeted. “It hurts to portray ourselves as too fully rational and cool-headed. That the United States may become irrational and vindictive if its vital interests are attacked should be part of the national persona that we project.” It’s beneficial for our strategic posture “if some elements appear to be potentially out-of-control”. That’s not Richard Nixon or George W. Bush; it’s Bill Clinton.

Again, Western intellectuals and media choose not to look, but potential targets don’t have that luxury. There’s also a recent history that the North Korean leaders know quite well. I’m not going to review it because of lack of time. But it’s very revealing. I’ll just quote mainstream U.S. scholarship. North Korea has been playing tit for tat – reciprocating whenever Washington cooperates, retaliating whenever Washington reneges. Undoubtedly it’s a horrible place. But the record does suggest directions that would reduce the threat of war if that were the intention, certainly not military maneuvers and simulated nuclear bombing.

Let me turn to the “gravest threat to world peace” – those are Obama’s words, dutifully repeated in the press: Iran’s nuclear program. It raises a couple of questions: Who thinks it’s the gravest threat? What is the threat? How can you deal with it, whatever it is?

‘Who thinks it’s a threat?’ is easy to answer. It’s a Western obsession. The U.S. and its allies say it’s the gravest threat and not the rest of the world, not the non-aligned countries, not the Arab states. The Arab populations don’t like Iran but they don’t regard it as much of a threat. They regard the U.S. as the threat. In Iraq and Egypt, for example, the U.S. is regarded as the major threat they face. It’s not hard to understand why.

What is the threat? We know the answer from the highest level: the U.S. intelligence and the Pentagon provide estimates to Congress every year. You can read them. The Global Security Analysis – they of course review this. And they say the main threat of a Iranian nuclear program – if they’re developing weapons, they don’t know. But they say if they’re developing weapons, they would be part of their deterrent strategy. The U.S. can’t accept that. A state that claims the right to use force and violence anywhere and whenever it wants, cannot accept a deterrent. So they’re a threat. That’s the threat.

So how do you deal with the threat, whatever it is? Actually, there are ways. I’m short of time so I won’t go through details but there’s one very striking one: We’ve just passed an opportunity last December. There was to be an international conference under the auspices of the non-proliferation treaty, UN auspices, in Helsinki to deal with moves to establish a nuclear weapons-free zone in the Middle East. That has overwhelming international support – non-aligned countries; it’s been led by the Arab states, Egypt particularly, for decades. Overwhelming support. If it could be carried forward it would certainly mitigate the threat. It might eliminate it. Everyone was waiting to see whether Iran would agree to attend.

In early November, Iran agreed to attend. A couple of days later, Obama canceled the conference. No conference. The European Parliament passed a resolution calling for it to continue. The Arab states said they were going to proceed anyway, but it can’t be done. So we have to live with the gravest threat to world peace. And we possibly have to march on to war which in fact is being predicted.

The population could do something about it if they knew anything about it. But here, the free press enters. In the United States there has literally not been a single word about this anywhere near the mainstream. You can tell me about Europe.
The last potential confrontation is China. It’s an interesting one, but time is short so I won’t go on.
The last comment I’d like to make goes in a somewhat different direction. I mentioned the Magna Carta. That’s the foundations of modern law. We will soon be commemorating the 800th anniversary. We won’t be celebrating it – more likely interring what little is left of its bones after the flesh has been picked off by Bush and Obama and their colleagues in Europe. And Europe is involved clearly.

But there is another part of Magna Carta which has been forgotten. It had two components. The one is the Charter of Liberties which is being dismantled. The other was called the Charter of the Forests. That called for protection of the commons from the depredations of authority. This is England of course. The commons were the traditional source of sustenance, of food and fuel and welfare as well. They were nurtured and sustained for centuries by traditional societies collectively. They have been steadily dismantled under the capitalist principle that everything has to be privately owned, which brought with it the perverse doctrine of – what is called the tragedy of the commons – a doctrine which holds that collective possessions will be despoiled so therefore everything has to be privately owned. The merest glance at the world shows that the opposite is true. It’s privatization that is destroying the commons. That’s why the indigenous populations of the world are in the lead in trying to save Magna Carta from final destruction by its inheritors. And they’re joined by others. Take say the demonstrators in Gezi Park in trying to block the bulldozers in Taksim Square. They’re trying to save the last part of the commons in Istanbul from the wrecking ball of commercial destruction. This is a kind of a microcosm of the general defense of the commons. It’s one part of a global uprising against the violent neo-liberal assault on the population of the world. Europe is suffering severely from it right now. The uprisings have registered some major successes. The most dramatic are Latin America. In this millennium it has largely freed itself from the lethal grip of Western domination for the first time in 500 years. Other things are happening too. The general picture is pretty grim, I think. But there are shafts of light. As always through history, there are two trajectories. One leads towards oppression and destruction. The other leads towards freedom and justice. And as always – to adapt Martin Luther King’s famous phrase – there are ways to bend the arc of the moral universe towards justice and freedom – and by now even towards survival.
Noam Chomsky is Institute Professor (retired) at MIT. He is the author of many books and articles on international affairs and social-political issues, and a long-time participant in activist movements. More Noam Chomsky.