Читать книгу How Mumbo-Jumbo Conquered the World: A Short History of Modern Delusions - Francis Wheen - Страница 7

1 The voodoo revolution

Оглавление

Why might not whole communities and public bodies be seized with fits of insanity, as well as individuals? Nothing but this principle, that they are liable to insanity, equally at least with private persons, can account for the major part of those transactions of which we read in history.

BISHOP JOSEPH BUTLER (1692–1752)

Although 1979 may not have the same historical resonance as 1789, 1848 or 1917, it too marks a moment when the world was jolted by a violent reaction to the complacency of the existing order. Two events from that year can both now be recognised as harbingers of a new era: the return of the Ayatollah Khomeini to Iran and the election of Margaret Thatcher’s Tories in Britain. The Imam and the grocer’s daughter represented two powerful messianic creeds whose ‘conflict’ – though often more apparent than real – found its most gruesome expression some twenty-two years later, when the twin towers of the World Trade Centre in New York were reduced to rubble by a small kamikaze squad of Islamist martyrs.

What seemed to be a straightforward battle between modernity and medievalism was in truth a more complex affair, ripe with ironies: the most ardent apostles of Thatcherite neo-liberalism were themselves engaged in a struggle against the world as it had evolved during the twentieth century (welfare states, regulated economies, interventionist governments, sexual permissiveness), while the pre-modern Islamic fundamentalists – commonly portrayed as bearded loons in an Old Testament landscape of caves and deserts – had a high-tech savvy that continually amazed and infuriated their enemies. Osama Bin Laden knew how to exploit the power of satellite TV and twenty-four-hour news channels; his lieutenants were Westernised enough to pass without notice in Europe and the United States. And it was a Boeing jet which carried the Ayatollah Khomeini back to Tehran on 1 February 1979.

‘A nation trampled by despotism, degraded, forced into the role of an object, seeks shelter,’ the Polish journalist Ryszard Kapuscinski wrote of the Iranian revolution. ‘But a whole nation cannot emigrate, so it undertakes a migration in time rather than in space. In the face of circling afflictions and of reality, it goes back to a past that seems a lost paradise. The old acquires a new sense, a new and provocative meaning.’ Although millions of Iranians celebrated the Ayatollah’s arrival, by no means all were fundamentalist zealots yearning for jihad: Iran was a secular state by the standards of the region. What made his installation possible was that he was the only alternative on offer. Why? Because the increasingly corrupt and brutal Shah Mohammad Reza Pahlavi had suppressed the voices of democratic dissent. And who was responsible for this counter-productive folly? The United States, among others: the CIA had helped organise the coup which toppled Mohammed Mossadegh’s left-liberal government and reinstalled the Shah on the Peacock Throne. Hence the seething resentment, felt even by some Westernised Iranians, against the ‘great Satan’ of America. It was President Carter’s subsequent decision to let the Shah enter the US for medical treatment that provoked the storming of the American embassy and the ‘hostage crisis’.

Ironically enough, Jimmy Carter was the only president who had dared to defy the conventional wisdom that guided American foreign policy for more than three decades after the Second World War: that in order to ‘contain’ the spread of Communism it was essential to support anti-Marxist dictators in Africa, Asia and South America, and to look the other way when they were torturing or murdering their luckless subjects. Although the founding fathers said in the declaration of independence that ‘governments are instituted among men, deriving their just powers from the consent of the governed’, and promulgated the American constitution to ‘establish justice … and secure the blessings of liberty’, their successors in the second half of the twentieth century were reluctant to bestow these blessings beyond their own borders. Under Carter, however, even strategically important countries on America’s doorstep – Nicaragua, El Salvador, Guatemala – were warned that further US aid was dependent on an improvement in their human-rights record. In an address at Notre Dame University on 22 March 1977, Carter deplored the ‘inordinate fear of Communism which once led us to embrace any dictator who joined us in that fear’, and called for a new foreign policy ‘based on constant decency in its values and an optimism in its historical vision’ – echoing Abraham Lincoln’s description of liberty as ‘the heritage of all men, in all lands everywhere’.

His conservative critics warned that by forcing right-wing despots to civilise themselves he was effectively hastening their downfall, to be followed by the installation of revolutionary dictatorships instead. The argument was summarised most bluntly by an obscure academic, Jeane Kirkpatrick, in her 1979 article ‘Dictators and Double Standards’, published in the neo-conservative magazine Commentary. ‘Only intellectual fashion and the tyranny of Right/Left thinking’, she wrote, ‘prevent intelligent men of good will from perceiving the fact that traditional authoritarian governments are less repressive than revolutionary autocracies, that they are more susceptible of liberalisation, and that they are more compatible with US interests.’

Although Kirkpatrick was in fact a Democrat, her article found an admiring audience among gung-ho Republicans as they prepared for the 1980 presidential campaign. ‘I’m going to borrow some of her elegant phraseology,’ Reagan told a friend after reading Commentary. ‘Who is she?’ He found out soon enough: by 1981 he had appointed Jeane Kirkpatrick as his ambassador to the United Nations, and was using her distinction between jackbooted ‘authoritarians’ and Stalinist ‘totalitarians’ to justify sending arms to the bloodstained regime in El Salvador. Even when three American nuns and a lay worker were murdered by the Salvadorean junta, Kirkpatrick expressed no sympathy at all for the victims but continued to recite her glib theory of autocracy. ‘It bothered no one in the administration that she had never been to El Salvador,’ the Washington Post observed, ‘and that one of the authorities she cited for her view of the strife there was Thomas Hobbes, an Englishman who had been dead for three centuries.’

Kirkpatrick shamelessly applied double standards of her own. Whereas right-wing tyrannies might take ‘decades, if not centuries’ to mature into democracies, she said, there was no example ever of a left-wing dictatorship making such a transformation. Hardly surprising, given that the world’s first Marxist state was only sixty-three years old at the time; had she waited another decade or so, examples galore would have refuted the argument. Nor did the Iranian revolution bear out her thesis that it was better for the United States to prop up tottering autocrats than to back reformers. As Professor Stanley Hoffman pointed out in the New York Times, postponement of democratic reform ‘prepares the excesses, sometimes the horrors, of the successor regimes’.

It has been said that opposition parties do not win elections: governments lose them. The rule applies in autocracies, too: hatred of the Shah, rather than universal Iranian longing for medieval theocracy, prompted the national rejoicing at the Ayatollah’s coup. Three months later, in Britain, Margaret Thatcher won the votes of millions of electors who probably had little enthusiasm for (or indeed understanding of) monetarism and the other arcane creeds to which she subscribed. All they wanted was the removal of an etiolated, exhausted government which had no raison d’être beyond the retention of office. Jim Callaghan’s administration had been limping heavily since 1976, when it was forced to beg for alms from the International Monetary Fund, and later that year he had formally repudiated the Keynesian theories of demand management that were accepted by all post-war governments, both Labour and Tory. In 1956 the Labour politician Anthony Crosland confidently declared that ‘the voters, now convinced that full employment, generous welfare services and social stability can quite well be preserved, will certainly not relinquish them. Any government which tampered with the basic structure of the full-employment Welfare State would meet with a sharp reverse at the polls.’ Twenty years later, following the onset of stagflation and the end of the long post-war boom, Callaghan informed the Labour Party conference that the game was up:

What is the cause of high unemployment? Quite simply and unequivocally it is caused by paying ourselves more than the value of what we produce. There are no scapegoats. That is as true in a mixed economy under a Labour government as it is under capitalism or communism. It is an absolute fact of life which no government, be it left or right, can alter … We used to think that you could spend your way out of a recession and increase employment by cutting taxes and boosting government spending. But I tell you in all candour that that option no longer exists, and that insofar as it ever did exist, it only worked on each occasion since the war by injecting a bigger dose of inflation into the economy, followed by a higher level of unemployment as the next step. Higher inflation followed by higher unemployment. We have just escaped from the highest rate of inflation this country has known; we have not yet escaped from the consequences: high unemployment. That is the history of the last twenty years.

Callaghan’s regretful message soon became Thatcher’s triumphant catchphrase, and was later adopted as the mantra of American evangelists for untrammelled global capitalism: there is no alternative.

At first, the new Tory prime minister proceeded with caution. There were plenty of old-style Tory gents in her Cabinet, and few people guessed what she would do to sabotage the post-war consensus – not least Thatcher herself. It was often remarked that, even when she had taken up residence in 10 Downing Street, the new prime minister continued to sound like a politician from the opposition benches, or even an impotent street-corner orator. When she censured her own employment secretary on the BBC’s Panorama programme, the Economist complained that ‘it is doing no good to the cause of party morale for the Cabinet’s most strident critic to seem to be the prime minister, especially on the highly public platform of a television interview’. Help was at hand, however, as neo-liberal soulmates cheered her on from across the Atlantic: in the same editorial, the Economist reported ‘the arrival of the ideological cavalry’ from the United States to rally the troops and stiffen the sinews.

‘The importance of Margaret Thatcher stems not from the fact that she is a woman and one who is both an attorney and the first-ever British Prime Minister with a science degree,’ Kenneth Watkins wrote in Policy Review, journal of the right-wing Heritage Foundation.

Her importance stems from the fact that she has a profound conviction, based on her birth, family upbringing and experience, that a successful free enterprise economy is the only secure basis for individual freedom for even the humblest citizen … If Margaret Thatcher fails, the door in Britain will be open for the headlong plunge to disaster in the form of the irreversible socialist state. If she wins, and win she can, she will have made a major contribution to the restoration of Britain’s fortunes and, in so doing, will inscribe her name in the history books as one who will have led the way not only for her own country but for the entire Western world.

Another conservative Washington think-tank, the American Enterprise Institute, despatched Professor Herbert Stein, who had chaired the Council of Economic Advisers under Presidents Nixon and Ford, to spend three weeks in Britain during the summer of 1979. He returned in high spirits. ‘The regime is dedicated to restoring the work ethic, initiative, personal responsibility, and freedom,’ he wrote in Fortune magazine.

It stresses these values not only as spurs to GNP growth but also as ends in themselves – quite simply the right way to live … The government wants to correct what it regards as the intellectual errors that have dominated British thinking for the past forty years. It finds the Socialist and Keynesian doctrines by which Britain has been governed since World War II to be intellectually uncongenial and economically self-defeating. To replace these obnoxious doctrines, it is resolved to preach what it holds to be economic truth and sense.

Even more gratifyingly, she won the approval of the two economists she most revered, both of them Nobel laureates. Milton Friedman, founder of the ‘Chicago school’ of monetarism and free-market theory, wrote an ecstatic column for Newsweek (‘Hooray for Margaret Thatcher’) urging American politicians to heed the British example. ‘What happens in Britain is of great importance to us. Ever since the founding of the colonies in the New World, Britain has been a major source of our economic and political thought. In the past few decades, we have been moving in the same direction as Britain and many other countries, though at a slower pace. If Britain’s change of direction succeeds, it will surely reinforce the pressures in the United States to cut our own government down to size.’ Three months after Friedman’s rousing hurrah, Forbes magazine sought a verdict from the other Thatcherite icon, Friedrich von Hayek, whose influential anti-Keynesian polemic The Road to Serfdom had been written in the final months of the Second World War. ‘I admire her greatly,’ Hayek confirmed. ‘Her policies are the right ones, but whether she’ll be able to get done what she knows must be done is another question.’ Quoting John Stuart Mill’s description of the Tories as ‘the stupid party’, he expressed his suspicion that Thatcher, like himself, was more of a nineteenth-century liberal than a conservative – an opponent, in other words, of any interference with the marketplace, whether from social democrats bent on social engineering or captains of industry who wished to keep out cheap imports.

Milton Friedman returned to Britain in February 1980 to launch an ideological Blitzkrieg – meeting Thatcher at Downing Street, promoting his new book Free to Choose and presenting a series of televised lectures in which he advocated ‘the elimination of all government interference in free enterprise, from minimum wage to social welfare programmes’. He cited the economies of Japan, South Korea and Malaysia to prove that prosperity depended on allowing the ‘invisible guiding hand’ of the free market to hold the tiller. ‘What happens here in Britain will have a very important influence in the US,’ he told the Washington Post’s London correspondent. ‘If Thatcher succeeds, it will be very encouraging. It is a fascinating experiment, and a good deal depends on it … Britain, and much of the world, is at a turning point after a fifty to sixty-year run of Fabian socialism.’ Thatcher’s election, he believed, ‘could mark the turning away from the welfare state back to the free-market economies of the nineteenth century’.

By then, Thatcher’s application of Friedmanite principles – restricting the money supply, cutting public spending – was indeed producing results. During her first year inflation surged from 9 per cent to more than 20 per cent; interest rates and unemployment both rose sharply; and Britain’s manufacturing industry, the legacy of that energetic nineteenth-century entrepreneurialism which Friedman and Thatcher so admired, was battered by recession.

This news escaped the attention of her transatlantic disciples, perhaps because they were distracted by the emergence of a hero on their own side of the pond – the old Hollywood actor Ronald Reagan, who entered the presidential primaries of 1980 reciting the incredible but irresistible promise that he would cut taxes, increase defence spending and still balance the budget by 1983. The wondrous alchemical formula had been devised by Arthur Laffer, a colleague of Milton Friedman at the University of Chicago, whose ‘Laffer Curve’ seemed to demonstrate that a government could actually increase its revenue by reducing tax rates. The rich would no longer feel impelled to seek out ingenious tax-dodging ruses, and the lower rate would stimulate economic growth, thus expanding the national revenue anyway.

Although the ‘supply-side economics’ espoused by the Reaganites had a veneer of scientific method, not least in the elegant parabola of Laffer’s curve, it was indistinguishable from the old, discredited superstition known as ‘trickle-down theory’: the notion that if the rich were encouraged and enabled to make themselves as wealthy as possible – through low taxes, huge salaries, stock options, bonuses and perks – the benefits of this bonanza would somehow, magically, reach the pockets of the humblest hop-picker or crossing-sweeper. In his Political Dictionary, William Safire attributes the theory (though not the title) to the presidential candidate William Jennings Bryan, who in 1896 referred to the belief ‘that if you will only legislate to make the well-to-do prosperous, their prosperity will leak through to those below’. The actual phrase ‘trickle-down theory’ first appeared in the 1932 presidential campaign, when Democrats mocked Herbert Hoover’s plan to engineer an economic recovery by making the rich richer. ‘It’s kind of hard to sell “trickle down”,’ Reagan’s budget director, David Stockman, admitted in an incautious interview with the Atlantic Monthly soon after the 1980 election. ‘So the supply-side formula was the only way to get a tax policy that was really “trickle down”. Supply-side is “trickle-down theory”.’

Even on the right, Reaganomics was not universally popular. One of Britain’s most fervent monetarists, Professor Patrick Minford, advised Margaret Thatcher that the Laffer Curve was nonsense; Reagan’s Republican rival George Bush mocked supply-side theories as ‘voodoo economics’. Only a few months later, however, Bush accepted the role of candidate Reagan’s running mate, and by 1981 the new president and vice-president were working their voodoo magic. Reagan’s first budget included a modest reduction in the basic tax rate, but his indiscreet colleague David Stockman revealed that this was merely a ‘Trojan horse’ for the far more drastic slashing of the top rate from 70 to 50 per cent – and, later, to 28 per cent. Tax-cuts for the rich were central to the supply-side superstition.

According to the Laffer Curve, the public coffers should then have swelled with extra revenue, so much so that the budget could be balanced within a year or two. The reality could hardly be further from the theory: during Reagan’s eight years in the White House the total federal deficit swelled from about $900 billion to more than $3 trillion. While his tax policies certainly precipitated an orgy of speculation in stocks and real estate, they did nothing to induce genuine economic progress: as Americans stopped saving and started spending, throughout the 1980s there was a continuous decline in the long-term capital investment on which growth and jobs depended. At the start of 1981 the new administration was assuring the nation that there would be no recession, but by the autumn it had already arrived, as the Federal Reserve raised interest rates to dampen the inflationary effects of the tax cuts. A year later unemployment in the US rose above 10 per cent for the first time since the 1930s.

Ronald Reagan was an incorrigible fantasist. (He once told the Israeli prime minister that he had been present at the liberation of Nazi death camps in Europe; in fact, his wartime duties in the army film unit never took him further afield than California.) Like many sentimental old hams, he could not always distinguish between his own life and the roles he acted. More surprisingly, others believed his fantasies: even today, tough conservative journalists come over all lyrical and moist-eyed when writing about the years of Reaganomics, recalled as a Gilded Age of prosperity and contentment.

It was indeed reminiscent of that previous Gilded Age a century earlier, notably in the widening gulf between a wealthy elite and the rest. As the political analyst Kevin Phillips recorded in his influential book The Politics of Rich and Poor (1990), ‘no parallel upsurge of riches had been seen since the late nineteenth century, the era of the Vanderbilts, Morgans and Rockefellers’. Income tax was abolished in the United States in 1872, not to be reimposed again until the First World War, and it was during this period that the great dynasties built their fortunes – and flaunted them. An ostentatious 1980s mogul such as Donald Trump, who erected the Trump Tower as a vainglorious monument, was merely following the example of those earlier nouveaux riches who built outrageously gaudy palazzos and châteaux on Fifth Avenue. The conspicuous extravagance of late-Victorian millionaires – exemplified by Mrs Stuyvesant Fish’s famous dinner in honour of her dog, which arrived wearing a $15,000 diamond-studded collar – was more than matched by the glitzy parties chronicled and celebrated every month in Vanity Fair, relaunched under the editorship of Tina Brown in 1983 as a parish magazine for the new plutocracy.

As in the first Gilded Age, scarcely any of the new abundance trickled down to the middle or working classes. Under Ronald Reagan, it was not until 1987 that the average family’s real income returned to the levels enjoyed in the 1970s, and even this was a misleading comparison since they were now working far harder for it: whereas in 1973 average Americans had 26.2 hours of ‘leisure time’ every week, by 1987 the figure had fallen to 16.6 hours. They were less secure, too, as short-term or temporary contracts demolished the tradition of full-time, well-paid and often unionised employment. The earnings of male blue-collar workers in manufacturing industry fell throughout the 1980s as their employers threatened to close the factory or move production overseas if American labour ‘priced itself out of a job’. There was also a revival of Herbert Spencer’s social Darwinism, which had last been in vogue at the turn of the previous century, as right-wing triumphalists argued that government should not interfere with the ‘natural selection’ of commercial markets.

Curiously, however, they seemed quite willing to let the government clear up any ensuing mess. In 1982 members of Congress were bribed to ‘liberalise’ the Savings & Loan industry, effectively promising that the public purse would cover any losses from bad investments made with savers’ money but also undertaking not to oversee or regulate these investments. The consequences, predictably enough, were rampant fraud, the collapse of more than 650 S&L companies – and a bill of $1.4 trillion, to be met by the taxpayer. In 1988 a report from the General Accounting Office, Sweatshops in the US, noted that another feature of the Gilded Age had returned, partly because of the official mania for deregulation: reasons cited for the reappearance of sweatshops included ‘enforcement-related factors, such as insufficient inspection staff, inadequate penalties for violations [and] weak labour laws’. But since the victims were penniless and often voteless workers, rather than middle-class mortgage-owners, the Reaganites blithely left them to the market’s tender mercies. Nor did they complain when the deregulatory zeal of Reagan’s Federal Communications Commission enabled a tiny and ever-shrinking group of large corporations to control most of the nation’s media enterprises – even though this concentration of power thwarted their professed desire for greater competition and choice.

The trouble with the Conservatives, Evelyn Waugh once said, was that they never put the clock back, even by five minutes. He could not have made the same complaint about Ronald Reagan or Margaret Thatcher, both of whom had a single-minded mission to free the capitalist beast from the harnesses and bridles imposed upon it during the previous half-century. In January 1983, when the television interviewer Brian Walden suggested that Thatcher seemed to yearn for ‘what I would call Victorian values’, she replied: ‘Oh exactly. Very much so. Those were the values when our country became great.’ Delighted by the cries of horror her remarks elicited from the liberal intelligentsia, she returned to the theme in subsequent speeches and interviews. As she explained:

I was brought up by a Victorian grandmother. We were taught to work jolly hard. We were taught to prove yourself; we were taught self-reliance; we were taught to live within our income. You were taught that cleanliness is next to Godliness. You were taught self-respect. You were taught always to give a hand to your neighbour. You were taught tremendous pride in your country. All of these things are Victorian values. They are also perennial values. You don’t hear so much about these things these days, but they were good values and they led to tremendous improvements in the standard of living.

Margaret Thatcher had a hostility to organised labour that would have won the respect of any grim-visaged Victorian mill-owner or coalmaster – as did Ronald Reagan, even though (or perhaps because) he himself was a former president of the Screen Actors’ Guild. ‘I pledge to you that my administration will work very closely with you to bring about a spirit of cooperation between the President and the air-traffic controllers,’ Reagan promised PATCO, the air-traffic controllers’ union, shortly before polling day in the autumn of 1980. But there was little evidence of this spirit when its members went on strike the following August: the new president announced that they would all be sacked unless they returned to work within forty-eight hours. More than 11,000 duly received their pink slips, their leaders went to jail and fines of $1 million a day were levied on the union.

Margaret Thatcher waited slightly longer for her own showdown. A thirteen-week strike by steel-workers in 1980, which ended with no obvious victor, convinced her that she must remove unions’ legal immunities and outlaw secondary picketing before turning the full armoury of state power against militant labour. Besides, other preparations had to be made. The union she most dearly wished to destroy was that of the mineworkers, who had brought down the previous Tory government in 1974 and were now led by the Marxist Arthur Scargill, but a lengthy pit strike could be resisted only if coal stockpiles were high enough to keep the home fires burning for the duration. So, as her biographer Hugo Young reported, from 1981 onwards the National Coal Board was ‘given every financial and other encouragement to produce more coal than anyone could consume, and the Central Electricity Generating Board given similar inducements to pile up the stocks at power stations’. At the same time the police were equipped with new vehicles, communications equipment, weaponry and body armour. When the National Union of Mineworkers went on strike in 1984, a year after Thatcher’s re-election, the government was ready for a long and bloody war.

With a belligerence that unnerved even some of her Cabinet colleagues, she described the miners as ‘a scar across the face of the country’ and likened them to the Argentine forces whom she had routed in the Falkland Islands two years earlier. ‘We had to fight an enemy without in the Falklands,’ she declared, in her best Churchillian style. ‘We always have to be aware of the enemy within, which is more difficult to fight and more dangerous to liberty … There is no week, nor day, nor hour when tyranny may not enter upon this country, if the people lose their supreme confidence in themselves, and lose their roughness and spirit of defiance.’ That autumn, when the IRA bombed a Brighton hotel where she was staying, she used the atrocity as further rhetorical ammunition: murderous terrorists and striking coal-miners were both conspiring ‘to break, defy and subvert the laws’. For Margaret Thatcher, the miners’ eventual and inevitable defeat represented nothing less than a victory of good over evil.

The prime minister could not claim the credit which she undoubtedly felt was her due, however, since throughout the dispute she had insisted that the war against ‘the enemy within’ was being prosecuted by the National Coal Board rather than Downing Street. The pretence fooled nobody – least of all the chairman of the NCB, who after one meeting at No. 10 complained to a reporter that ‘I have weals all over my back, which I would be happy to show you’ – but she felt obliged to maintain it, having often expressed her vehement dislike for government intervention in industry, or indeed in anything else. Even those branches of the state that enjoyed almost universal acceptance, such as public education and the National Health Service, appeared to Thatcher as quasi-Soviet abominations. ‘As people prospered themselves so they gave great voluntary things,’ she said in one of her many nostalgic eulogies to Victorian England. ‘So many of the schools we replace now were voluntary schools, so many of the hospitals we replace were hospitals given by this great benefaction feeling that we have in Britain, even some of the prisons, the Town Halls. As our people prospered, so they used their independence and initiative to prosper others, not compulsion by the State.’

This was Margaret Thatcher’s own version of trickle-down economics. Despite her notorious comment that ‘there is no such thing as society; there are individual men and women, and there are families’, she had limitless faith in the social conscience of the rich and might even have endorsed the mystical credo issued by an American coal-owner, George Baer, during the 1902 miners’ strike: ‘The rights and interests of the labouring man will be protected and cared for – not by the labour agitators, but by the Christian men to whom God in his infinite wisdom has given the control of the property interests in this country.’ Since God, in his infinite wisdom, presumably had similar influence over those who control the White House, he must have changed his mind during the middle decades of the twentieth century: from Franklin Roosevelt’s New Deal of 1933, which laid the foundations of a rudimentary welfare state, through Harry Truman’s ‘Fair Deal’ to Lyndon Johnson’s ‘Great Society’, the consensus was that even a prosperous capitalist nation should protect its weaker citizens – and its natural resources – against the depredations of the rich. To Thatcher this may have seemed tantamount to Communism, but it was also accepted by many conservatives. As the American author William Greider points out:

The ideas and programmes that formed the modern welfare state originated from the values of the right as well as the left, from the conservative religious impulse to defend the domain of family, community and church against the raw, atomising effects of market economics as well as from the egalitarianism of anti-capitalist socialism. The welfare state was, in fact, an attempt to devise a fundamental compromise between society and free-market capitalism.

It was a Republican president, Richard Nixon, who created the Occupational Safety and Health Administration, the Food and Drug Administration and the Environmental Protection Agency. ‘We are all Keynesians now,’ he explained.

By the 1980s, however, God had apparently become a champion of laissez-faire again. Whereas Margaret Thatcher’s twentieth-century predecessors mostly kept their Christianity to themselves, her own ‘crusade’ – as she often called it – was thoroughly religious in both content and style. Her father had been a Methodist lay preacher, and in her memoirs she proudly acknowledged the influence of a stern Christian upbringing: ‘I believe in “Judaeo-Christian” values: indeed my whole political philosophy is based on them.’ In 1951, as the prospective parliamentary candidate Miss Margaret Roberts, she told the Dartford Free Church Council that ‘the future of the world depended on the few men and women who were Christians and who were willing to practise and propagate that faith abroad’.

From the moment when ‘Thatcherism’ was first articulated as a distinctive brand of Conservatism, soon after she and her intellectual mentor Sir Keith Joseph established the Centre for Policy Studies in 1974, its disciples emphasised that this was not mere materialism but an entrepreneurial theology. Sir Keith’s famous Edgbaston speech of October 1974 caused a furore by proposing that low-income women should be discouraged from breeding, but its peroration (scarcely noticed at the time) was no less astonishing, and probably more significant. ‘Are we to move towards moral decline reflected and intensified by economic decline, by the corrosive effects of inflation?’ he asked, his face characteristically furrowed in anguish. ‘Or can we remoralise our national life, of which the economy is an integral part?’

Modern British politicians hadn’t previously used such language, but over the following two decades the evangelical message was heard again and again. ‘I am in politics because of the conflict between good and evil,’ Thatcher said, ‘and I believe that in the end good will triumph.’ Speaking to the Zurich Economic Society in 1977, she warned that:

we must not focus our attention exclusively on the material, because, though important, it is not the main issue. The main issues are moral … The economic success of the Western world is a product of its moral philosophy and practice. The economic results are better because the moral philosophy is superior. It is superior because it starts with the individual, with his uniqueness, his responsibility, and his capacity to choose … Choice is the essence of ethics: if there were no choice, there would be no ethics, no good, no evil; good and evil have meaning only insofar as man is free to choose.

She explicitly associated her belief in economic freedom of choice with the Christian doctrine of the same name, as a means of salvation. Self-reliance and property ownership were ‘part of the spiritual ballast which maintains responsible citizenship’. (Many Christians, of course, remained unpersuaded that the prime minister was doing God’s work. Anglican bishops such as David Jenkins and David Sheppard protested against the mass unemployment which she had created, and the 1985 report Faith in the City, commissioned by the Archbishop of Canterbury, blamed the Tories’ social Darwinism for the squalor, decay and alienation found in Britain’s inner cities. Thatcher reacted furiously to the criticisms, arguing that men of the cloth had no business commenting on her industrial and economic policies – apparently oblivious to the fact that her own metaphysical and religious justifications for the new ‘enterprise culture’ had legitimised these clerical ripostes.)

As the daughter of a man who had been both a preacher and a local councillor, Thatcher knew how usefully scriptural texts could be deployed to suggest divine sanction for political prejudice. Speaking to the general assembly of the Church of Scotland in 1988, she justified her blitz on benefit-claimants by quoting St Paul’s epistle to the Thessalonians: ‘If a man will not work he shall not eat.’ In another startling biblical exegesis, she said that ‘no one would remember the Good Samaritan if he’d only had good intentions. He had money as well.’ Like Reagan, Thatcher often implied that most public welfare provision was unnecessary: if the nation had enough millionaires, their natural benevolence and wealth-spreading talents would suffice. (Alas for the theory, charitable giving by Americans with annual salaries above $500,000 actually fell by 65 per cent between 1980 and 1988; the real Good Samaritans, who raised their donations by 62 per cent, turned out to be humbler souls earning between $25,000 and $30,000. Even more remarkably, the poorest in the land – those on $10,000 a year or less – gave 5.5 per cent of their income to charity, a higher share than anyone else.)

While affecting to admire Victorian philanthropy, Thatcher displayed a visceral contempt for the noblesse oblige of the wealthy paternalists in her own ranks –‘the wets’, as she called them, who ‘drivel and drool that they care’. Her real heroes were buccaneering entrepreneurs, and it sometimes seemed that no other way of life, including her own, was truly virtuous. (The political writer Hugo Young recalled one social encounter with Thatcher which ‘consisted of a harrying inquiry as to why I didn’t abandon journalism and start doing something really useful, like setting up a small business’.) ‘These people are wonderful,’ she raved. ‘We all rely upon them to create the industries of tomorrow, so you have to have incentives.’ Every obstacle to money-making – corporation taxes, antimonopoly rules, trade unions’ bargaining rights, laws protecting workers’ health and safety – had to be minimised or swept away altogether. No wonder the stock market on both sides of the Atlantic went wild in the 1980s. For all Thatcher’s pious injunctions about living within one’s income, this was a decade of borrowing and spending. Between 1895 and 1980 the United States had shown a trade surplus every year, but during the presidency of Ronald Reagan it was transformed from the world’s biggest creditor nation into the biggest debtor – and tripled its national debt for good measure.

Most people would regard it as suicidally irrational to embark on a credit-card splurge without giving a thought to how the bills can ever be paid. Yet when the denizens of Wall Street did just that in the 1980s, they were lionised. Gossip columnists and business reporters alike goggled in awe at the new financial titans – men such as Ivan Boesky, ‘the great white shark of Wall Street’, who was said to be worth $200 million by 1985, and Michael Milken, ‘the junk-bond king’, who earned $296 million in 1986 and $550 million in 1987. Boesky liked to describe himself as an ‘arbitrageur’ but the title of his 1985 book, Merger Mania, summarised the source of his wealth in plainer language. He had an enviable talent for buying stock in companies which, by happy coincidence, were targeted for takeover shortly afterwards, thus enabling him to sell at a profit. As it transpired, this owed less to the mysterious arts of ‘arbitrage’ than to old-fashioned insider-trading: Dennis Levine, a broker at the Wall Street firm Drexel Burnham Lambert, was tipping him off about imminent mergers or acquisitions in return for a percentage of Boesky’s spoils. Meanwhile Levine’s colleague Michael Milken was pioneering the use of high-risk, high-yield ‘junk bonds’ – essentially a means of converting equity into debt – to finance the merger mania. In May 1986 Boesky gave the commencement address at Milken’s alma mater, the Berkeley business school at the University of California, and won loud applause when he said: ‘Greed is all right, by the way. I want you to know that. I think greed is healthy. You can be greedy and still feel good about yourself.’ Six months later he was indicted for illegal stock manipulation and insider dealing, charges that eventually landed him in Southern California’s Lompoc federal prison. By 1991 Milken was also in a Californian jail, having incurred a ten-year sentence and a $600 million fine for fraud and racketeering.

As with ‘junk bonds’, almost all the macho financial neologisms of the 1980s were euphemisms for debt in one form or another. A ‘leveraged buyout’, for instance, involved purchasing a company with borrowed funds. More often than not, the security for these loans would be the target company itself, which would thus have to repay the debt from its own profits – or from the sale of assets – once the deal had gone through. Then there was ‘greenmail’, a technique pioneered by corporate raiders such as T. Boone Pickens and Sir James Goldsmith who would acquire a menacingly large stake in a company and then terrify the firm’s owners into buying it back at a premium in order to avoid a hostile takeover. This reaped huge rewards for the predators but left their victims fatally indebted or dismembered: when the Goodyear Tire and Rubber Company fell prey to Goldsmith and his associates, it had to spend $2.6 billion paying off the greenmailers.

Where once businesses made products, now they made deals. As more and more money was borrowed from abroad to cover the difference between what Americans produced and what they consumed, a few voices began to question how and when the IOUs for this bogus prosperity would be honoured. ‘This debt is essentially the cost of living beyond our means,’ the economist Lester C. Thurow warned in the late summer of 1987, when the US trade-account deficit reached $340 billion. ‘If the money we were borrowing from abroad all went into factories and robots, we wouldn’t have to worn’ because the debt would be self-liquidating. It’s the fact that we are using it entirely for consumption that makes it a serious problem.’

As any three-card-trick hustler knows, legerdemain depends for its success on fooling all the audience all the time: any members of the crowd who point out that the entire operation is a con must be silenced at once, or else punters will be markedly more reluctant to hand over their ten-dollar bills. So it was with the stock market in 1987, after five continuous years of giddy ascent. John Kenneth Galbraith, the grand old man of American Keynesianism, can probably claim the credit for being the first observer to state what should have been obvious: that Wall Street prices no longer had any relation to actual economic conditions. Writing in the January 1987 issue of the Atlantic Monthly, he argued that the market was now driven solely by ‘a speculative dynamic – of people and institutions drawn by the market rise to the thought that it would go up more, that they could rise up and get out in time’. It had happened before, in the months preceding the Great Crash of 1929, and as the historian of that disaster Galbraith was struck by several other parallels – most notably the faith in seemingly imaginative, currently lucrative but ultimately disastrous innovations in financial structures. ‘In the months and years prior to the 1929 crash there was a wondrous proliferation of holding companies and investment trusts. The common feature of both the holding companies and the trusts was that they conducted no practical operations; they existed to hold stock in other companies, and these companies frequently existed to hold stock in yet other companies.’ The beauty of this exaggerated leverage was that any increase in the earnings of the ultimate company would flow back with geometric force to the originating company, because along the way the debt and preferred stock in the intermediate companies held by the public extracted only their fixed contractual share. The problem, however, was that any fall in earnings and values would work just as powerfully in reverse, as it duly did in October 1929. Nearly sixty years on, Galbraith wrote, leverage had been rediscovered and was again working its magic in a wave of corporate mergers and acquisitions, and in the bank loans and bond issues arranged to finance these operations.

The Atlantic magazine was the perfect pulpit from which to deliver such a sermon. Three years after the 1929 débâcle it had published the following mea culpa, written by an anonymous denizen of Wall Street, which now reads like a pretty accurate history of the 1980s as well:

In these latter days, since the downfall, I know that there will be much talk of corruption and dishonesty. But I can testify that our trouble was not that. Rather, we were undone by our own extravagant folly, and our delusions of grandeur. The gods were waiting to destroy us, and first they infected us with a peculiar and virulent sort of madness.

Already, as I try to recall those times, I cannot quite shake off the feel that they were pages torn from the Arabian Nights. But they were not. The tinseled scenes through which I moved were real. The madcap events actually happened – not once, but every day. And at the moment nobody thought them in the least extraordinary. For that was the New Era. In it we felt ourselves the gods and the demigods. The old laws of economics were for mortals, but not for us. With us, anything was possible. The sky was the limit.

It is a familiar delusion, the conviction that one has repealed the laws of financial gravity. (Even Isaac Newton, the man who discovered physical gravity, succumbed. ‘I can calculate the motions of the heavenly bodies, but not the madness of people,’ he said, selling his South Sea Company stock for a handsome profit in April 1720 before the bubble burst; but a few months later he re-entered the market at the top and lost £20,000.) Writing of a Wall Street boom at the very beginning of the twentieth century, Alexander Dana Noyes recalled that the market ‘based its ideas and conduct on the assumption that we were living in a New Era; that old rules and principles and precedents of finance were obsolete; that things could safely be done today which had been dangerous or impossible in the past’. Days before the crash of October 1929, the Yale economist Irving Fisher (himself an active share-buyer) pronounced that ‘stock prices have reached what looks like a permanently high plateau’.

All the familiar portents of disaster – swaggering hubris, speculative dementia, insupportable debt – were evident by 1987. ‘At some point something – no one can ever know when or quite what – will trigger a decision by some to get out,’ Galbraith predicted. ‘The initial fall will persuade others that the time has come, and then yet others, and then the greater fall will come. Once the purely speculative element has been built into the structure, the eventual result is, to repeat, inevitable.’ He added, however, that there was a compelling vested interest in prolonging financial insanity, and anyone who questioned its rationale could expect rough treatment from the spivs organising the three-card trick, just as the eminent banker Paul Warburg had been accused of ‘sandbagging American prosperity’ when he suggested in March 1929 that the orgy of ‘unrestrained speculation’ would soon end in tears. Galbraith’s own article in the Atlantic had originally been commissioned by the New York Times but was spiked because the editors found it ‘too alarming’.

Its eventual publication did nothing to puncture Wall Street’s exuberance, at least for a while. (‘Galbraith doesn’t like to see people making money’ was a typical reaction.) On 8 January 1987, a few days after the professor’s gloomy New Year message, traders on the floor of the stock exchange were cheering and hurling confetti in the air as the Dow Jones industrial average broke through the 2,000 level for the first time. ‘Why is the market so high when the economy continues to be so lacklustre?’ Time magazine wondered. ‘Considering such questions mere quibbles, many optimistic analysts are convinced that the crashing of the 2000 barrier is the start of another major market upsurge that might last anywhere from two to five years.’ By the end of August the Dow had climbed to 2,722.42, the fifty-fifth record high achieved that year. Employing a system known as Elliott Wave Theory, the Wall Street guru Robert Prechter calculated that it would gain another thousand points in the next twelve months. Others put their trust in the so-called Super Bowl Theory, which held that the stock market always rose when a team from the original National Football League won the championship. And why not? The theory had been vindicated in eighteen of the previous twenty years, a more impressive success rate than conventional forecasting methods.

Against the madness of crowds, Friedrich von Schiller once wrote, the very gods themselves contend in vain. What hope was there for mere mortals wishing to understand the logic of a bull market that seemed unaffected by sluggish economic growth and a decline in business earnings? To quote Galbraith again:

Ever since the Compagnie d’Occident of John Law (which was formed to search for the highly exiguous gold deposits of Louisiana); since the wonderful exfoliation of enterprises of the South Sea Bubble; since the outbreak of investment enthusiasm in Britain in the 1820s (a company ‘to drain the Red Sea with a view to recovering the treasure abandoned by the Egyptians after the crossing of the Jews’); and on down to the 1929 investment trusts, the offshore funds and Bernard Cornfeld, and yet on to Penn Square and the Latin American loans – nothing has been more remarkable than the susceptibility of the investing public to financial illusion and the like-mindedness of the most reputable of bankers, investment bankers, brokers, and free-lance financial geniuses. Nor is the reason far to seek. Nothing so gives the illusion of intelligence as personal association with large sums of money.

It is, alas, an illusion.

During the South Sea Bubble of 1720 investors hurled their money into any new venture, however weird its prospectus: ‘For extracting of Silver from Lead’; ‘For trading in Human Hair’; ‘For a Wheel of Perpetual Motion’; and, most gloriously, ‘a Company for carrying on an Undertaking of Great Advantage, but Nobody to know what it is’. Similarly, some of Wall Street’s best-performing stocks in 1987 were enterprises that had neither profits nor products – obscure drug firms which were rumoured to have a cure for AIDS, or AT&E Corp, which claimed to be developing a wristwatch-based paging system. ‘The thing could trade anywhere – up to 30 times earnings,’ a leading analyst, Evelyn Geller, said of AT&E. ‘So you’re talking about $1,000 a share. You can’t put a price on this – you can’t. You don’t know where it is going to go. You are buying a dream, a dream that is being realised.’ AT&E soon went out of business, its dream still unrealised, but Geller’s rapturous illusion shows how the market was kept afloat at a time when any rational passenger should have been racing for the lifeboats.

By the second week in October, a few dents were appearing in the hitherto impregnable dreadnought. Some blamed a spate of investigations by the Securities and Exchange Commission into Wall Street’s biggest names – Drexel Burnham Lambert, Goldman Sachs, Kidder Peabody – following the arrest of Ivan Boesky. Others complained that the fall in the dollar (caused by a widening foreign trade deficit) was weighing on the market. An additional factor was ‘portfolio insurance’, the high-tech innovation which set off waves of computerised selling as soon as the market fell below a certain level, prompting a downward stampede and foiling any attempts to recover equilibrium. Meanwhile, the risk-free yield on thirty-year government bonds had risen to an unprecedented 10.22 per cent, only slightly below the risk-heavy 10.6 per cent return from the stock market. Some investors wondered why they bothered to buy shares at all.

The economic tsunami of 19 October 1987 –‘Black Monday’ – began with panic selling on the Tokyo stock exchange and then surged through Asia and Europe, following the sun, before engulfing Wall Street. The Dow Jones plummeted by 508.32 points, losing 22.6 per cent of its total value – almost twice the 12.9 per cent plunge during the crash of October 1929 which precipitated the Great Depression. ‘Of all the mysteries of the Stock Exchange,’ J. K. Galbraith had written in his history of the 1929 disaster, ‘there is none so impenetrable as why there should be a buyer for everyone who wants to sell. 24 October 1929 showed that what is mysterious is not inevitable. Often there were no buyers.’ Sure enough, on the morning of 20 October 1987 (‘Terrible Tuesday’), with no one willing to purchase stocks at any price, there was a full hour in which trading ceased altogether: it appeared that the world’s dominant financial system had simply curled up and died. What saved it from extinction was not the ‘invisible hand’ but the new chairman of the Federal Reserve Board, Alan Greenspan, who flooded the market with cheap credit shortly after midday and strong-armed the big banks to do the same, thus preventing Wall Street from dragging the whole US economy into recession. Meanwhile, the regulators of the New York stock exchange also intervened to ‘preserve the integrity of the system’.

Did chastened right-wing triumphalists notice that capitalism had been rescued only by swift action from the federal government and the regulators, precisely the kind of ‘interference’ they would usually deplore? Apparently not. Ronald Reagan signalled a return to business as usual by dismissing Black Monday as ‘some kind of correction’, and magazines such as Success continued to glamorise the casino culture. Neo-liberals applauded the ‘creative destruction’ of manufacturing industry, old work practices, public institutions and anything else that stood in their path. In Washington and London, right-wing institutes and foundations proliferated like bindweed, fertilised by the enthusiasm with which Thatcher and Reagan greeted their crackpot schemes. The bow-tied young men in these think-tanks prided themselves on ‘thinking the unthinkable’, coming up with ideas such as privatisation – which would later become an unchallengeable gospel, spread everywhere from Russia to Mexico. ‘We propose things which people regard as on the edge of lunacy,’ Dr Madsen Pirie of the Adam Smith Insitute boasted in 1987. ‘The next thing you know, they’re on the edge of policy.’ By a blissful irony, it was one such ‘unthinkable’ scheme – the poll-tax, dreamed up by the Adam Smith Institute and recklessly adopted by Thatcher against the advice of her colleagues – which helped to bring her down in November 1990.

By then, however, her task was effectively accomplished anyway. The demise of the Marxist states in Eastern Europe seemed to vindicate all that she and Ronald Reagan had done: both socialism and Keynesianism had been pronounced dead, and unrestrained turbo-capitalism installed as the new orthodoxy. ‘What I want to see above all,’ Reagan said, ‘is that this remains a country where someone can always get rich.’ And there was no shortage of hucksters willing to explain, for a fee, just how this could be achieved.

How Mumbo-Jumbo Conquered the World: A Short History of Modern Delusions

Подняться наверх