Читать книгу Arms, Revenue, and Entitlements - William Mannen - Страница 4

From the Surplus to the Deficit Normative State

Оглавление

The long run trends in the ratio of debt to GDP through most of U.S. history show the pattern of rising debt during wartime and economic contractions followed by stabilization in debt levels during peacetime expansions. Yet today the federal debt assumes an unstoppable upward trajectory unprecedented in the annals of American public finance. In its extended baseline projections, the Congressional Budget Office (CBO) predicts that the ratio of debt to GDP will top 144% by 2049.[1] The federal budget now invariably produces deficits, which drives this debt. The last time the government generated a surplus occurred in FY 2001. The budget thereafter returned to deficit amid war and financial crisis. Yet even the four surpluses achieved in the late 1990s and very early 2000s only offered a reprieve from what had already become deficit normality.

Rising debt levels over the long run, triggered by near permanent deficits, must be recognized as harmful for both economic and institutional reasons, even if deficits should not be considered harmful per se such as during downturns. A standard justification for balancing the budget in the twentieth century centered on the threat that unrestrained deficits would produce inflation, and indeed the “Great Inflation” of the 1960s and 1970s arose at a time of rising government spending, in fact during a protracted war in Vietnam. This anti-inflationary justification may not be as germane today with the Federal Reserve struggling to hit its 2% inflation target. But the danger of rampant future inflation cannot be automatically discounted. What is more, the danger of rising government debt crowding out private investment and reducing output bears greater relevance in today’s economy now that interest income accrued from Treasurys are increasingly diverted into foreign countries.

The institutional justifications for reining in deficits are equally powerful. Faced with constantly rising debt levels, the federal government will be forced into allocating more and more of its revenue into net interest payments (adjusted for interest income earned from government trust funds), hampering effective governance. In the past, net interest as a percentage of total outlays spiked whenever government spending grew, but it receded again with a subsequent decline in spending. This future stabilization in net interest will not occur in the event of continually rising debt levels. The CBO projects net interest payments to rise from 9% of total outlays in 2019 to 20% by 2049, by which point servicing net interest will cost more than total discretionary spending.[2]

The long run trends in the growth of debt should not obscure the role played by deficits in more immediate terms. Debt accumulates to the extent of deficits generated in a given fiscal year, and multiple factors, including events abroad and political dynamics at home, affect the generation of such deficits at specific points in time. Whatever the factors behind it, debt continues to rise as the government regularly sustains deficits year upon fiscal year, an invariability that is at the heart of the deficit normative state.

A situational perspective focuses on deficits as they are incurred in each fiscal year and the factors compelling these deficits. The budget and appropriations process inevitably occurs within a specific context shaped both by national security concerns and domestic political imperatives. Under a process first established with the Budget and Accounting Act of 1921, the president must submit a budget request by the first Monday in February for estimated spending levels in the following fiscal year. The fiscal year used to run from July 1 to June 30 until it was changed to an October 1–September 30 time frame starting in 1976. The legislation mandating this change also created the House and Senate Budget Committees, which designate spending levels for the two Appropriations Committees in each chamber. The Appropriations Committees then determine spending levels for the subcommittees. Congress ideally passes a series of twelve appropriations bills for the president to sign, though in recent years it has heavily relied on expedients like continuing resolutions, which keep the same level of spending intact from the previous fiscal year. This process does not occur in a void. Real world factors constantly impinge upon the budget. The transition to the deficit normative state occurred because for various reasons the president and Congress made tax and spending decisions that led to deficits, and these deficits became only more and more routine.

A surplus normative state existed in the 1800s because the budget registered more surpluses than deficits on an annual basis, and these surpluses tended to coincide with economic expansions. That is not to say the government never sustained a deficit or that national debt did not grow. Deficits were unavoidable in wartime or in periods of financial and economic distress. The Jackson Administration briefly succeeded in paying off the national debt in 1835, an accomplishment facilitated by sales of abundant publicly owned land. When the Administration inserted a specie clause into its contracts for the sale of land, it inadvertently triggered a financial panic and a downturn that sent the budget back into the red. The antebellum deficits paled in comparison to Civil War debt, which soared to 30% of GDP, its highest level since the 1790s, as the federal government organized a whole new scheme of taxation, currency, and banking to sustain the war effort. Many of these policy changes from the war remained in place after it ended, including higher tariff walls, which helped turn the budget quickly back to surplus.

Indeed from 1866 until 1894 the budget turned in an uninterrupted string of surpluses, even with bouts of depression and deflation. The size of government was much smaller back then, and economic growth was robust in most years, despite deficiencies in the country’s monetary system. The government did not restore convertibility to gold until 1879 after having issued paper greenbacks during the war, while railroad booms turned to bust and caused panic in the major financial centers. Yet the railroad booms also fueled investment across the country. The transcontinental railroad, partly subsidized by the federal government, was completed in May 1869 with the last golden spike driven into the ground at Promontory, Utah. Capital and wealth became far more concentrated as businesses merged at an accelerating rate at the turn of the century. Over 1,000 firms vanished just in the year 1899.[3] As behemoths like U.S. Steel and Standard Oil wielded ever greater monopoly power in the business sphere, the political class responded with initiatives that marked the dawn of the regulatory state. Congress established the Interstate Commerce Commission in 1887, the first federal regulatory agency, and charged it with overseeing the railroads. It was followed three years later by the Sherman Anti-Trust Act, which outlawed combinations and conspiracies that unduly restrained competition. The corruption and other maladies of the Gilded Age aroused a full-scale progressive movement that led to additional reforms like the Pure Food and Drug Act of 1906 and the Fair Trade Commission in 1914 for more aggressively pursuing anti-trust violators. The regulatory state did not cause or lead to the deficit normative state per se, but it did increase discretionary spending to the extent of larger administrative expenses for running the new agencies.

The entitlement state as we know it today did not exist, but a forerunner of modern entitlements, pensions for Civil War veterans, did alter the trajectory in spending and affected budget balancing by the 1890s. The original pension program for veterans enacted in 1862 extended eligibility only to those Union soldiers who suffered a disability inflicted during the war, in addition to their widows and other dependents. The program remained fiscally limited through the 1870s until Republicans and Northern Democrats, pressured by lobbying groups like the Grand Army of the Republic veterans’ association and lawyers handling disability cases, started dramatically expanding benefits. Congress first passed an Arrears Act in 1879, which retroactively paid out benefits to veterans whose determination of disability due to wartime service occurred well after the war itself had ended. By the late 1880s, with calls for increased benefits and even universal pensions growing louder, Congress tried to pass legislation extending eligibility to veterans with disabilities suffered after the war. President Grover Cleveland, a Northern Democrat who had incidentally avoided service in the Union Army, regularly resorted to his veto pen in stifling this legislation, but he was defeated in the next election by Benjamin Harrison, a former brevet Union general who fully pledged his support to expanding the program. In 1890 Congress passed the Dependent and Disability Pension Act, extending eligibility to veterans with postwar disabilities that impaired their ability to perform manual labor. The legislation vastly increased the government’s annual spending, with pensions now constituting 40% of total outlays.[4]

The Civil War pension program has often been likened to modern day entitlements because it imposed uniform rules of eligibility to a vast swathe of the population. Like entitlements it led to increased spending that affected the government’s ability to regularly generate surplus revenue. The budget turned in a deficit by 1894, only four years after passage of the Dependent Act but only one year after a financial panic that crippled the money market and led to economic depression. The budget had still managed to balance just after previous panics in 1873 and 1884, but now a devastating crisis, with the U.S. economy again integrated into the international gold standard, combined with the dolling out of some $140 million in pensions, produced a series of deficits for the rest of the 1890s, culminating in spending to fund the Spanish-American War. The budget at this could point could still not yet be described as deficit normative. The Civil War pension program was by definition transitory and subject to natural attrition. A stronger economy in the early 1900s, fueled by increased gold production and higher rates of inflation, also eased the budget back into surplus during expansions.

Going to war with Spain and intervening on behalf of Cuban independence signified another factor impacting the continuity of the surplus normative state. The U.S. emerged as a rising power in the late 1800s, and such a new role compelled more military spending inevitably affecting the budget. This effect on the budget was not at first dramatic. For a while in the early 1890s, pensions exceeded the combined expenditures for the Army and Navy. But the U.S.’s entry into the global military arena was bound to drive up costs. Benjamin Harrison proposed a shipbuilding program in 1889 that would rectify the Navy’s dearth of battleships. In his annual report for 1889, Navy Secretary Benjamin Tracy laid the case for an Atlantic and a Pacific fleet each composed of battleships to secure the nation’s coastlines. Instead of describing a modern navy as an instrument of empire, he recast seapower in largely defensive terms, seeing it as the only viable way for the U.S. to defend itself against belligerent European powers and a modernizing Japan. In armored warships the U.S. Navy severely lagged other nations and did not even rank as one of the world’s top maritime forces. To remedy the dangerous imbalance in power, he called for a Navy of “20 battle-ships, 20 coast-defense ships, and 60 cruisers, or 100 vessels in all, which is believed to be a moderate estimate of the proper strength of the fleet.”[5]

Tracy’s recommendations reflected the thesis of Alfred Thayer Mahan in his seminal, soon to be published tome on seapower that large, blue water navies shape the course of world history. Some naval strategists would even go so far as to call the U.S. an “island nation” because of its long coastline on two different oceans. A modern navy with battleships was also bound to cost money. Yet Harrison’s shipbuilding did not send the budget into deficit. It was much smaller than the pension program, and Harrison in fact presided over four straight years of surpluses. To put matters in perspective, by 1891 naval expenditures, including the costs of shipbuilding, totaled $26.1 million, compared to $48.7 million in Army-related spending, pensions costing $124.4 million, and a budget surplus of $37.3 million.[6] Naval expenditures, in other words, were that year smaller than the entire federal surplus. But revenue greatly fluctuated depending on the state of the economy, and the next year, just prior to a terrible financial crisis, naval spending marginally rose to $29.2 million while the surplus dropped to $9.9 million.[7]

Shipbuilding still put added pressure on the budget, regardless of the economy’s gyrations. From a strategic vantage, shipbuilding ensured that the Navy had a modern fleet in being at the outset of the Spanish-American War. That war marked the high point of jingoism, an aggressive strand of patriotism contrary to the admonitions of defensive naval strategists, with the U.S. winning colonies and a new presence in the Pacific. These developments bore consequences stretching into the 1940s. The country was now obligated to defend the Philippines, Guam, and Wake, not to mention the recently annexed territory of Hawaii, and it more than ever needed the Navy to fulfill this function. Theodore Roosevelt launched the Great White Fleet in late 1907 to demonstrate American naval might just after the Japanese navy’s decimation of the Russian fleet at Tsushima Straits only two years before. The U.S. was indeed hardly alone in the Pacific. The British, French, and German Empires all controlled islands throughout the region, with Japan seizing German Pacific territory during World War I. The Navy considered Japan its chief Pacific rival and formulated War Plan Orange as a desperate attempt to relieve the garrison holding Manila Bay after defeating Japanese naval forces in a decisive surface action somewhere in the western Pacific. The plan never came to fruition after Japan’s surprise attack on the Pacific fleet battleships in Pearl Harbor while the previously underappreciated aircraft carriers remained fortuitously at sea.

The U.S.’s entry into the Pacific and the Caribbean at the end of the 1800s provides greater context for understanding isolationism. As a twentieth century phenomenon, isolationism was magnified by the bloodbath in Europe starting in 1914 and then the fear of another European war by the 1930s. This isolationism towards the horrendous slaughter going on in Europe or the prospects of another such slaughter was unsurprising, but it also did not necessarily signify rejection towards the rest of the world, especially other regions outside of Europe such as Latin America. Isolationism should be mainly thought of in relation to Europe and the World Wars. The U.S. emerged as a more engaged global power by the dawn of the twentieth century, and this new role did at times carry a budgetary impact, even if not always enormous. For instance, the federal budget ran deficits in 1904 and 1905 during and shortly after an economic slowdown, but Panama Canal spending also greatly contributed to these transitory deficits.[8] The budget then turned in a series of deficits starting in 1908 that resulted from the Panic of 1907 and its severe aftershocks. The surplus normative state survived for a while into the twentieth century, even with the beginnings of larger scale regulation over the economy and greater military power. The government in fact turned in another uninterrupted series of surpluses from 1920 through 1930 during the short-lived era of interwar prosperity at home. The Great Depression ended these surpluses in their tracks by 1931, and a surplus would not be reached again until 1947.

For taxation, the surplus normative state in the 1800s relied in large part on the tariff as a means for raising revenue. The tariff also served protectionist purposes, especially when Republican Congresses refrained from lowering the high Civil War–era duties in the late 1860s and 1870s. The federal government otherwise imposed excise taxes, with public land sales and other miscellaneous sources bringing in small amounts of additional revenue. To take the year 1890, for instance, when support for the tariff was still widespread, customs duties brought in $229.7 million in revenue as opposed to only $142.6 million from internal sources.[9]

Given their stewardship over industrial interests in the Midwest and Northeast, the Republicans, like their Whig forebearers, championed the tariff, while the Democrats had all along fought for free trade. A Democratic president, Grover Cleveland, and a Democratic Congress managed to enact the Wilson-Gorman Tariff during their one interregnum of joint power in the 1893–1895 session, but they failed to reduce overall tariff levels, and Cleveland refused to affix his signature to the law. They also revived the Civil War–era income tax in 1894, but the Supreme Court struck it down as unconstitutional the next year, concluding that it served as a direct tax that had to be levied proportionately with state populations. Even partial tariff reduction, meanwhile, did not stop Republicans from raising tariffs again in the late 1890s, or yet again in the early 1920s, which undid the Underwood Tariff signed into law by Woodrow Wilson. The Fordney-McCumber Tariff of 1922 became synonymous with Republican protectionism of that era, while the more reviled Smoot-Hawley Tariff of 1930 did much to dampen international trade without necessarily worsening the Great Depression. The push for Smoot-Hawley started as politics as usual for the Republicans after they again won the presidency and a huge majority in Congress in 1928. But by the time President Hoover signed the legislation the economy was entering a deep contraction, and the bill had already encountered stiff opposition from the academic and business communities.[10]

The Sixteenth Amendment to the Constitution authorizing an income tax, ratified in 1913, allowed the federal government to shift the tax burden onto personal and corporate incomes. But the personal exemption, written into the code at $3,000, or $78,000 in 2019 dollars, shut off the majority of taxpayers from liability. Taxes instead disproportionately hit the wealthy, and even under Republican administrations in the 1920s they brought in between 40% to 60% of revenue, with customs duties well under 20%.[11] The Revenue Acts of 1924 and 1926 reduced liability by expanding the personal exemption and cutting the normal and surtax rates. Treasury Secretary Andrew Mellon provided singular views on tax cutting, pointing to the need for lower rates and base broadening and even penning the famous tract Taxation: The People’s Business. In his annual report for 1924 he assessed the Revenue Act of that year, along with President Coolidge deeming it inadequate and further opining that “if we attempt to levy taxes inherently too high, those whom we seek to tax will find some of the many ways of avoiding the realization of an income which can be reached by taxation, and the source of the revenue will decline.”[12] Mellon wrote of tax cuts and base broadening in anticipation of the supply side movement some fifty years in the future. But in the next few decades economic depression and total war took tax rates in the opposite direction. Herbert Hoover by 1932 prioritized a balanced budget over lower taxes, and New Deal tax legislation led to even higher rates on the wealthy. World War II helped turn the tax code for the first time into a truly national system. In addition to hiking the top marginal rate to over 90%, Congress reduced the personal exemption, which had previously protected most middle income taxpayers from liability. Now they were brought into the system, causing income tax filings to rise by nearly 40 million during the war years.[13]

Meanwhile, multilateral trade agreements ratified after World War II only further detached the tariff from any lingering revenue raising purposes. Barriers were dramatically lowered again as a result of trade rounds in the 1960s and 1970s. By that point trade liberalization had gained greater support among both parties, especially the Republicans, though as domestic industries faced new competition overseas, resistance to more trade rounds grew. But this populist resistance, which found its home in certain quarters of the Democratic Party in a reversal of its historical role, did not stop the long-term trend already underway, which was that the multilateral trade rounds helped unify the free world in an epic new struggle against communism, a struggle that was geopolitical, economic, military, and ideological. With tariff rates declining, the government relied ever more for its revenue on income taxes, mostly personal and to a lesser extent corporate. Perhaps unsurprisingly, the deficit normative state took hold once income taxes generated the overwhelming share of revenue. By 1970, personal income taxes brought in $90.4 billion in revenue, along with $32.8 billion from corporate income taxes, compared to $15.7 billion from excise taxes and only $2.4 billion from customs duties.[14]

The surplus normative state could not survive the duration of the Great Depression and World War II, but not until the period from 1945 to 1991, coinciding with the Cold War and momentous political changes at home, did the budget decisively turn deficit normative. This transition was not at first obvious, especially because the first two presidents during this period, Harry Truman and Dwight Eisenhower, both adamantly insisted upon balanced budgets. Their fiscal legacies improbably aligned because they both succeeded in balancing the budget part of the time in office, attributable to a shared commitment even though they belonged to opposing political parties and had become personally antagonistic to one another by 1952. Yet they only partly succeeded in balancing the budget, and their successors were even less successful. Truman secured surpluses in 1947, 1948, 1949, and 1951, while Eisenhower did so in 1956, 1957, and 1960. Post-Truman and Eisenhower, the budget turned in a surplus only once more during the Cold War, in FY 1969, remaining otherwise in deficit until the late 1990s when international tensions had considerably subsided.

No single factor caused the transition to deficit normality. The emergence of the entitlement state out of the economic wreckage of the 1930s did not single-handedly produce deficits, particularly since the most significant program enacted during the New Deal, Social Security, was funded through dedicated revenue sources. Postwar military commitments in Europe in the late 1940s, a dramatic break from interwar isolationism, offered the first potential challenge to surplus normality, even as the Truman Administration initially kept defense spending in check. This was the state of affairs until the shocking outbreak of war in Korea, which utterly overturned the Administration’s fiscal and national security posture. The U.S. military emerged from the Korean War larger and more powerful, so much so that Eisenhower strived to use all his influence and credibility in these matters to keeping a lid on defense spending for the rest of the 1950s and preventing the country from descending into a militarized “garrison state.”

The domestic political situation changed by the 1960s, with John Kennedy and Lyndon Johnson both calling for more activist steps in promoting economic growth and a higher standard of living, first with deep tax cuts and then increased social welfare spending. By the mid-1960s, entitlements had started contributing to deficits year in and year out as programs like Medicare and Medicaid relied at least partly on general revenue for funding instead of exclusively on payroll taxes. Spending for the Vietnam War also contributed to deficits, and the federal government could not achieve a surplus after FY 1969, even with the withdrawal from Vietnam. In the final epoch of this era, Ronald Reagan promoted another round of deep tax cuts and dramatically increased defense spending that sent debt to GDP soaring under conditions of peace and prosperity, which his successor, George H.W. Bush, handled as best he could while working with a Democratic Congress.

During the Cold War era defense could not alone take the blame for persistent deficits. Entitlements exerted an increasingly heavy pull on deficits after 1965. The tax code also shared responsibility for rising and continuous deficits through a series of exclusions, deductions, and credits: an exclusion the most advantageous of these breaks for the taxpayer, with some benefits such as employer-sponsored health insurance going entirely untaxed, deductions serving as expenses subtracted from income, whether from gross income (“above the line” on tax forms) or adjusted gross income (“below the line”), through which taxpayers reduce liability, and credits an even more aggressive way of reducing liability, functioning as percentages that are multiplied by taxes owed, not taxable income. The tax code is thus Janus-faced, looking forward in one direction to the collection of revenue and looking back in the other to taxpayer refunds to protect a multitude of diverse interests.

The oft-cited dilemma of guns and butter, pitting national security commitments against domestic priorities, should be more thoroughly modified as the paradox of arms, revenue, and entitlements. The competing demands of higher defense spending, lower taxes, and bigger entitlement programs increasingly offloaded onto general revenue cumulatively drove deficits over the span of decades, with the result today of deficit normality. Non-defense discretionary spending cannot be wholly excluded from a consideration of factors contributing to deficit normality in the budget sheet. This category of spending rose sharply as a result of the Great Society programs, with the federal government funneling dollars into housing, energy projects, education, and revenue sharing. As a percentage of GDP, it rose higher in recessions, first to 4.4% in 1975 and then 5.1% in 1980. Yet mandatory spending programs, which mainly take the form of entitlements and other benefits (net interest does not count as programmatic spending), rose to 10% of GDP in the same period, twice as high as non-defense discretionary spending.[15] Without ignoring non-defense discretionary spending, this narrative focuses on the major drivers of deficits in the second half of the twentieth century, first defense spending, also changes in the tax code, and then entitlements funded from general revenue.

If the government is to ever return to surplus normality, in which the budget generates surpluses at least in times of solid economic expansion, then the first step in this direction must be correspondingly comprehensive, to address equally defense spending at a sustainable level, a tax code that provides maximum revenue, and entitlements that do not unduly burden the general revenue base. Governance is a messy practice, and no single answer lends itself to the task of better controlling debt and perhaps returning to budget surpluses at some point down the road. But seeing exactly how we reached this point provides greater clarity when conceiving of and implementing future prescriptions.

A situational perspective serves as an effective way to depict the transition to deficit normality. The factors impacting how government spends money and raises revenue occur in real time, the effect of events at home and abroad. Policymakers respond in specific contexts, and often they cannot look beyond short-term horizons. This narrative relies mostly on nominal data not adjusted for inflation from the given period because that is what policymakers were working with at the time. From this approach we can observe how and under what circumstances deficits became normal as each presidential administration contended with new national and international challenges. The focus on presidential and not congressional leadership is unsurprising when considering the accretion of presidential power over the twentieth century. Members of Congress hold great sway during budgetary disputes. Congress must after all pass a budget for the fiscal year before the president signs it into law, and members of Congress may robustly oppose a president with a more relaxed attitude towards deficits. Yet by law the president submits a new budget request early in the new year, and the presidential administration sets the terms of debate over what spending items should be prioritized. In the time frame of 1945 to 1991 the possibilities of successful budget balancing corresponded, with some exceptions, to the inclinations of the administration in office, while members of Congress, even if at times willful and outspoken, tended to assume a more reactive role.

The main narrative ends in 1991 because by that point, at the Cold War’s end, deficits had become normal, even with Congress and the president having worked feverishly to enact more aggressive deficit control legislation. The 1990s came to be regarded as a kind of hiatus from supreme national security threats and great power rivalry, and in the new century the same divergent demands of arms, revenue, and entitlements once again played out but now in a much more heated and vitriolic atmosphere.

A single momentous turning point in history can rarely be discerned, certainly at the time it is supposed to have happened and even decades later. The same holds true for the shift from the surplus to the deficit normative state. But by looking to the factors impacting the budget in the second half of the twentieth century, new military commitments abroad and regional wars, changes in the tax code, and expansion in entitlements, we can gain a better understanding of how this shift came to occur and perhaps how fiscal balance might one day be restored.

Arms, Revenue, and Entitlements

Подняться наверх