Most Americans live and work with the expectation that, sometime between their 65th and 70th birthday, they will be able to retire from their workaday activities and spend their golden years enjoying grandchildren, traveling or other forms of recreation, or doing volunteer service such as missionary work that the demands of a career and growing family once precluded. Although a comparatively recent by-product of modern prosperity, retirement — like medical care and education — has come to be widely viewed as an essential component of the lifestyle of comfort and diversion to which several generations of modern Americans have become accustomed.
Yet the rite of retirement, for millions of Americans, is less reality than pipe dream. Every day, younger Americans shopping for groceries, eating out at family restaurants, or taking their children to school interact with elderly men and women running cash registers, waiting tables, driving commercial trucks and buses, teaching classes, and doing myriad other types of full-time work. To those of us who remember our grandparents as the people who always had time and means to entertain their grandchildren, because full-time careers and other concerns of youth and middle age were finished, this can seem jarring. And while some working senior citizens may do so out of a desire to keep busy and be productive, a majority continue working because they are financially unable to retire.
Although America continues to progress technologically and economically, the decrease in the percentage of senior citizens able to retire is a troubling indicator of a larger trend — that today’s rising generation can expect to be less well-off than their parents and grandparents, with many of the elements of the traditional American dream — such as a well-earned retirement — becoming relics of the past.
The expectation of retirement, an outgrowth of the development of large-scale financial independence among the middle and working classes, is not much more than a century old. Government-sponsored retirement, via pensions for the elderly, was first implemented in 19th-century Germany, by Otto von Bismarck. The idea of government (i.e., taxpayer-funded) pensions for the elderly began catching on elsewhere in Europe, and reached the United States during the Great Depression, when the Social Security system was created as part of FDR’s New Deal. It should be noted that government pensions as a means of providing retirement were (and remain) very much a part of the socialist program; prior to the advent of modern socialism, government pensions in some countries (including the ancient Roman Empire) were confined to those who had served in the military.
But Social Security has never been more than a partial retirement program; even in its early years, a Social Security check often was not enough to enable its recipient to retire fully. Nowadays, with Social Security benefits dwindling relative to rising costs of living, the anticipated Social Security check will not be enough to pay for all the necessities of even the most Spartan lifestyle. For this reason, Americans regard retirement as something primarily funded by private income and savings. Most large employers still offer retirement savings programs for their employees, while government programs such as IRAs encourage earners to set aside money for their retirement, using tax deductibility as an inducement.
Besides company-based pension plans, Americans traditionally relied on investment portfolios, accumulated over several decades of working and saving, to provide retirement security in old age. It was once axiomatic that money invested in stocks and bonds — especially in “blue-chip” corporations that returned reliable gains year after year — was an essential part of a retirement nest egg. In this respect, the life of this author’s grandfather was a typical embodiment of the “American dream” as it was understood during much of the 20th century. After graduating from college in the 1930s with an engineering degree, my grandfather began work at a large oil company. Despite purchasing a nice home in an upscale suburban New Jersey neighborhood and raising four children, he set aside a portion of his earnings every month to buy stocks. Because his peak career years coincided with the great post-World War II economic expansion, his portfolio grew by leaps and bounds, allowing him to retire in the mid-seventies in his early 60s, move to a bucolic Pennsylvania country home, and spend much of the rest of his long life helping his children and grandchildren. When he passed away, he left a considerable inheritance to all four of his children and indelible memories of a hardworking American who had risen from a humble upbringing in Pennsylvania coal country to live life to the fullest.
Such biographies seem outlandish in the “new economy” of the 21st century, a time when even a modest suburban home typically costs many times the annual salary of even higher-paid professionals such as doctors and engineers, when a family-sized SUV costs more than many small homes, and when a college degree runs well into six figures. Most Americans today are so deep in debt — from mortgages, car payments, and student loans — that becoming debt-free anytime prior to their 60th birthday is not a realistic prospect.
This article appears in the July 23, 2018, issue of The New American. To download the issue and continue reading this story, or to subscribe, click here.
And what is the ultimate source of this economy of debt? Five generations of inflation, courtesy of the Federal Reserve System and its creation of fiat (unbacked) money via the issuance of more and more debt. The operation of modern central banking, of which the American Federal Reserve (actually the central bank of the United States, as the Bank of England is to England or the European Central Bank is to the European Union) is a typical study, is often simplified as a process whereby governments print unlimited amounts of “fiat” money, thereby debasing the value of currency and causing a general rise in prices. This is how inflation works in principle, but it is a bit of oversimplification. If central banks merely cranked up the printing presses (or the computer keyboards, in the modern world), manufactured money at whim, and passed it out like candy at a Halloween parade, people would quickly cotton to what was taking place. They would recognize the source of all the new money, and there would ensue a catastrophic loss of confidence in the monetary system. This occurred during the American Revolutionary War, when the Continental Congress was so artless as to simply print lots of paper “Continental dollars” and ship them off as payment. Hyperinflation ensued, and the expression “worthless as a Continental” summed up the public’s esteem for America’s first fiat money.
The architects of modern central banking, most notably Benjamin Strong, the first head of the New York Federal Reserve branch, who is usually credited with designing most of the operational features of the Federal Reserve, realized that they would have to find more subtle mechanisms of introducing new money into the money supply, to keep the general public in the dark. And the mechanism they settled on was debt, both public and private.
For a large portion of its manipulation of the money supply, the Fed relies on “open market operations” (supposedly devised by Strong, and now used by central banks all over the world) whereby the Fed buys and sells government-issued debt (“treasuries”) at weekly sessions. To ensure that these operations are not too inflationary, the Fed normally buys debt already issued by the U.S. Treasury to previous buyers. This creates greater and greater demand for government debt, but only indirectly. Nevertheless, the overall effect is that the government issues a certain amount of new debt, which will ultimately be paid for by the creation, ex nihilo, of new money by the Federal Reserve. Open market operations are thus a very strong incentive for the government to issue more and more debt, leading to growth in the money supply and a general rise in prices, as the dollar’s value is eroded over time.
The Fed also creates powerful incentives for private and commercial debt by lowering interest rates. Technically, the Fed can lower (or raise, if deflation is the goal) the interest rate that it charges to lend money to member banks. It knows that by doing this, it will also allow banks to lower the interest rates on commercial loans, car loans, mortgages, student loans, and other debt that they issue. And lower interest rates — usually, significantly lower than free market rates — are a strong inducement to the private sector to borrow more money than they would otherwise. Easier credit incentivizes increased borrowing, which in turn drives up prices.
Put simply, the mechanism by which the Fed and all modern central banks operate is the ability to convert debt into money. Because this operation depends for its effectiveness on a lack of understanding by the general public, these procedures are seldom discussed outside of advanced finance courses at elite universities, and in the privileged councils of the highest echelons of banking and finance. Yet from time to time, there are cracks in the façade. In September 1941, then-Chairman of the Federal Reserve Marriner Eccles was testifying before a befuddled House Committee on Banking and Currency (now the Committee on Financial Services), when the following revealing exchange took place between Eccles and Congressman Wright Patman, the committee chairman, in response to Patman’s inquiry as to where the Fed had obtained $2 billion to purchase government bonds in 1933:
Eccles: We created it.
Patman: Out of what?
Eccles: Out of the right to issue credit money.
Patman: And there is nothing behind it, is there, except our government’s credit?
Eccles: That is what our money system is. If there were no debts in our money system, there wouldn’t be any money. [Emphasis added.]
In other words, if Americans suddenly began refusing to borrow money, our inflationary money system would collapse. Debt is the broken reed upon which the modern fiat money systems lean.
Inflation, properly defined as an increase in the money supply, has a devastating effect on the long-term economic well-being of ordinary people in two ways. First, the infusion of new money into the economy drives up prices of everything, including big-ticket items such as houses, cars, and college educations, requiring the borrowing of more and more money to afford them. Mortgages that used to take 10 or 20 years to pay off are now extended for three decades or more, meaning that the average individual can expect to spend his entire productive life paying off his mortgage. Student loans for ordinary four-year undergraduate degrees now routinely exceed $50,000, and also require decades to pay off — along with mountains of interest accrued along the way. Under such conditions, Americans have become accustomed to purchasing even ordinary items, such as household appliances, on credit. The second major consequence of inflation is that it disincentivizes old-fashioned savings. Even if they do not understand its mechanisms, most people grasp that, over time, money put into a savings account, a CD, or a cash stash will — thanks to steady inflation — lose value. They therefore reason that buying on easy credit or putting money into riskier investments dependent on the vicissitudes of the stock and money markets makes more sense than a simple savings plan. Tragically, under the modern regime of inflation, it is simply no longer true that a penny saved is a penny earned.
The long-term effect of all of this, of course, is that few Americans can realistically contemplate retirement, since they will spend their productive years paying off debt rather than saving and investing for their golden years. As those of us nearing retirement age can attest, youth passes quickly, and with it, opportunities to set aside money for a time in life when we will no longer be able to bounce out of bed at 5 or 6 a.m. and work five or six days a week, eight to 12 hours (or more) a day. Yet many Americans are less worried than they should be at the steady corrosion of their and their children’s standard of living.
One reason some of us do not get overly concerned about inflation is that it seems to be too insignificant to make much of a difference. Why get exercised over something that, we are told, has largely been “tamed,” that runs at an insignificant rate of one or two percent a year? Inflation is the bugbear of backward economies in corrupt Third World countries such as Argentina and Zimbabwe, we reassure ourselves, but is not a factor in “advanced” economies such as our own.
The whole truth is rather more stark. It is certainly true that countries such as Argentina have higher, more obvious rates of inflation. In such places, currency has virtually no worth at all, so earnings are immediately converted into foreign currencies such as U.S. dollars, or into precious metal, real estate, and other assets guaranteed not to depreciate. Such countries, whose currencies are not readily negotiable outside their own borders, do not have the luxury, as the United States does, of exporting some of the effects of inflation by selling their government debt to foreign entities.
But with our own “manageable” inflation rates, the full story is artfully concealed from the public, by the “smooth-tongued wizards” at the Bureau of Labor Statistics and other government and quasi-government entities. Their job is to massage statistics in order to divert public attention from the mess we’re in. The way in which inflation rates are calculated has been modified repeatedly in recent decades, such that our current official inflation rate of around 2.5 percent would be over six percent if calculated by methods used in 1990, and nearly 11 percent if reckoned according to the standards of 1980. What’s more, the official inflation rate of just over five percent during the depths of the Great Recession in late 2008 would have come out about nine percent in 1990 terms and 14 percent in 1980. These are numbers quite comparable to the infamous “stagflation” of the 1970s and 1980s, yet they are no longer news because the truth has been buried in a landfill of lies and misleading statistics. What’s more, these “inflation rates” are based on the consumer price index (CPI), which evaluates a basket of prices relative to goods and services deemed “consumables” by government economists who calculate them. The way the CPI is calculated has often drawn criticism for not covering all aspects of pricing, and thereby providing an incomplete snapshot of inflation. Housing prices, for example, are evaluated in terms of rental costs, not the actual cost of new houses. The cost of houses purchased is represented in terms of how much they would be worth as rental units, the rationale being that a house purchase is a one-time event, not a “consumable” product such as groceries or gasoline. Because the CPI does not accurately take into account the dizzying rise in the price of certain big-ticket items such as new houses, it tends to significantly underreport real inflation rates, and encourages the delusion that inflation is only about rising costs in the produce aisle and at the gas pump.
Caught in the inflationary vise between rising prices and vanishing savings, many Americans have adapted in the only way they can: by borrowing and spending money rather than saving it, and by assuming that somehow, the Powers That Be (the administrators of the Social Security program and of their corporate pension plans) will have the wisdom to ensure that they still have enough assets to retire in their golden years, as their parents and grandparents were able to do.
But the economic reality is far otherwise. As almost everyone is at least vaguely aware, the Social Security system is broke, dependent on continued government deficit spending to continue meeting its obligations to Social Security recipients. No amount of gradual reductions in the promised benefits can ultimately stave off insolvency for Social Security. This is an unpleasant truth that too few Americans are willing to accept. No one really knows how deeply in debt the Social Security program is, since its balance sheet is arbitrarily deemed “off-budget.” It has been estimated that, because of Social Security’s dire straits, the federal debt may be tens of trillions of dollars more than the official figure of around $21 trillion.
Whatever be the case, the value of Social Security payouts in real purchasing power has declined dramatically over the decades, and will continue to do so. And the inflationary regime has destroyed nearly 100 percent of the value of the dollar since the founding of the Federal Reserve in 1913, and will continue to wreak havoc on our national currency as long as the fiat money system is allowed to endure.
Nor is inflation the only reason for our slowly declining standard of living. In addition to participating in the inflationary con game, the federal government also spends lavishly on all sorts of welfarist conceits that violate both constitutional limits on federal power and the dictates of economic common sense. For example, student loans are not merely facilitated by a monetary system that rewards greater and greater indebtedness; they are also directly subsidized by the federal government, a factor that further contributes to unnaturally cheap and easy credit for education loans and the consequent run-up in the cost of a college education. To appreciate just how great this run-up has been, consider the findings of a recent article on Quartz.com, which showed that an average student in 1979 needed to work 182 hours at minimum wage to pay for a year’s college tuition — money that could easily be raised with a part-time summer job. In 2013, the average student would have to work 991 hours (a full-time, half-year job) to accomplish the same. As a more specific example, in 1979 a single credit-hour at Michigan State University cost 8.44 hours of minimum-wage work, meaning that an entire semester’s tuition could be paid for with a couple of weeks of such work. But by 2014, a single credit-hour at MSU cost a whopping 60 hours of minimum-wage work. And these figures pertain only to tuition; the cost of housing, textbooks, and overall fees have mushroomed alongside tuition. Quartz.com concludes that, by 1993, “working one’s way through college,” an age-old component of the American dream, had become impossible.
It is well documented that universities now spend enormous amounts on the bloated salaries of legions of administrators, many of them hired to enforce every trivial politically correct mandate and quota imaginable. Where student tuitions once paid the salaries of a university president, provost, and handful of essential administrative staff, in addition to the professors, they now pay to support the bloated bureaucratic minions made both necessary and possible by modern progressive Big Government. All of these conceits would vanish in a hurry were colleges and universities once again forced to operate without generous federal subsidies and student-loan guarantees. Otherwise put, if students had to pay their own way through college, up front or via loans with no deferred payments or interest schedules, they would demand that colleges and universities again provide top-notch, no-frills, cost-effective education. This would lead to, among other things, a swift return to college degrees that could be paid for within a few years of graduation, like those my parents received, and an end to the lifetime of bondage that is modern student-loan debt.
As another example, imagine what would happen to home prices if federal subsidies of home loans were eliminated. Under the current regime of inflation and subsidies, home prices have skyrocketed. In 1970, the average cost of a house in 2000 dollars was $65,600. By 1980, it had risen to $93,400, by 1990 $101,100, and by 2000, $119,600. Today the median home price is about $200,000, despite the housing crash last decade.
If the government subsidies, loan guarantees, and inflationary policies that have been driving the housing bubble were to go away, people would be far less willing to borrow massive amounts of money, and sellers would be forced to lower their prices. The traditional middle-class midlife ritual of burning the mortgage upon final payment would return, and most couples would be living in their own home by the time they reached their early 50s.
If such changes came to pass, Americans would rediscover the joy of paying for homes, cars, and college educations in reasonable time intervals, and once again contemplate being debt-free by middle age and retired in their seniority. We would once again be a nation of solvent, self-reliant people able to help our children rise even further than we could, and not dependent our entire lives on the illusory largess of bankers and Big Government.
But that is not what America’s leftist social planners want. They envision not a nation of rugged individualists but a cowed populace of dependent paupers. Inflation and government subsidies (including so-called entitlements) work together to produce such a result, and the disappearance of retirement — a fringe benefit of modern civilization only possible in a free market society that produces surplus wealth — is but one of the byproducts of the leftist program. Whereas traditional American culture valued — and could realistically contemplate — self-reliance, modern America is enthralled with government dependency. Self-reliance, after all, can only be realized by paying off debt and accumulating savings, both of which are increasingly difficult in inflationary times.
It has become fashionable to deride “millennials” for their supposed attitude of entitlement. But in fairness to the rising generation, how can they be expected to be as self-reliant as their parents and grandparents if the erosion of our economic well-being denies them the opportunities their parents had? The entitlement mind-set, which has been around in certain social strata for generations, tends to create its own demand. Today’s housing and education subsidies were put in place in response to the demands of earlier generations, especially during the ’60s and ’70s. The damage they have inflicted on housing and education costs has triggered cries for more entitlements, which have in turn exacerbated the problems.
What is needed in our day, if we are to return to — and perhaps even exceed — the standard of living of recent generations, including this writer’s, is a complete break with the erroneous ways of thinking urged on us by the Left. Instead of denigrating the free market system our ancestors enjoyed, we should celebrate and re-institute it. We should do away with the Byzantine regulations, subsidies, and other forms of federal interference in our economy that have created this crisis. Above all, the century-old regime of inflation that has quietly and steadily robbed middle-class America of its wealth should be done away with, the Federal Reserve shuttered, and fiat money replaced with the old-fashioned precious -metal standard that allowed generations of Americans to prosper.
Calls to do these things have been sounding for years, but so far, the “Swamp” has successfully resisted serious rollbacks of its leftist program. For things to change, more Americans need to understand exactly how their wealth and dreams are being stolen. That a candidate such as Donald Trump could be elected president shows that more and more Americans are waking up to the reality that something is seriously wrong with our country. The fact that, so far, Trump has found few supporters in Congress for many of his proposals, and that taxes and inflation remain high, indicates that too few Americans yet understand why and how we have reached such a pass.