The minimum wage — a government-mandated minimum amount that employers must pay employees — has been part of the American political landscape for roughly a century. In 1912, the state of Massachusetts became the first to take action to establish a minimum wage at the state level, and within a decade, many more states had followed suit.
It was not until the Great Depression that the first attempt was made to establish a federal minimum wage. In 1933, the Roosevelt administration made $.25 per hour the national minimum wage as part of the National Recovery Act, the centerpiece legislation of the New Deal. Only two years later, however, the Supreme Court ruled the national minimum wage unconstitutional. Undeterred, Roosevelt reinstated the $.25 minimum wage in 1938 as part of the Fair Labor Standards Act, and this time, a more pliant Supreme Court discovered constitutional authority for a federal minimum wage in the oft-abused interstate commerce clause of the U.S. Constitution. Accordingly, the minimum wage became the law of the land, and has remained so ever since.
Quite aside from the question of whether a federal minimum wage is constitutional (it is not), it is worth asking whether any government-mandated minimum wage is good policy. Most American voters now assume that government ought to set some type of minimum wage in order to prevent employers from exploiting their workers; they quibble only on what that level should be. In 2006 alone, for example, voters in six states approved state-level minimum wage hikes. Nowadays, there are 29 states with minimum wages higher than that mandated by federal law. But despite a broad popular consensus on the need for minimum wages, is it in fact good policy?
The bedrock assumption behind the minimum wages is that if employers are left to their own devices, some of them will pay their workers less than a fair living wage. Government oversight is therefore required to protect the rights of employees, or so it is maintained. But a moment’s reflection will reveal that such arguments are no different in substance from those advocating government price controls. Indeed, wage and price controls are often lumped together by policymakers — because they are in fact the same thing. A wage or salary, after all, is nothing more than the price paid for labor, a type of market good just like a commodity, a piece of furniture, or a piece of equipment. The economic rules of supply and demand that cause prices to rise and fall in a free market are precisely the same for the labor market as for any other economic good. If a manufacturer or retailer tries to charge too much for a given article, or to produce or sell an item of inferior quality relative to competitors, buyers will take their business elsewhere. In this way, economic goods in a free market tend toward the best possible price for the best quality goods.
It is likewise in the labor market. If an employer pays too little for labor, those who supply it — the labor force — will seek employment elsewhere. By the same token, if a laborer does shoddy work in exchange for a given salary relative to another laborer, who is willing to work harder and more efficiently for the same wage, the latter will consistently earn more, have more job security, and so forth. There is nothing inherently fair or unfair about the free market, including the labor market. But experience shows that the free market is the most efficient mechanism yet discovered for producing the best products for the best prices. It is the free labor market, for example, that once guaranteed that American laborers were not only the most productive, but also the best paid in the world.
On the other hand, when government chooses to intervene in the workings of the free labor market by mandating a minimum-wage level (which is invariably higher than what the free market would dictate), it disrupts and distorts the workings of the free market to the detriment of all. Just as price controls lead to scarcities of goods, so wage controls lead to scarcity of labor, or in other words, unemployment.
Minimum wages price the lowest-tier workers out of the labor market, since employers are unwilling to hire workers whose level of competence and experience would warrant, say, a $5.00 per hour wage, for $7.00 or $9.00 per hour, or whatever the minimum wage of the moment happens to be.
But, say supporters of the minimum wage, it is unjust to allow employers to pay employees less than a living wage. The response to this is twofold. First of all, who determines what constitutes a “living wage”? It is ludicrous to believe that government bureaucrats and policymakers are able to make sounder economic judgments than the countless decisions made by mostly rational actors in the free market. Secondly, not all labor need provide a “living wage.” Labor performed by teenagers still living at home (who typically have very few marketable assets other than a strong back) need not command a “living wage.” Neither does labor typically performed for a second or supplementary income. The fact that the teenage or part-time worker does not earn the same as a full adult or a full-time worker is not unjust, but a reflection of socioeconomic realities. The miracle is that a free market economy is flexible enough to provide work for such as these.
When government takes away such options, unemployment of the young and less-qualified is the inevitable result. Not only that, but the young and less-qualified are denied the opportunity to gain on-the-job experience helping them to acquire the skills they need to move up the economic ladder. This, and not general venality, is the reason that teenage and college summer jobs are becoming quaint relics of yesteryear, and that the work ethic and job skills of America’s youth have declined so drastically.
The minimum wage is bad policy at any level of government. By denying the young, the inexperienced, and the less-qualified a chance to work, it limits opportunity and encourages sloth. It has no place in a free market economy or in a free society, and ought to be abolished.