Here at the Chamber, we like to spend our days delving in theory (and by "we," I mean people who possess a greater cognitive capacity than I do). And this concept of human capital and student loans struck some of us as intriguing.
What if students repaid loans with a percentage of their future earnings? The National Center for Policy Analysis tackled the subject. Check out their analysis, which links to the original article in the Dallas Morning News by Rebecca Tuhus-Dubrow:
Originally the brainchild of Milton Friedman, human capital contracts are seen as a way to remove the risk of overwhelming debt for students and mitigate the social costs of trying to repay it. By gearing repayment to income, the contracts reduce those burdens sharply — a student who earns less money is obligated to pay less back.
The potentially lower payments explain why human capital contracts would draw students, but there’s an attraction for investors, as well, says Tuhus-Dubrow:
An education fund offers investors a steady flow, protection against inflation and a more targeted hedge for large employers.
Investors could be motivated by philanthropic goals: wealth alumni might see this as a way to help students attend their high-priced alma maters.
Foundations and schools could require students to sign contracts stating that nothing is owed up to a certain point, but high-earning graduates would repay a percentage of their income, allowing the foundation to recycle that money into later classes.
However, for all the benefits, the contracts pose multiple challenges in practice, adds Tuhus-Dubrow:
They create an incentive for graduates to hide their income and make it easier for them to not work, since no fixed payment is required.
Adverse selection and discrimination against low-income students could cause problems.
Further, it’s not clear how the contracts would be enforced, how the IRS would treat them and what would happen in the case of bankruptcy.
Boston Globe writer Jeff Jacoby recently scribed an interesting column stating the case for the separation of employment and health care. While most of us have accepted this as an inevitable reality during our lifetimes, he says it simply stems from World War II wage controls that are no longer relevant:
With more than 90 percent of private healthcare plans in the United States obtained through employers, it might seem unnatural to get health insurance any other way. But what’s unnatural is the link between healthcare and employment. After all, we don’t rely on employers for auto, homeowners, or life insurance. Those policies we buy in an open market, where numerous insurers and agents compete for our business. Health insurance is different only because of an idiosyncrasy in the tax code dating back 60 years – a good example, to quote Milton Friedman, of how one bad government policy leads to another…
Unconstrained by consumer cost-consciousness, healthcare spending has soared, even as overall inflation has remained fairly low. Nevertheless, Americans know almost nothing about the costs of their medical care. (Quick quiz: What does your local hospital charge for an MRI scan? To deliver a baby? To set a broken arm?) When patients think someone else is paying most of their healthcare costs, they feel little pressure to learn what those costs actually are – and providers feel little pressure to compete on price. So prices keep rising, which makes insurance more expensive, which makes Americans ever-more worried about losing their insurance – and ever-more dependent on the benefits provided by their employer.We thus ended up with a healthcare system in which the vast majority of bills are covered by a third party. With someone else picking up the tab, Americans got used to consuming medical care without regard to price or value. After all, if it was covered by insurance, why not go to the emergency room for a simple sore throat? Why not get the name-brand drug instead of a generic?