By Joe Kennedy

What good is a college degree, anyway? We all know that high-quality education is a foundation of economic opportunity and the American Dream. And we hear all the time that in an innovation-driven economy, companies' competitive success hinges on their ability to hire highly educated and skilled workers. But what does that college degree really tell us about the student who earned it?

The answer is that it's hard to say, because diplomas are opaque. They represent institutional brands, not objective or quantifiable measures of academic achievement. This basic flaw stems from the fact that colleges and universities hold a unique franchise: They have both the responsibility to educate students and the power to bestow academic credentials. If we want to improve the quality and reduce the costs of higher education, we should break the link between teaching and credentialing by encouraging qualified third parties to step in and offer independent evaluations of students' knowledge, skills, and abilities.

As it is now, the system is rife with misaligned incentives. For starters, even though we live in an era in which information technology and the Internet offer powerful new ways to conduct research, learn, and impart knowledge, schools have little incentive to encourage students to pursue learning outside the classroom through means such as massively open online courses (MOOCs), because it would cut into tuition revenue. Likewise, they have little incentive to raise educational standards, because doing so risks driving away customers (i.e., students). Why force students to work hard or give them poor grades if doing so would invite poor teaching evaluations and disgruntled "customers"?

For their part, students have little incentive to push themselves harder than is necessary to gain their degrees - because, for the most part, a degree is just a degree in the eyes of a potential employer. Why work hard to learn more if you know you are likely to receive a good grade, or at least a passing grade, and in any case receive a diploma that serves as official validation? At the end of the process, employers are left to rely on a combination of a college's reputation and a student's grade point average to measure quality, but both are imprecise measures, the latter particularly so after decades of rampant grade inflation. How are employers to know whether one applicant exerted herself harder or learned more in school than another?

It is against this backdrop of misaligned incentives that the costs of higher education have been going up while quality has been falling. In the last 20 years, the inflation-adjusted total price for a four-year college degree rose by 85 percent at public schools and 61 percent in private schools. That is significantly more than health-care costs increased in the same period. Much of this money has been spent on new classrooms, modern dorms, and even indoor rafting courses - which may make college more pleasant, but does not improve quality.

At the same time, many colleges are doing a poor job of teaching a significant portion of their students. A 2003 study found that full-time students only studied for 27 hours a week, down from 40 hours in 1961. Also in 2003, the National Assessment of Adult Literacy found that only 34 percent to 40 percent of college seniors were proficient in various forms of literacy. A review of students taking the Collegiate Learning Assessment found that 36 percent showed minimal gains in their four years at college. And meanwhile, 34 percent of business leaders said college graduates lack the skills and competencies their firms need.

Solving these problems will require going beyond popular policy prescriptions. Much of the current debate about higher education centers on making college more affordable to students by increasing public subsidies or lowering interest rates on student loans. But that would only shift the cost problem to taxpayers; it wouldn't actually solve it. Meanwhile, by giving students even less reason to care about the substance of what they are learning, free college may actually make the quality problem worse. Instead, we need to pressure schools to constantly improve on both cost and quality.

The first step is to create a process for measuring actual educational attainment that can be used to either supplement or replace the traditional degree awarded by an institution of higher education.

A new report from the Information Technology and Innovation Foundation proposes a series of policy initiatives.

First, Congress should encourage more skill-based testing by establishing a process to accredit organizations that provide certifications. It should also prime the market by encouraging federal agencies to accept alternative certifications in lieu of traditional degrees when they are hiring and by leaning on the private sector to do the same. Congress should allow students to use federal aid for alternative learning options, such as MOOCs, and it should press graduate programs to consider applicants with alternative certifications. Finally, the administration should conduct a regular survey of employer needs and build the findings into the certification tests so that students can demonstrate their readiness for the job market.

The catalyst for reform must be to fundamentally realign incentives to encourage more learning at lower costs, thereby ending the current trend of pricier degrees with unclear value. In the short run, alternative testing and accreditation will give students a way to demonstrate their employability. In the longer run, it should not only offer an alternative route to four years in a traditional institution, but create real incentives for universities and students to do better.

Joe Kennedy is a senior fellow at the Information Technology and Innovation Foundation. @JV_Kennedy