Assessment for Readiness: How Principled Measurement Can Advance Career Preparedness

REL Mid-Atlantic Recognizes Career and Technical Education Month
Feb 07, 2019

RELevant: Viewpoints and Findings from the REL Mid-Atlantic 

High school completion rates have risen steadily over the past 40 years. Currently, about 93 percent of 18- to 24-year-olds hold a high school diploma or an alternative credential such as a GED, and completion gaps between White, Black, and Hispanic students have narrowed dramatically. This is good news for career readiness because, quite simply, jobs demand credentials.

Yet the notion of high school completion as an achievement indicator faces some skepticism. Yes, more students are completing high school, but too few are ready for the next step. To wit, research conducted by Thomas Bailey and the Community College Research Center at Teachers College, Columbia University has found that 60 percent of students who complete high school and enter community colleges require remediation before they’re ready for credit-bearing coursework. Colleges aren’t alone in their concerns; 46 percent of U.S. businesses report difficulty finding suitable candidates.

None of this is news to those who follow education trends. Unfortunately, acknowledging the problem isn’t the same as identifying and pulling the levers that can address it. One necessary lever is rigorous and relevant assessment.

Photo of writing on white board

Of course, we already assess K–12 students in reading and mathematics, early and often. We should continue to do so, because reading and mathematics competencies are strong predictors of postsecondary readiness. But we do not consistently measure other predictors that matter—the soft skills or interpersonal and intrapersonal competencies that go beyond basic academic knowledge and offer a holistic picture of what students need to be ready to succeed after high school. Some skills, such as conscientiousness, are broadly applicable and have been shown in study after study to influence success in a range of postsecondary endeavors. Other constructs, such as sense of belonging, address more specific deficits and goals. Why would assessing any of these soft skills act as a lever toward improving career readiness? There are two reasons:

First, measurement sends a message. When we assess achievement in reading and mathematics rigorously and regularly, we signal that reading and mathematics achievement is valuable. If we begin assessing soft skills with the same discipline and consistency, we signal those skills’ importance.

Second, soft skills tend to be actionable competencies rather than fixed aptitudes, so targeted intervention becomes the next logical step. In fact, numerous programs already exist to develop a wide range of soft skills such as growth mindset, time management, and study skills. Measuring soft skills consistently will help educators sense which skills are a substantial weakness among their students, and thus may be targeted to maximize a program’s impact.

Of course, the two arguments above raise two legitimate concerns.

First, can measuring something really change it? No. Weighing a pig does not make it heavier. But principled measurement does provide reliable information, which can be leveraged to evaluate the impact of educational programs. A measure is not an intervention, but without measurement, it’s difficult to separate the strategies that are effective from those that are not.

Second, the soft skills literature is broad, and the terminology is diverse. So, which soft skills should we be measuring? In many cases, a comprehensive assessment battery capturing an array of soft skills will do the trick. For example, few educational frameworks have broader practitioner input and buy-in than the Four Cs (critical thinking, communication, collaboration, and creativity), and I have argued elsewhere that few psychological frameworks have a stronger research base than the Big Five (conscientiousness, agreeableness, emotional stability, openness, extraversion). For districts concerned about over-assessing their students, very brief measures are available for capturing key soft skills without overburdening students, teachers, and schools.

Of course, single-minded pursuit of “the one best framework” may undervalue the specific priorities of a given educational program or system. Measurement sends a message, so just as academic achievement tests must align with state learning standards, soft skills assessments should align explicitly with local aims and needs. Different educational agencies have different student populations and may have somewhat different goals. Therefore, picking the right measures to assess progress requires not just grounding in the research, but also accounting for local input. Fortunately, numerous federal programs, local foundations, and nonprofits can help. For example, the Regional Educational Laboratory Mid-Atlantic’s readiness for career entry and success alliance is currently working with a group of school districts in southwestern Pennsylvania to help teachers and administrators advance three goals:

  1. Identify key soft skills related to readiness for high-demand and high-growth occupations in southwestern Pennsylvania (for example, health care occupations such as occupational and physical therapists and jobs in the energy sector)
  2. Pick appropriate assessments (or develop their own in consultation with measurement experts)
  3. Monitor their students’ progress over time

To be sure, these assessments won’t eliminate readiness deficits on their own. Interventions—not measures—affect educational outcomes. But measures like those under construction in southwestern Pennsylvania can signal the importance of career success and empower the educators who develop these measures. If we start thinking of measurement not as an auditing device, but as a tightly aligned, indispensable element of the readiness toolbox, we stand a better chance of helping all students complete from high school with the knowledge and skills they’ll need for the next step.

 Cross-posted from the REL Mid-Atlantic website.



SHARE THIS POST

The opinions expressed are those of the author(s) and do not represent those of Mathematica.