Last week, job-search site Glassdoor published a list of companies no longer requiring college degrees -- with Google and Apple notably among them. In this Facebook Live interview, hear from CompTIA CEO Todd Thibodeaux on what this means for the tech industry and ongoing skills gap. And below, read Thibodeaux’s column from the new issue of CompTIAWorld magazine on the true value and impact of a college education today.
When people ask me what I think about the value of a college education in today’s job market, my usual, cheeky response is to say students are spending a $150,000 to show an employer they can socialize effectively. That’s what it’s really come down to. For many employers, and many jobs, the hiring manager could care less where a student went to school and, often times, even less what their degree is in. And after a college grad has their first job, the school and degree are almost completely irrelevant. What have you done for me lately!
Over the last 20 years in particular, college has become more or less the single biggest barrier to entry into the job market. Employers dogmatically post job after job requiring a college degree – at a 92 percent rate in the tech industry. But when you blow even the slightest amount of dust off the top of the resume stack, it’s pretty clear the majority of those jobs (55 to 60 percent) didn’t really require a degree at all, let alone four or more years of study in subjects that have absolutely nothing to do with the job role objectives. How many entry level marketing jobs, for instance, couldn’t be more than adequately staffed by people with six to nine months of training?
Proponents of the university experience talk about the maturation process college helps provide. Students come into college as dewy-eyed lumps of unformed clay, dependent on others, and they graduate as well-formed self-sufficient adults with dreams and a vision for the future.
But is that really the case? And is college the only place you could gain that maturation experience? And the bigger question; does that process really need to take four years? From a pure-learning perspective, a typical college grad will absorb more about how to do a job in their first six to nine months on that job than their entire college tenure. They will learn more about teamwork, communication, meeting expectations and managing up – all in a specifically relevant context.
The last point in particular makes the case for internships. A college student without an internship or two under their belt is woefully unprepared for their first days in the workplace. But as much as universities try, the overwhelming majority of students never participate in an internship, and many times those that do find the experience to underwhelming. It’s the rare organization that takes the internship process seriously enough to provide mentorship, training, goal setting and actual relevant work assignments that require them to integrate as part of a multi-departmental team. More often, interns end up poorly supervised and instructed and with tasks no one else wanted to do or worse, assignments with no relevancy, the output of which is thrown in the trash the minute the intern turns in their security card.
What needs to change? We’ve had some disruption in the education market over the last decade or so, but a lot of it has just perpetuated the same stale model. Online universities made classes more accessible for some audiences, like adults transitioning to new careers, but the course loads look a lot like the standard fare.
We’ve had the laughably ineffective massive open online course (MOOC) model. The MOOC model is a lot like school choice. Your university doesn’t offer the quality courses you need, so you take them from one of the online exchanges like edX. The dropout and participation rates in these courses are disappointing and have not done anything to build a better mousetrap.
True disruption would start from scratch. For any particular job role, what are the key sets of skills and knowledge someone truly needs to do the job? What are the ongoing learning requirements of the job to further professional development? And what can we learn from the way professional certifications are developed to help us construct a model?
Organizations that develop professional credentials start first and foremost with a job task analysis (JTA). The JTA takes subject matter experts working in the field through the process of breaking down a job role into a series of learning objectives. The objectives identify in specific terms the knowledge and skills required. These may be highly technical, problem solving or soft skills. Those objectives are then turned into training plans, most of which could be covered in less than a year of classroom study and practical application. If you want to extend the model even further, put the learning plan on top of an apprenticeship and we would have a more efficient, cost effective, two-year model that could prepare almost any individual for any entry-level job in any industry. The apprenticeship model has all the advantages of a university degree program without all the useless overhead.
If the answers are right in front of our face why aren’t they adopted? The answer is inertia. This is a big ship with a huge amount of embedded interests. If the goal of college is to get kids ready for the job market, and employers say the model is not working, yet they perpetuate the model by requiring college degrees for most jobs, are they sending the wrong signals?
True disruption, leading to a better model, starts and ends with employers. They are the ones who have to begin to accept students from alternate non-traditional pathways. If they do, one and two-year models like the one I described, some of which are happening at community colleges, will begin to flourish. If they don’t, we’ll be stuck in the same quagmire we’re in, where increasingly large number of kids will spend six figures to get something of decreasing value.