karl taylor

5 minute read

I’m noticing a startling trend.

Over the past few years the number of job functions related to marketing online have grown dramatically. Roles haven’t always kept up and some shops always seem to have a slightly different way of doing things.

I first noticed this trouble when we began working through rounds of contract and part-time talent. Breaking apart who was responsible for what on a team is a chore. Some people play minor parts and are quick to take credit for all the work. Others had a very specific task that may not have been the right task for the situation. Divorcing the experience from the lesson it taught is part of the challenge of hiring. It isn’t new and it isn’t going anywhere.

But in many fields, job titles are fairly standardized. If someone tells you they worked as a third grade teacher, for example, you know exactly what they did. They taught third grade.

But when someone tells you that they’ve worked with Facebook Ads for three years, you really don’t have a great way of picking apart what that experience meant.

Many small businesses post a daily post on two social channels. Facebook and Instagram are in many cases a popular setup. Working out the math, you might be making as many as 10 original pieces of content that way. In reality, you may make slightly fewer and use a mix of OPC (other people’s content) and recycled assets. You may also spend a few minutes hitting “boost post” and an incalculable amount of time monitoring, engaging and responding to a (hopefully) growing community.

The trouble is, the ads platform is a lot more powerful than many people realize. Posts don’t just show up in your newsfeed (or the feeds of your audience) via magic.

If you’ve read through Facebook’s Marketing API documentation before, you’ll know what I’m talking about. You’ll be familiar with each of the different types of edges, fields and nodes in the Facebook graph. You’ll know how to split out Clicks and Link Clicks, and you’ll probably have a series of custom views you use to highlight the reporting metrics you care about.

If this is your experience, and you’ve got three years of it, you have proven that you can pretty much do anything you want in the world of ad targeting. You may need to learn the quirks of a new platform, or the nuances of a differences between auction styles, but you’re well on your way to developing a valuable hard skill.

The trouble is that at the end of the day, your job title won’t look that much different from your colleagues who are walking a less arduous path.

I haven’t found a great way to reliably differentiate between these two types of candidates, but we do have a work around I’ve grown to be rather fond of.

We’ll create an ad account that conforms to conditions you might find at a typical business. We’ll include some historic campaigns, we’ll spin up some post engagement ads. We’ll position everything from the audience on down to recreate a situation with an obvious problem and a handful of potential solutions.

Then we see what they find and what they do about it.

Ike Ellis suggests a similar approach in “I Will Not Do Your Tech Interview,” when the problem is verifying the skills of a key hire, the easiest way to do that is to create an environment where you can actually find that out. Despite the great variety of Commonly Used Aptitude Test Types, there really just isn’t a perfect replacement for documented, verifiable success.

I think a lot about Roger Nesbitt’s questions in “Designing a Great Technical Test Experience,” I’ve copied them here for convenience.

A technical test provides a framework for the following information to leak out:

How do they handle feedback, both positive and negative?

How fast do they pick up on new concepts?

How much have they been exposed to the concept of elegant code?

What do they do when they don’t know something?

What are they fast at? What are they slow at? What are they sloppy at?

Where are the gaps in their knowledge? How far does their knowledge extend? How aware are they of this?

If given an opportunity to, do they cheat by going outside of the rules of the test? Do they admit it when challenged?

How do they justify the decisions they made in their code? How defensive are they?

What are their values when it comes to development? How flexible are they with those values?

If you’re doing it right, it should be possible for your candidate to completely “flunk” the test, but for you still to hire them because you see the value they’ll bring over the next year.

I think there’s something to this.

If you (or your prospective team member) can accurately gauge strengths and weaknesses, there’s a good chance that individual will also be effective at implementing plans to improve on those strengths and mitigate the impact of any weaknesses. That’s a recipe for success.

So it doesn’t fix the problem of uneven candidate credentialing. It also doesn’t make it any easier for a well-qualified-but-poorly-packaged candidate to stand out, but what a project based approach can do is highlight the way your prospective new player will fit into the team. It should give you a clear picture of what learning projects you need to make sure your new employee undertakes. If you’re looking at this from the perspective of looking for a job, I suspect that this is the reason that folks doing hiring generally pay more attention to applications that are the product of a novel approach to problem solving.

While you may not be ready to change the way you do everything today, over the next few years we’re likely to see the popularity of these sorts of assessments grow. It might be a good idea to start thinking about.

comments powered by Disqus