(This article originally appeared on Fast Company.)
(This article originally appeared on Forbes, through the Forbes Human Resources Council, where HR executives across all industries offer leadership advice and insights.)
Today Criteria is proud to announce the launch of JobFlare, a new iOS app designed to help job seekers get connected with employers based on their abilities. The app features 6 fast-paced brain games that measure qualities that employers look for in their employees: attention to detail, verbal ability, math skills, and memory.
One US company received significant publicity recently when it introduced a controversial new “test” designed to weed out what they call whiney, entitled, millennial job candidates. The “snowflake test,” as it’s called, features a series of cherrypicked questions designed to determine if a job applicant has the same political and cultural viewpoints as everyone else at the company. Some of the questions include:
For some time, just about every industry was raving about the potential of Big Data – the process of analyzing enormous data sets to discover patterns and trends that can then be used to guide business decisions. In the world of HR, the discourse on Big Data became so prevalent that the term started to be used as a catch-all description for any type of predictive analytics in the hiring process. But long before the concept of “Big Data” took off, companies who favored data-driven, evidence-based hiring methodologies were using pre-employment tests to gather information on prospective employees. And while pre-employment testing may be an older, more established way of gathering data on job candidates, it differs in several critical ways (both ethical and practical) from Big Data.
As a pre-employment testing provider, we offer both general aptitude and personality tests, as well as micro-skills tests such as typing tests and computer skills assessments. We’ve written about some of the differences between general tests and more specific tests, and we’ve found that many people continue to have misconceptions about the profound differences between general and specific tests, both in terms of the science behind them and the types of results companies should expect from them.
Most hiring tools are designed to accomplish two primary tasks: to more accurately identify quality candidates, and to make the hiring process move more quickly and efficiently, for both employers and job seekers.
Cognitive aptitude is one of the best predictors of job performance because it measures so many key drivers of work success – the ability to solve problems, think critically, and learn new skills. But does cognitive aptitude vary from state to state?
Amidst all the buzz over the advent of “big data,” HR departments are increasingly focused on using data to improve their talent acquisition strategies. In our particular business—developing pre-employment assessments used by businesses to help inform their hiring decisions—we are seeing an increasing willingness on the part of employers to adopt evidence-based hiring tools. The goal of all this is simple: better hiring results, or in other words, improvements in quality of hire (QoH).
We’ve previously written about the use of the Wonderlic aptitude test on NFL draft prospects, pointing out that the popular press and NFL fans as a whole have often unfairly dismissed aptitude tests as irrelevant to future gridiron success. This seems to be based on jock stereotypes about the sport and on a misunderstanding of how tests, and predictive tools in general, work. Virtually every article about the Wonderlic test at the NFL draft mentions Dan Marino, who bombed the Wonderlic and went on to a Hall of Fame career, as evidence that the tests aren’t predictive of success in football. However, this type of anecdotal evidence clearly holds no weight when statistically determining whether or not a test works.