We talked with Jeromy Anglim, whose article "Comparing Job Applicants to Non-Applicants Using an Item-Level Bifactor Model on the HEXACO Personality Inventory" will be published in a forthcoming issue of EJP.
Read on to find out more about Jeromy and his research, and his thoughts on open science practices.
Don't forget to check out a preprint of his article, here.
Q: Can you tell us a little about yourself and your research interests?
I am lecturer in the School of Psychology at Deakin University in Melbourne, Australia. In general, my research involves applying advanced quantitative methods to the analysis of individual differences and human performance. In 2016, I spent a few of months of my sabbatical visiting Belgium and the Netherlands where I connected with several personality researchers in the region including Peter Kuppens, Filip Lievens, Filip de Fruyt, Joeri Hofmans, Bart Wille, and Reinout de Vries. This EJP paper is the first of several papers to flow from the collaborations developed during this visit.
Q: What is your study about?
At an applied level, the study looked at differences between job applicants and non-applicants on the 200 item HEXACO Personality Inventory (HEXACO PI-R). The HEXACO conceptualizes personality as consisting of six dimensions: Honesty-Humility, Emotionality, eXtraversion, Agreeableness (versus Anger), Conscientiousness, and Openness to Experience. While previous research has estimated differences between applicants and non-applicants on other personality inventories, and a few studies have looked at responses to the HEXACO PI-R of undergraduate students role playing going for a job, to our knowledge, none had looked at real world applicant differences. Given the growing popularity of the HEXACO model of personality as an alternative to the Big 5, such estimates were important to have. This is particularly relevant to organisations wanting to use the HEXACO PI-R in employee selection settings. We found that applicants responded in more socially desirable ways scoring substantially higher on honesty-humility, extraversion, agreeableness, and conscientiousness. Applicants also exhibited less variability in their responses, suggesting that applicant responses may compress towards an applicant ideal. This highlights the importance of having job applicant norms (i.e., sample means and standard deviations used to understand participant responses) that are distinct from non-applicant norms. It also provides an assessment of the extent to which applicant response distortion is an issue.
However, I think that the more interesting contribution relates to how the study contributes to an understanding of the structure of trait personality. The study used an item-level bifactor model to represent the structure of personality, which suggests that responding to personality items is influenced by a combination of traditional personality traits (e.g., Big 5, HEXACO 6) and the extent to which participants wish to answer in socially desirable ways. This partitioning of variance is somewhat different from the way that higher-order factors models are currently conceptualised.
Q: What made you decide to submit your manuscript to EJP?
EJP has a strong reputation. EJP is also willing to publish longer and more substantial manuscripts. Thus, it felt like a good fit for our paper. I also respect the value that EJP places on open science (e.g., sharing data, analysis code, and materials). The review process was hard-work, but constructive, and worthwhile.
Q: What research interests are you currently, or will be exploring?
I have a general interest in the hierarchical structure of personality traits. I'm doing a few studies comparing the predictive validity of broad and narrow traits. I'm also planning to do further research on the bifactor model of personality. I think there are lots of interesting topics around self-other correlations, quadratic concepts of social desirability, longitudinal stability, modelling considerations, and so on. These interests also manifest in an applied interest in the consequence of the applicant context for personality measurement.
A second stream of research that I am interested in is concerned with the longitudinal dynamics of workplace performance. So, I'm doing various studies where workplace performance has been tracked over time in which we also get experience sampling data. This links with a general interest in mapping individual differences and workplace performance dynamics.
Q: We noticed that you have your data, reproducible analysis scripts, and materials on the Open Science Framework linked to your publication. Can you tell us about your experience with engaging in open science practices?
I've been passionate about this topic for a long time. In the early 2000s, whilst doing my PhD, I was a statistics consultant to students doing their theses. This exposed me to the many of issues with data analysis in psychology: small samples, low statistical power, poorly documented analyses, and motivated reasoning that we'd now label "p-hacking". My thinking at the time culminated in a talk to my fellow PhD students where I discussed issues such as the importance of staying open minded, focusing on effect sizes and confidence intervals, and interpreting p-values. I also advocated data sharing and using more reproducible data analytic tools like R instead of SPSS. Ten years ago, many of these ideas seemed radical and idealistic.
I have watched with great pleasure as various changes have come about. First, there is a growing awareness and documentation of issues associated with flexible data analytic practices and underpowered studies. Second, the tools and support for doing open science have improved dramatically. R has become more accessible with the emergence of Rstudio, StackOverflow, and packages like lavaan and psych. In particular, the work of Brian Nosek and the OSF has been fantastic in giving people tools and examples for how to conduct open science. Finally, the incentive structures are ever-so-slowly changing to encourage open scientific practices. In particular, it helps greatly when prestigious journals, such as EJP, show that they value submissions that embody open scientific practices. Sharing data, materials, and data analysis scripts takes work that benefits the scientific endeavour. So, it's great when the rewards to the individual align with what is good for science as a whole.
With regards to the social movement that is the open science transformation, I've tried to focus more on implementation aspects. For example, providing case studies of open scientific practices, blogging about reproducible research, and so on. Anyone interested can check out the data and R scripts used for our EJP paper. I'm also a moderator for a scientific Question and Answer Site on Psychology.
Q: Do you have any tips or advice for young researchers?
Honing skills in research takes a long time. I don't think there is any way around that. But here are a few thoughts:
It's important learn to write well. At an early PhD level, this might involve reviewing ideas around sentence and paragraph construction, and the general principles of writing and editing. Then, reading a lot, and deconstructing good example articles becomes an important strategy for learning all the hidden principles that underlie discipline-specific writing.
Overcoming publishing hurdles is also important. In particular, I found Brian McGill's summary of William Shockley's list of publishing hurdles to be illuminating. Shockley lists the following hurdles:
- ability to think of a good problem
- ability to work on it
- ability to recognize a worthwhile result
- ability to make a decision as to when to stop and write up the results
- ability to write adequately
- ability to profit constructively from criticism
- determination to submit the paper to a journal
- persistence in making changes (if necessary as a result of journal action)