Interview: Christopher Brydges

Profile picture for Christopher BrydgesThis is a continuation of a series of blog posts interviewing different authors who have posted to PsyArXiv. The goal of these posts is to spotlight different uses of PsyArXiv and to help the community get to know some of our authors a little better. In this post, Hannah Moshontz, a graduate student at Duke University and a member of the PsyArXiv Steering Committee, interviews Christopher Brydges (@ChrisBrydgesPhD), Postdoctoral Research Associate at Colorado State University who responded to a tweet asking people about unusual uses of PsyArXiv.

Hannah: Can you tell me a bit about your manuscript? What did you find? How does this project relate  to your other work?

Chris: The original study was an undergraduate’s research project that I supervised, examining whether working memory and/or processing speed mediated the association between fluid intelligence and ADHD symptomology in a nonclinical undergraduate student sample. (I’d really like to test this in a clinical sample, but given the time and sample size required for the project it wouldn’t have been feasible for the undergraduate student’s project). We found that working memory, but not processing speed, fully mediated deficits in fluid intelligence associated with ADHD symptoms in this sample. This study was conducted in 2014, and was published in the Journal of Neuropsychology in 2015. The paper that was put on PsyArXiv as a preprint was a reanalysis of this data—rather than examining average reaction time, we looked at variability of reaction time and found that this (and working memory) fully mediated the ADHD–fluid intelligence associations.

My research has always been examining higher-order cognition, particularly developmental aspects of it (typical childhood development, typical aging, and development in children born very preterm) and this was the first time I was supervising a student’s research project, so this project seemed like a good way to conduct a study that was still somewhat in my area of expertise, but was also of some interest/relevance to the student—she’s now working as a clinical psychologist.

Hannah: You mentioned this preprint to me after I tweeted asking people about unusual uses of PsyArXiv, and you mentioned that you uploaded this preprint to make your CV more competitive for a job application. Tell me more about why you decided to post it, and what advantages you expected it to give you on the market. Do you have any evidence that it helped you how you thought it would?

Chris: In late 2017, I was coming towards the end of my contract for my postdoc position at the University of Western Australia—my PI at the time very kindly extended my contract for about 10 weeks, but didn’t have funding for any longer—and I was looking for a new job. I came across the advert for the postdoc position I’m now working in, and after reading the job description, finding the Google Scholar profile of my current PI, the lab website, etc., I had a pretty clear idea of where I could improve my application—in my case, learning about intra-individual variability in response time, what it’s thought to be a measure of, and how to calculate it. I was lucky to have this previously collected data that I could use to try out this new technique. From there, I spent a weekend going through the analyses step by step and writing a draft of the paper before sending it to the two coauthors from the original paper. Once they were happy with it, we put it on PsyArXiv and submitted it for publication.

I wanted it to go onto PsyArXiv because it’s my belief that if I’m going to include absolutely anything on my CV, I need to be able to provide verifiable proof that I have experience in that area/can conduct that particular analysis/can program in that particular language, etc. I don’t think it would’ve been worth nearly as much for me to say “I saw the job description and spent last week learning about intra-individual variability”! I think that my application had to be at least quite strong to begin with, but doing this hopefully showed that I was serious about the job, I’m a fast learner, and productive researcher. My PI has said expressed similar sentiments—my application was strong, but the preprint definitely added a couple of bonus points. Even though I had my own data to do this with, I believe that in the future this will be easier for motivated applicants to do themselves due to the number of open datasets being published now.

Hannah: How did you hear about PsyArXiv? Any general reflections on the value of using it? Any costs or fears related to using PsyArXiv (realized or not)?

Chris: I have quite a few projects on PsyArXiv and the Open Science Framework now, yes—a combination of preprints of papers submitted to peer-reviewed journals, and null results that have been written up to show that we attempted this research. The School of Psychological Science at the University of Western Australia (where my previous postdoc position was) is very progressive in terms of open science, and I heard about PsyArXiv and preprints during a meeting there. My biggest fear has been that I would get an email from someone saying that I’d analysed my data completely incorrectly, that I’m incompetent and should never step foot on university land again—this hasn’t happened yet though, fortunately! It actually means that my research practices have improved—after analysing data, writing everything up, the last thing I do before uploading everything is put all the data in a folder and run the scripts/code to get the results one more time, just to double check. This last check hasn’t caught anything yet, which means that everything I’ve uploaded so far should work and be consistent with the numbers in the papers. No one’s contacted me or anything, though I think I’ve got a few more likes, retweets, and followers on Twitter because I occasionally retweet PsyArXiv-bot when I upload something there.

Thank you, Chris!

Interview: Cory Costello

Profile picture for Cory CostelloWe are starting a series of blog posts interviewing different authors who have posted to PsyArXiv. The goal of these posts is to spotlight different uses and experiences of PsyArXiv and to help the community get to know some of our authors a little better. In our first post, Ben Brown, Associate Professor of Psychology at Georgia Gwinnett College and Chair of the PsyArXiv Steering Committee, interviews Cory Costello, a graduate student at the University of Oregon.  

Ben: First of all, congratulations on your first first-authored paper and the attention that the preprint has received! As I mentioned previously, PsyArXiv is starting a series of blog posts interviewing different authors about their uses and experiences of PsyArXiv. To start, would you mind introducing yourself to our readers?

Cory:  Thanks—both for the kind words and this opportunity! My name is Cory Costello, I’m a graduate student and doctoral candidate in the Psychology Department at the University of Oregon, working primarily with Sanjay Srivastava in the Personality and Social Dynamics Lab. Before getting to Oregon, I was an MA student in the Department of Psychology at Wake Forest University, where I worked primarily with Dustin Wood (now at University of Alabama). I did my undergrad at a small, public, Liberal Arts College called New College of Florida. My research interests revolve around personality assessment/measurement, personality development, interpersonal perception, reputation, and how personality is expressed and perceived online (in online social networking environments, such as Twitter). More broadly, I’m interested in methodology, data analysis, and how to make psychological science better, more open, and more reproducible.   

Ben:  Can you tell me a little about this new method for assessing personality?

Cory: The basic idea of the method was to take this idea of revealed preferences and apply it to personality assessment. Revealed preference techniques are used in some areas of psychology (e.g., close relationships/attraction) and seem to be even more common in behavioral economics. The basic idea of revealed preferences is that rather than asking people about their preferences for different features (e.g., do you find confident people attractive?), you can instead infer (or reveal) their preferences by correlating some criterion (a rating, a choice behavior, etc.) with the absence or presence of those features (e.g., do they rate more confident people as being more attractive?).

We attempted to take a similar approach to personality assessment, with the assumption that personality traits primarily describe patterns of trait-relevant behavior (e.g., someone’s level of a trait like dependable means they tend to more often behave in ways that are described as dependable). So we administered a bunch of short action scenarios to participants that set up a situation (e.g., you come home and your living space is messy from your roommate cooking) and asked them how likely they would be to respond in a particular way (e.g., how likely would you be to clean up the mess?). Then, we had another group of participants rate each of these actions (e.g., cleaning up after your messy roommate) on 23 bi-polar personality adjectives (e.g., dependable/reliable vs. undependable/unreliable). The correlation between the likelihood ratings and the independently rated trait-ness (we call this action characterization in the paper) of each action is what we’re calling a revealed trait estimate. So it’s sort of like one of those old choose your own adventure novels, but where your choices are being rated, and seen as reflecting your personality.

One kind of neat thing about this approach is that its pretty flexible. In addition to having the action scenarios rated for their trait-ness, we also had them coded for expected effects (e.g., the likelihood of being rejected), which we were interested in because of their relevance to personality trait judgments (which we had been working on in a previous, related project: Wood, Tov, & Costello, 2015). But in theory, you could code actions for other variables of interest.

I guess the last thing to mention is that we think this technique might be especially useful in contexts where more typical personality assessment may have problems. In our case, we were interested in applying it cross-culturally, as there have been some surprising results in cross-cultural comparisons of personality that may stem from measurement issues. Of course, some of the measurement issues may still affect revealed traits, but some might be mitigated. In particular, having a standard set of raters characterize the actions (rate their trait-ness) means that we can rule out that any cultural differences stem from using a trait term differently. For example, we can rule out that differences in revealed outgoingness between the two groups we looked at (which we found in both studies) are due to different understandings of what it means for an action to be outgoing, because that judgement is being made by a standard group. A similar technique (Situation Judgment Tasks) has been developed in the I/O psych literature on assessment to deal with some similar challenges in workplace assessment.

Ben: Sounds like a cool, novel approach! What do you think the benefits are of sharing such a novel approach on PsyArXiv?

Cory: My main motivation for sharing this particular work on PsyArXiv was to provide easier access to it. I first started working on this manuscript the summer of 2014. I had just finished my Master’s degree at Wake Forest and I hadn’t yet started the PhD program at University of Oregon. There was this 3 month gap in my access to all of the tools/resources provided by a university (library subscriptions, SPSS site license, etc.), but I was still trying to get work done on a few projects, including this one. Whenever I was searching for literature I was always so grateful to see a link to a PDF on Google Scholar (usually posted on a personal or lab website, since this was before PsyArXiv). I ended up with this list of articles I had to look up when I got access to a library subscription. I mean, it ended up working out alright in the end, but I can’t help but feel like a stronger preprint culture in psychology would have saved me a bit of a headache.

Incidentally, that summer is also when I first started learning R, which was prompted by not having access to SPSS. Necessity is a great motivator.

Had PsyArXiv existed when I first submitted this paper, I would have loved to post it earlier in the pipeline (either pre-submission, or at the time of initial submission) to elicit feedback. Sanjay Srivastava, Gerard Saucier, and I did that with a different paper, and received some interesting and helpful feedback on Twitter.

Ben: It sounds like that’s a pretty stable (and awesome!) motivating factor. How likely are you to share work on PsyArXiv in the future? Why?

Cory: I hesitate to say 100% likely—I have a pretty unshakable sense of uncertainty about just about everything. With that said, I’d say I’m extremely likely (effectively 100% likely) to share my work on PsyArXiv in the future. I’d say my primary motivation is still making the manuscript easier to access. As I alluded to in my last email, I’d like to get more in the habit of sharing pre-submission/working papers on PsyArXiv to elicit feedback, and so I suspect my motivation to share work on PsyArXiv will be increasingly driven by seeking feedback. I’m working on a few different manuscripts right now, and I think we (Sanjay and I) will post pre-prints at the time of initial submission (maybe before that; he and I haven’t explicitly discussed that yet).

One day, I’d like to post them before submission to potentially get feedback that I can incorporate into the manuscript before the initial submission, but I just don’t know if that extended timeline makes sense at my current career stage. In my perfect world, hiring committee expectations would be calibrated to a more extended timeline like that (e.g., preprints on a CV would be seen more positively; grad students would be expected to have fewer pubs, more preprints; etc.), but I don’t know if the field is there yet. So, for the time being, my goal is to post a preprint on PsyArXiv concurrently to submitting it to a journal, but hopefully I’ll be able to move posting the preprint to an earlier point in that pipeline eventually.

Ben: Thanks!  What has been your experience of sharing your work on PsyArXiv? Anything surprising?

Cory: It’s been a positive experience. I’ll admit, it was nerve-racking when we posted the first preprint I was involved in (Sanjay posted the preprint), especially because we posted that at the time of initial submission. I had this sort of slight feeling of dread: ‘what if there was some obvious mistake in the manuscript or data/code we posted?! At least if these things come up in peer review it’s private!’ But, in the end, it was fine, and led to some really great feedback on Twitter.

I think the most surprising thing occurred with that same preprint I just mentioned. I was telling someone about the project recently, and they responded with, “Oh, this is the preprint you guys posted in PsyArXiv, right?” I guess I still sort of have in my mind that if a paper hasn’t been published in a journal, no one would know or care about it. But that preprint has been downloaded quite a bit (314 as of today). We’re still figuring out what to do with that manuscript (it was rejected on our initial submission), but I feel really grateful that the work is available to others in the meantime.

Ben: That’s encouraging to hear that PsyArXiv has helped to make your work more visible.  Are there any new features you would like to see from PsyArXiv in the future?

Cory: I’d like to see features that sort of perform the curating role that journals often perform. It might be helpful if it recommended similar preprints (to the one you’re viewing). One thing that I think would be cool, and maybe the blog is aimed at doing this, would be to have some sort of monthly list of preprints (similar to subscribing to a journal). I’m thinking some sort of most viewed/downloaded set of preprints within topic area(s) the person has indicated interest.

Ben: Thanks for those ideas, Cory! I’ll pass them along and see what we can do. One last question: What would you say to a friend who is thinking about posting?

Cory: I would say go for it! It’s a great way to get your work out there – both so that people learn what you found, and so they can tell you how the work can be improved. Plus, it’ll make it easier for folks to find your work if they’re in between universities and still trying to get their research done.