June 2019 news from Clinical Psychology, Quantitative Psychology, Meta Psychology, and Open Science. For prior news, see the rubric Psychology News on this blog.
- Ruscio with a new review paper “Normal Versus Pathological Mood: Implications for Diagnosis” published in the Annual Review of Clinical Psychology. [Personal note: Check out this paper with the argument that the ‘same’ disorder might be dimensional for some more categorical for others]
- Kessler et al. on “Machine learning methods for developing precision treatment rules with observational data” because clinical trials often have too little data to work with. Money quote: “In this paper, we propose to address the sample size problem by: working with large observational electronic medical record databases rather than controlled clinical trials […] and using ensemble machine learning methods rather than individual algorithms to carry out statistical analyses to develop the precision treatment rules”.
- Kevin Mitchell with a new blog post on missing heritability. As usual, the blog post also contains a down-to-earth summary of the literature, and an accessible explanation of many genetic topics.
- Markus Jokela et al. with a great follow-up paper on work I did in grad school: depression symptoms are differentially related to important outcomes and risk factors.
- Jennifer Tackett et al. (who was promoted to Full Professor two days back—congratulations Jennifer!!!) with “Psychology’s Replication Crisis and Clinical Psychological Science”
- Border et al. with a fantastic multiverse paper on depression candidate genes. Money quote: “the results suggest that early hypotheses about depression candidate genes were incorrect and that the large number of associations reported in the depression candidate gene literature are likely to be false positives.” Slate Star Codex wrote a fantastics blog post on the topic, and I wrote a blog as well.
- New preprint by Aaron Fisher on “Generating Accurate Personalized Predictions of Future Behavior: A Smoking Exemplar”. Includes time-series data for 52 participants (!) and syntax.
- Our formalized computational model of Panic Disorder is online, the result of a 2-year collaboration among clinical psychologists, statisticians, ecologists, & complexity researchers. Preprint here, summary of the project either in the recent talk by Don Robinaugh (project leader) or on Twitter.
- Blog post by Marilyn Piccirillo summarizing her 2 recent papers: “Foundations of Idiographic Methods in Psychology and Applications for Psychotherapy” and “A Clinician’s Primer for Idiographic Research: Considerations and Recommendations“. Includes neat reading lists for digging into these topics!
- Feczko et al. discuss the heterogeneity problem in an insightful review in Trends in Cognitive Sciences, and review approaches to identify psychiatric subtypes.
- Calamia on “Practical considerations for evaluating reliability in ambulatory assessment studies”. Money quote: “Given the time and effort involved in conducting ambulatory assessment studies, proper consideration of reliability is needed so that researchers have sufficient power to take advantage of the large amount of data often available within a single ambulatory assessment study”. I couldn’t agree more: measurement is highly under-researched for time-series psychological data.
- Matti Heino with a brief illustration blog post on the information we miss when averaging across people.
- Emorie Beck & Joshua Jackson published a new book chapter as a preprint on within person variability (history, unanswered questions, research agenda).
- Two new papers on causal inference in cross-lagged panel models & panel data analysis. Includes a youtube talk to accompany the papers (1, 2).
- Super mega utmost important news: Game of Thrones color palette in R ;)
- The set.seed() argument in R might not be reproducible anymore. Problem and solution described here.
- Egon Dejonckheere, Merijn Mestdagh et al. on “Complex affect dynamics add limited information to the prediction of psychological well-being”. Great twitter summary of the paper here.
- There was a new paper calling to abandon significance, and a few commentaries and blog posts were posted as response. I summarize them here.
- Riet van Bork’s dissertation is online, on interpreting psychometric models. Highly recommended.
- Pfaff et al. published “Tinbergen’s challenge for the neuroscience” in which they discuss clear criteria for when a neuroscience problem is solved by trying to understand a behavior’s development, its mechanisms, its function, and its evolution. These complex issues call for large-scale collaborations and rethinking funding infrastructure.
- Lonsdorf et al. show in their new paper “Fear Extinction Retention: Is It What We Think It Is?” that there are at least 16 different operationalizations of fear extinction, and that these vary considerably in their correlations. The authors replicate these differential relations in 4 independent datasets.
- Sara Weston et al. with a new preprint on “Recommendations for increasing the transparency of analysis of pre-existing datasets”. Covers the important topic how to try to implement open science practices when reanalyzing data.
- Preregistration workshop by Cassie Brandes! (given in her department)
- Preregistration workshop by Anna van ‘t Veer, Dan Simons, David Mellor, Simine Vazire, Katie Corker,and Stephen Lindsay! (given at APS 2019)
- Study examined 38,000 reviews from 13,000 grant applications from all disciplines. When applicants choose reviewers, there is considerable bias. This is also concerning regarding peer review for scientific publications: many journals *require* authors to suggest reviewers.
- Brian Nosek provides a preliminary report on the “The Rise of Open Science in Psychology”. TL;DR: 🇦🇺🇳🇱🇬🇧 are doing well, 🇩🇪🇺🇸🇨🇦 not so much; social psychology is doing well, clinical, education, & health psychology are not; young folks are better adopters than old folks; and 35% (!) of around ~2000 coded faculty have OSF accounts, which is a lot more than I had anticipated. Caveat: this is a bit of an artificial selection at present, as pointed out on Twitter.
- Sweet editorial by Ruggeri et al. on “Advancing Methods for Psychological Assessment Across Borders”, with a bullet-point style of recommendations for advancing psychological science such as preregistration, replication before exploration, publication of null-findings, and — most importantly — using the Oxford Comma ;).
- Claesen et al. published “Preregistration: Comparing Dream to Reality” in which they compared 27 preregistered papers in Psychological Science 2015-2017 with their preregistration plans; all papers deviated from original plans to some degree, & all but one did not disclose all deviations.
- Allen & Mehler on “Open science challenges, benefits and tips in early career and beyond”. Money quote: “We review 3 benefits and 3 challenges and provide suggestions from the perspective of ECRs for moving towards open science practices, which we believe scientists and institutions at all levels would do well to consider.”
- Richardson with a preregistration guide for dealing with outliers.
- Together with Kelci Harris, Emorie Beck, Jessica Flake, Lorne Campbell, and Melissa Kline, we organized the very first APS hackathon, on “Best Research Practices Made Easy”. You can find the products of the hackathon on the OSF, and our amateurish selfie here ;).
- Cool #MeasurementSchmeasurement paper by Enkavi et al. showing that self-report measures of self-regulation have much higher re-test reliability than behavioral (“objective”) measures. Great to see pNAS publishes critical methodological studies.
Favorite tweet of the month
A collection of fake Epidemiology methods such as Regression to the Collider, Minimum likelihood estimation, Monte Cristo Simulation, Propensity Cluster Censoring, and Fox-Hound M Estimation for Misclassification.
Can you come up with a fake epi methods word or phrase that sounds real?— Matthew Fox (@ProfMattFox) May 23, 2019
PS: I gave a keynote recently entitled “Depression is a problematic phenotype” at King’s College London as part of the “Dissecting the Heterogeneity of Major Depression” symposium. Audio recording and slides are online, and Jonathan Coleman provides a fantastic summary here.
Cannot find the full of the paper I’m linking to? I describe a way around the problem here; keyword ‘sci-hub’.