Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Another victim in the "replication crisis" bloodbath for psychology.

Personally, I'm probably going to wait 5 years before I believe any psychology science claim. Especially if it confirms something I already believe in :)



Yes, this has been a disaster that will be written about in the history books. I hope they take down all the TED talks about these papers that have been debunked.

In hindsight, it is kind of funny in a dark-humor way.


What is the rate of failure for "harder" sciences, does anyone know?

In saying that I think there will be a self correcting mechanism in more physically based science. If something can be replicated it can be built upon and new discoveries made. If something can't be replicated it will end up being forgotten about.

Also I do think that science needs a way of reporting failed studies and failed replication. At the moment all the glory and funding go to studies that produce positive results. Producing a failed result is still valid science and should be doccumented.


In all my time in academia, in my discipline (cross between engineering and physics), I never heard anyone try to replicate a study. No advisor would allow students to "waste" time on it.

They were also pretty open in vocalizing that they didn't believe the result in paper X in journal Y. They knew the problem existed.


There has been some attempt at this, with the SURE journal for unsurpising economics outcomes being one example.

Unfortunately, this approach seems to be at odds with the incentive structure in academia


Pretty sure dead and unconscious things are way easier to do objective science on.

Psychology must be one of the hardest disciplines to design experiments for.


It's a kind HN received wisdom that all psychology studies are false or invalid.

Personally, I see this blanket dismissal as one of the blind spots of our otherwise intelligent community.

Perhaps because so many of us are engineers, and that it strokes our egos to see the softer sciences fail?


> It's a kind HN received wisdom that all psychology studies are false or invalid.

Not ALL, but the replication crisis has demonstrated that at least 2/3 of such studies can't be replicated.

If you had a communication channel where 2/3 of your packets failed to deliver, I'd say a statement like "this channel isn't reliable" is very justified.


That's 2/3rd of medical research paper where the standards are the highest. I'd be shocked if the rate was only a mere 66% everywhere else. You'd have to be a fool to take on faith anything published today that wasn't replicated, hopefully more than once.


According to [1], replication rates in psychology varied from 23% to 38%. Social psychology was around 25%, while the more rigourous cognitive psychology was around 50%.

[1] https://en.wikipedia.org/wiki/Replication_crisis#Psychology_...


Psychology is like trying to debug a program with nearly zero tools, not even a reliable ‘printf(..)’. I don’t think it’s ego-stroking so much as it is just frustration at the seemingly intractable complexity of human behaviour.


> trying to debug a program with nearly zero tools, not even a reliable ‘printf(..)’.

That's called kernel development (from scratch) :P


Dismissal ("that's not true") is different than skepticism ("I can't trust this to be true"). You're "ego" quip is not very charitable of this community--a more likely and more charitable interpretation is that people don't trust the fields that fail to reproduce, especially those fields that appear to be more interested in advancing an agenda than research (e.g., all of the drama around the implicit association test and it's alleged ability to predict racism, etc).


> You're "ego" quip is not very charitable of this community

And the blanket dismissal often posted here in a knee jerk reaction isn't very charitable to a whole body of research, that is arguably harder to operationalize than the hard sciences.

I'd prefer a more substantial discussion of the findings or paper at hand rather than to just bring up the replication crisis anytime any psychological finding is mentioned on HN.

I think many people on HN, myself included, are kind of addicted to appearing clever.

This tends to produce responses that earn us clever points with the least effort possible.

We've heard about the replication crisis a thousand times, we've heard about the Gell Mann effect. Neither of these alone constitute a substantive argument against a particular scientific paper or piece of journalism.

I'm waiting for someone to write a bot to post Gell Mann and Replication Crisis replies to every newspaper article or psychology paper. Maybe someone already has...


> And the blanket dismissal often posted here in a knee jerk reaction isn't very charitable to a whole body of research, that is arguably harder to operationalize than the hard sciences.

Credibility doesn’t depend on the relative difficulty of proving out truth claims. We don’t owe Psychology or Sociology blind faith because their claims are (arguably) more difficult to confirm than harder sciences.

You may well be bored of hearing about the replication crisis—you’re entitled to your opinion, but the low quality of the field of psychology is pretty relevant to the topic.

> I think many people on HN, myself included, are kind of addicted to appearing clever.

Apparently so are psychologists :) (kidding, I have lots of respect for psychologists and other researchers even if the fields themselves are prone to drawing bad conclusions)


you can call it whatever you want but if they can’t reproduce the findings the they are dangerous at best. Especially with a study as sexist as the original here.


> Perhaps because so many of us are engineers, and that it strokes our egos to see the softer sciences fail?

Just the "lol math is hard and statistics is annoying" crowd. Honestly the thing that pisses me off most about psychology students is how only a tiny fraction of them understands and gets the actual importance of hard statistics to their field.

But instead, it's almost accepted to trash the statistics courses as just these boring hurdles to get past by. And it's still cool to suck at math.


There are surely many valid ones.

Problem is, right now you can't tell which they are.


Then, wait another 15+ years before that claim stops being widely held and repeated as a justification for policy, business decisions, etc.


Wait, you did or you didn't believe this paper's psychology science claim? Why or why not? If we/you apply your criteria to other findings, will we find the good science?


Testosterone increases blood flow, it could be that the participants taking it got more alert meaning better scores but that testosterone also worsens results for similar alertness levels.


That's just not what the study studied. You can speculate but there is no evidence for that hypothesis in this paper.


I didn't say that this study proved it, I just meant that effects like that could cause the non-linear results reported and mess with studies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: