Do you really understand the “peer-reviewed research” you’re linking to?
I often see links to “research proves” or “science says” in article submitted to Better Humans. (Who is this Science person who’s saying so much, anyhow?)
Those phrases are often linked to content mills that put the most simplistic spin on a story about some finding in behavior research—or worse, to some apocryphal claim with no findable origin.
But even when the claim is linked to a real peer-reviewed paper, I have my doubts. I get the impression that writers are writing the story and then back-filling with research they eventually find to support it. Somehow people have gotten the idea that that’s what we want.
What we prefer to see is evidence that the writer is sharing advice that they’ve learned through lived experience.
I actually prefer an article based solely on what a writer has learned, through experience—mistakes and all. I’m finding that a writer’s sharing of mistakes is a real hallmark of authenticity. Research is wonderful, but honestly—you should only be citing research if you understand what it says.
Quit parroting advice you’ve heard and back-filling it with “proof” after the fact. Dig deeper into your lived experience, and tell us about the failures you’ve had along the way.
That’s the only way you’re going to inspire us to make our own.