Connect with us

Wondering if Educational Research Is Legit? 4 Ways To Tell

Wondering if Educational Research Is Legit? 4 Ways To Tell

Education

Wondering if Educational Research Is Legit? 4 Ways To Tell


Remember that mandatory English comp class we all had to slog through in our first year of college? For many of us, our instruction on finding sources for research began and ended with hitting the “peer-reviewed” checkbox when searching online.

Now, finding myself on the path of a budding researcher, I’m learning that the evaluating research is far more nuanced than a checkbox. Just because an article is peer-reviewed doesn’t necessarily make it rock-solid.

Luckily, you don’t have to be a researcher to evaluate whether a study is sound. Here are four questions to help you separate legit research from studies that might show skewed or misleading results.

1. Is it peer-reviewed?

Peer-reviewed refers to journal articles that have been evaluated by “peers” of the group conducting the research—often journal editors, experts, and scholars.

Sometimes, administrators don’t even get to the peer-reviewed checkpoint before adopting a new school-wide practice.

But evaluating research goes way beyond looking for the peer-reviewed label. In the classes I teach, students will often defend their citations with, “Well, it popped up under the peer-reviewed database, so I can use it, right?”

GULP!

In a time when people are eager to prove their viewpoints with research, it’s not surprising that most settle on the quickest peer-reviewed finds online. While a significant amount of misinformation can be weeded out through peer-reviewed articles, it’s important to know that not everything nestled in an online repository is concrete research. Sometimes, what you stumble upon might be a theoretical argument that has undergone peer review and earned its spot in a journal.

READ ALSO:  $2 In Sight? Mina Protocol's 47% Growth Raises Price Target Hopes

So, you’ve confirmed it’s peer-reviewed, but does the study truly hold water? Here’s the next question to ponder:

2. What’s the sample size?

It’s astounding how many educators are swayed by “research” without any sample size. If no real people or treatments are involved, is it useful for applying to schools full of children? Such articles might still have scientific value, but they might not be the linchpin for our evidence-based arguments.

Within the world of research, we often ask ourselves while reading manuscripts a few things:

  • How big is the sample size? Small sample sizes can still be valid, but usually around 100 participants offers sufficient statistical power. For qualitative research, 10 participants can provide insights, though not causal conclusions.
  • Is the sample size diverse? A study’s validity also depends on sample diversity; if it doesn’t reflect your classroom’s demographics, results may not be relevant. For instance, my study with a uniform student group might not yield the same results elsewhere.
  • Is the sample size made up of a treatment and a comparison group? We all can showcase growth in one group over time. But juxtaposing it with a group that didn’t receive the same intervention reveals if the growth was statistically significant.
  • Is the sample randomized, or is it one of convenience? When testing a program, let’s say a tutoring program, observing results from a random sample of all students is key, avoiding those who might have an inherent bias toward tutoring. Many survey results lean on samples of convenience, often reflecting inherent differences in survey takers and non-takers.
READ ALSO:  Can Bitcoin Price Climb To $47,000? Here’s What This Crypto Analyst Thinks

Again, recap: Peer-reviewed? OK. Sample size? OK. But what about the source?

3. Is it from a trustworthy source?

Our educator prep programs do a pretty good job of ensuring we know the popular names and big shots in the education world, but we need to ruminate a bit more on a few questions:

  • Who conducted the research? Even eminent figures might lean on theoretical arguments without any empirical backing.
  • Where was it published? There is a way to filter on your search for top education research journals. Review of Educational Research, Educational Researcher, and Educational Research Review are just a few of the big hitters!
  • How recent is it? Research older than a decade probably needs newer studies to reinforce its findings.

And I wish I didn’t have to end on this last one, but it’s essential, so here goes:

4. Is the study methodologically rigorous?

(Say “methodologically” three times fast.)

Beyond statistical significance, it’s important to look at methods. How did they perform testing? Are the methods rigorous? Do they acknowledge limitations? A super-important point to be aware of:

Correlation is not causation!

It’s human nature to want to find causes for things. Sometimes the cause-effect relationship is valid, but sometimes it’s not. One of the most common ways we see data misinterpreted is applying a cause-effect relationship to mere patterns or trends.

For example, I might notice that my toddler is crankiest around 6 p.m. I might also notice that my neighbor waters his plants around 6 p.m. Just because I have two patterns that share a similar time factor doesn’t mean that my toddler’s meltdowns are a result of my neighbor’s gardening (or that my neighbor gardens in a strange and ineffective effort to pacify my son).

READ ALSO:  Bitcoin A Tad Closer To $40,000

Similarly, in educational research, a significant correlation doesn’t mean one factor directly causes another. Let’s say a school district increases the school day by 30 minutes, then gets a 3.5% increase on test scores that year. It’s misleading to say that the increase in test scores can be attributed to extra time without rigorous methods of controlling other factors that affect test scores, setting up a control group, etc.

When authors claim causation, you should scrutinize their methodology. For causation claims, a really strong randomized control and treatment group setup is needed.

Roll call: Check peers, sample size, sources, and methods

The next time your very own Malarkey Odometer buzzes when hearing “Well, research says …” during a professional development session, let these questions give you the confidence to dig deeper. Who made the claim? Where was it stated? Was it tested on real students? How sound were their methods?

Surprisingly (or maybe not to some of you), much of the “evidence-based” research promoted to teachers may not be as credible as presumed.

For more articles like this, be sure to subscribe to our newsletters.



Source link

Continue Reading
Advertisement
You may also like...

More in Education

To Top