Scientific Papers - How to Navigate Them?

Scientific papers are like the gossip columns of the science world – they spill all the juicy details about experiments, findings, and what it all means. Most importantly they’re there to let the science world know what they’re up to, to let other scientists test out their experiments, and to break new information that’s been discovered. 

Here's the breakdown on what they're all about and why they matter:

 

What's Inside:

 

Abstract: Think of this as the TL;DR version. It's a quick rundown of the whole shebang – what the study was about, how they did it, and what they found. It gives whoever's reading it an idea of what they’re getting into. 

 

Introduction: This sets the scene. It's where researchers explain what they're studying, why it's important, and what they hope to find out. They also give a shoutout to other studies that they have built this experiment on; scientists don’t just come up with ideas and go off all guns blazing - there needs to be proof of concept, or basically evidence behind why they’re doing something a certain way. 

 

Methods: Here's the nitty-gritty stuff. Researchers explain exactly what they did – the tools they used, how they gathered data, and any magic tricks they pulled off to get the job done. This is to show hope exactly they got the results they did, and allows for other researchers to copy them and ensure they get the same results.

 

Results: Time to spill the tea! This section dishes out all the juicy details of what actually went down during the study. Think graphs, charts, and stats galore. There’s little to no interpretation in this section and is very dense, and honestly, 90% of the time I’ll skip the text and just look at the graphs and images - if the section has headers that summarize each paragraphs result I’ll take note of that but usually I pay more attention to the discussion. 

 

Discussion: Now it's time to dissect the results. Researchers chat about what it all means, how it fits into the bigger picture, and any drama that went down along the way. They’ll also talk about what limited them in the study - maybe they had to change direction half way through because they got an unexpected result, or there were certain experiments they wanted to run that they just didn’t have the resources to complete. 

 

Conclusion: This is where they wrap it all up with a bow. Researchers summarize what they found, why it matters, and where they’d like to go next, or where they think research on the subject in general should go. 

 

References: This is where researchers will show where they got their building blocks of information from. This is mostly seen in the introduction (where they might give a brief synopsis of the work done in the field previously, redirect you to a paper that they’re building on etc) and the discussion where they rationalize their interpretation of the results using other people's work as well. There’s hundreds of referencing styles but the ones you’ll probably come across most often is LastName, LastName et al, 2017, and [12]. Both will have an expanded link to the paper they’re talking about at the end of the paper. 



Why They're Important:

Spreading the Word: Scientific papers are like the town criers of the research world – they shout out the latest discoveries to anyone who'll listen.

 

Fact-Checking: Before a paper gets published, it's like going through the ultimate fact-checking process. Other experts in the field give it the thumbs up (or thumbs down) to make sure it's legit.

 

Building Blocks of Knowledge: Each paper adds another brick to the ever-growing castle of scientific knowledge. They build on what's come before and pave the way for what's next.

 

Sharing is Caring: Papers aren't just for bragging rights. They're a way for scientists to share their ideas, collaborate with others, and (hopefully) make the world a better place.


How to Critically Evaluate a Paper 

 

Study Design: Start by scrutinizing the study design. Is it a randomized controlled trial, an observational study, or something else? Assess whether the design aligns with the research question and if it's the most appropriate method for answering it. Don't know what any of this means? Check out The What's and Why's

 

Methodology Mastery: Dive deep into the nitty-gritty details of how the study was conducted. Look for any potential flaws or limitations in the experimental setup, data collection methods, and statistical analyses. Pay attention to sample size, control groups, and any confounding variables that could muddy the waters. 

 

Results Evaluation: Take a close look at the results. Are they presented clearly and objectively? Do they support the conclusions drawn by the researchers? Keep an eye out for any discrepancies or inconsistencies that might raise a red flag.

 

Interpretation Inspection: Don your detective hat and interrogate the interpretation of the results. Does it seem logical and grounded in evidence, or are the researchers reaching beyond what the data can support? Be wary of overblown claims or leaps in logic that could signal bias or agenda-driven conclusions.

 

Bias Detection: Be on the lookout for biases lurking in the shadows. Is there any funding source or conflict of interest that could sway the findings? Take note of any industry affiliations, financial incentives, or ideological biases that might influence the research. Check out our post about funding, why it's not always a bad thing and when it turns shady HERE!

 

Conflict of Interest Radar: Keep your radar tuned to detect any conflicts of interest. Are the researchers beholden to any outside entities that could compromise their objectivity? Look for disclosures of funding sources, affiliations, or potential conflicts that could impact the integrity of the study. Again - affiliations aren't always a bad thing. Many of humanities most important research is funded by private agencies. 

 

Limitations Acknowledgment: Every study has its limitations – it's just part of the game. Pay attention to how researchers acknowledge and address these limitations. Are they transparent about the constraints of their study, or are they sweeping them under the rug?