Remix.run Logo
timr 7 hours ago

You found a paper saying that contamination is possible. That doesn’t mean that most of these plastic studies are doing the necessary controls, let alone the (almost impossible) task of preventing the contamination in a laboratory setting where nanomolar detection levels are used to make broad claims.

dahart 6 hours ago | parent | next [-]

Are more “controls” what is necessary here? The problem wasn’t plastic contamination, it was the presence of stearates. Distinguishing between stearates and microplastics sounds like a classification problem, not a control problem.

There is practically universal recognition among microplastics researchers that contamination is possible and that strong quality controls are needed, and to be transparent and reproducible, they have a habit of documenting their methodology. Many papers and discussions suggest avoiding all plastics as part of the methodology, e.g. “Do’s and don’ts of microplastic research: a comprehensive guide” https://www.oaepublish.com/articles/wecn.2023.61

Another thing to consider is that papers generally compare against baseline/control samples, and overestimating microplastics in baseline samples may lead to a lower ratio of reported microplastics in the test samples, not higher.

timr 5 hours ago | parent | next [-]

Many papers in this field are missing obvious controls, but you’re correct that controls alone are insufficient to solve this problem.

When you are taking measurements at the detection limit of any molecule that is widespread in the environment, you are going to have a difficult time of distinguishing signal from background. This requires sampling and replication and rigorous application of statistical inference.

> Another thing to consider is that papers generally compare against baseline/control samples,

Right, that’s what a control is.

> and overestimating microplastics in baseline samples may lead to a lower ratio of reported microplastics in the test samples, not higher.

There’s no such thing as “overestimating in baseline samples”, unless you’re just doing a different measurement entirely.

What you’re trying to say is that if there’s a chemical everywhere, the prevalence makes it harder to claim that small measurement differences in the “treatment” arm are significant. This is a feature, not a bug.

dahart 4 hours ago | parent [-]

You’re still bringing up different issues than this article we are commenting on.

> There’s no such thing as “overestimating in baseline samples”

What do you mean? Contamination and mis-measurement of control samples is a thing that actually happens all the time, and invalidates experiments when discovered.

> What you’re trying to say is that if there’s a chemical everywhere, the prevalence makes it harder to claim that small measurement differences in the “treatment” arm are significant.

No. What I was trying to say is that if the control is either mis-measured, for example by accidentally counting stearates as microplastics, or contaminated, then the summary outcome may underestimate or understate the prevalence of microplastics in the test sample, even though the measurement over-estimated it.

njarboe 4 hours ago | parent | prev [-]

Any scientific paper that does not document how things were done (methodologies) is basically worthless in the search for truth.

dahart 4 hours ago | parent [-]

I agree completely. My point is that documenting methodology is standard practice, as is strict quality control, in the microplastics literature. I don’t know what controls are missing according to GP, and we don’t yet have references here to back up that claim. By and large I think researchers are aware of the difficulties measuring this stuff, and doing everything they can to ensure valid science.

johnbarron 3 hours ago | parent | prev | next [-]

>> That doesn’t mean that most of these plastic studies are doing the necessary controls

That was never my argument. Read it again.

refulgentis 6 hours ago | parent | prev | next [-]

Not OP, but:

> "You found a paper"

johnbarron didn't find it. The authors cited it as foundational to their own work. it's ref. 38 in the paper under discussion. From the paper: "this finding had not been reported in the MP literature until 2020, when Witzig et al. reported that laboratory gloves submerged in water leached residues that were misidentified as polyethylene."[1]

> "most of these plastic studies are [not] doing the necessary controls"

which studies? The paper they linked surveys 26 QA/QC review articles[1]. Seems well understood.

> "a laboratory setting where nanomolar detection levels are used to make broad claims"

This is like saying "miles per gallon" when discussing weight. "nanomolar detection levels"...microplastics are individual particles identified by spectroscopy, reported as particles per mm^2. "Nanomolar" is a dissolved-species concentration unit. It has nothing to do with particle counting. (I, and other laymen, understand what you mean but you go on later in the thread to justify your unsourced and unjustified claims here via your subject-matter expertise.)

> "(almost impossible) task of preventing the contamination"

The paper provides open-access spectral libraries and conformal prediction workflows to identify and subtract stearate false positives from existing datasets[1]. Prevention isn't the strategy. Correction is. That's the entire point of the paper they linked and the follow-up in [2]

[1] https://pubs.rsc.org/en/content/articlehtml/2026/ay/d5ay0180...

[2] https://news.umich.edu/nitrile-and-latex-gloves-may-cause-ov...

timr 5 hours ago | parent [-]

> This is like saying "miles per gallon" when discussing weight. "nanomolar detection levels"...microplastics are individual particles identified by spectroscopy, reported as particles per mm^2. "Nanomolar" is a dissolved-species concentration unit. It has nothing to do with particle counting. (I, and other laymen, understand what you mean but you go on later in the thread to justify your unsourced and unjustified claims here via your subject-matter expertise.)

This paper used “light-based spectroscopy” [1]. Many others use methods that depend on gas chromatography or NMR. A relatively infamous recent example used pyrolysis GCMS to make low-concentration measurements (hence: nanomolar), which they credulously scaled up by some huge factor, and then made idiotic claims about plastic spoons in brains.

Relatively little quantitative science in this area depends on counting plastic particles in microscopic images, but it’s what gets headlines, because laypeople understand pictures.

[1] as an aside, the choice of terminology here is noteworthy. A simple visual light absorption spectra is also “light based spectroscopy”, but is measuring the aggregate response of a sample of a heterogeneous mixture, and is conventionally converted to molar equivalents via some sort of calibration curve (otherwise you can’t conclude anything). But there could be other approaches that are closer to microscopy, which they also discuss. “Particles per square millimeter” is also a unit of concentration (albeit a shitty one, unless your particles are of uniform mass).

Anyway, the point is that these kinds of quantitative analyses are all trying to do measurements that are fundamentally about concentration, which is why I chose the words that I did.

refulgentis 3 hours ago | parent [-]

> ...

"1 nanomole of polyethylene" requires you to pick an arbitrary average molecular weight.

This changes the answer by orders of magnitude depending on what you pick.

Which is why nobody does it.

> Relatively little quantitative science in this area depends on counting plastic particles in microscopic images...Many others use methods that depend on gas chromatography or NMR.

So we're dismissive of some subset of papers, because they get false positives using toy methods.

Real science would use gas chromatography.

But...the paper we're dismissing tested gas chromatography. And found the same false positive. [1, in abstract]

> A relatively infamous recent example used pyrolysis GCMS to make low-concentration measurements (hence: nanomolar)

The brain study I'm guessing you are referring to, [2], measured low concentrations, yes.

But it reported them in ug/g.

Because polymers don't have a defined molecular weight.

> made idiotic claims about plastic spoons in brains

The brain study I'm guessing you are referring to, [2], does not mention spoons, or, come close.

Are we sure there's a paper that did that?

[1] Witzig et al, https://pubs.acs.org/doi/10.1021/acs.est.0c03742, "Therefore, u-Raman, u-FTIR, and pyr-GC/MS were further tested for their capability to distinguish among PE, sodium dodecyl sulfate, and stearates. It became clear that stearates and sodium dodecyl sulfates can cause substantial overestimation of PE."

[2] Campen et al, https://pubmed.ncbi.nlm.nih.gov/38765967/, "Bioaccumulation of Microplastics in Decedent Human Brains"

userbinator 3 minutes ago | parent [-]

Doesn't take an expert to see that fatty acids and hydrocarbon chains from the degradation of polyethylene look nearly the same.

idiotsecant 7 hours ago | parent | prev [-]

Luckily HN software developers, the foremost authority on literally every subject imaginable, are here to bless the world with their insights.

bonoboTP 6 hours ago | parent | next [-]

I think there's an important distinction of smug better-knowing instances.

"I have unique insight as a non-expert that all experts miss and the entire field is blind to" -> usually nonsense

"I think in this specific instance academically qualified people are missing something that's obvious to me" -> often true.

timr 6 hours ago | parent [-]

There’s also the possibility that some of us actually, you know…have subject-matter expertise.

refulgentis 6 hours ago | parent [-]

Doubtful, in your case, no?

"Nanomolar" is a dissolved-species concentration unit. It doesn't apply to spectroscopic particle counting.

timr 5 hours ago | parent [-]

Uh, yeah. I know what the word means. See my response to the other comment where you say the same thing.

refulgentis 6 hours ago | parent | prev | next [-]

Spiritual equivalent of a life sciences forum discovering memory safety, one person who wrote code for a bit saying they wrote a memory bug in C once, then someone clutching pearls about why all programmers irresponsibly write memory unsafe code given it has a global impact.

Been here 16 years, it's always an adventure seeing whether stuff like this falls into:

A) Polite interest that doesn't turn into self-keyword-association

B) Science journalism bad

C) Can you believe no one else knows what they're doing.

(A) almost never happens, has to avoid being top 10 on front page and/or be early morning/late night for North America and Europe. (i.e. most of the audience)

(B) is reserved for physics and math.

(C) is default leftover.

Weekends are horrible because you'll get a "harshin' the vibe" penalty if you push back at all. People will pick at your link but not the main one and treat you like you're argumentative. (i.e. 'you're taking things too seriously' but a thoughtful person's version)

david-gpu 6 hours ago | parent [-]

> Spiritual equivalent of a life sciences forum discovering memory safety, one person who wrote code for a bit saying they wrote a memory bug in C once, then someone clutching pearls about why programmers irresponsibly write memory unsafe code given it has a global impact.

I used to be a code monkey, I wrote systems software at megacorps, and still can't understand why so many programmers irresponsibly write memory unsafe code given it has a global impact.

So Poe's law applies here.

refulgentis 6 hours ago | parent [-]

That's the analogy working as intended: the answer to "why do programmers still write memory-unsafe code" is the same shape as "why do microplastics researchers still wear gloves." The real answer is boring and full of tradeoffs. The HN thread version skips to indignation: "they never thought of contamination so ipso facto all the research is suspect"

(to go a bit further, in case it's confusing: both you and I agree on "why do people opt-in to memunsafe code in 2026? There’s no reason to" - yet, we also understand why Linux/Android/Windows/macOS/ffmpeg/ls aren't 100% $INSERT_MEM_SAFE_LANGUAGE yet, and in fact, most new written for them is memunsafe)

peyton 2 hours ago | parent [-]

You’re ignoring the article to grind your axe.

Der_Einzige 6 hours ago | parent | prev [-]

You joke, but given that SWE/AI researchers literally invented AI that does everything else for them and is often super-human at intelligence across most things, I would unironically prefer the opinion of the creator of such a system over most others for most things.

hnlmorg 5 hours ago | parent [-]

I cooked a steak yesterday therefore I am an expert in biology.

Creating a user interface for the world’s knowledge doesn’t make the developer an expert on the knowledge that the interface holds in its database. Regardless of how sophisticated that interface might be.

kelseyfrog 5 hours ago | parent [-]

'I disagree, therefore I am an expert in skepticism.' The sword cuts both ways.

hnlmorg 43 minutes ago | parent [-]

No it doesn’t. What you’re describing is an oxymoron.