Feedback Loopy: Tracing the Echo That Sounds Like a Signal That Sounds Like Science

I recently read this Phys.org piece—which summarizes a recent Northwestern study on scientific fraud—with what feels like both a kind of statistical rigor and something very close to heartbreak, the way fraud in science has become infrastructural rather than incidental, which is to say: something large and networked and with internal stability, not unlike a sewer system or an archipelago of ghost cities that still appear on maps long after the people and animals have left—and I found myself unable to move my attention away from a particular sentence, or really a claim masquerading as a sentence, about how fake research is now growing faster than legitimate research, which in some ways is the kind of sentence you read once and then reread again with the hope that you had misread it, or perhaps hallucinated it, which, under current conditions of information exposure and fatigue, feels increasingly plausible, though in this case the sentence remained, unchanged, verifiable, and profoundly unwelcome.
And by unwelcome I do not just mean disheartening or inconvenient in the way most data points about collapse are disheartening and inconvenient but something more like the kind of unwelcome that implicates you by proximity, which is what happens when the thing you’re reading about isn’t just external or academic or over-there but something you are actively entangled with, in ways you can neither precisely describe nor easily exit, which for me includes writing, reading, citing, referencing, hyperlinking, trusting, or trying to trust, the outputs of scientific literature in an era in which literature and output have become functionally indistinct, and the distinction between what has been written and what has been produced requires more interpretive effort than many people—especially the people designing recommendation engines or training large language models—are structurally allowed to spend.
So yes, the study reveals a network—though even that word feels understated, because “network” still suggests something flat, digital, maybe even sexy in a late-nineties sort of way, whereas what they’re describing resembles more a syndicate or a chain-of-custody operation or a black-market logistics system for laundering scientific credibility through fake journals and automated submissions and paid citations and pre-written manuscripts that one can apparently slot into, like seats on an airplane, with higher prices for first authorship and lower prices for the kind of author that shows up last and adds no known value except maybe volume or affiliation or institutional heft, which has become its own form of currency.
And once that content—though even the word “content” carries the residue of platform ideology, as if the thing itself matters only insofar as it can be indexed or optimized or presented—once that material enters the stream, which in this case includes not just publication venues but indexing systems, academic databases, search engines, open-access mirrors, citation crawlers, recommendation feeds, and whatever hybrid crawlers now feed training data into models that will, in turn, use this data to generate new content indistinguishable from the original except for maybe a statistical fingerprint or a meta-tag buried six levels down in a JSON file nobody reads, the question becomes recursive rather than ethical, or maybe the recursion is the ethics, and nobody has figured out how to describe it yet without sounding either hysterical or vague.
Because what we’re facing is a system that learns from itself, where the input is shaped by the last output, and the criteria for legitimacy—citations, metrics, frequency, reproducibility (in the search sense rather than the lab sense)—are entirely internal to the system and its feedback loops, which is how you end up with AI tools trained on fake science generating new science that reads plausibly enough to get published, which then gets indexed and cited and used as input for future models that are trained to generate outputs that look like what has already been published, meaning that the difference between noise and knowledge begins to erode from the inside, quietly, without announcement or visible rupture.
What the article gestures toward, and what I find myself unable to stop thinking about, is a kind of slow epistemic heat-death in which the systems designed to produce and preserve knowledge begin, through no single act of sabotage but through a kind of accumulated optimization, to produce simulacra of knowledge that circulate with the same speed and authority as the real thing, which itself becomes harder and harder to define, especially once the same systems are used to train AI models that are then used to review literature or write introductions or generate citations for work that might, by this point, be citing itself.
The authors of the study seem to grasp this, at least tacitly, and their tone—though restrained and peer-reviewed and fully academic—carries something like alarm, or maybe resignation, or maybe that quiet mode of panic that shows up in researchers when the data begin to describe something irreparable.
What remains unclear, at least to me, and maybe this is the part that feels closest to despair, is whether any amount of forensic effort—re-indexing, retraction, whistleblowing, audits, new incentive structures, radical transparency, improved training data, AI that detects other AI—is able to reverse what happens once the archive becomes unstable, or perhaps the word is saturated, or perhaps the better metaphor is inverted, meaning: once the mechanisms of knowledge start feeding back on themselves in a way that makes simulation indistinguishable from inquiry.
And if that has already happened—or if it is happening now, in slow motion, masked by the volume and speed of circulation—then the question becomes not how to restore the integrity of science, which implies a stable past and a recoverable baseline, but how to live inside a knowledge system that has begun to mistake its own outputs for the world.