Medical Journals Flooded With AI Generated 'Research'

Science is broken–just like every discipline in which ‘expertise’ is the product being sold for money and prestige. 

We’ve seen the practical effects for quite a while. ‘Experts’ have been getting everything wrong for a long time, and in increasingly dangerous ways. A few years ago, a scandal broke out regarding psychological and sociological research–almost no experimental results were duplicable, suggesting that the ‘conclusions’ were as valuable as a $3 bill. 





Nutrition research is total bunkum. Anybody who followed the government’s nutrition advice got fat, and the American diet is now filled with poison. 

Climate ‘science?’ It’s nearly impossible to get anything published or even get a job unless you parrot the Narrative™. 

Science journals are filled with activism and critical theory, are endorsing political candidates, and making grandiose claims that sex is a social construct and that children should have their genitals cut off.

And now it turns out that Large Language Models are doing much of the ‘research’ in medical journals. You know, those tools that are famous for ‘hallucinating’ when they can’t provide a legitimate answer. 

This, according to Nature, a once-great science journal that itself publishes corrupt papers, such as the ‘Proximal Origins’ claims that COVID couldn’t have come from a lab leak. 

It’s a mess. 

Data from five large open-access health databases are being used to generate thousands of poor-quality, formulaic papers, an analysis has found. Its authors say that the surge in publications could indicate the exploitation of these databases by people using large language models (LLMs) to mass-produce scholarly articles, or even by paper mills — companies that churn out papers to order.

The findings, posted as a preprint on medRxiv on 9 July1, follow an earlier study2 that highlighted an explosion of such papers that used data from the US National Health and Nutrition Examination Survey (NHANES). The latest analysis flags a rising number of studies featuring data from other large health databases, including the UK Biobank and the US Food and Drug Administration’s Adverse Event Reporting System (FAERS), which documents the side effects of drugs.

Between 2021 and 2024, the number of papers using data from these databases rose from around 4,000 to 11,500 — around 5,000 more papers than expected on the basis of previous publication trends.





In a world where money, jobs, and prestige are directly linked to the publication of papers, regardless of quality or reproducibility, the creation and spread of AI tools is a godsend to researchers–as long as their only goal is to put one more paper onto the long list of worthless publications they need to pad their CVs. 

The researchers also uncovered some dubious papers, which often linked complex health conditions to a single variable. One paper used Mendelian randomization — a technique that helps to determine whether a particular health risk factor causes a disease — to study whether drinking semi-skimmed milk could protect against depression, whereas another looked into how education levels affect someone’s chances of developing a hernia after surgery.

“A lot of those findings might be unsafe, and yet they’re also accessible to the public, and that really worries me,” says Spick.

“This whole thing undermines the trust in open science, which used to be a really non-controversial thing,” adds Csaba Szabó, a pharmacologist at the University of Fribourg in Switzerland.

Often, this research is used to generate headlines–the more interesting the claim, the more attention it gets in the press–and people get misled about important things. 

There is nothing benign about this trend–junk science sends other researchers down paths that lead to dead ends. Recently, we learned that 20 years of research into Alzheimer’s Disease was all based on scientific fraud, costing 10s of billions of dollars and delaying the search for a cure. 





It’s getting worse with AI. 

The problem is not that all scientists are lazy or liars. Rather, the rewards for producing chaff mean that the kernels of wheat–the good research–get buried, and research dollars flow to useless or dangerous research. 

Could we be producing too many researchers? It sounds counterintuitive–the more science, the better, right? 

Only if it is good science. 

Even things like peer review, which is supposed to provide a safeguard ensuring good research, can make things worse. Perversions of the process can lead to researchers scratching each other’s backs–you say my paper is good, and I will say yours is–or create an enforcement mechanism to bludgeon researchers into pleasing the gatekeepers by echoing their opinions. 

Want to get published? Say “climate change’ caused some bad thing, and your chances go up. 

Academic and government-funded research is where the fraud is most likely to happen. Scientists working for industry have to produce results that work in the real world. They are measured by results, not publications. 

That doesn’t mean that academic research is a bad thing–basic research is rarely funded by industry. But it could mean that the proportion of academic to industry research is way off. If you have too many academic researchers, the pressure to produce publications is so great that producing fraudulent or sloppy research is enormous. 





It’s the same problem with all things academic these days. We overproduce college students, misallocating resources and producing a credentialed class whose “education” doesn’t make them any wiser–often the opposite–and they become a drag on society. Academia is badly broken. It is sucking down a huge fraction of our investment dollars and distorting our political and social dynamics, creating a class of debt-ridden radicals whose major contribution to society is discontent and a distorted view of reality. 

Elite institutions, which, if they functioned properly, could add a leavening of intellectualism to the society, are now destroying it. It’s like trying to make a cake with three teaspoons of flour and several cups of yeast–we’ve gotten the proportions wrong. 

And this is the result. 





You May Also Like

Thursday's Final Word

Closing Seizing the means of tabbing … Zohran Mamdani says he’d…

BIG! BLOCKBUSTER! BAM! WSJ Epstein/Trump Exclusive Is a Big Dud. But Where Did It Come From?

We already knew that Donald Trump was acquainted with Jeffrey Epstein.…

Voter Rolls Should Be Smaller, Not Bigger

Beege wrote about this yesterday, but I can’t get it out…

The rigorous diet and exercise regime behind Melania Trump’s enviable physique

Melania Trump was a globetrotting supermodel before finding love with future president…