
Polish women face jail for abortion pills, anti-vax scientists rake it in on Substack, AI chatbot’s papers fool academics
We all know stories of scientists or respected doctors who “turn” into anti-vaccine conspiracy theorists. There’s Andrew Wakefield, who went from a respected doctor at London’s prestigious Royal Free Hospital to a disgraced anti-vaxxer who claimed vaccines cause autism. Over the years, he’s doubled down and become a rich man, living in a gated enclave in Miami and running an enormous annual anti-vaccine conference called Autism One. I’ve interviewed a number of former anti-vaxxers with autistic children over the years who told me the most painful part of Wakefield’s propaganda was the fact that it meant they carried the blame for their children’s autism diagnoses. Then there’s Dr. Oz, the American TV presenter and former Columbia professor of cardiothoracic surgery, who unsuccessfully ran for Senate in Pennsylvania, backed by Donald Trump. Oz has long been known for pushing quack pseudoscience cures and weight loss aids with no scientific backing. And besides those two examples, there are thousands more formerly respected, accredited academics who suddenly find themselves deep within the anti-vaccine conspiracy community.
How does it happen? “It’s a decidedly simple, but dangerous and malicious process,” explained Alistair McAlpine in a useful thread this week, where he broke down how people become “red pilled.” He described how pushing certain popular but unproven treatments, such as Ivermectin, generate vast amounts of online revenue and attention on platforms like YouTube. Then in come the anti-vax speaking fees to give talks at events like Wakefield’s. “From obscurity, you’re now a name. You’re also making very good money,” McAlpine said. At a certain point, it’s too late to turn back: “The science community thinks you’re gone and won’t easily accept you back.” And YouTube is not the only way anti-vaccine influencers make money. Alongside selling supplements and telehealth services, there’s a new revenue stream for anti-vaxxers, according to author Derek Beres, who’s writing a book on new-age anti-vaccine trends. He revealed (ironically enough, in a Substack post) that four of the top 15 monetized political newsletters on Substack peddle anti-vaccine content, some of them raking in hundreds of thousands of dollars a month by pushing anti-vaccine propaganda.
It was Lunar New Year over the weekend, and China’s censorship bosses released a special set of instructions to try to dispel “gloomy thoughts” during the Spring Festival. There was plenty to be nervous about for people traveling home to their families for the first time since the zero-Covid restrictions were lifted, as the virus runs rampant in China. “Skepticism about official COVID death toll figures, China’s historic population decline, a disappointing economy, shortages of COVID medications, and increased medical debt for those who do contract the virus and require treatment” are all national concerns right now, writes Alexander Boyd for China Digital Times. Boyd translated the censorship instructions issued to censors to facilitate “a festive and harmonious atmosphere” during the festival. Censors were told to “crack down” on pandemic-related online rumors, fabricated reports about new pandemic control policies, fake miracle cures for the virus and spurious personal experiences with the virus during the Spring Festival period.
The notorious AI chatbot program ChatGPT can generate text that is able to pass a Business Master’s exam. And it can also write perfect abstracts for scientific papers that fool keen-eyed scientists. Northwestern University researchers tested the program by telling it to write fake abstracts by pulling from ten real ones from medical journals. When scientists reviewed the results, to try to identify if they’d been written by the program, they missed about a third of the AI-generated results. It’s a worrying result, considering the fake research papers industry is already booming, particularly in China — this program could make it all the easier for scientists to produce fake research.