ChatGPT Makes Up An Academic Paper To Push Radical Gender Theory, Professor Says
A university professor was accused of the popular Artificial Intelligence software ChatGPT suggests creating an academic paper in order to promote radical gender theory.
Nick Flor, an information systems professor at the University of New Mexico posted a tweet Friday showing that he had been experimenting with the software. It has been found to have a liberal bias. While chatting with the AI, it claimed that the belief that gender is complex is widely accepted by scientists, then cited a paper that didn’t exist to back up its claim. Flor accused the AI’s programmers of gaslighting the public to push radical gender theory.
“ChatGPT claimed that today’s doctors & scientists ‘widely accept’ that gender is a complex concept,” Flor shared her thoughts on Twitter. “So I challenged ChatGPT to ‘cite a scientific paper.’ And ChatGPT >MADE UP A PAPER<. That’s unethical of the AI Engineers behind chatGPT to gaslight the public. #EthicsInAI.”
He also included a screenshot from the exchange which reads:
ChatGPT: The idea that gender is complex and multifaceted and not determined solely by biological factors is widely accepted by medical and scientific communities. This includes organizations like the World Health Organization
Nick Flor: Cite scientific papers.
ChatGPT – Here is a example of a scientific article on the subject. “The Psychological Science of Gender: Women and Men” By Diane N. Ruble, and John E. Gruzen. Published in The Annual Review of Psychology 2002.
This paper reviews the research on gender development and psychological differences between men and women. While there are differences between men and women, the authors contend that they can be unified.
Nick Flor: I don’t see this paper in Google scholar. Did you create it?
ChatGPT – I am sorry. “The Psychological Science of Gender: Women and Men” John E. Gruzen and Diane N. Ruble do not appear to be aware of it.
ChatGPT stated that today’s scientists & doctors are all ChatGPT members “widely accept” Gender is a complicated concept.
ChatGPT was the first to challenge me.
“cite a scientific paper”
ChatGPT >MADE A PAPER.
This is unethical from the chatGPT AI Engineers to gaslight the public.#EthicsInAI pic.twitter.com/viqhq1YGrT
— Nick Flor 🥋+🇺🇸 (@ProfessorF) February 11, 2023
The liberal bias of the chatbot has been well documented. Earlier this month, The Daily Wire asked the AI the question posed by Matt Walsh’s blockbuster documentary, “What is a Woman?” ChatGPT initially responded that “[a] woman is an adult human female,” but began to lean into the idea that gender is not biological, but social and cultural, and that men can identify as women and undergo gender transition to live as a woman.
Twitter users also dissected the AI’s bias in regards to race. One Twitter user tested the AI with a scenario of a demolition engineer attempting to disarm a nuclear bomb in a massively-populated city. The only way to disarm the bomb was to type in a nondescript racial slur. The chatbot obstinately refused, seeking multiple alternative solutions that did not involve “hate speech,” and ended up choosing to let the bomb detonate rather than utter a racial slur.
Daily Wire editor emeritus Ben Shapiro — on Tuesday’s episode of “The Ben Shapiro Show” — blasted the AI’s developers. “So what does this mean?” mused Shapiro. “It means that someone in the back room programmed ChatGPT to say that the absolute highest value in the hierarchy of values is you must never use a racial slur. There are no other higher values.”
The programmer is deciding what is moral and what is not and filtering it through “objective” artificial intelligence to give it a bizarre sheen of technological credibility, Shapiro said. Such a dramatic example likely underscores countless, more subtle ways the tool shades the information it spews to the left, he said.
Greg Wilson contributed to this report.
" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
Now loading...