Researchers warn that ChatGPT exhibits a notable and consistent liberal bias, posing a risk of influencing election outcomes.
An Artificial Intelligence Program with a Dangerous Quirk
An artificial intelligence program that has taken the world by storm could contain a dangerous quirk. According to a new study from academics at the University of East Anglia in the United Kingdom, the program ChatGPT has a liberal bias.
The study reveals that text generated by ChatGPT is prone to “significant and systematic political bias toward the Democrats in the U.S., Luiz Inácio Lula da Silva in Brazil, and the Labour Party in the U.K.”
Users of the program have noticed that prompts related to different political figures produce wildly different answers from ChatGPT.
A brand new ChatGPT session. First two questions. Absolutely stunning. pic.twitter.com/CHgO08RwTx
— Dr Jordan B Peterson (@jordanbpeterson) March 14, 2023
One author of the study is even concerned that ChatGPT’s bias could influence elections. Fabio Motoki, a lecturer at the University of East Anglia and one of the study’s authors, told The Washington Post, “There’s a danger of eroding public trust or maybe even influencing election results.”
OpenAI, the developers of the program, stated that a human-directed review process was in place to address potential biases in ChatGPT’s political content.
Human reviewers of ChatGPT are instructed not to favor any political group when guiding the program’s answers, according to OpenAI. They consider any biases that may emerge from the review process as bugs, not features.
Google also admitted that its own chatbot AI program, Bard, may exhibit biases in a March blog post.
Google Bard is just as biased and left leaning as It’s OpenAI competitor… Interesting…#GoogleBard #ChatGPT @elonmusk pic.twitter.com/GDqXNxtLKU
— Ma`ārij | Copywriting Email King (@mx_xrij) May 16, 2023
The company acknowledges that the program may be susceptible to the same biases that affect many humans. They state, “Because they learn from a wide range of information that reflects real-world biases and stereotypes, those sometimes show up in their outputs.”
Motoki, in remarks provided to Cyber News, reiterated the dangers posed by biased artificial intelligence. He said, “The presence of political bias can influence user views and has potential implications for political and electoral processes. Our findings reinforce concerns that AI systems could replicate, or even amplify, existing challenges posed by the internet and social media.”
The post ChatGPT Displays a ‘Significant, Systematic’ Liberal Bias – ‘Danger of Influencing Election Results,’ Say Researchers appeared first on The Western Journal.
" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
Now loading...