VA Attorney General joins fight against AI-generated child abuse images.
Attorneys General Call for Action Against AI-Generated Child Sexual Abuse Material
Virginia Attorney General Jason Miyares has joined forces with a coalition of attorneys general from 54 U.S. states and territories to address the growing concern of artificial intelligence (AI) being used to create child sexual abuse material (CSAM). They are urging Congress to take action by studying the issue and introducing legislation that will make it easier to prosecute these crimes.
In a statement, Mr. Miyares emphasized the urgency of establishing strong legal boundaries and protections to safeguard the safety and innocence of our children. He stated, ”We are in a race against the clock to establish strong legal boundaries and protections that encompass artificial intelligence technologies and, more importantly, protect the safety and innocence of our children.”
The National Association of Attorneys General sent a letter (pdf) to congressional leaders about the issue.
Mr. Miyares and the other attorneys general share deep concerns for the safety of children in their respective states and territories. They also feel a responsibility to hold accountable those who seek to harm children. However, they express concern that AI is creating a new frontier for abuse, making prosecution more difficult.
In June, Mr. Miyares led 23 state attorneys general in urging the National Telecommunications and Information Administration to advance artificial intelligence governance policies, recognizing the need to keep up with the rapid advancements in AI-generated technology.
While some argue that AI-generated CSAM does not directly victimize real children, opponents contend that it still causes harm. Research has shown a link between viewing child pornography and sexually abusing children in real life. In fact, studies have revealed that 50 to 60 percent of viewers of child pornography admit to abusing children themselves.
Types of AI-Generated CSAM
In their letter, the attorneys general highlight three types of AI-generated CSAM:
- One type uses the likeness of a real child who has not been sexually abused but whose image is digitally altered to show depictions of abuse.
- A second type uses images of a child who was sexually abused, digitally recreating them to show other forms of abuse.
- A third type uses images that are entirely generated by AI without the image of any real children.
The attorneys general are calling for all forms of AI-generated CSAM to be illegal and for the crimes to be easily prosecuted. They are urging Congress to create laws that deter and address child exploitation, explicitly covering AI-generated CSAM.
The nonprofit Nati
" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
Now loading...