Trang thông tin tổng hợp
Trang thông tin tổng hợp
  • Tranh Tô Màu
  • Meme
  • Avatar
  • Hình Nền
  • Ảnh Hoa
  • Ảnh Chibi
  • Ảnh Nail
Tranh Tô Màu Meme Avatar Hình Nền Ảnh Hoa Ảnh Chibi Ảnh Nail
  1. Trang chủ
  2. Ảnh Nail
Mục Lục

Generative AI CSAM is CSAM

avatar
Xuka
02:14 11/09/2025

Mục Lục

Real prompts used to create GAI CSAM.

As if the creation of this imagery wasn’t terrible enough, NCMEC also has received reports where bad actors have tried to use this illegal GAI content to extort a child or their family for financial means. Furthermore, users of the technology to create this material have used the argument that, “At least I didn’t hurt a real child” and “It’s not actually a child...”

GAI CSAM is CSAM. The creation and circulation of GAI CSAM is harmful and illegal. Even the images that do not depict a real child put a strain on law enforcement resources and impede identification of real child victims. For the children seen in deepfakes and their families, it is devastating. We must continue providing support to children who are victims of explicit imagery online regardless of how the imagery was created, ensuring that laws adequately protect child victims, and implementing regulations that require GAI platforms to incorporate child safety by design concepts as they create these tools. Protecting children from the harm of GAI CSAM also requires education and guidance from trusted adults. We have an opportunity with this relatively new technology to guide youth, so they learn to use GAI safely and understand the dangerous implications of misusing GAI to create sexually explicit or nude images of other minors.

It is essential that federal and state laws be updated to clarify that GAI CSAM is illegal and children victimized by sexually exploitative and nude images created by GAI technology have civil remedies to protect themselves from further harm. Additionally, legislation and regulation is needed to ensure that GAI technology is not trained on child sexual exploitation content, is taught not to create such content, and that GAI platforms are required to detect, report, and remove attempts to create child sexual exploitation content and held responsible for creation of this content using their tools.

The ethical and legal conversations around the governance of GAI technology and its ability to generate CSAM is just beginning. We call on GAI technology creators, legislators and child serving professionals to come together and find a way to prioritize child safety while this innovative technology continues to evolve.

On March 12, John Shehan, NCMEC's Senior Vice President, Exploited Children Division & International Engagement, testified before the United States House Committee on Oversight and Accountability Subcommittee on Cybersecurity, Information Technology, and Government Innovation to discuss the trends that NCMEC is seeing. Read his full testimony, “Addressing Real Harm Done by Deepfakes,” here.

0 Thích
Chia sẻ
  • Chia sẻ Facebook
  • Chia sẻ Twitter
  • Chia sẻ Zalo
  • Chia sẻ Pinterest
In
  • Điều khoản sử dụng
  • Chính sách bảo mật
  • Cookies
  • RSS
  • Điều khoản sử dụng
  • Chính sách bảo mật
  • Cookies
  • RSS

Trang thông tin tổng hợp itt

Website itt là blog chia sẻ vui về đời sống ở nhiều chủ đề khác nhau giúp cho mọi người dễ dàng cập nhật kiến thức. Đặc biệt có tiêu điểm quan trọng cho các bạn trẻ hiện nay.

© 2025 - itt

Kết nối với itt

https://nghengu.vn/
Trang thông tin tổng hợp
  • Trang chủ
  • Tranh Tô Màu
  • Meme
  • Avatar
  • Hình Nền
  • Ảnh Hoa
  • Ảnh Chibi
  • Ảnh Nail
Đăng ký / Đăng nhập
Quên mật khẩu?
Chưa có tài khoản? Đăng ký