The U.S. Senate has unveiled another artificial intelligence protection bill in a series of similar moves, this time aimed at protecting the work of artists and other creatives.
The new legislation, known as the COPIED Act, would require more precise authentication of digital content and make it illegal to remove or tamper with watermarks. edge According to the report, according to the new artificial intelligence standards developed by the National Institute of Standards and Technology (NIST).
You can shop Prime Day deals now
Products available for purchase here via affiliate links are selected by our marketing team. Mashable may earn an affiliate commission if you purchase something through links on our site.
The bill specifically requires generative AI developers to add content provenance information (identifying data embedded in digital content, such as a watermark) to their output, or allow individuals to attach such information themselves. More standardized access to this information could help detect artificial intelligence-generated synthetic content (such as deepfakes) and curb the unauthorized use of data and other intellectual property. It would also empower the Federal Trade Commission (FTC) and state attorneys general to enforce the new rules.
Mix and match speed of light
Someone is turning old Apple blogs into AI content factories with fake authors
Regulatory pathways like this could effectively help artists, musicians, and even journalists exclude their original works from datasets used to train artificial intelligence models—a growing public accessibility issue that has recently seen artificial intelligence such as OpenAI Cooperation between smart giants and media companies will only exacerbate this problem. Organizations such as Artists Alliance SAG-AFTRA, the Recording Industry Association of America, the Press/Media Alliance and the Artists Rights Alliance have all come out in support of the legislation.
“We need a fully transparent and accountable supply chain for generated artificial intelligence and the content it creates that protects everyone in control of their face, voice and persona,” said Duncan Crabtree, national executive director of SAG-AFTRA. Basic rights of use.
If passed, the bill would make it easier for these creatives and media owners to set terms of content use and provide legal avenues for their work to be used without consent or attribution.