RateLimited°C
09-20-2024
BSV
$49.29
Vol 20.13m
1.02%
BTC
$63442
Vol 44545.07m
2.17%
BCH
$340.7
Vol 293.23m
-1.05%
LTC
$66.2
Vol 363.53m
1.76%
DOGE
$0.1
Vol 737.49m
2.19%

A bipartisan group of United States legislators has proposed a new law to protect content creators, and every other media industry organization supports it.

Sen. Maria Cantwell (D-Wash.) led the Senators who proposed the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act). Others who backed it included Marsha Blackburn (R-Tenn.) and Martin Heinrich (D-N.M.)—the latter is a member of the Senate’s AI Working Group.

The COPIED Act requires the National Institute of Standards and Technology (NIST) to develop new guidelines for the artificial intelligence (AI) sector, covering watermarking, the detection of AI-generated content and content provenance.

NIST’s standards would allow Americans to easily identify content generated by AI. With technology advancing rapidly in recent times, it has become harder to detect AI content, and this could have massive implications.

This includes impersonations and the use of deepfake audio and video to manipulate elections; in May, one study found that the ease of cloning Trump and Biden’s voices could dupe millions in this year’s U.S. elections. Microsoft and others have also warned that foreign nations could use AI to sway U.S. elections.

NIST’s standards will also offer content creators tools to digitally prove ownership of their content. This is critical in the second provision of the COPIED Act: requiring AI firms to seek consent before using any content with provenance information to train their AI.

“These measures give content owners—journalists, newspapers, artists, songwriters, and others—the ability to protect their work and set the terms of use for their content, including compensation,” the bill’s summary notes.

This provision will prevent OpenAI, Meta (NASDAQ: META), Google (NASDAQ: GOOGL) and other tech giants from using content to train AI without consulting or compensating its creators. This has been a major sticking point in the rise of AI, with some like the New York Times and Game of Thrones creator George R. R. Martin suing ChatGPT for using their content.

In defense, AI creators have argued that this content is covered by the fair use doctrine, which allows the use of copyrighted content in the promotion of freedom of expression. There are several lawsuits in court currently focused on whether the fair use doctrine applies to AI training.

But even if the courts side with AI companies, the COPIED Act would ensure an extra layer of protection for content creators as these firms would still have to seek consent for any content with provenance information.

Media industry backs COPIED Act

The COPIED Act joins dozens of others at the state and federal level seeking to police the fast-rising AI industry, including the NO FAKES Act.

It has received the support of several media industry organizations, from SAG-AFTRA and the Recording Academy to America’s Newspapers and the National Association of Broadcasters (NAB).

“As AI-generated music continues to disrupt the legitimate market, it is essential that listeners know where their music is coming from. Artists and songwriters deserve protection against unauthorized imitations and this legislation is an important step towards that goal,” commented David Israelite, the CEO of the National Music Publishers’ Association.

The COPIED Act stands a great chance at sailing through the Senate as it’s being pushed by the influential Sen. Cantwell, who heads the Senate Commerce Committee. Commenting on the bill, she said it would “provide much-needed transparency around AI-generated content.”

“The COPIED Act will also put creators, including local journalists, artists and musicians, back in control of their content with a provenance and watermark process that I think is very much needed,” she added.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Digital currency regulation and the role of BSV blockchain

Recommended for you

US SEC sounds alarm on risks tied to spot BTC, Ether ETF
In its bulletin, the U.S. securities regulator voiced alarm about the risks tied to BTC and Ether ETFs and urge...
September 13, 2024
Digital asset micropayments unlock AI autonomy: Bernstein
The legacy financial system is limiting AI as it doesn’t enable micropayments, programmability or non-human entity involvement, according to Bernstein...
September 13, 2024
Advertisement