Connect with us

Hi, what are you looking for?

Hit The StockHit The Stock

Editor's Pick

TAKE IT DOWN Act Shows That Noble Intentions Can Make Bad Tech Policy

David Inserra

revenge porn

No one wants revenge porn and other forms of non-consensual intimate imagery (NCII) to spread online or offline. And with AI being used to create even more of this kind of content, shouldn’t we be able to find common ground to remove such content quickly? 

That’s the approach taken by the TAKE IT DOWN Act cosponsored by Sen. Ted Cruz (R‑TX) and Sen. Amy Klobuchar (D‑MN), which passed the Senate and is now being considered in the House of Representatives. While the bill has noble intentions, it creates a takedown regime that will be abused to silence legitimate speech and threaten encryption. 

NCII Should Be — and Already Is — Illegal 

The first portion of the TAKE IT DOWN Act makes it a crime to publish and share content both real and digitally created, intimate imagery of individuals without their consent. These prohibitions create stricter rules and greater penalties when they involve minors. These provisions also include exceptions for content that is a matter of public concern or was shared with law enforcement, part of a legal filing, within a medical or educational purpose or is otherwise shared as part of reporting or trying to help the victim. 

This section closely mirrors similar legislation such as the SHIELD and DEFIANCE acts, which also criminalize or create civil liability for those who produce or try to distribute such content. Indeed, NCII deserves to be illegal and, as other groups have pointed out, there are already an array of criminal and civil laws that apply to NCII. To the extent that these laws need to be clarified and strengthened to apply to modern AI-generated NCII, Congress can take carefully tailored action to ensure real legal punishments can be applied to those disseminating such abusive content. 

TAKE IT DOWN’s Unintended Consequences 

Unfortunately, TAKE IT DOWN goes further and creates a takedown requirement for all websites and online applications that provide a forum for user-generated content except for internet service providers and email services. This means all social media platforms, as well as internet storage, online marketplaces, private messaging applications, and more, will be required to remove any content that is reported to them as NCII within 48 hours and to also remove any known identical copies of the content. 

But unlike legal standards established in the first portion of the bill that provided exceptions, e.g. when content is a matter of public concern, the takedown regime does not have such carveouts but simply demands that platforms remove any intimate visual image that a person claims to be published without their consent. The bill also lacks any provision to prevent frivolous or false claims. Also, it punishes companies that fail to act by subjecting them to FTC investigations and sanctions for engaging in unfair and deceptive trade practices. 

There are multiple problems with this takedown regime. With a short timeline, no defense for trying to sort through false claims, no consideration for content that may be a matter of public concern, and facing the threat of painful FTC punishments, companies will drastically err on the side of over-removing content. And this isn’t just a hypothetical prediction. We already have another notice and takedown regime in the Digital Millennium Copyright Act (DMCA), which creates liability for platforms that fail to act to take down intellectual property claims. 

The DMCA is a known problem as it’s continuously abused to take down massive quantities of non-copyright-protected speech. This abuse is so rampant despite the DMCA having a provision designed to punish fraudulent claims. The TAKE IT DOWN Act does not even try to prevent such bad claims, meaning that this takedown regime will likely be even more abused. And since companies must also identify and remove copies of such images elsewhere on their platforms — a task that larger companies will find far easier to do than smaller companies or websites — abuse of this takedown regime will have far-reaching impacts. Yes, there will be many legitimate requests for takedowns, but the scale of likely abuse targeting legitimate speech should be troubling. 

To give a high-profile example of potential abuse, politicians are frequently mired in sexual scandals. Criticism of such leaders can often include sexual claims and depictions, whether they be suggestive and embarrassing pictures or more explicit art. A politician could easily make a claim that such sexual depictions, even if legal and in the public interest, are NCII and must be removed by the platforms. The companies, erring on the side of over-removing content, will inevitably remove large amounts of political criticism. 

trump inauguration

Donald Trump, in his address to Congress, explicitly stated that he is “going to use that bill for myself too, if you don’t mind because nobody gets treated worse than I do online, nobody.” And countless other political actors, businesses, activists, and trolls will similarly try to abuse this law to silence their opponents, their competitors, and important conversations happening in society. Other consensual sexual content online could also be subject to targeting under this takedown system. 

The bill also poses a potential threat to encryption. The law covers a wide range of platforms, including those that involve encrypted communications that are not able to remove such content because the company is unable to access its users’ data. But if they receive a takedown order, how can they respond to such an order? Will they be forced to build new products where encryption can be broken? Or will the threat of future punishment constantly hang over their heads even if no immediate sanctions occur? Regardless, this threat to encryption undermines an important safeguard that allows everyone to keep their information and communications private and secure. 

Combining Good Intentions with Good Policy 

Policymakers, activists, and users of online platforms are right to want to defeat the spread of non-consensual intimate imagery online. While the takedown regime created by the TAKE IT DOWN Act will combat NCII, it will also harm free speech online and threaten smaller companies and encrypted services. A more tailored measure to clarify that NCII and AI-created NCII are illegal and improve enforcement of these or existing statutes would meaningfully combat such abusive content without the harmful side effects. 

You May Also Like

Editor's Pick

Healthcare re-enters the top 5 After a wild week in the markets, the sector ranking got quite a shake-up. Allthough only one sector changed...

Editor's Pick

Moving average strategy, trend trading, and multi-timeframe analysis are essential tools for traders. In this video, Joe demonstrates how to use two key moving...

Editor's Pick

With so many articles and videos on popular media channels advising you not to look at your 401(k) during this market downturn, avoiding taking...

Politics

Former Trump national security aide and Pentagon press secretary John Ullyot reportedly will resign at the end of the week.  His sudden departure comes...