Who might probably oppose laws to get robust on AI-generated revenge porn? For one, Kentucky Republican Rep. Thomas Massie, one among two nays in Monday’s Home vote on the TAKE IT DOWN Act. For one more, an entire bunch of civil liberties advocates, together with people with teams just like the American Civil Liberties Union, the Digital Frontier Basis, and The Way forward for Free Speech.
That is as a result of regardless of how worthy the intentions behind the TAKE IT DOWN Act could also be, the way in which it is written poses main threats to protected expression and on-line privateness. It might additionally give politicians one other software with which to strain know-how firms into doing their bidding.
Not one of the measure’s critics are defending “revenge porn,” or what the invoice calls “nonconsensual intimate visible depictions.” Moderately, they fear that the measure can be “ripe for abuse, with unintended penalties,” as Massie put it.
Alas, the TAKE IT DOWN Act (S.146), sponsored by Sen. Ted Cruz (R–Texas), has now handed the Senate and the Home. Subsequent cease: President Donald Trump, who has been supportive of the invoice.
You’re studying Intercourse & Tech, from Elizabeth Nolan Brown. Get extra of Elizabeth’s intercourse, tech, bodily autonomy, legislation, and on-line tradition protection.
What the TAKE IT DOWN Act Says
The measure would make it a federal crime to publish “any intimate visible depiction of an identifiable particular person” on-line if the picture was generated by a pc or synthetic intelligence and was “indistinguishable from an genuine visible depiction of that particular person,” except the depicted particular person consented to its publication or “voluntarily uncovered” such a picture in a “public or industrial setting” themselves.
So, no Photoshopping a star’s head onto another person’s racy picture and posting it to some on-line discussion board. No asking Grok to think about your ex in a compromising state of affairs with J.D. Vance or a pizza supply man or a Smurf, after which messaging that picture to associates. And so forth.
The measure would additionally ban publishing “an intimate visible depiction of an identifiable particular person” on-line except the depicted particular person “voluntarily uncovered” the picture “in a public or industrial setting” or in any other case had no expectation of privateness. On this case, the crime is sharing actual photographs of somebody who did not need them shared.
Notably, the invoice incorporates an exception for actual or AI-generated photographs shared by legislation enforcement businesses or different authorities actors doing it as a part of “investigative, protecting, or intelligence exercise.” (Would not need to jeopardize any of these catfishing intercourse stings, would we?)
For everybody else, violating the phrases of the TAKE IT DOWN Act might imply as much as two years in jail if the depicted particular person was an grownup and as much as three years in jail if the depicted particular person was a minor.
Threatening Free Speech and Encryption
Already, there’s some hazard right here of roping in individuals who share parodies and different protected speech.
However maybe a much bigger downside is the way in which the brand new measure can be enforced in opposition to tech platforms.
The invoice would require on-line platforms to ascertain a discover and elimination regime just like these used for copyright infringements (a notoriously easy-to-abuse system). Platforms can be required to take away reported photographs inside 48 hours after receiving a request and “make cheap efforts to take away any recognized an identical copies of such depiction.” The fast turnaround required—and the legal responsibility imposed if a platform fails to conform—would incentivize firms to easily take down any reported photographs, even when these weren’t breaking the legislation. That makes it ripe to be used by individuals who need authorized photographs to be eliminated.
“Companies will depend on automated filters, that are infamously blunt instruments,” warned Digital Frontier Basis Activism Director Jason Kelley. “They ceaselessly flag authorized content material, from fair-use commentary to information reporting.”
The legislation would additionally incentivize better monitoring of speech, “together with speech that’s presently encrypted,” famous Kelley. “The legislation thus presents an enormous risk to safety and privateness on-line.”
And the company tasked with making certain tech-company compliance can be the Federal Commerce Fee (FTC), a physique of political appointees that may be extremely influenced by the whims of whoever is in energy. That makes the measure ripe to be used in opposition to politically disfavored tech firms and simply wielded as a jawboning software to get tech platforms to do an administration’s bidding.
That additionally makes it simply vulnerable to deprave makes use of, akin to eradicating photographs embarrassing to politicians. (“I’ll use that invoice for myself, too, in case you do not thoughts,” Trump instructed Congress in March. “As a result of no person will get handled worse than I do on-line.”)
TAKE IT DOWN’s Many Critics
The invoice has bipartisan assist in Congress, as payments aimed toward giving the federal government extra management over on-line areas are wont to (see: FOSTA). Nevertheless it has been roundly criticized by teams involved with free speech and different civil liberties.
“The TAKE IT DOWN Act responds to actual harms, however within the palms of a authorities more and more keen to manage speech, its broad provisions present a strong new software for censoring lawful on-line expression, monitoring non-public communications, and undermining due course of,” mentioned Ashkhen Kazaryan, senior authorized fellow at The Way forward for Free Speech.
The TAKE IT DOWN Act “creates unacceptable dangers to customers’ basic privateness rights and cybersecurity by undermining encryption,” a coalition of civil liberties and cybersecurity teams and consultants wrote in a letter earlier this month. “Though the Act appropriately excludes some on-line companies — together with ‘[providers] of broadband web entry service’ and ‘[electronic] mail’ — from the definition of ‘coated platform,’ the Act doesn’t exclude non-public messaging companies, non-public digital storage companies, or different companies that use encryption to safe customers’ information,” states the letter, signed by the American Civil Liberties Union, the Web Society, and New America’s Open Know-how Institute, amongst many others.
The notice-and-takedown scheme “would consequence within the elimination of not simply nonconsensual intimate imagery but additionally speech that’s neither unlawful nor truly [nonconsensual distribution of intimate imagery],” a bunch of civil liberties organizations—together with the Heart for Democracy & Know-how, Combat for the Future, the Freedom of the Press Basis, TechFreedom, and the Woodhull Freedom Basis—wrote to senators in February. “This mechanism is probably going unconstitutional and can undoubtedly have a censorious impression on customers’ free expression. Whereas the legal provisions of the invoice embody applicable exceptions for consensual industrial pornography and issues of public concern, these exceptions aren’t included within the invoice’s takedown system.”
“The invoice is so unhealthy that even the Cyber Civil Rights Initiative, whose whole existence is predicated on representing the pursuits of victims of [non-consensual intimate imagery] and passing payments just like the Take It Down Act, has come out with an announcement saying that, whereas it helps legal guidelines to deal with such imagery, it can’t assist this invoice because of its many, many inherent issues,” notes Mike Masnick at Techdirt. “The invoice’s obscure requirements mixed with harsh legal penalties create an ideal storm for censorship and abuse.”
“Whereas the invoice is supposed to deal with a significant issue, good intentions alone aren’t sufficient to make good coverage,” mentioned Kelley. “Lawmakers needs to be strengthening and implementing present authorized protections for victims, fairly than inventing new takedown regimes which are ripe for abuse.”
Observe-Up: Cambridge Intercourse Staff Weren’t Locked Inside
Just a few weeks in the past, this article coated a case in opposition to a Cambridge, Massachusetts, intercourse enterprise. Although not as steeped in human trafficking fantasies as many intercourse work busts are, a Homeland Safety agent did declare {that a} supervisor locking the door from the skin “utilized this tactic in order that the industrial intercourse suppliers felt that they needed to keep within the unit to carry out intercourse acts for money on behalf of the prostitution community.” That declare was subsequently utilized by some media to gasoline claims that employees had been coerced, and the bit about locking the door from the skin would later be repeated by federal prosecutors.
However “an worker of the Atmark, the constructing the place the door-locking befell, mentioned Thursday that every one its condominium doorways may be unlocked from the within, and that renters aren’t allowed to interchange locks—presently high-tech gadgets managed by smartphone— with their very own fixtures,” experiences Cambridge Day.
Cambridge Day “has confirmed what we suspected—the condominium within the Cambridge brothel case could possibly be unlocked from the within, debunking the federal government affidavit’s declare that the ladies had been locked inside,” the Boston Intercourse Staff and Allies Collective (BSWAC) posted on BlueSky.
Extra Intercourse & Tech Information
• Statistics professor Aaron Brown dismantles an influential examine linking legalized prostitution to will increase in human trafficking. “The examine, printed in 2013 within the journal World Improvement, has been used to cease legalization initiatives all over the world and to justify harsh new legal guidelines that flip clients of voluntary intercourse work into criminals, typically within the identify of stopping human trafficking,” Brown factors out. “Sadly, the authors of the examine used a flawed financial mannequin and abysmal information to achieve their conclusion. When essential info was lacking, they guessed and crammed it in. Then, when the evaluation did not yield what appeared to be the authors’ desired discovering, they threw out the info. There is no such thing as a proof that legalizing prostitution will increase human trafficking.”
• Asian therapeutic massage parlor panic is not going to die. Repeatedly, media retailers are keen to lap up teams warning that immigrant therapeutic massage employees are intercourse slaves, although nearly each “human trafficking” bust at a therapeutic massage parlor winds up with the employees themselves getting charged with prostitution or unlicensed therapeutic massage and little else.
• Trump is repeating Joe Biden’s AI errors.
• The Wall Avenue Journal provoked Meta’s AI chatbots into some attractive speak after which freaked out about it. “The use-case of this product in the way in which described is so manufactured that it is not simply fringe, it is hypothetical,” a Meta spokesman instructed the Journal in response. “However, we have now taken extra measures to assist guarantee different people who need to spend hours manipulating our merchandise into excessive use circumstances could have an much more troublesome time of it.”
Right now’s Picture

