Is encryption the biggest impediment to law enforcement’s ability to stop child sexual predators? For the advocates of the EARN IT Act, which would loosen the rules protecting Internet services’ use of encryption, it most certainly is.
The Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act would create an online child sexual exploitation prevention commission to develop “best practices” which Internet services would have to adhere to in order to retain the protections of Section 230 of the Communications Decency Act of 1996. Section 230 protects Internet services such as Facebook and Twitter from lawsuits over content published on their sites by their users.
Following the addition of a manager’s amendment by cosponsor Senator Lindsey Graham (R-SC) which weakens some parts of the bill and leaves the impact of other sections unclear, the Senate Judiciary Committee voted to approve the EARN IT Act today (Thursday, July 2). The next step for the bill is to be voted on by the full Senate.
The EARN IT Act, originally introduced by Senators Richard Blumenthal (D-CT) and Graham, doesn’t actually use the word “encryption,” and both senators denied on Thursday that the bill is intended to interfere with how tech companies use encryption to protect their users’ data and communications. The bill creates a 19-member commission to determine what the “best practices” should be, with three mandatory commission members: the U.S. Attorney General, the Secretary of Homeland Security, and the Chair of the Federal Trade Commission. Any one of these three would be empowered by the guidelines set out in the EARN IT Act to veto recommended “best practices.”
Given that many US government leaders have led decades-long history of opposition to private use of digital encryption, cybersecurity and privacy advocates fear that the EARN IT Act commission is a wolf in sheep’s clothing. Their concern is that EARN IT is an attempt to hide an attack on the use of encryption among the legitimate concerns of the proliferation of child sexual abuse material (CSAM) online, says Riana Pfefferkorn, associate director of surveillance and cybersecurity at the Stanford Center for Internet and Society.
“There is a sense that tech companies are too big for their britches, and someone should stick it to them,” Pfefferkorn says, thanks to the spread of hate speech, misinformation, and disinformation online. “EARN IT will hurt all of us, but it won’t financially hurt the companies, and it won’t help catch the bad guys. It’s the wrong tool to indulge that understandable impulse in the year of our lord 2020.”
A second bill: the LAED Act
The EARN IT bill is not the only attempt by lawmakers to restrict the use of encryption, which has become increasingly more commonplace in the aftermath of the whistleblower disclosures taken by Edward Snowden, especially as used in messaging apps such as iMessage, Signal and WhatsApp to prevent malicious hackers and government snoops from spying on message content. A second bill, the Lawful Access To Encrypted Data (LAED) Act, would force tech companies with more than 1 million users to create government-accessible backdoors in encryption they’ve deployed to aid search warrants of devices used by government targets.
Where the intent of EARN IT is to focus on child abuse, LAED, co-sponsored by Republican senators Graham, Tom Cotton (R-AR), and Marsha Blackburn (R-TN), is much clearer, says Pfefferkorn.
Introduced on June 23, the LAED Act, she says, is uncomplicated in its requirements and would affect most tech providers today, including operating systems makers and device manufacturers such as Apple, Google, Microsoft, Amazon, Samsung, and the entire Android ecosystem; devices as diverse as the Xbox, voting machines, and the panoply of the Internet of Things; messaging services including Apple’s iMessage, Facebook’s WhatsApp, and Signal; and services that offer encrypted storage such as Box and Dropbox, she says.
“It’s a backdoor mandate. If you are one of the larger entities out there, you have to redesign everything” to pass muster.
However, advocates who specialize in stopping CSAM say that Big Tech is simply looking for a way out of a legitimate government interest in using its power to stop child sexual abuse and communication between those who create and disperse child pornography. Benjamin Bull, general counsel for the National Center on Sexual Exploitation, says that his organization supports the EARN IT Act and LAED Act because law enforcement has lacked the tools to stop CSAM thus far.
“Predators and purveyors of CSAM are completely unregulated on the Dark Web and encrypted web,” he says, and calls concerns over privacy rights “a red herring” because “law enforcement would still have to prove probable cause” in order to obtain a warrant to search encrypted digital communications and files.
“Today’s Internet is basically a law enforcement-free zone where child predators can swim around like sharks. The only privacy interests at stake are people engaging in criminal activity. People who are not violating the law don’t have anything to worry about,” Bull says.
Invest in Child Safety Act
The EARN IT and LAED bills are not the only ones attempting to address the issue of CSAM and encryption. The Invest in Child Safety Act, a sweeping law proposed by Sen. Ron Wyden (D-OR) in the Senate and Rep. Anna Eshoo (D-CA) in the House of Representatives, would provide $5 billion over a decade to better fund already-existing anti-CSAM measures at the FBI, National Center for Missing and Exploited Children, and Internet Crimes Against Children task forces.
They’ve been making the case that ending encryption would stop the problem, but upsetting the entire intermediary liability regime we have is not the way to do it. Not to absolve what the companies have done, there’s more work to be done there, they say.
Impacts of laws could go far beyond privacy, security
Jeffrey Westling, a technology and innovation policy fellow at the nonprofit, nonpartisan R Street Institute, says that EARN IT and LAED could eliminate the Section 230 protections as they’ve been used since the rise of the commercial Internet in the mid-1990s, to facilitate global communications.
“People misunderstand why we need intermediary liability generally. It’s not a special carve-out for any one company or any big company. All it says is that a company is not responsible for individual speech on its platform. It’s going to have a terrible impact on our ability to communicate,” he says. “These communications are vital.”
Seth is editor-in-chief and founder of The Parallax, an online cybersecurity and privacy news magazine. He has worked in online journalism since 1999, including eight years at CNET News, where he led coverage of security, privacy, and Google. Based in San Francisco, he also … View Full Bio