In late January, a Senate hearing took social media platforms and other Big Tech companies to task for harms to children. Many states have already passed laws regulating social media, online privacy practices, and other elements of interaction with minors online. Now, there is speculation about whether their federal counterparts can or will pass any national measures designed to deal with children's experiences online. Much of the news from this hearing focused on sexual exploitation and bullying of children online. It remains to be seen whether any follow-up action will result (and whether it will be effective).
Subscribe to Taylor English Insights by topic here.
Among the proposals were the STOPCSAM Act (Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act), which would give victims new avenues to report child sexual abuse material to internet companies, and the REPORT Act (Revising Existing Procedures on Reporting via Technology), which would expand the types of potential crimes online platforms are required to report to the National Center for Missing and Exploited Children.
Other proposals would make it a crime to distribute an intimate image of someone without that person’s consent and would push law enforcement to coordinate investigations into crimes against children.
A separate proposal passed last year by the Senate Commerce Committee, the Kids Online Safety Act, would create a legal duty for certain online platforms to protect children.