Congress sees new rules for the technique: what is being considered

WASHINGTON (AP) – Should TikTok be banned? Should younger children be barred from engaging with social media? Can the government ensure private information is safe? What about brand new artificial intelligence interfaces? Or should users regulate themselves and leave the government out?

Tech regulation has gained momentum on Capitol Hill amid concerns over China’s ownership of TikTok and parents’ growing concern about social media’s impact on a post-pandemic mental health crisis. Noting that many young people are struggling, President Joe Biden said in his State of the Union address in February it was “time” to pass bipartisan legislation to set tighter limits on personal data collection and targeted advertising forbidden for children.

“We need to finally hold social media companies accountable for the experiment they’re doing with our kids for profit,” Biden said.

Lawmakers have introduced a series of bipartisan bills to regulate the technology, and it’s one of the few major policy issues on which Republicans and Democrats generally agree, fueling hopes of compromise in a divided Congress.

Still, any attempt to take on the mammoth industry would meet with major obstacles. Tech companies have aggressively opposed any federal interference, and they’ve operated without strict federal oversight for decades, making new rules or guidelines that much more complicated.

A look at some of the areas of potential regulation:


Several House and Senate bills would seek to make social media, and the internet in general, safer for children who will inevitably be online. Lawmakers cite numerous examples of teens who have committed suicide after being cyberbullyed or have died as a result of dangerous behavior encouraged on social media.

At least two competing bills in the Senate focus on children’s online safety. Legislation by Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., approved by the Senate Commerce Committee last year, would require social media companies to be more transparent about their activities and allow child safety settings by default activate. Minors would have the ability to disable addictive product features and algorithms that push specific content.

The idea, according to the senators, is that platforms should be “safe by design”. The legislation, which Blumenthal and Blackburn reintroduced last week, would also require social media companies to prevent certain dangers to minors – including promoting suicide, eating disorders, substance abuse, sexual exploitation and other illegal behaviors.

A second bill introduced last month by four senators — Democratic Sens. Brian Schatz of Hawaii and Chris Murphy of Connecticut, and Republican Sens. Tom Cotton of Arkansas and Katie Britt of Alabama — would take a more aggressive approach and include children under Age 13 prohibit from use of social media platforms and parental consent for teens. It would also ban companies from recommending content about algorithms to users under the age of 18.

Senate Majority Leader Chuck Schumer, DN.Y., has not commented on specific legislation, but told reporters Tuesday, “I think we need some kind of child protection” on the internet.

Critics of the bills, including some civil rights groups and advocacy groups linked to tech companies, say the proposals could threaten teens’ online privacy and prevent them from accessing content that might help them, such as and gender identity.

“Legislators should focus on educating families and giving them the power to take control of their online experience,” said Carl Szabo of NetChoice, a group that works with Meta, TikTok, Google and Amazon, among others.


Biden’s State of the Union remarks appeared to be a nod to legislation by Sens. Ed Markey, D-Mass., and Bill Cassidy, R-La. that would expand protections of children’s privacy online, ban companies from collecting personal information from younger teenagers and ban advertising targeted to children and young people. The bill, also reintroduced last week, would create what is known as a “delete button,” allowing parents and children to erase personal information where possible.

A broader House effort would seek to give adults and children more control over their data with what lawmakers call a “national privacy standard.” Legislation passed by the House Energy and Commerce Committee last year with broad bipartisan support would seek to minimize the data collected and make it illegal to target ads to children, and usurp state statutes that have attempted to introduce privacy restrictions . But the bill, which would also have given consumers more rights to file data breach lawsuits, never made it to the bottom of the House of Representatives.

The prospects for House legislation are unclear now that Republicans have the majority. House Energy and Commerce Chair Cathy McMorris Rodgers, R-Wash., has made the issue a priority and held several privacy hearings. But the committee has yet to move forward with a new bill.


Lawmakers introduced a series of bills to either ban TikTok or make it easier, after a hawkish March House hearing in which lawmakers from both parties asked TikTok CEO Shou Zi Chew about his company’s ties to China’s Communist government , data security and harmful content put the app to the test.

Chew tried to reassure lawmakers that the hugely popular video-sharing app prioritizes user safety and shouldn’t be banned because of its Chinese connections. But the testimony gave new impetus to the effort.

Shortly after the hearing, Republican Senator Josh Hawley from Missouri attempted to force a Senate vote on legislation that would ban TikTok from operating in the United States. But he was blocked by another Republican, Sen. Rand Paul of Kentucky, who said a ban would be unconstitutional and anger the millions of voters who use the app.

Another bill sponsored by Republican Sen. Marco Rubio of Florida would, like Hawley’s bill, ban U.S. commercial transactions with TikTok, but also create a new framework for the executive branch to block any foreign apps deemed hostile. His bill is co-sponsored by MPs Raja Krishnamoorthi, D-Ill., and Mike Gallagher, R-Wis.

There is broad Senate support for bipartisan legislation sponsored by Senate Intelligence Committee Chairman Mark Warner, D-Va., and South Dakota Sen. John Thune, Senate Republican No. 2, who TikTok not specifically calling out, but would give the Commerce Department authority to review and potentially limit foreign threats to tech platforms.

The White House has signaled it would support the bill, but it’s unclear if it will be brought up in the Senate or if it could find support among House Republicans.

TikTok has launched an extensive lobbying campaign for its survival, including by getting influencers and young voters to argue that the app is not malicious.


A more recent question for Congress is whether lawmakers should regulate artificial intelligence, given that rapidly evolving and potentially disruptive products like the AI ​​chatbot ChatGPT are emerging and capable of mimicking human behavior in many ways.

Senate Chair Schumer has made emerging technology a priority, arguing that the United States needs to get ahead of China and other countries eyeing regulations for AI products. He has worked with AI experts and published a general framework for how regulation could look like, including increased disclosure of the people and data involved in developing the technology, more transparency, and explanations of how bots arrive at answers.

Schumer said any regulation should “prevent potentially catastrophic damage to our country while ensuring that the US advances and leads in this transformative technology.”

The White House has also focused on the issue, recently announcing a $140 million investment to create seven new AI research institutes. Vice President Kamala Harris met with leaders from Google, Microsoft and other companies developing AI products on Thursday.