AdTech & Digital Media Newsletter – Social Media

Welcome to the November/December 2022 newsletter produced by Arnold & Porter’s AdTech and Digital Media group. This newsletter provides legislative, regulatory and case law developments relevant to the AdTech and digital media industries, the status of M&A activity in the AdTech and MarTech industries, and updates on Arnold & Porter’s activities in this area.

legislation

Senate report on domestic terrorism criticizes social media business models. A report by the Senate Homeland Security and Governmental Affairs Committee focused on the rising threat of domestic terrorism and looked at the impact social media has had on the dissemination of extremist content. The report looked at four major social media companies and concluded: “This report also finds that social media companies have failed to meaningfully address the growing presence of extremism on their platforms. The business models of these companies are based on maximizing user engagement, growth and profit. which incentivizes more and more extreme content – ​​and without new incentives or regulations, extremist content will continue to proliferate and corporate moderation practices will continue to be insufficient to stop its proliferation.”

Repeal of the Journalistic Competition and Preservation Act (“JCPA” or the “Act”). Momentum for adoption of the JCPA faltered in early December when it was left out of a bicameral defense spending legislation agreement. The JCPA would have allowed approved news media to negotiate the use of their content with covered online platforms such as Facebook. The bill, sponsored by Senator Amy Klobuchar and co-sponsored by senators from both parties, would have created an antitrust exception by allowing eligible publishers and broadcasters to negotiate with covered online platforms on the prices and terms and conditions that the online platforms covered could access their content. Proponents of the law argued that social media platforms like Facebook were to blame for the decline in advertising revenue for many news publications and media outlets. The law was opposed by trade groups NetChoice and the Computer & Communications Industry Association, as well as a number of other organizations. claimed it was buggy for several reasons. These included that it would force platforms to negotiate with and broadcast content from digital journalism providers no matter how extreme their content, it would allow large media conglomerates to dominate negotiations at the expense of small media companies, and it would large ones prefer broadcasters to other forms of journalism.

READ :  Twitter prohibits linking to Facebook, Instagram and other competitors

Legal challenge to California’s age-appropriate design law (AB2273). As detailed in our September newsletter, the California Age-Appropriate Design Code Act (CAADCA) was signed into law on September 15, 2022 to promote online privacy for minors. The provisions of the law generally come into force on July 1, 2024. Technology giant NetChoice has filed a lawsuit to ban enforcement of the law on various grounds. According to the Complaintthe law contains a number of defects, such as B. the following:

  • It violates the First Amendment by improperly telling websites how to handle constitutionally protected language and improperly “deputiz[ing] online service providers acting as roving internet censors at the behest of the state”;

  • It is void for vagueness under the First Amendment and Due Process Clause;

  • It violates the Commerce Clause because it “attempts to impose an unreasonable and unreasonable burden on interstate commerce”;

  • It is excluded under the Children’s Online Privacy Protection Act (COPPA) and Section 230 of the Communication Decency Act;

  • It violates the Fourth Amendment by requiring companies to provide commercially sensitive information to the Attorney General “upon request without an opportunity for prior review by a neutral decision-maker”; and

  • It violates various provisions of the California Constitution.

FTC developments

Epic Games Pays $520M in Penalties for Alleged COPPA Violations and Refunds to Users for Alleged Unwanted User Fees. On December 19, 2022, the FTC announced that it had reached an agreement with Epic Games, makers of the popular Fortnite video game, that included a $275 million penalty for alleged COPPA violations and an obligation to reimburse users $245 million for fees which they were allegedly tricked into using dark patterns at Epic Games. (See ours
September consultation for a discussion of the FTC’s focus on digital dark patterns.) The penalties stemmed from a lawsuit filed in federal court in which the FTC alleged that Epic Games collected personal information from children under the age of 13 without obtaining parental consent, and that Epic Games engaged in unfair practices using standard text and voice communications that resulted in children and young people being subjected to bullying, threats and harassment from strangers with whom they were matched. The court order bars Epic Games from providing voice and text communications to children without parental consent, or to teens without consent from the teens or their parents. The refund obligation arose from a separate administrative action in which the FTC alleged that Epic Games used various dark patterns to trick users into making unwanted in-game purchases, such as by using a confusing button configuration. Prior to 2018, Epic Games also allowed children to purchase online currency without parental or cardholder consent. The FTC also alleged that Epic Games engaged in other problematic billing practices, such as B. banning user accounts when they disputed charges, resulting in users losing access to online content they had purchased.

development of case law

More on Gonzalez vs Google (CDA Section 230). As we have described in our October newsletter, the Supreme Court has agreed to hear a case over whether Section 230 of the Communications Decency Act of 1996 shields interactive computer services that use algorithms to target and recommend third-party content to users. The Chamber of Progress, a US-based trade group representing technology companies, recently wrote a
Letter to Attorney General Merrick Garland, who argued that the decision had important implications for online access to reproductive health information and asked the United States Department of Justice to file a brief in support of the defendants. The letter, dated November 21, 2022, said that eroding Section 230 protections would open the floodgates to liability for online service providers and websites that allow reproductive health information on their platforms. The Chamber of Progress writes that this would create a “devastating reality for women seeking reproductive resources in states where they are unavailable.”

EU

EU data protection authorities threaten Meta’s targeted ad model. In early December, a panel representing EU data protection authorities found that Meta breached Europe’s fundamental data protection law, the GDPR, specifically by processing personal user data to deliver personalized advertising. Users can opt-out of personalized ads shown to them based on activity on third-party websites, but have no option to opt-out based on their activity on Meta’s own platforms. Meta does not rely on user consent for this type of personalized advertising, as it claims that personalized ads are necessary for the performance of its contracts with users. That
Board rejected “Performance of contract” as a valid basis under the GDPR for the delivery of personalized advertising and directed the Irish Data Protection Commission to issue binding orders against Meta and impose fines. The Irish Commission will have to follow the Chamber’s decision, which has not yet been published, but appeals can still be lodged.

Other

Report advocates use of expert panels for content moderation. A paper published by the Stanford Institute for Human-Centered Artificial Intelligence summarized a study that focused on the perceived legitimacy of social media content moderation policies and concluded that respondents believed that expert panels were the most legitimate of the four types of content moderation processes they considered. The other three types considered were paid contractors (i.e., individuals hired and trained by the social media company), automated systems (i.e., algorithms), and digital juries (i.e., ad hoc groups of users). The study used a metric called “descriptive legitimacy” (also referred to as “perceived legitimacy”), which consists of the following five parts: “satisfaction (with the way the organization handled the decision), trustworthiness (the organization), fairness and impartiality (of the organization), commitment to continue the organization, and belief in maintaining the level of the organization’s decision-making authority (i.e., decision-making authority). Respondents considered algorithms to be the most impartial moderation process, subject to caveats based on the nature and manner in which the algorithm was developed and the ability for human verification. However, the algorithms lagged behind the expert panels in terms of overall legitimacy. The paper issued a recommendation for social media companies to use expert panels as part of their content moderation policies.

markets

Based on M&A transactions provided by PitchBook, AdTech and MarTech involving change of control transactions in (a) the US (see Figure 1 below) and (b) the US, EU and Asia combined (see Figure 2 below), all measured in dollar volume and number of deals have decreased significantly in CY2022 versus CY2021.

(Figure 1: US only)

(Figure 2: USA, Europe, Asia)

1266720b.jpg

Arnold & Porter updates

The content of this article is intended to provide a general guide to the topic. In relation to your specific circumstances, you should seek advice from a specialist.

READ :  Eddie Nketiah posts optimistic injury update on social media ahead of Arsenal v Crystal Palace