Il prof. Eric Goldman segnala l’appello del 9 circuito 20.06.2023, No. 21-16499, Vargas ed altri c. Facebook , in un caso di allegata discriminazione nel proporre offerte commerciali sul suo marketplace –
La domanda: <<The operative complaint alleges that Facebook’s “targeting methods provide tools to exclude women of color, single parents, persons with disabilities and other protected attributes,” so that Plaintiffs were “prevented from having the same opportunity to view ads for housing” that Facebook users who are not in a protected class received>>.
Ebbene, il safe harbour non si applica perchè Facebook non è estraneo ma coautore della condotta illecita, in quanto cretore dell’algoritmo utilizzato nella pratica discriminatoria:
<<2. The district court also erred by holding that Facebook is immune from liability pursuant to 47 U.S.C. § 230(c)(1). “Immunity from liability exists for ‘(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a [federal or] state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.’” Dyroff v. Ultimate Software Grp., 934 F.3d 1093, 1097 (9th Cir. 2019) (quoting Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1100 (9th Cir. 2009)). We agree with Plaintiffs that, taking the allegations in the complaint as true, Plaintiffs’ claims challenge Facebook’s conduct as a co-developer of content and not merely as a publisher of information provided by another information content provider.
Facebook created an Ad Platform that advertisers could use to target advertisements to categories of users. Facebook selected the categories, such as sex, number of children, and location. Facebook then determined which categories applied to each user. For example, Facebook knew that Plaintiff Vargas fell within the categories of single parent, disabled, female, and of Hispanic descent. For some attributes, such as age and gender, Facebook requires users to supply the information. For other attributes, Facebook applies its own algorithms to its vast store of data to determine which categories apply to a particular user.
The Ad Platform allowed advertisers to target specific audiences, both by including categories of persons and by excluding categories of persons, through the use of drop-down menus and toggle buttons. For example, an advertiser could choose to exclude women or persons with children, and an advertiser could draw a boundary around a geographic location and exclude persons falling within that location. Facebook permitted all paid advertisers, including housing advertisers, to use those tools. Housing advertisers allegedly used the tools to exclude protected categories of persons from seeing some advertisements.
As the website’s actions did in Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (en banc), Facebook’s own actions “contribute[d] materially to the alleged illegality of the conduct.” Id. at 1168. Facebook created the categories, used its own methodologies to assign users to the categories, and provided simple drop-down menus and toggle buttons to allow housing advertisers to exclude protected categories of persons. Facebook points to three primary aspects of this case that arguably differ from the facts in Roommates.com, but none affects our conclusion that Plaintiffs’ claims challenge Facebook’s own actions>>.
Ed ecco le tre eccezioni di Facebook e relative motivazioni di rigetto del giudice:
<<First, in Roommates.com, the website required users who created profiles to self-identify in several protected categories, such as sex and sexual orientation. Id. at 1161. The facts here are identical with respect to two protected categories because Facebook requires users to specify their gender and age. With respect to other categories, it is true that Facebook does not require users to select directly from a list of options, such as whether they have children. But Facebook uses its own algorithms to categorize the user. Whether by the user’s direct selection or by sophisticated inference, Facebook determines the user’s membership in a wide range of categories, and Facebook permits housing advertisers to exclude persons in those categories. We see little meaningful difference between this case and Roommates.com in this regard. Facebook was “much more than a passive transmitter of information provided by others; it [was] the developer, at least in part, of that information.” Id. at 1166. Indeed, Facebook is more of a developer than the website in Roommates.com in one respect because, even if a user did not intend to reveal a particular characteristic, Facebook’s algorithms nevertheless ascertained that information from the user’s online activities and allowed advertisers to target ads depending on the characteristic.
Second, Facebook emphasizes that its tools do not require an advertiser to discriminate with respect to a protected ground. An advertiser may opt to exclude only unprotected categories of persons or may opt not to exclude any categories of persons. This distinction is, at most, a weak one. The website in Roommates.com likewise did not require advertisers to discriminate, because users could select the option that corresponded to all persons of a particular category, such as “straight or gay.” See, e.g., id. at 1165 (“Subscribers who are seeking housing must make a selection from a drop-down menu, again provided by Roommate[s.com], to indicate whether they are willing to live with ‘Straight or gay’ males, only with ‘Straight’ males, only with ‘Gay’ males or with ‘No males.’”). The manner of discrimination offered by Facebook may be less direct in some respects, but as in Roommates.com, Facebook identified persons in protected categories and offered tools that directly and easily allowed advertisers to exclude all persons of a protected category (or several protected categories).
Finally, Facebook urges us to conclude that the tools at issue here are “neutral” because they are offered to all advertisers, not just housing advertisers, and the use of the tools in some contexts is legal. We agree that the broad availability of the tools distinguishes this case to some extent from the website in Roommates.com, which pertained solely to housing. But we are unpersuaded that the distinction leads to a different ultimate result here. According to the complaint, Facebook promotes the effectiveness of its advertising tools specifically to housing advertisers. “For example, Facebook promotes its Ad Platform with ‘success stories,’ including stories from a housing developer, a real estate agency, a mortgage lender, a real estate-focused marketing agency, and a search tool for rental housing.” A patently discriminatory tool offered specifically and knowingly to housing advertisers does not become “neutral” within the meaning of this doctrine simply because the tool is also offered to others>>.