from the getting-it-backwards dept
In early December 2022, a former Israeli Minister of Defense and Chief of Staff of the Israel Defense Forces, three other retired Israeli generals, a former Commissioner of the Israeli Police, and a former head of the Mossad’s Intelligence Directorate filed an amicus brief before the U.S. Supreme Court in Gonzalez v. Google arguing that Internet platforms should be civilly liable for third party content that encourages terrorist activity. In their filing, they claimed that the wave of terror in Israel in 2015–2016 “became known as the ‘Facebook intifada’ and the #stab! Campaign due to the essential role social media played in inciting the perpetrators to attack civilians.” The Anti-Defamation League also filed a brief in the case, similarly arguing that Internet platforms should have legal accountability for violence against Jewish Americans and other vulnerable communities encouraged by these platforms’ recommendation engines. So, too, the Zionist Organization of America asserted that Internet platforms should not be immune from liability “when they target specific users and recommend and direct them to new content that helps fan the flames of hatred and violence against the Jewish community.”
There is no doubt that Internet platforms are used to disseminate antisemitic content. But these briefs fail to recognize that these same platforms greatly foster Jewish community and religious activity in the United States and throughout the Diaspora; and that the legal interpretations these briefs advocate could drastically diminish this activity.
In Gonzalez, the U.S. Supreme Court will consider for the first time the scope of the safe harbor provided by Section 230 of the Communications Decency Act, which limits the liability of an Internet platform for content uploaded by third parties. Since Section 230’s enactment in 1996, the lower courts have interpreted it broadly. Some politicians and interest groups argue that these interpretations are overly broad and have disincentivized Internet companies from eliminating hate speech and disinformation from their sites. Free speech advocates, on the other hand, contend that the courts have correctly applied Section 230 in a manner that enables platforms to allow any speaker to reach a global audience at no, or minimal, cost, without prior vetting or filtering. In Gonzalez, the Supreme Court could uphold the broad interpretation upon which Internet companies have relied for the past quarter century, or it instead could narrow the scope of the safe harbor and disrupt the existing business models of the Open Internet.
Increasing the liability of the platforms for the third-party content would force the platforms to act as gatekeepers; to reduce their exposure to ruinous damages, many platforms would permit dissemination only of paid or pre-approved content. This could adversely affect a wide range of online activity, including the rich Jewish life facilitated by the Internet.
Worship. Even before the Covid-19 pandemic, Jewish congregations had begun to experiment with the live streaming of religious services. Live streaming and video conferencing of services increased dramatically with the onset of the pandemic, and many congregations now use hybrid models. While Orthodox congregations will not stream their services on the Sabbath and the holidays, they will stream the daily morning, afternoon, and evening services. Less strict denominations not concerned about the use of electricity will also stream services on the Sabbath and holidays. (For example, this past Rosh HaShanah, I witnessed the blowing of the shofar at B’nai Jeshuran Congregation in New York from my hotel room in Geneva, Switzerland.) Indeed, the use of video conferencing and streaming technologies have led to extensive rabbinic debate whether people participating in a service via Zoom counted towards the “minyan” or quorum of ten participants. The Rabbinical Assembly of the Conservative Movement issued a 50-page legal opinion on the subject.
The availability of remote attendance has led to growing participation in daily services and the strengthening of ties to Judaism. One now can routinely join shiva minyans at the houses of mourners via Zoom or other video-conferencing platforms, enabling mourners to be joined by family and around the world. The same is true with other life-cycle events, such as brises (ritual circumcisions) and weddings.
Education. Internet platforms provide myriad channels for formal and informal Jewish education. During the pandemic, Jewish institutions of learning at all levels migrated online using platforms such as Zoom or Webex. Adult education programs on Jewish topics by synagogues, universities, and other organizations are now offered online. Lectures are live streamed and archived on YouTube. Hundreds of rabbis from around the world offer “daf yomi” or the daily study of a page of Talmud via Internet platforms.
Culture and Community-Engagement. Social media platforms such as YouTube host vast quantities of Jewish cultural material, including videos of performances of songs and dances. Around holidays, groups such as Six13 and the Maccabeats release their latest holiday-themed recordings. Synagogues and other Jewish organizations use platforms like Facebook and Zoom for cultural events, book groups and professional discussion forums for rabbis, cantors, and teachers. Hadassah Magazine in an article entitled A (Facebook) Group for Every Jewish Interest reviewed some of the over 1,000 Facebook groups with “Jewish” or “Jews” in their names.
In short, Internet platforms allow Jews in the Diaspora to practice their faith and strengthen their identity. Moreover, U.S.-based platforms heavily support all aspects of political, economic, cultural, and personal life in Israel. Indeed, Israelis are the world leaders in social media use, with 77 percent of adults using social platforms such as Facebook, Instagram, and WhatsApp. Imposing greater liability on U.S. platforms for third party content could endanger these positive uses by increasing their cost and reducing their spontaneity.
Proponents of narrowing the Section 230 safe harbor, including the generals and organizations mentioned above, may contend that rather than wholesale changes to the application of Section 230, they merely want increased liability for the use of algorithms recommending content. But virtually all social media sites use recommendation algorithms; the amount of content available on the Internet is so enormous that all search engines and sites hosting content use algorithms to determine what content to offer users.
The Solicitor General of the United States in its brief in the Gonzalez case tried to draw a distinction between the use of algorithms by search engines to select content in response to a user’s query and the use of algorithms by a social media platform to supply a user with content by an automatically generated feed. This is a distinction without a difference. The search engine algorithm considers searches the user has previously conducted in determining what search results to present the user in response to the particular query he is now making; the platform considers the user’s prior activity in determining what content to present the user in her feed. In both cases, the user’s prior activity influences the algorithm.
Furthermore, even if there were a difference between search engine results and feeds, feeds are extremely beneficial to the user and society at large in most cases. The feed provides the user with more of the content she wants to see, and usually that content is not problematic in any way. To be sure, a social media platform might feed additional antisemitic content to a user who spends some time on the platform viewing antisemitic content. By the same token, the platform would feed Jewish educational material to a user who spends time on the platform viewing Jewish educational material. Platforms should not be forced to abandon feeds, with all the resulting user benefits, because on occasion the feeds may have harmful impacts.
Finally, three quick responses to the suggestion that platforms could easily remove access to antisemitic content without changing their business models in a manner that ultimately restricts access to legitimate content. First, display of antisemitic symbols and content might be necessary for educational purposes, such as to teach about the Holocaust — but Internet companies have difficulty accurately making such content moderation distinctions, particularly at scale. Second, there is profound disagreement about when criticism of Israeli government policies towards the West Bank and Gaza Strip constitutes antisemitism. Here, too, Internet companies have trouble getting the nuance right. Third, even if the social media platforms could draw appropriate lines with respect to antisemitic material, changes to Section 230 would still lead to liability for other problematic content, and the platforms would still need to change their business models, to the detriment of Jewish activity online.
A decision in the Gonzalez case is expected by the end of June 2023.
Jonathan Band is a copyright and internet lawyer based in Maryland. This article was reposted with permission.
Filed Under: algorithms, gonzalez v. google, judaism, safe harbors, section 230, social media
Source link