from the this-is-not-a-good-lawsuit dept

Everyone wants to blame internet companies for everything. A couple weeks back, a woman sued Meta over the death of her brother, claiming Facebook was to blame. This is the latest in a ridiculously long line of failed lawsuits that look to hold Facebook liable for the deaths of people, just because either the killers or people connected to them somehow communicated on social media. It’s like suing AT&T because two people plotting a crime spoke on the phone. These are nonsense lawsuits and they are nuisance lawsuits. This one is no different.

The underlying story here is tragic: two men, who were a part of the “boogaloo bois” (one of the many extremist groups who believe that a new civil war is coming, and that they need to help it along), killed a Federal Protective Services officer, Dave Patrick Underwood. They literally believed that this was part of the process to start this civil war. Its quite understandable while Underwood’s family would be furious about this, and the two murderers are going to be in prison for a long, long time.

But trying to demand money from Facebook?

The entire complaint argues that because the two murderers talked on Facebook, Facebook is somehow responsible. It tries to get around Section 230 by arguing that the murderers found each other due to Facebook’s algorithms, and somehow that gets around Section 230 (it doesn’t).

Facebook, as originally conceived, may have functioned like an enormous virtual
bulletin board, where content was published by authors. But Facebook has evolved over time
with the addition of numerous features and products designed by Meta to engage users. The
earliest of these – the search function and the “like” button – were user-controlled features. In
more recent years, however, Meta has taken an active role in shaping the user-experience on the
platform with more complex features and products that are not triggered by user requests. The
most visible of these are curated recommendations, which are pushed to each user in a steady
stream as the user navigates the website and in notifications sent to the user’s smartphone and
email addresses when the user is off-platform. These proprietary Facebook products include
News Feed (a newsfeed of stories and posts published on the platform, some of which are posted
by your Facebook friends or members of groups you have joined, and others that are suggested
for you by Facebook), People You May Know (introductions to persons with common
connections or background), and Suggested for You, Groups You Should Join, and Discover
(recommendations for Facebook groups to join).

These curated and bundled recommendations are developed through
sophisticated algorithms. As distinguished from the earliest search functions that were used to
navigate websites during the Internet’s infancy, Meta’s algorithms are not based exclusively on
user requests or even user inputs. Meta’s algorithms combine the user’s profile (e.g., the
information posted by the user on the platform) and the user’s dossier (the data collected and
synthesized by Meta to which Meta assigns categorical designations), make assumptions about
that user’s interests and preferences, make predictions about what else might appeal to the user,
and then make very specific recommendations of posts and pages to view and groups to visit and
join based on rankings that will optimize Meta’s key performance indicators.

This is a variation on the “algorithmic recommendations don’t deserve 230 protections” argument. There are efforts in Congress to make that explicit, but it’s nonsensical — in large part because recommendations are protected by the 1st Amendment. So even if you got past the Section 230 issue, you’re still killed by the 1st Amendment.

And the lawsuit here more or less admits that by saying that these are all opinions:

Meta’s algorithms are carefully protected intellectual property. While they are
often characterized as automated and impersonal, they are, in actuality, dynamic and subject to
frequent refinement. They also reflect the inferences, judgments, priorities, and decision-making
of human programmers, managers, and executives at Meta.

Specifically, they’re arguing “negligent design” — an argument that has been tried repeatedly in 230 cases and fails.

In 2017, Meta changed its mission to giving “people the power to build
community and bring the world closer together.”

To accomplish this new mission, Meta redesigned its social media platform and
its recommendation algorithms to promote and emphasize user engagement in hobby clubs, civil
society organizations, and other community groups.

Meta built a superstructure to support groups, built algorithms to recruit members
for those groups, and built algorithms that created an insular world view for members of those
groups.

In an interview with CNN, CEO Zuckerberg recognized the real world
implications of achieving its goal to expand group membership on its platform: “Once people are
coming together in these smaller groups, that actually grows and it ends up with much bigger
changes in the world.”

Meta knew or should have known that these changes in the world could very well
be negative—even dangerous and harmful to the public—as demonstrated by its own internal
research.

I mean, this claim alone actually does a good job of demonstrating why Section 230 is so important in two different ways. First, the whole reason that Facebook changed its efforts towards focusing on community was in response to people freaking out about the 2016 election and misinformation. Facebook took those concerns and tried to respond to them by saying it was changing its emphasis away from news — in the hopes that this would lessen things like the impact on elections — and towards community. And yet here, they’re being blamed for that attempt (weak as it may have been) to improve their service.

Basically, this shows how any decision on how to run your platform will be second guessed, and (if you’re big enough) someone will sue you over it. The very nature of Section 230 is that it allows for these kinds of changes and experimentation to figure out what’s best, and recognizes that it’s a constant need for adjusting and adapting.

The second reason why this highlights the importance of Section 230 is that 230 is useful in getting these cases kicked out of court quickly, as they should be. Because without Section 230, this case is still a complete loser.

The entire basis of the lawsuit is that because these men met on Facebook, and then later decided to kill someone, Facebook is somehow to blame for the murder. But under that theory, if two people meet at a bowling alley, or a bar, or a restaurant, and later commit a crime, you could hold those meeting places liable? That’s ridiculous and clearly cannot fly. Section 230 is helpful in getting such obviously bogus lawsuits kicked out early. Without 230, Facebook still wins this case, but it’s just a lot more expensive.

This is a nonsense lawsuit and the lawyers who filed it should be embarrassed. The underlying story is sad, and I feel terrible for the family, but the decision to sue here is a bad one that will not end well.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: boogaloo bois, dave underwood, intermediary liability, section 230
Companies: facebook, meta


Source link