Dispute System Design for Facebook

The New York Times published an interesting article worth reading, which riffs on Mark Zuckerberg’s statement that Facebook would develop an independent body to make decisions about acceptability of posts on its platform.  He mused that the body might be like a supreme court to make final decisions reflecting global social norms.

The article was written by St. John’s Law School Professor Kate Klonick and Thomas Kadri, a resident fellow at Yale Law School’s Information Society Project.

The authors discussed dispute system design criteria for the kind of institution that Mr. Zuckerberg suggested.

They argued that our existing courts have at least three theoretical virtues:  due process, representation, and independence – though they note that our court system sometimes does not fulfill these ideals.  They describe potential benefits and problems for a Facebook “court” to live up to these three virtues.

It will be an especially daunting task to develop a dispute system governing content on Facebook that generally will be perceived as legitimate and fair.  This would be hard enough problem given “normal” political differences – and the level of polarization has increased dramatically in recent decades, especially in recent years.  Since Facebook truly is a global network with billions of users who can distribute mass messages around the world in an instant, it is particularly vulnerable to authoritarian regimes that maintain power through deception, propaganda, incitement of fear and hatred of out-groups, and escalation of partisan and nationalist identification.  Presumably, for the Facebook system to gain widespread support, authoritarian regimes would need to accept the system.  Given these regimes’ interest in maintaining control, it may be particularly difficult to get their agreement to a system that would significantly limit their control.

Mr. Zuckerberg wrote that Facebook is initiating a consultation process to develop the system.  It will be interesting to see the extent, if any, that Facebook taps the expertise of the leaders of our online dispute resolution community.

5 thoughts on “Dispute System Design for Facebook”

  1. The New York Times article was an interesting read. Understandably, there will be many hoops that Mr. Zuckerberg will have to jump through to get this “Supreme Court” panel off the ground. I found this article interesting because it discusses each concern with a lens focused on our country’s courts of law. However, I would like to address a concern with this issue that is not mentioned in the article: the law of private association (LPA).

    In our very own Marquette Law Review, an article written by Julie Moegenburg discussed the Supreme Court’s “crystallization” of two facets of LPA: freedom of intimate expression and freedom of expressive association. Moegenburg continues that an individual’s interests are “paramount in American society . . . Freedom of association is, therefore, a valuable instrument used to give greater depth and scope to an individual’s [interests].” The freedom of expressive association is the right to associate with others “in pursuit of a wide variety of political, social, economic, educational, religious and cultural ends.” Could Facebook’s panel infringe on this right by censoring and deleting posts by individuals on their site?

    In my opinion, the Facebook panel is most likely going to be in charge of monitoring hate speech; false news; and explicit posts, which is something I hope all users would support. I think having this panel with an arbitration option if one of your posts is deleted would be an ideal way to supplement their panel. If someone has their post deleted and want to dispute the post, a quick online arbitration decision would help curb an infringement of rights.

    Overall, this was a very interesting read and I look forward to monitoring this story in the future.

  2. Well, this is an interesting approach Facebook. On the one hand, it saddens me that our society has gotten to a place that an ‘enforcement panel’ is even necessary to protect individuals online. On the other, I feel Facebook has a social responsibility to protect its users. Facebook is easily the largest social media outlet, with 2 billion active users worldwide, and it seems like Facebook is held to a higher standard than other outlets or applications.

    While I commend Mr. Zuckerberg for this unique approach to cyber-safety, I definitely share some of the concerns in the New York Times article. I worry that it could have the opposite effect, and people will feel suppressed by the world’s biggest social media platform. As the Times article suggested, maybe instead of a ‘Community Standards’ policy, Facebook should adopt a more rigid constitution. In order to reflect Facebook’s diverse cultural values, the panel members need to be chosen wisely and must be representative of Facebook’s users.

    Though I don’t think there is anything Facebook could do to lose users, the regulations have to be appropriately implemented and updated, in order to maintain its place on top of the social media world. I think Mr. Zuckerberg made a responsible decision in an increasingly unstable world. I just hope the process is used correctly. I will definitely pay attention to this story and look forward to seeing its effects.

  3. Well, this is an interesting approach Facebook. On the one hand, it saddens me that any type of ‘enforcement panel’ is necessary to regulate and monitor posts on social media. On the other, I feel that Facebook has a social responsibility to protect its users. Since Facebook is the largest social media platform, with over 2 billion active users worldwide, it also seems like Facebook is held to a higher standard than other social media outlets.

    While I commend Mr. Zuckerberg for this unique approach, I share some of the concerns articulated in the Times article. I do worry that the process will have the opposite effect on its users who might feel suppressed by the media and feel like they aren’t welcome to share their opinions. The biggest issue I have with the ‘panel’ is figuring out who should be on it. The panel needs to be reflective of Facebook’s diversity and numerous cultural values, and represent a number of countries to ensure it works. It also needs to be independent of Facebook, which might be a bit tangled, as the panel looks like it will be paid for by Facebook.
    Like the Times article suggested, perhaps instead of creating a ‘Community Standards’ policy, Facebook should adopt a more rigid constitution.

    Though I don’t think there is anything Facebook could do to lose many users, the process needs to be monitored regularly to maintain Facebook’s place on top of the social media world.
    I think Mr. Zuckerberg made a responsible decision in an increasingly unstable world. I just hope the process is used appropriately. I look forward to seeing its effects on the world’s biggest social media outlet.

  4. This is a very interesting take from Facebook as to how to handle disputes on their own network. Getting a “Supreme Court” to follow posts that do not follow their platform seems like the right thing to do, but will it be executed well enough?

    Comparing the Facebook “Supreme Court” to the United States Supreme Court was also interesting. I like how the article walked through the variables and jobs and roles of the court system. It will be worth a follow to see if there is a Facebook “Constitution” so that their “Supreme Court” could follow the guidelines much like the US Supreme Court.

    Another interesting factoid to keep an eye on would be to see if other social media platforms, like Twitter, soon would follow suit. If Facebook implements this overarching body that makes decisions on the types of posts people see, will other platforms soon follow? There always seems to be a “Keeping up with the Joneses” that these social media sites have with each other. From the last election until now, Facebook has been under a microscope for the types of posts that are on their site. A Twitter could look at this and not want to be in the same spot Facebook is. Maybe they want a dispute system that is like Facebook where a post might not be suited for that site.

    Another posing question would be how this “Supreme Courts'” decisions would hold up in the court of law. Would it be like how arbitration is viewed by the courts where the FAA is the main player? Or is there room for interpretation from the court system of the US?

  5. I was really interested to read the New York Times article, and especially the comparison between the new Facebook appeals process and the United States Supreme Court.

    In reading Mark Zuckerberg’s post “A Blueprint for Content Governance”, I didn’t personally connect what he was describing with the Supreme Court, but rather with arbitration. From the way the process was described, it seems as though a person makes a questionable post, someone complains to Facebook, Facebook removes the post, and then the poster appeals that decision. As a result of the appeal, a new third party looks at the post to decide if it fits within Facebook’s guidelines.

    Personally, whether you view the new Facebook appeal process as an arbitration or as a court review, I think the question will always be finding unbiased judges for these matters. Mark Zuckerberg mentioned himself in his post that judging content for the second time was just as difficult as the first time, and since his process doesn’t mention who is deciding the appeals, I’m assuming someone within Facebook is making the decision both the first time and the second time. An outside decision maker, similar to an arbitrator or a judge, seems necessary, but where do you find a person who can make decisions about Facebook posts totally without bias? Especially when people are so divided on so many issues these days, it’s hard to imagine any decision maker could really make a decision about Facebook content in an unbiased manner.

    I also struggle with the idea that such a process is even necessary for Facebook content. Outside of clearly inappropriate posts, like those containing pornography or violent threats, the point of Facebook has always been allowing people to post what they think freely. I think adding this appeals process will only complicate the already complicated landscape of free speech on Facebook.

Comments are closed.