Resources / Study / Innovation for Court ADR

Just Court ADR

The blog of Resolution Systems Institute

Facebook’s User Conflict Resolution System: An Illustrated Walkthrough

Just Court ADR, August 27th, 2014
Picture of a blooming cherry tree over a river

A Facebook user who objects to this photo can try conflict resolution online.

This post illustrates my recent discussion of the template-based online conflict resolution system Facebook has implemented for user disputes. The system asks users a series of questions, and suggests possible resolutions based on their answers. In some cases, Facebook may offer a user a pre-written message, filled out with a computerized template, to use as a starting point to address a conflict.

Here are some examples of the conflict resolution templates currently available for users who object to other users’ photos on Facebook. (Click any image to enlarge.) In order to get these screencaps, I used RSI’s Facebook account (“RSI”) to visit my personal Facebook (“Mary”.) In this simulation, RSI is upset by Mary’s photo of a cherry tree in bloom, and decides to report it. While looking at the photo, RSI clicks “Options” and then “Report Photo.” A discussion box pops up. Facebook auto-fills the name “Mary” in its messages, to talk to RSI about the problematic photo. The goal of the messages is to help RSI articulate its feelings about Mary’s post, to decide what action to take, and to receive help crafting a message that Mary may respond to positively.

Example 1: Mild annoyance; no company intervention

The process begins by asking why the user doesn’t want to see the photo. The answer will send the user through one of three different branches through this system.

This photo is annoying

Fig. 1 The first message RSI gets upon reporting Mary’s photo. RSI chooses “annoying”

The results are quite different depending on the user’s problem. “It’s annoying” is mainly for mild complaints; “I’m in the photo and I don’t like it” leads to a more personal series, and “I think it shouldn’t be on Facebook” leads to a list of issues that may violate Facebook’s policies.

this photo is silly

Fig. 2 The system explores RSI’s annoyance. RSI finds the photo silly.

In this example, RSI’s annoyance is seen as a mild accusation, and Facebook will not intervene directly. Instead, RSI is offered two specific choices: “Hide all posts from Mary” or “Message Mary.” RSI chooses to send a message.

Let Mary know how you feel

Fig. 3 For mild annoyance, RSI gets a blank message with a framing prompt from Facebook.

 

Example 2: Conflict Resolution Guided by Template

Here is an example of a message that is filled out by template. This example begins with the “Why don’t you want to see this photo?” image in Fig. 1. This time, RSI selects “I think it shouldn’t be on Facebook.” The options under this choice include more serious issues than the first example. To develop these ideas, Facebook provides RSI with examples of each choice. The complaints follow a broad range: the offending photo could be pornographic, annoying, insulting, or show the user in an unpleasant way.

This photo shows me or my family

Fig. 4 RSI is prompted with more serious issues.

When RSI selects “This photo is of me or my family,” it gets different options than “this photo is annoying or not interesting.” Once again, Facebook does not offer to intervene directly with Mary. However, RSI’s choices are stricter: “Message Mary,” or “Block Mary,” a stronger choice than “Hide Mary” since it cuts off all communication.

In this case, the message box provides a template for communication.

This photo is personal, please take it down

Fig. 5 Facebook offers RSI a pre-written message to address user-to-user conflict.

Facebook pre-fills the message box with “Hey Mary, this photo is personal and I would prefer to keep it private. Would you please take it down?”

Example 3: Helping Users Address Very Serious Issues

Many of the message templates use the same text as Fig. 5, but some are highly tailored for specific situations, including potentially life-threatening problems. If RSI clicks “Something else” in Fig. 1, an expanded list of issues appears, with more examples.  One striking example is “This displays someone harming themselves or planning to harm themselves.” Examples include holding a gun to their head or promoting an eating disorder.

This photo shows self harm

Fig. 6 RSI chooses “showing harm” from a list of very serious issues

 If RSI selects the self-harm issue, Facebook presents several choices that acknowledge the gravity and sensitivity of the problem. First, Facebook offers advice on what to do if a friend is in danger. Second, the wording for every option RSI can take has been gentled with a tone that assumes that RSI wants to help a friend, not initiate a conflict. Rather than say “Message Mary,” Facebook suggests RSI “offer help or support.” Also, Facebook suggests RSI may want to “reach out to a friend” other than Mary, to talk through the problem and decide what to do. Lastly, since Facebook does have rules about this type of image, RSI is invited to “submit to Facebook for review,” which could lead Facebook to take its own actions.

Choices to address friend's self-harm

Fig. 7 Facebook’s options to respond to an image showing self-harm.

The messages are also highly tailored depending on RSI’s choices. The “Offer Support” template suggests a message of concern that would go directly to Mary. The message begins “Hey Mary, this post makes me feel worried about you. Are you OK?” and concludes with a helpline number.

A message to offer support

Fig. 8 A message showing concern to a friend, with a helpline.

Facebook also guides RSI’s actions by helping RSI discuss the issue with a third friend, rather than go to Mary directly. It reads “Hey, this post makes me feel worried about Mary. Do you have any idea why Mary would have written this? Do you think there’s something we can do to help?”

Ask a friend for advice

Fig. 9 A message guiding RSI to approach a third party for help with a friend

Though it looks simple at first glance, Facebook’s online conflict resolution system reveals great complexity as one follows each branch of the tree of possible decisions. The tool doesn’t just help users articulate their feelings, it also guides them through the process of choosing an action to take. This is necessary for online dispute resolution, but could it have a place in court ADR as well? In British Columbia an online system is currently being tested for small claims cases. Would it benefit self-represented parties to have a tool to help them articulate their feelings, or is the court system simply too complex for this? Let us know what you think in the comments.

 

 

Tags: ,

One Response to “Facebook’s User Conflict Resolution System: An Illustrated Walkthrough”

  1. Kent Lawrence says:

    Why not? The Administrative Hearings Department of Cook County, IL – I believe – has a system of “Ordinance” violations [tickets], electronically sent to the “offender” who can pay OR go to hearing before a ALJ (Administrative Law Judge) OR waive hearing, send in “proof” documents (pictures, etc.) and write reasons why they are not guilty – again electronically – and upon response by the “prosecutor” similarly, the ALJ would rule “on the papers.” The ALJ then “files” the ruling – again electronically, and the “offender” is done, if found not guilty, or may pay the fine imposed via credit card. Especially for relatively “simple” disputes and matters not involving a lot of resources [damages, fines, etc.] it can be a very cost effective way for dispute resolution with reasonable due process and fairness and a “hearing”.

Leave a Reply

Verified by ExactMetrics