Resources / Study / Innovation for Court ADR

Just Court ADR

The blog of Resolution Systems Institute

Author Archive

After Successful Pilot, RSI Seeks Mediator Partners for Next Phase of Trust Project

Jennifer Shack, July 17th, 2024

Last year, RSI began the pilot phase of a research project to examine how mediator behaviors might affect parties’ trust during mediation. During this exploration phase, our research team has been observing small claims and eviction mediations and marking down mediators’ communication behaviors, in a process referred to as coding, for the Trust Project. We gathered pre- and post-mediation surveys from the parties, and we interviewed the mediators involved.

From left, Rackham Foundation’s Ava Abramowitz, RSI Director of Research Jennifer Shack and Behavior Analysis Trainer Kenneth Webb gave a presentation on the early findings of RSI’s Trust Project at the American Bar Association Section of Dispute Resolution 2024 Spring Conference in April 2024.

After coding 22 mediations and completing a thorough review of our piloted data collection instruments, RSI has successfully completed our pilot phase. We are excited to share that we will soon be expanding the project and are looking for mediation organizations and/or individual mediators who would like to partner with us.

Method Adapted for Mediation

The Trust Project is based on behavior analysis (BA), a research method that codes for particular communication behaviors and connects them to desired outcomes. This method has been used successfully in negotiations and sales. BA examines the particular behaviors used as well as the sequences of behaviors that occur, to determine their effects on specific desired outcomes. In this instance, RSI is interested in changes in trust between the parties and changes in trust in the mediator. We are also interested in mediation results and participant perceptions of the mediation and the other party.

Over the course of five years, Ava Abramowitz and Ken Webb worked to modify communication behaviors used in the contexts of negotiations and sales for use in mediation — with a lot of input from mediators and researchers. Ava is a former assistant U.S. attorney, longtime mediator and secretary of the Rackham Foundation. Ken is an expert in behavior analysis, coding and training negotiators to improve their practice. He trained RSI’s researchers in behavior analysis. Thanks to generous support from the Rackham Foundation, RSI has the opportunity to conduct this innovative research into the effects of mediator behaviors on party trust.

Watch Michael Lang’s 2021
In Their Voices interview with Ava Abramowitz and Ken Webb for more insight into the idea of applying behavioral analysis to mediation — the concept behind the Trust Project!

Mediator Partners Sought

For the next phase of the Trust Project, RSI will observe mediations of small claims, family and larger civil cases, both in person and online. We are looking for partners in this endeavor. Interested organizations and mediators would work with RSI to determine how to effectively recruit parties. Mediators will be asked to complete an initial survey about their background and approach to mediation, to facilitate observations of their mediations, and to complete a survey after each observed mediation. We will preserve confidentiality of the mediations, the mediators and the parties by removing any identifying information from the data.

If you are interested in participating in this impactful research, please contact RSI Director of Research Jennifer Shack at jshack@aboutrsi.org.

Courts Can Take Steps to Design Text-Based ODR Programs that Better Serve Parties

Jennifer Shack, June 22nd, 2023

While conducting two of the first independent evaluations of text-based online dispute resolution (ODR) programs in U.S. state courts, Donna Shestowsky and I found those programs promoted access to justice in some ways, but inhibited it in others. To help other courts, we wrote an article about how they might reduce potential barriers when developing and implementing their text-based ODR programs. The following is a summary of our advice from the article, “Access to Justice: Lessons for Designing Text-based Court-Connected ODR Programs,” which was recently published in Dispute Resolution Magazine, a publication of the American Bar Association.

Court adoption of text-based ODR allows parties to communicate asynchronously, at their convenience, from anywhere. This suggests that ODR has the potential to increase access to justice, particularly for self-represented litigants,[i] and could lead to increased efficiency and reduced costs for parties and courts alike.[ii] Conversely, however, for parties who lack digital literacy or access to technology, mandated ODR could instead benefit already advantaged parties and leave others behind. Furthermore, in some instances, mandating ODR could reduce access to justice by overriding consent and party self-determination.[iii]

The Texas and Michigan Programs

The programs we evaluated differed in the issues involved and the platforms used. In Collin County, Texas, we assessed a debt and small claims pilot program in a busy Justice of the Peace Court (JP3-1) that used the Modria platform. In Ottawa County, Michigan, we examined a program for post-judgment family matters brought to the Friend of the Court (FOC), an agency under the aegis of the Chief Judge of the 20th Circuit Court. The FOC used the Matterhorn platform. Both programs, however, were intended to be mandatory once the program was referred. And both required that the parties register and communicate via text on the ODR platforms.

Litigant survey responses suggested that many parties were unaware of the ODR program or did not understand its main features. When asked what would make them more likely to use ODR for a similar case in the future, half said more information.

Although the programs we evaluated used different ODR platform vendors, the platforms worked similarly and had comparable limitations. The platforms provided a chat space and permitted third-party facilitation or mediation. Neither was available to those with significant visual impairments or limited English proficiency. Both allowed only one individual per side to participate. This limitation meant that in Texas if a party had a lawyer, the lawyer participated alone. In Michigan, only parties could participate, and those who had lawyers were not referred to ODR.

Possible Reasons for Not Using ODR

Although ODR was ostensibly mandatory in both programs, the majority of parties in each court did not use ODR. In Texas, both parties to a case used the platform in only 81 of 341 cases (24%) referred to ODR. In Michigan, ODR use was twice as high: For the 102 matters in which caseworkers determined ODR was appropriate, 48% used ODR. In 26 of the 53 matters in which the parties in the Michigan program opted not to use ODR, at least one party did not register on the platform.

Survey and interview data suggest a few reasons parties did not use ODR. In both programs, staff indicated they did not send parties who lacked digital literacy to ODR, and litigant survey responses suggested that many parties were unaware of the ODR program or did not understand its main features. In the Texas program, of those who did not use ODR, only one survey respondent (out of ten) indicated having received information about the program. When asked what would make them more likely to use ODR for a similar case in the future, half said more information.

In survey responses for the Michigan program, parties appeared to lack a basic understanding of how ODR worked. Half of the 50 parties surveyed near the start of their matter did not know ODR was offered free of charge.

According to Texas court staff, litigants received information about the ODR program via the notice the court sent to them (or their lawyers) about their court date, and through an email or text from the platform when the court uploaded their case to it — if the court had their email address or cellphone number. Both the notice and the email lacked information about how ODR worked. Similarly, the Michigan program’s automated email and text, platform, and FOC website missed opportunities to educate the parties.

Implications for Courts

Despite their accessibility issues, both the Texas and Michigan programs had similar access to justice benefits. Our evaluations suggest that for those parties who use ODR, the process is convenient. We found that 72% of ODR use in Texas and 52% in Michigan occurred outside of court and office hours, i.e., at times not available to them in traditional dispute resolution methods. However, in both programs, many parties simply did not register to use ODR. In addition, 50% of ODR users who responded to our survey noted that they liked that ODR was easy to use. These findings indicate that ODR can increase convenience.

Nonetheless, our finding that some parties lacked information or had nontrivial misconceptions about ODR also suggests parties did not always make informed decisions about whether to participate. To enhance access to justice and self-determination, courts should incorporate a communications plan. The plan should:

  • Specify how parties can learn about the program and detail what information court personnel should relay about ODR
  • Indicate what information about ODR to include on the court’s websites and the ODR platform to educate parties about how to use ODR and its potential risks and benefits
  • Outline outreach efforts to urge social services or other agencies to inform their clients about the ODR program

Additionally, courts should present information about ODR in a way that is comprehensible to individuals with low literacy. They should also explain the privacy and confidentiality implications of using ODR, especially regarding whether and how communications shared on the platform might be used in subsequent legal proceedings.

Further, ODR offerings should be accessible to all eligible parties. Courts should urge ODR providers to facilitate use by parties with visual impairments and limited English proficiency. Additionally, courts should direct parties who do not have reliable internet access to computers in the courthouse or other community locations — though as a result of limited business hours and privacy concerns, this solution is far from ideal.

Courts should also ensure that text-based platforms are user-friendly for smartphone users. In the Michigan program, 71% of participants exclusively used a smartphone for ODR. (We did not have information on the devices Texas ODR participants used.) Yet our findings indicate that text-based ODR may be difficult for smartphone users. Courts should urge ODR providers to include in-app voice control to facilitate ODR use on smartphones generally, a change that might be especially important for individuals with disabilities that restrict their ability to type. Parties should also be able to participate in ODR with their attorneys.

Finally, courts should explore ways to maximize access to their platforms for those who lack digital literacy. Usability testing, similar to that conducted for Utah’s ODR pilot program,8 can help identify challenges and potential solutions for given platforms. Courts might also consider providing parties with links to web-based resources or trainings that could increase their comfort with technology.

Given ODR’s current technological limitations and the percentage of the population that continues to lack reliable internet access or digital literacy, ODR is not a panacea for the continued access to justice problem in the U.S. Additionally, our evaluations suggest that parties have different preferences for how to resolve their disputes. To enhance access to justice, and to advance party self-determination, ODR might best serve parties as part of a constellation of alternative dispute resolution (ADR) options rather than being the only form of court-connected ADR.


[i] Amy J. Schmitz, Measuring “Access to Justice” in the Rush to Digitize, 88 Fordham L. Rev. 2381 (2020).

[ii] Amy J. Schmitz, Measuring “Access to Justice” in the Rush to Digitize, 88 Fordham L. Rev. 2381 (2020).

[iii] Amy J. Schmitz & Leah Wing, Beneficial and Ethical ODR for Family Issues, 59 Fam. Ct. Rev. 250 (2021).

Ten Tips for Evaluating Your ODR Program

Jennifer Shack, May 18th, 2023

After Donna Shestowsky and I completed two of the nation’s first neutral evaluations of state court online dispute resolution (ODR) programs, we had some thoughts to share with courts about how they could best ensure the evaluations of their ODR programs were useful and of high quality, so we wrote an article, “Ten Tips for Getting the Most Out of an Evaluation of Your ODR Program,” which was recently published in Court Review, the journal of the American Judges Association. The following is a summary of what we wrote. For more complete guidance, see the complete article.

Tip 1: Negotiate data access when contracting with an ODR provider
RSI Director of Research Jennifer Shack, second from left, and University of California Davis Professor Donna Shestowsky, second from right, are the authors of “Ten Tips for Getting the Most Out of an Evaluation of Your ODR Program.” They joined Nick White, Research & Evaluation Director of the Maryland Judiciary’s Mediation and Conflict Resolution Office, and Dr. Deborah Goldfarb, Assistant Professor at Florida International University, on a panel at the American Bar Association Section of Dispute Resolution Spring Conference.

The best time to ensure you will have the data needed for a future evaluation, and to monitor program activity in general, is when you negotiate your contract with an ODR provider. As you screen providers, learn how each ensures data security and confidentiality. This is also the ideal time to negotiate data access for your evaluation. To keep your evaluation options open, you will want to secure terms that obligate the provider to share data not only with you, but with external evaluators you might hire later. And whether you hire a provider or develop an ODR platform in-house, be mindful of any data sharing or confidentiality rules or policies in your jurisdiction for the types of cases you plan to include in your evaluation.

Tip 2: Determine when to evaluate

Ideally, you would plan the evaluation of your ODR program as you design your program. But evaluation planning can happen at any time, as can the evaluation itself.

If you plan your evaluation before launching your program, you can increase the probability that the evaluation will accurately reflect your program’s use and effectiveness if you begin your evaluation after 1) the provider has addressed technology glitches that may emerge during early testing of your platform and 2) after you conduct outreach to ensure parties know about your program. If your program is already up and running, you should avoid scheduling an evaluation for a time frame when major changes are planned for your court or the ODR program itself. Significant changes while data are being collected may introduce noise into the data.

Tip 3: Find a neutral evaluator

Selecting a neutral evaluator is important for enhancing the quality, usefulness and credibility of your evaluation. In choosing your evaluator, consider whether they have experience evaluating court alternative dispute resolution (ADR) and/or ODR programs. If you hire someone who is not knowledgeable about ADR, be prepared to spend a lot of time explaining how ADR works, the theory behind it, and the specific issues involved.

Tip 4: Ensure that key personnel are involved in the evaluation planning process

Include court personnel who have knowledge that can assist with evaluation planning. These individuals are, at minimum, judges hearing the cases served by the ODR program and court staff who understand the processes involved and the underlying technology. They can help you determine the questions to answer, identify what data are needed, or work out how to access relevant existing data.

You will also want to decide who should serve as the point of contact for your evaluators. This person will answer the evaluator’s questions and help to obtain data. Staff members should be clear on their role in the evaluation.

Tip 5: Prepare to use data from a variety of sources

To best understand your ODR program, you should obtain information from multiple sources, such as your case management system, the ODR platform and participant surveys. To collect systematic feedback from parties (or other stakeholders, such as lawyers), your evaluator will need your help to facilitate distribution of surveys. They will work with you to determine which parties to survey and the best method for contacting the parties. They should also ask you to review the survey questions.

Tip 6: Expect to spend time with the evaluator

To conduct an effective evaluation, your evaluator will need to thoroughly understand your ODR program, how it fits with your overall process for handling cases, and how the platform interfaces with your case management system. Your evaluator will want to spend time with you to discuss your program processes and get answers to any questions throughout the evaluation process.

Tip 7: Facilitate the participation of court personnel and other program partners in the evaluation

Give court personnel and other program partners (e.g., mediators) who were not a part of the evaluation planning process a heads up about the evaluation and ask for their cooperation. Introduce your evaluator to relevant personnel and partners. These efforts should pave the way for your evaluator to reach out to them to get their perspectives on the ODR program and its impact on their work. When asking court personnel and program partners for their cooperation, reassure them that the evaluation’s objective is to improve the program, not to find fault with it or with them.

Tip 8: Help your evaluator to pilot test their survey materials

Before the evaluation period, your evaluator should obtain feedback on their surveys from individuals similar to those who will be surveyed for the evaluation—typically, parties to similar cases. Your evaluator will need your help to gain access to those individuals.

In addition, every court has unique terminology that should be reflected in how questions are worded. Your evaluator should work with your staff to ensure that the correct terminology is used so that it is more likely to be understood by those who will be asked to complete the survey.

Tip 9: Be flexible about the length of time set aside for data collection

The time allocated for data collection needs to be long enough to get the data necessary for analysis, but measured enough to provide timely results. Your evaluator will work with you during the evaluation planning phase to determine the time frame. But be prepared to be flexible. Your evaluator may recommend extending the data collection period if the level of program use and/or survey participation is lower than expected and they need more time to collect data to deliver a useful evaluation.

Tip 10: Survey those who use ODR as well as those who do not

Our evaluations have shown that the motto “If you build it, they will come” does not always apply to ODR. Surveying eligible parties who did not use ODR could help you identify issues that might be driving lower-than-expected usage. Surveys can point to marketing or party education problems, or in the case of voluntary ODR programs, uncover program attributes that parties find unattractive. You can ask parties whether they knew about your program, how and when they learned about it, and whether they knew they were eligible.

In the end, when a court invests resources to establish an ODR program, a major goal is to have it be used. It is imperative to commit resources to effectively market the program, which should include efforts to educate parties and ensure they know they are eligible or required to use it.

Conclusion

Courts that have their ODR programs objectively evaluated should be applauded for their efforts. Evaluations can facilitate program design that is data-driven and evidence-based, rather than guided by anecdotes or hunches. This grounding in data is especially important when making decisions geared toward satisfying the interests of litigants, since understanding their unique perspectives requires collecting data directly from them. Ideally, ODR evaluations will be conducted by neutral third parties who have no stake in the results and meet high research standards. Neutral evaluations are uniquely situated to offer an outside perspective on what works well about a program and to suggest how it might be improved. In addition, constituents, including lawyers, are more likely to accept the findings of a neutral, outside evaluation that concludes that a program delivers beneficial outcomes.

Could RSI’s Latest Research Project ‘OPEN’ Door to ODR for Parties with Low Literacy?

Jennifer Shack, April 13th, 2023

Text-based online dispute resolution (ODR) programs are often touted as a way to increase access to justice. They are seen as more convenient, less costly to parties, and less intimidating, and thus as having the potential to reduce the default rate, particularly for debt cases. Yet early evaluations of ODR programs have found that they suffer from low participation. An information gap, worsened by the prevalence of low literacy, contributes to this low participation.

Through a generous grant from the AAA-ICDR Foundation, RSI’s ODR Party Engagement (OPEN) Project hopes to address this problem by gaining insights from impacted populations and using those insights to develop guidance on communication materials for small claims courts that use ODR.

Through a generous grant from the AAA-ICDR Foundation, RSI’s ODR Party Engagement (OPEN) Project hopes to address this problem by gaining insights from impacted populations and using those insights to develop guidance on communication materials for small claims courts that use ODR.

The Information Gap

RSI’s ODR evaluations found that parties were often unaware of their court’s ODR program or did not understand what ODR was and how it worked. We identified deficiencies in the language the courts used to inform and educate parties, and in how the information was provided. In Utah, a usability study found that parties did not always understand the information provided and wanted more information than was offered.

These evaluations point to a need for better information to apprise parties that an ODR program exists and educate them about the program. Then they could knowledgeably decide whether the program might benefit them, understand what the risks may be, and learn how to use the ODR platform.

Need for Digital Hand-holding

Informing parties properly has become more important with the increase in self-represented litigants. According to the Program for the International Assessment of Adult Competencies, 48% of US adults struggle to perform tasks with text-based information, such as reading directions, with 19% only capable of performing short tasks.

Some courts have changed their approach to helping parties, with varying success. But even those that recognize the need to serve their constituents better may not realize they have a communication problem. Recently, the Colorado Supreme Court conducted a listening tour throughout the state to find out how it might better serve the state courts’ constituents. The main takeaway was that people with low literacy could not understand the courts’ communications to them.

Some courts have instituted alternative dispute resolution (ADR) programs, such as RSI’s virtual eviction mediation programs, that involve access to a program administrator to help parties navigate the program. Small claims ODR programs are different. These programs require parties to use ODR before their first hearing, and they often do not have a designated staff person to help those who have the wherewithal to reach out to the court on their own. Without a person to “hold their hand” through the process, parties need digital hand-holding.

RSI’s Project Goals

To engage and educate parties, courts should offer ODR participants materials that are easy to understand and to access via multiple methods (e.g., mailed notices, videos, text guides). A recent readability study of court forms found that simplifying the text used in the forms increased participants’ understanding of the purpose of a subpoena from 29% to 70%.

Courts generally do not have the knowledge or capacity to develop materials that can be readily understood by people with low literacy. For the OPEN Project, RSI will conduct a series of focus groups and apply their findings, along with best practices developed from prior research, to develop guidance on communication materials for small claims courts using ODR. Cases such as debt, landlord-tenant, eviction and consumer-merchant cases are likely to benefit.

OPEN aims to make access to justice more equitable for self-represented, diverse populations who are either required or offered the opportunity to use text-based court ODR for debt and small claims cases.

Watch this space for updates on our findings.

Verified by ExactMetrics