Resources / Study / Innovation for Court ADR

Just Court ADR

The blog of Resolution Systems Institute

Archive for the ‘Program Evaluation’ Category

Jennifer Shack Talks about Inspirations, Dream Projects and the Future of ADR

Just Court ADR, July 19th, 2023

RSI Director of Research Jennifer Shack often uses this space to tell us about a new research project or share findings from her latest ADR program evaluation. Today, we asked her to take a step back and answer a few questions about what drives her work, as well as share her thoughts on a few “big questions” in our field.

What drew you to studying alternative dispute resolution (ADR) as a career?

When I was a Peace Corps volunteer in Benin, West Africa, I observed how the village chiefs resolved conflicts through what I was to discover was mediation. I thought it would be great to have something similar here in the States – a way to resolve conflicts without court intervention and in a way in which both parties felt was fair. I was surprised to learn about mediation when I returned home, and excited when I saw an ad for a job opening that started with the words “Interested in mediation?” I applied, and 24 years later I’m still enjoying my work at RSI.

What is your favorite part of your work?

So much! I really enjoy designing evaluations and research projects. I love interviewing program participants and conducting focus groups because I get to learn on a much deeper level how mediation programs affect the participants – and because I get to meet so many interesting people. I also have a lot of fun digging into data to find out what story they tell about a program or an issue and then writing that story.

Do you have a long-term wish list in terms of aspects of court-based ADR that you’d like to study?

I have a lot of items on my wish list. I’ll just talk about my top three. As you know, Donna Shestowsky and I evaluated two text-based ODR programs. I have also evaluated programs that involved in-person and video mediation. I would love to delve further into how these three different processes affect participant experience, particularly in what and how they communicate with each other and the mediator, and whether agreement terms differ. The more we know about how these processes are experienced by parties, the better we can become at determining which method best fits with different case types and situations, and the more we can improve the participant experience.

I would also love to do longitudinal research on child protection mediation. Having conducted a couple of evaluations on child protection mediation programs and interviewed parents after they participated in mediation, I think this is one of the best uses of mediation. But I’d like to know more about its long-term impact on families.

My third item on my wish list is already starting to become true. For decades, I and so many others have wanted to look inside the black box of mediation and find out what works and what doesn’t. We’re starting to do this with the Mediator Trust Project, but that’s only the first step. There are many aspects that can be examined. For example, in family mediation we can examine mediation’s effect on co-parenting and family dynamics. Another possibility is researching whether there are certain things mediators do that increase the probability of impasse.

RSI’s research team has recently expanded to include two additional full-time employees. How has this affected your day-to-day work or RSI’s project work?

RSI’s Research and Evaluation team recently expanded to include Rachel Feinstein, left, and Jasmine Henry.

Having Rachel and Jasmine join us has been wonderful. It’s really helpful to be able to talk through ideas and issues with other research-minded colleagues. I also am happy to have Jas do research on an idea that I otherwise wouldn’t have time to explore. But most of all having Rachel take leadership on our OPEN Project has allowed me to focus on our Mediator Trust Project while Jasmine continues to monitor and report on the participant surveys from the eviction mediation program RSI administers.

What trends do you see in court-based ADR that you think are likely to persist?

I think remote dispute resolution is here to stay, whether it’s video mediation or text-based ODR. Video mediation will continue to be prevalent, and I’m seeing signs that text-based ODR is going to become much more common in the near future. Artificial intelligence (AI) will make inroads in dispute resolution, particularly in helping parties to negotiate and write agreements. AI may also one day mediate between parties as well.

Outside of technology, I believe courts will continue to implement ADR to address crises, as we have seen with foreclosure and eviction. My optimistic side leads me to think that more courts will treat such cases holistically, attempting to resolve not just the dispute but the problems that led to the dispute in the first place – for example, providing housing and financial counseling to parties at risk of homelessness.

What is your least favorite part of your work?

Probably not having the time or money to pursue all the projects I’d like to do.

What do you see as keys to making court-based ADR more accessible?

The main thing is to break down barriers to participation. This means making the ADR process easier to navigate and use. It also means communicating with parties using multiple methods and keeping in mind best practices for individuals with low literacy. Courts need to ensure that parties know about the existence of ADR options. Donna Shestowsky’s research on civil court ADR and our evaluations of court ODR programs have shown that too many parties don’t know that ADR programs exist. Courts should also educate parties about the benefits and risks of their options if they have them, so they can make informed decisions about those options.

Ten Tips for Evaluating Your ODR Program

Jennifer Shack, May 18th, 2023

After Donna Shestowsky and I completed two of the nation’s first neutral evaluations of state court online dispute resolution (ODR) programs, we had some thoughts to share with courts about how they could best ensure the evaluations of their ODR programs were useful and of high quality, so we wrote an article, “Ten Tips for Getting the Most Out of an Evaluation of Your ODR Program,” which was recently published in Court Review, the journal of the American Judges Association. The following is a summary of what we wrote. For more complete guidance, see the complete article.

Tip 1: Negotiate data access when contracting with an ODR provider
RSI Director of Research Jennifer Shack, second from left, and University of California Davis Professor Donna Shestowsky, second from right, are the authors of “Ten Tips for Getting the Most Out of an Evaluation of Your ODR Program.” They joined Nick White, Research & Evaluation Director of the Maryland Judiciary’s Mediation and Conflict Resolution Office, and Dr. Deborah Goldfarb, Assistant Professor at Florida International University, on a panel at the American Bar Association Section of Dispute Resolution Spring Conference.

The best time to ensure you will have the data needed for a future evaluation, and to monitor program activity in general, is when you negotiate your contract with an ODR provider. As you screen providers, learn how each ensures data security and confidentiality. This is also the ideal time to negotiate data access for your evaluation. To keep your evaluation options open, you will want to secure terms that obligate the provider to share data not only with you, but with external evaluators you might hire later. And whether you hire a provider or develop an ODR platform in-house, be mindful of any data sharing or confidentiality rules or policies in your jurisdiction for the types of cases you plan to include in your evaluation.

Tip 2: Determine when to evaluate

Ideally, you would plan the evaluation of your ODR program as you design your program. But evaluation planning can happen at any time, as can the evaluation itself.

If you plan your evaluation before launching your program, you can increase the probability that the evaluation will accurately reflect your program’s use and effectiveness if you begin your evaluation after 1) the provider has addressed technology glitches that may emerge during early testing of your platform and 2) after you conduct outreach to ensure parties know about your program. If your program is already up and running, you should avoid scheduling an evaluation for a time frame when major changes are planned for your court or the ODR program itself. Significant changes while data are being collected may introduce noise into the data.

Tip 3: Find a neutral evaluator

Selecting a neutral evaluator is important for enhancing the quality, usefulness and credibility of your evaluation. In choosing your evaluator, consider whether they have experience evaluating court alternative dispute resolution (ADR) and/or ODR programs. If you hire someone who is not knowledgeable about ADR, be prepared to spend a lot of time explaining how ADR works, the theory behind it, and the specific issues involved.

Tip 4: Ensure that key personnel are involved in the evaluation planning process

Include court personnel who have knowledge that can assist with evaluation planning. These individuals are, at minimum, judges hearing the cases served by the ODR program and court staff who understand the processes involved and the underlying technology. They can help you determine the questions to answer, identify what data are needed, or work out how to access relevant existing data.

You will also want to decide who should serve as the point of contact for your evaluators. This person will answer the evaluator’s questions and help to obtain data. Staff members should be clear on their role in the evaluation.

Tip 5: Prepare to use data from a variety of sources

To best understand your ODR program, you should obtain information from multiple sources, such as your case management system, the ODR platform and participant surveys. To collect systematic feedback from parties (or other stakeholders, such as lawyers), your evaluator will need your help to facilitate distribution of surveys. They will work with you to determine which parties to survey and the best method for contacting the parties. They should also ask you to review the survey questions.

Tip 6: Expect to spend time with the evaluator

To conduct an effective evaluation, your evaluator will need to thoroughly understand your ODR program, how it fits with your overall process for handling cases, and how the platform interfaces with your case management system. Your evaluator will want to spend time with you to discuss your program processes and get answers to any questions throughout the evaluation process.

Tip 7: Facilitate the participation of court personnel and other program partners in the evaluation

Give court personnel and other program partners (e.g., mediators) who were not a part of the evaluation planning process a heads up about the evaluation and ask for their cooperation. Introduce your evaluator to relevant personnel and partners. These efforts should pave the way for your evaluator to reach out to them to get their perspectives on the ODR program and its impact on their work. When asking court personnel and program partners for their cooperation, reassure them that the evaluation’s objective is to improve the program, not to find fault with it or with them.

Tip 8: Help your evaluator to pilot test their survey materials

Before the evaluation period, your evaluator should obtain feedback on their surveys from individuals similar to those who will be surveyed for the evaluation—typically, parties to similar cases. Your evaluator will need your help to gain access to those individuals.

In addition, every court has unique terminology that should be reflected in how questions are worded. Your evaluator should work with your staff to ensure that the correct terminology is used so that it is more likely to be understood by those who will be asked to complete the survey.

Tip 9: Be flexible about the length of time set aside for data collection

The time allocated for data collection needs to be long enough to get the data necessary for analysis, but measured enough to provide timely results. Your evaluator will work with you during the evaluation planning phase to determine the time frame. But be prepared to be flexible. Your evaluator may recommend extending the data collection period if the level of program use and/or survey participation is lower than expected and they need more time to collect data to deliver a useful evaluation.

Tip 10: Survey those who use ODR as well as those who do not

Our evaluations have shown that the motto “If you build it, they will come” does not always apply to ODR. Surveying eligible parties who did not use ODR could help you identify issues that might be driving lower-than-expected usage. Surveys can point to marketing or party education problems, or in the case of voluntary ODR programs, uncover program attributes that parties find unattractive. You can ask parties whether they knew about your program, how and when they learned about it, and whether they knew they were eligible.

In the end, when a court invests resources to establish an ODR program, a major goal is to have it be used. It is imperative to commit resources to effectively market the program, which should include efforts to educate parties and ensure they know they are eligible or required to use it.

Conclusion

Courts that have their ODR programs objectively evaluated should be applauded for their efforts. Evaluations can facilitate program design that is data-driven and evidence-based, rather than guided by anecdotes or hunches. This grounding in data is especially important when making decisions geared toward satisfying the interests of litigants, since understanding their unique perspectives requires collecting data directly from them. Ideally, ODR evaluations will be conducted by neutral third parties who have no stake in the results and meet high research standards. Neutral evaluations are uniquely situated to offer an outside perspective on what works well about a program and to suggest how it might be improved. In addition, constituents, including lawyers, are more likely to accept the findings of a neutral, outside evaluation that concludes that a program delivers beneficial outcomes.

Does ADR + Tech = Better Access to Justice? RSI Spent Much of 2022 Trying to Find Out

Sandy Wiegand, May 2nd, 2023

RSI spends a lot of time and energy studying the conditions under which court-based alternative dispute resolution (ADR) can best improve access to justice. In recent years, that has often meant using new technologies and/or assessing their impact.

As is often the case with innovations, ADR options that employ new technology are sometimes hailed as the solution to longstanding challenges. For example, online dispute resolution (ODR) is celebrated for its potential to increase access to justice by allowing parties to engage on their own schedules, in their own spaces. Unfortunately, however, technological innovations can also bring challenges and create their own barriers to justice.

RSI’s 2022 annual report asks the question: Does ADR + Tech = Better Access to Justice? Our staff spent much of last year examining that premise. We published two landmark evaluations of court programs that used ODR-specific platforms; completed an in-depth report on the potential for ODR to serve thinly resourced parents, courts and communities; and used video mediation to serve hundreds of clients in northern Illinois. We also evaluated how those programs were operating and how participants viewed them.

Our annual report outlines these efforts and summarizes some of our findings. Not surprisingly, we found both promising signs and causes for concern when it came to technology’s impact on access to justice. We also discovered a lot more questions that need to be answered and problems that need to be addressed.

We hope you will take the time to read the Resolution Systems Institute 2022 Annual Report and review what we have learned so far. The role of technology is, of course, just one of many aspects of court-based ADR that RSI is examining. Please join us as we continue exploring what technology can and can’t solve, as well as other keys to providing cost-effective, timely and fair conflict resolution.

Grant-Funded Research Adds to Evidence on How to Make Eviction Mediation Effective

Eric Slepak Cherney, November 21st, 2022

Last month, RSI reached the end of an 18-month grant from the American Arbitration Association-International Centre for Dispute Resolution (AAA-ICDR) Foundation. A primary goal of the grant was to provide guidance to courts nationwide about addressing the eviction crisis arising from the COVID-19 pandemic. As that project has come to a close, we at RSI would like to look back at what we accomplished, and learned, from the experience.

Our Eviction Mediation Program and Special Topic

The first focus of the grant was to help us establish a local court mediation program, serving Kane County, Illinois. While it may seem counterintuitive that a project with a focus on national guidance invest in a local program, our approach at RSI is to utilize our mediation programs as “laboratories” for the research and evaluation that is core to our mission. We have a long history of designing and administering programs, and as part of that work, we implement established best practices, set up robust monitoring and evaluation systems, and carefully and thoughtfully test out different approaches to help us achieve the goals we set for our programs. The Kane County Eviction Mediation program is no exception (See related article above), and it served as the basis for many exciting accomplishments of the project, detailed further below.

Our next big milestone was developing the Eviction Mediation Special Topic. Special Topics are collections of resources RSI curates around court alternative dispute resolution (ADR) as it relates to different subject matter (e.g., child protection mediation and restorative justice) or interested parties (e.g., judges and lawyers). For eviction, we sought to develop a Special Topic collection that was both topical to the present crisis and also highlighted the best research, guidance and tools for those invested in the development and administration of effective eviction diversion programs.

Blogs and Evaluation Projects

Throughout the 18 months of the grant, the RSI team was regularly blogging about our experiences developing and administering our programs, and what we were learning from others across the country. A few highlights from our blogging include glimpses into innovative program models in Hawaii and Philadelphia, and program design considerations such as working with rental assistance programs and cultivating buy-in from landlords. Additionally, a pair of Q&As with our Programs Manager Chris Riehlmann and our Kane County Program Coordinator Christina Wright provide a great look into what it really takes to make these program work day in and day out.

Finally, and most significantly, the grant supported several evaluative projects we embarked on over this past year and a half. We analyzed the results of our post-mediation surveys to assess whether our programs were providing procedural justice to participants. After reflecting on the steps we’d taken to develop programs and conducting interviews with key program personnel and partners, RSI published program implementation guides to give others nationwide a manual of sorts for building and tweaking their own programs. The project culminated in an evaluation of the Kane County program’s first 13 months (summarized in the article above by RSI Director of Research Jennifer Shack), assessing program use, services provided, mediation outcomes and participant experience.

A Few Key Findings

The amount of information we have learned and done our best to share during the course of this project has been staggering. While any summation is sure to be incomplete, we’d like to leave you with a few key findings from the project:

  1. Integrated and holistic service delivery approaches truly made for better outcomes. Programs that took a comprehensive and progressive approach to combatting eviction saw more agreements and fewer evictions. Similarly, programs that brought more partners to the table, including social service agencies, advocacy groups, state and municipal representatives, and others, saw greater success. While eviction cases are ultimately resolved by courts, the underlying issues are economic and social in nature, and collaboration with entities that address those causes is highly valuable.
     
  2. Good eviction mediations take time. Prior to the pandemic, mediation in housing disputes, in many jurisdictions, was typically an event that took place on the day of the first court appearance and lasted no more than half an hour. Unsurprisingly, agreement rates in this context were generally low. A number of programs we worked with noted that utilizing a model where mediation was done outside of court (and the time constraints that usually entails) resulted in greater agreement. Allotting more time for the session gave greater opportunity to work through impasse, and scheduling mediation for an advance date gave parties the time to better prepare for mediation, including taking stock of finances, asking for support, applying for rental assistance, and consulting attorneys.
     
  3. Remote mediation, which is the norm for RSI’s programs and many others still, continues to offer mixed blessings for participants. The flexibility afforded parties by doing remote mediation meant many more parties could participate without taking a day off work, critical for parties trying desperately to pay back past due rent. On the other hand, our data noted that about 1 in 6 needed to borrow a device or leave home to participate virtually, and 1 in 5 experienced some sort of technical difficulty. Making sure that in-person accommodations could be offered to those who could not or would prefer not to participate virtually ought to be a priority to ensure access, and RSI did so with our Kane County program.

We are tremendously grateful to the AAA-ICDR Foundation for its support of this project.

Verified by ExactMetrics