After Donna Shestowsky and I completed two of the nation’s first neutral evaluations of state court online dispute resolution (ODR) programs, we had some thoughts to share with courts about how they could best ensure the evaluations of their ODR programs were useful and of high quality, so we wrote an article, “Ten Tips for Getting the Most Out of an Evaluation of Your ODR Program,” which was recently published in Court Review, the journal of the American Judges Association. The following is a summary of what we wrote. For more complete guidance, see the complete article.
Tip 1: Negotiate data access when contracting with an ODR provider
The best time to ensure you will have the data needed for a future evaluation, and to monitor program activity in general, is when you negotiate your contract with an ODR provider. As you screen providers, learn how each ensures data security and confidentiality. This is also the ideal time to negotiate data access for your evaluation. To keep your evaluation options open, you will want to secure terms that obligate the provider to share data not only with you, but with external evaluators you might hire later. And whether you hire a provider or develop an ODR platform in-house, be mindful of any data sharing or confidentiality rules or policies in your jurisdiction for the types of cases you plan to include in your evaluation.
Tip 2: Determine when to evaluate
Ideally, you would plan the evaluation of your ODR program as you design your program. But evaluation planning can happen at any time, as can the evaluation itself.
If you plan your evaluation before launching your program, you can increase the probability that the evaluation will accurately reflect your program’s use and effectiveness if you begin your evaluation after 1) the provider has addressed technology glitches that may emerge during early testing of your platform and 2) after you conduct outreach to ensure parties know about your program. If your program is already up and running, you should avoid scheduling an evaluation for a time frame when major changes are planned for your court or the ODR program itself. Significant changes while data are being collected may introduce noise into the data.
Tip 3: Find a neutral evaluator
Selecting a neutral evaluator is important for enhancing the quality, usefulness and credibility of your evaluation. In choosing your evaluator, consider whether they have experience evaluating court alternative dispute resolution (ADR) and/or ODR programs. If you hire someone who is not knowledgeable about ADR, be prepared to spend a lot of time explaining how ADR works, the theory behind it, and the specific issues involved.
Tip 4: Ensure that key personnel are involved in the evaluation planning process
Include court personnel who have knowledge that can assist with evaluation planning. These individuals are, at minimum, judges hearing the cases served by the ODR program and court staff who understand the processes involved and the underlying technology. They can help you determine the questions to answer, identify what data are needed, or work out how to access relevant existing data.
You will also want to decide who should serve as the point of contact for your evaluators. This person will answer the evaluator’s questions and help to obtain data. Staff members should be clear on their role in the evaluation.
Tip 5: Prepare to use data from a variety of sources
To best understand your ODR program, you should obtain information from multiple sources, such as your case management system, the ODR platform and participant surveys. To collect systematic feedback from parties (or other stakeholders, such as lawyers), your evaluator will need your help to facilitate distribution of surveys. They will work with you to determine which parties to survey and the best method for contacting the parties. They should also ask you to review the survey questions.
Tip 6: Expect to spend time with the evaluator
To conduct an effective evaluation, your evaluator will need to thoroughly understand your ODR program, how it fits with your overall process for handling cases, and how the platform interfaces with your case management system. Your evaluator will want to spend time with you to discuss your program processes and get answers to any questions throughout the evaluation process.
Tip 7: Facilitate the participation of court personnel and other program partners in the evaluation
Give court personnel and other program partners (e.g., mediators) who were not a part of the evaluation planning process a heads up about the evaluation and ask for their cooperation. Introduce your evaluator to relevant personnel and partners. These efforts should pave the way for your evaluator to reach out to them to get their perspectives on the ODR program and its impact on their work. When asking court personnel and program partners for their cooperation, reassure them that the evaluation’s objective is to improve the program, not to find fault with it or with them.
Tip 8: Help your evaluator to pilot test their survey materials
Before the evaluation period, your evaluator should obtain feedback on their surveys from individuals similar to those who will be surveyed for the evaluation—typically, parties to similar cases. Your evaluator will need your help to gain access to those individuals.
In addition, every court has unique terminology that should be reflected in how questions are worded. Your evaluator should work with your staff to ensure that the correct terminology is used so that it is more likely to be understood by those who will be asked to complete the survey.
Tip 9: Be flexible about the length of time set aside for data collection
The time allocated for data collection needs to be long enough to get the data necessary for analysis, but measured enough to provide timely results. Your evaluator will work with you during the evaluation planning phase to determine the time frame. But be prepared to be flexible. Your evaluator may recommend extending the data collection period if the level of program use and/or survey participation is lower than expected and they need more time to collect data to deliver a useful evaluation.
Tip 10: Survey those who use ODR as well as those who do not
Our evaluations have shown that the motto “If you build it, they will come” does not always apply to ODR. Surveying eligible parties who did not use ODR could help you identify issues that might be driving lower-than-expected usage. Surveys can point to marketing or party education problems, or in the case of voluntary ODR programs, uncover program attributes that parties find unattractive. You can ask parties whether they knew about your program, how and when they learned about it, and whether they knew they were eligible.
In the end, when a court invests resources to establish an ODR program, a major goal is to have it be used. It is imperative to commit resources to effectively market the program, which should include efforts to educate parties and ensure they know they are eligible or required to use it.
Courts that have their ODR programs objectively evaluated should be applauded for their efforts. Evaluations can facilitate program design that is data-driven and evidence-based, rather than guided by anecdotes or hunches. This grounding in data is especially important when making decisions geared toward satisfying the interests of litigants, since understanding their unique perspectives requires collecting data directly from them. Ideally, ODR evaluations will be conducted by neutral third parties who have no stake in the results and meet high research standards. Neutral evaluations are uniquely situated to offer an outside perspective on what works well about a program and to suggest how it might be improved. In addition, constituents, including lawyers, are more likely to accept the findings of a neutral, outside evaluation that concludes that a program delivers beneficial outcomes.
Very well done. All seems extremely sensible–and no one SHOULD object unless they are not really interested in evaluation for the purpose of improvement and deciding whether resources expended are worth the results obtained.