The nation’s major evidence clearinghouses are heavily used but need further improvement to be more useful to frontline practitioners, funders, and policymakers, according to a report released April 10 by Results for America and Bridgespan.
Nationally, the report reviewed 36 clearinghouses that rate social interventions based on their impact and/or the rigor of their evaluations. Most are focused on specific topic areas such as education, mental health, and criminal justice. It identified another 15 internationally, most of which are in Britain.
As a group, these clearinghouses receive substantial amounts of web traffic. The most heavily trafficked is the What Works Clearinghouse at the U.S. Department of Education, which reported an average of 8,000 visits per day. SAMHSA’s National Registry of Evidence-based Practices and Programs averages more than 1,000 visits per day. Most others report several hundred visitors per day.
Despite such usage, however, based on interviews with over 80 clearinghouse users, suppliers, and other experts, the report identified a number of information gaps and challenges.
Clearinghouses Are Not Comprehensive
One problem is a lack of comprehensiveness. Clearinghouses do not include ratings for every intervention and usually omit those that do not meet their evidence standards. In interviews, some clearinghouses said this was because they lacked the resources needed to rate every intervention and preferred to focus on those with enough evidence to meet their criteria.
Purveyors also hesitate to subject their interventions to such ratings if they are not sure they will be rated well. According to the report:
One purveyor admitted it is not on a clearinghouse yet because “you can choose when to be rated, so we chose not to until we can get an A rating.” Uncertainty about negative portrayal can have this type of chilling effect on those who might otherwise be willing to participate in and share evaluations.
This bias toward positive results is reinforced by a similar tendency among academic journals, which often only publish studies with positive findings. The resulting omission of studies with inconclusive results or that demonstrate program ineffectiveness makes clearinghouses much less useful.
“The federal government can be a real leader in providing incentives for people to do research and publish findings even when they are negative,” Kathy Stack, a former official at the White House Office of Management and Budget, told the report’s authors. “We need to change the value system. It’s useful to know what doesn’t work. We need to push for people to preregister studies. This puts it out in the ether that this work is being done.”
Clearinghouses Can Be Difficult to Use
Some clearinghouses can be difficult to navigate. Even when information about specific interventions is available, it can be confusing when different clearinghouses rate the same interventions differently using different standards. Users can be further confused by separate ratings for program impact and evaluation rigor.
“Users have told us it’s confusing,” said Cambria Rose Walsh, project manager at the California Evidence-Based Clearinghouse for Child Welfare. “They go to one clearinghouse and there is this rating. They go to another, and it’s a different rating. What does that mean?”
There are several efforts underway to address this problem, including a cross-agency effort to coordinate evidence standards at the federal level launched by the U.S. Department of Education and the National Science Foundation. The report also cites the Pew-MacArthur Results First Initiative, which is compiling and comparing ratings across eight different national research clearinghouses.
Some clearinghouses are also beginning to provide user-friendly syntheses and research summaries. According to the report:
The What Works Clearinghouse has launched practice guides, which have been well received. In 2013, there were over 370,000 downloads of the practice guides—more than twice the number of downloads of intervention reports. The What Works Clearinghouse interviewee told us, “For practitioners, our most useful product is the practice guides.” However, most clearinghouses do not play this synthesizer role, nor do they believe it is their role to play.
Users Need More Than Evidence Ratings
The report identified additional user needs that went beyond evidence ratings. These needs include feedback from peers on their experiences implementing the rated interventions, information about upfront and ongoing costs, and cost-benefit analysis. Many practitioners are also looking for ways to incrementally improve existing interventions, not replace them with completely new ones. Such information is rarely available.
Some of these needs are being filled by third-party consultants and intermediaries, such as Hanover Research, an organization that reviews existing research and best practices and provides advice to clients in the education field. Nonprofit organizations like Child Trends and Chapin Hall at the University of Chicago are filling similar roles in child welfare.
Overall, however, the report says this advisory market is “sparse.” More typical sources of information for practitioners include word-of-mouth referrals from peers, national associations, and the marketing efforts of vendors and purveyors of specific interventions. According to the report, “purveyors often tout their products as evidence-based practices, whether or not they have been officially validated.”
Usage By Policymakers Is Low
The report also reviewed evidence use by policymakers at the federal, state, and local levels and found demand for such information among these individuals and organizations to be very limited.
However, the report cited one organization as a promising model for addressing this shortcoming. The Washington State Institute for Public Policy is a nonpartisan research organization that serves the Washington state legislature by reviewing the evidence base for specific policies and conducting cost-benefit analysis. The organization is performing an intermediary role for policymakers similar to the one being played by consultants and some nonprofits for frontline practitioners.
…But Clearinghouses Are Headed in the Right Direction
According to the report, despite the many challenges and gaps facing clearinghouses, these problems are not insurmountable. It characterizes them as “growing pains” and suggests the field is headed in the right direction overall:
Even within their resource-constrained environments, several clearinghouses described their efforts to continually improve their website functionality and content. By soliciting user feedback, they are starting to identify the gaps in meeting users’ needs and plan their responses.
About this improvement process, one clearinghouse interviewee said, “A clearinghouse is a long-term process … It’s a big cultural change; it’s accelerating now, but it’s not going to change overnight. It’s going to require repetition and getting the incentives set up right.”
- Education Week, In What Works Clearinghouse Research, Does High Quality Equal Highly Useful? (March 16, 2016)