Teen Pregnancy Prevention Program Findings Show Benefits, Challenges of Evidence-Based Programming

When the federal government started a new grants program in 2010 to finance and evaluate programs to reduce teen pregnancies, Hennepin County decided to give it a shot.

The Minneapolis-area county already had a pilot project in two suburbs with high teen pregnancy rates that it had developed after intensive consultations with young people, parents, and community organizations. But an influx of federal money would allow it to expand its efforts to encourage young people to delay sex or use contraception.

The county won a five-year grant from the Teen Pregnancy Prevention Program, operated by the Office of Adolescent Health in the U.S. Department of Health and Human Services (HHS). The money paid for two efforts. One, the Teen Outreach Program, was on a list that HHS had compiled of “evidence-based programs” — that is, approaches that had already proven effective through rigorous evaluation.

“It was all the rage,” says Kathy Wick, initiative manager at Better Together Hennepin, the county’s teen-pregnancy program. ”It was really a youth-development program that supposedly had positive results in terms of impacting teen behaviors around pregnancy prevention and whether to have sex or not.”

The Teen Outreach Program (TOP), which had shown good results in a 1997 study,  involves weekly classroom sessions, community-service learning, and adult support. The county spent about $500,000 a year out of its annual $3.3 million grant to conduct a randomized controlled trial involving 1,644 students in 24 middle and high schools.

The results? The program had no impact.

Three months and 15 months after the intervention, participants were just as likely as students who followed the regular school curriculum to have had recent sexual activity as well as unprotected sex. They also scored no better in areas like school performance, school engagement, educational expectations, and civic responsibility.

“We spent five years very intensively building TOP with a whole lot of school partners,” says Katherine Meerse, former manager of Better Together Hennepin. “Results showed it didn’t have an impact, what do we do about that?”

There was only one answer, she says: “It wasn’t effective for our kids, we needed to move on to something that was.”


A Scientific Approach

Meerse’s philosophy is at the heart of the Obama administration’s approach to the Teen Pregnancy Prevention Program (TPP), a centerpiece of its efforts to promote “evidence-based programming.”

After more than a decade when most federal funding was diverted to abstinence-only efforts, the TPP program has channeled money to a variety of projects aimed at curbing teen births, which can derail young futures and increase poverty, child abuse, and child neglect.

From 2010 to 2014, it awarded 102 grants that reached nearly half a million young people across the country. But it did not just disburse money. It paid for 41 rigorous evaluations of two types of efforts: 19 that adapted programs on the list of evidence-based programs to new settings or new populations; and 21 that tried out new and innovative approaches. Ninety percent of the findings involved randomized controlled trials, the gold standard of evaluation.

The results of those studies, both summaries and detailed reports, were released this summer, offering a gold mine to advocates of a more scientific approach to government programs.

“You have to give HHS great credit for funding all these rigorous evaluations, to build credible evidence about what works and what doesn’t,” says Jon Baron, vice president of evidence-based policy at the Laura and John Arnold Foundation.* “That separates them from almost every other federal program out there.”

The findings make a couple of things clear:

  • Designing an effective program is hard work. Of the 41 approaches that were evaluated, the Office of Adolescent Health (OAH) identified 12, or about 29 percent,  that had changed behavior.  “Moving the behavior needle is hard and not for the faint of heart,” says Andrea Kane, vice president for policy and strategic partnerships at the Campaign to Prevent Teen and Unplanned Pregnancy, a nonprofit that has been working for two decades to reduce the adolescent birth rate. “Only some rigorous evaluations can be expected to demonstrate behavior change.” But she and others say that’s to be expected: whether in medicine, business strategy, education, or social policy, rigorous testing routinely shows that most interventions fail.
  • Just because a program works in one setting does not mean it will be effective  in another. Only four of the 19 efforts to replicate previously successful programs  passed muster this time around. Hennepin County was not the only grantee to strike out with the Teen Outreach Program.TPP grantees evaluated TOP’s impact in seven different settings. Only one, offered in 25 Florida high schools, significantly changed behavior. Replicating a program can be challenging for grantees, says Brian Goesling, senior researcher at Mathematica Policy Research, a group that helps HHS identify effective teen-pregnancy programs by conducting reviews of research literature: “This is a program developed somewhere else. We have to find out how to make this work for us.”


Need for Multiple Interventions

With an annual budget of $110 million — 75 percent spent replicating evidence-based programs and 25 percent on new approaches — the Teen Pregnancy Prevention Program is adapting its approach based on the findings from the first round of grants.

One of the lessons learned, OAH officials say, is that one-shot interventions are not likely sufficient to prevent teen pregnancies or sexually transmitted diseases in the long run. Instead, they say, young people should be exposed to evidence-based programs at multiple times over the course of their adolescence.

Some of the programs that changed behavior did so in a short-term or limited way — for example, the Crossroads Program, a three-day session offered to students who were at risk of dropping out of high school in Arlington, Tex. The evaluation found that participants were less likely to have had vaginal intercourse without a condom six months after the intervention than students offered standard dropout-prevention services.

But that impact was not found at the three-month or 12-month mark. Nor was there any impact on pregnancy rates after 12 months.

The problem with a finding like that is that there is a one in 20 chance of a false positive for any individual outcome, says Baron of the Arnold Foundation. “It’s possible the six-month effect is valid,” he says. “But it’s also possible it was a chance finding just because you measured at three different points in  time.”

Anita Barbee, professor of social work at the University of Louisville, supervised a $4.8 million, five-year grant to offer two “new and innovative” programs them to young people in Louisville.

One of them was  Love Notes, which educates young people about healthy relationships. It was offered over two consecutive weekends and involved presentations and educational videos.

The evaluation found that three months after the intervention ended, students in Love Notes were less likely than a control group to have had sex without a condom, gotten pregnant, or caused a pregnancy. Those effects persisted after six months, but the differences disappeared after 12 months.

Barbee suspects that the three-month and six-month followups may have acted as a “booster,” an effect that dimmed after 12 months. ”If you’re not reinforcing the training, it starts to fade,” she says.

Barbee is a fan of Love Notes — in fact, she now works as a trainer for the Dibble Institute, advising sites in nine other cities that have adopted the program. She says that when she looked at the results over the whole grant period, instead of just at specific intervals, pregnancy rates were 44 percent lower among Love Notes participants than among the control group.

However, she advises groups to spread out the program beyond just two weekends so students have more time to absorb the information.

In the 2015-2019 round of TPP funding, OAH officials say that 58 of 84 grantees are implementing evidence-based programs in at least three settings — for example, in middle school, high school, and at a clinic or youth organization so that messages can be reinforced over time.

Current grantees are continuing to conduct rigorous evaluations to help fill gaps in the evidence base and explore issues like whether a “holistic, community-wide approach,” on that offers programs in multiple settings, is effective, they say.


Identifying Evidence-Based Programs

Baron of the Arnold Foundation wishes HHS had been more discriminating about the programs that it originally identified as evidence-based, most of which he says did not show a strong, sustained impact on teen pregnancy. A group he used to head, the Coalition for Evidence-Based Policy, made that point in 2010.

That shortcoming is also true of the new results, Baron says. “You can’t point to anything that came out of the program and say, Let’s do this everywhere and it will affect national teen pregnancies,” he says.

OAH officials say they continue to adapt the TPP program based on the latest information — and that they steered the current grantees away from some programs that no longer have the strongest evidence.

The TPP’s legislative mandate calls for it to reduce both pregnancy and “associated risk factors,” they say, and many of the evaluations found programs that changed behavior in areas like contraceptive use or delayed sexual activity.

For example, while the Teen Outreach Program did not work for Hennepin County, it found success with another TPP evidence-based program — Safer Sex — and continues to run the program with a new OAH grant.

Safer Sex offers one-on-one sessions with health educators to young women at clinics.

A randomized controlled trial of programs offered by three grantees found that nine months after taking the program, participants were less likely than the control group to have had sexual intercourse without birth control in the prior 90 days. Those who were sexually inexperienced were also less likely to report engaging in sexual activity.


High Marks for Methodology

A key value of the TPP program is the example it sets for others who want to establish evidence-based programming, some experts say — for example, the Bridgespan group, which gave the effort high marks in a 2013 report.

The researchers surveyed grantees and interviewed a dozen of them, along with  technical-assistance providers and federal officials. “By the end of our research, we came to believe that the TPP program is a model worth emulating,” says the report, “What Does It Take to Implement Evidence-Based Practices?”

It concluded that the the federal program had gone to great lengths to ensure that grantees had implemented evidence-based programs with “fidelity” — that is, replicating the intervention as closely as possible to the original.

In the first full implementation year, it said, independent observers sat in on 3,257 of the sessions offered by grantees and reported that 89 percent had an overall quality of very good or better.  Furthermore, 92 percent of the survey respondents agreed or strongly agreed they had “sufficient support from OAH to implement the pregnancy prevention programs effectively.”


Teen Birth Rates

So to look at the big picture, have teen-pregnancy programs like those funded by OAH helped prevent adolescent pregnancies nationwide? The statistics are headed in the right direction. Teen birth rates have fallen dramatically over the past two decades, now standing at 22 births per 1,000 girls ages 15 to 19. That’s a 64 percent drop since 1991, according to the National Campaign to Prevent Teen and Unplanned Pregnancies. But there are still troubling signs: The U.S. rate remains much higher than that of other developed countries. And the rates for black and Hispanic girls are more than double those for white girls.

A new analysis published by researchers at the Guttmacher Institute found that improved contraception use was the key factor driving the fall in adolescent pregnancy and birth rates from 2007 to 2012.

It’s probably impossible to know precisely what role the TPP has played in driving the downward trend. “We would never say that’s all because of these programs,” says Kane of the National Campaign, which has a contract to help OAH communicate about the TPP program. “That would be an overstretch.”

But she says the rate of decline in teen births started accelerating in 2010 and it’s reasonable to conclude that the TPP program — along with the State Personal Responsibility Education Program, another HHS evidence-based initiative — contributed to that.

In any case, the Teen Pregnancy Prevention Program has helped communities across the country learn which approaches have evidence behind them and which don’t.

In the unproductive and long battle in the U.S. that has too often pitted abstinence versus contraception,” she says, “the Campaign has always been on the side of science.”

         

*Disclosure: The Laura and John Arnold Foundation provides funding support for the Social Innovation Research Center (SIRC). SIRC maintains independent editorial freedom and control over the production and publication of all content.

This entry was posted in Children and Families, Evidence, Government Performance. Bookmark the permalink.