Pre surveys are administered before the sessions begin and post surveys are administered at the conclusion. Some students are followed up for behavioral outcomes. The Heritage survey includes demographic, behavioral and attitudinal items that are later electronically grouped into “mediating variables.” Demographic items include gender, ethnicity/race, and age. The behavioral items include questions about whether or not students have ever had sex and also has follow-up questions about frequency, last time, and condom use. Mediating variables are measured by items that group into six constructs that measure the following:
Abstinence Values: measures the student commitment to maintaining sexual abstinence until marriage and their acceptance of the idea that marriage is the most appropriate context for sexual activity.
Abstinence Efficacy: assesses student confidence in their ability to resist peer pressure to have sex, to avoid situations that would compromise their abstinence position, and to disengage from people who try to pressure them to have sex.
Justifications for Sex: measures the rationalizing that adolescents often engage in to legitimize their initiation of sexual activity, such as being in love.
Behavioral Intentions: measures the students’ level of intent and commitment to abstain from sexual activity in the coming year and until marriage.
Future Orientation: measures the adolescents’ beliefs about the possible future consequences of sexual activity.
Sexual Independence from Peers: measures the ability of the student to reject negative peer pressure to initiate sexual activity.
Outcome objectives are measured by additional survey items. Questions on pregnancy and sexually transmitted diseases, personal impacts, refusal or assertiveness skills, and the intention to abstain from sexual activity outside of marriage are included. Each student is also asked questions related to whether or not they enjoyed the program and the educator teaching it.
Survey data recording students’ pre- to post-program change using the mediating variables is calculated at the end of each presentation of the curriculum and has been verified by an independent third-party evaluator to assess program effectiveness. Internally, to assess educator effectiveness, each educator is given feedback within four weeks of curriculum completion to allow rapid adjustments to teaching methods when they are needed. Educators with strong results present their methods at regular meetings of all educators. Educators with weak improvement on specific scales are paired with stronger educators for mentoring.
Heritage’s program and evaluation process has been approved by, and is under the supervision of, the Heritage Community Services Institutional Review Board #1 for program content, implementation and evaluation forms, and processes and procedures, which provides protection for the vulnerable populations (minors) served. IRBs are responsible for ensuring that the rights and welfare of the subjects are adequately protected.
Heritage Keepers has two published studies, one on the Heritage Keepers Abstinence Education program and one on the Heritage Keepers Life Skills program. The third study showing effectiveness is under review for publication. For more information on these studies, please visit our evaluation section.
Under the federal abstinence education money, Community Based Abstinence Education grants, 5% of each program’s budget was mandated to be spent on third party independent evaluation. It is common practice in Federally-funded and some state-funded grants that the grantee secure an “independent evaluator” and identify appropriated funds in the grant application. Although most people may not be familiar with this process, many examples of this practice are found in www.grants.gov.
The duty of an independent evaluator is to provide an objective and impartial opinion. The Institute for Research and Evaluation was chosen as the independent third party evaluator because of the Institute’s long history of evaluating abstinence education programs. A grantee who is offering abstinence education would select an evaluator that is conversant with abstinence education (just as a grantee who has a grant to offer family planning services secures an evaluation researcher who is familiar with family planning, and a drug manufacturer would choose someone who has experience in pharmaceutical research). Consequently, it is counter-intuitive for anyone to infer that it would be preferential for Heritage Keepers to utilize an evaluation consultant unfamiliar with the priorities and mandates of a program to evaluate effectiveness.
However, the most recent study of the Heritage program was done independently by Dr. Stan Weed, who received no compensation for the analysis and subsequent article submitted for review for inclusion in the proven effective programs list.