Truth Initiative (https://truthinitiative.org) aims to reduce tobacco use through public education, research, and community engagement. Truth Initiative has embarked on two programs to address tobacco use among college students by focusing efforts on underserved populations and settings. While the number of colleges and universities that have implemented 100% smoke- and tobacco-free policies has increased in recent years, two populations and settings have been overlooked and underserved: Historically Black Colleges and Universities (HBCUs) and community colleges. The community college and HBCU programs are multi-year projects funded by Truth Initiative to support grantees’ policy development, implementation, and enforcement efforts. Truth Initiative staff provide technical assistance and resources to enhance college capacity to develop, adopt, and implement 100% tobacco-free and smoke-free policies on campus.
Community Science has been an invaluable partner in our college initiative work. They consistently provide clear and concise reports summarizing the many surveys and focus groups they complete for us. Reports allow us to make adjustments to the initiative and also help inform our overall conclusions and lessons-learned about the best ways to help colleges go smoke or tobacco-free.
Community Science’s evaluation of Truth Initiative’s college initiative examines how colleges build capacity for policy change and if colleges are successful in their policy efforts. This has been a multi-year project to monitor and evaluate tobacco- or smoke-free policy adoption on college campuses. Our evaluation approach included: 1) collaboration with Truth Initiative to codevelop evaluation plans, timelines, and data collection tools; 2) mixed methods evaluations combining online data collection, document review, interviews, and progress reports; and 3) shared learning by client, evaluator, and colleges to support use of evaluation findings. This last component has been a proactive effort to support alignment of evaluation with program planning and implementation at the funder and college levels.
While evaluators aim for their efforts to produce useful information to improve programs and support decision-making (Patton, 1997; Stufflebeam & Shinkfield, 1985), often reality does not live up to this intent (Alkin & Taut, 2003). As evaluators, we must be intentional in our efforts to ensure evaluation findings are used to inform program planning and implementation efforts. Our work with Truth Initiative to evaluate its college-focused policy change efforts is one project in which we have worked consciously and deliberately to ensure findings are shared among stakeholders and used to improve policy change efforts.
Truth Initiative, a nonprofit public health organization, aims to reduce tobacco use through public education, research, and community engagement. Truth Initiative has embarked on two programs to address tobacco use among college students by focusing efforts on underserved populations and settings of community colleges and HBCUs. In particular, tobacco use among college students remains a serious public health issue. In 2008, 18% of college students were smokers (Johnston et al., 2009). College students have a higher prevalence of smoking in comparison to the general adult population (CDC, 2002; Johnston, O’Malley and Bachman, 2003). Some colleges have implemented comprehensive tobacco-free policies to protect the health of students, faculty, and staff (Lee et al., 2012); however, two populations and settings have been overlooked and underserved—HBCUs and community colleges. Truth Initiative’s efforts are designed to address the gaps that exist in capacity building and support for tobacco- and smoke-free policy efforts in these settings.
Community college and HBCU grantees complete an online progress report approximately every 6 months, reporting on policy development activities, task force development and functioning, educational and outreach activities, and campus support and advocacy for the smoke- or tobacco-free policy. Often, the data flow in evaluations is unidirectional—from the participant to the evaluator—and evaluation reports and findings are solely shared with the client. However, we have implemented cost-effective processes and tools to ensure data and results from the grantees’ online progress reporting and other data collection efforts flow back not just to the client but to the grantees themselves. In addition to evaluation reporting to Truth Initiative, Community Science has developed processes and tools to share evaluation findings and data with the grantees. This includes: 1) sharing aggregate evaluation findings with grantees via a simple synopsis document of the overall findings for each progress report conducted; and 2) utilizing a simple dashboard system to automate individual progress report summaries for each grantee.
The aggregate findings report pares down the lengthy report written for Truth Initiative into a two- to four-page summary that primarily utilizes graphics and visualizations to highlight key evaluation findings. The individual grantee summary reports were developed using a dashboard system to simply and easily create a report for each grantee after each progress reporting period. Our main objective with the individual summary reports was to share back the data that grantees reported to us and to do so as efficiently as possible. A dashboard system allowed us to cost effectively automate the report generation process. We created a template for the summary document, used Excel to organize the “back end” where data were stored, and implemented a quality assurance process to ensure that each generated report was accurate and formatted properly. The dashboard system has been used multiple times, and the automated process also allows for simple and quick changes to incorporate any updates to the progress report.
The overall and individual summary reports have been shared with the grantees, and we recommended that grantees use the reports to gauge their level of progress toward policy adoption efforts, and to share and discuss the findings with their campus taskforce members and other stakeholders to acknowledge successes and determine gaps. Additionally, these documents have been used by Truth Initiative to guide technical assistance they provide one-on-one to the grantees.
We have learned that we can proactively ensure evaluation finding and data sharing with grantees across multiple data collection periods in a cost-effective, simple way. This does not have to be complicated or labor or resource intensive. Additionally, this can serve multiple purposes; in addition to data sharing, it can inform technical assistance and also continuous quality improvement. Our efforts have taught us that it is possible to build in processes, practices, and tools to support evaluation utilization in low-cost, simple ways that have maximum impact.
References
Alkin, M. C., & Taut, S. M. (2003). Unbundling evaluation use. Studies in Educational Evaluation, 29, 1-12.
Centers for Disease Control and Prevention. (2002). Trends in cigarette smoking among high school students—United States, 1991–2001. MMWR Morbidity and Mortality Weekly Report, 51, 409 –12.
Johnston, L. D., O’Malley, P. M., & Bachman, J. G. (2003). Monitoring the future: National results on adolescent drug use: Overview of key findings. Focus, 1(2), 213-234. http://dx.doi.org/10.1176/foc.1.2.213
Johnston, L.D., O’Malley, P.M., Bachman, J.G., Schulenberg, J.E. (2009). Monitoring the Future national survey results on drug use, 1975–2008: Volume II, College students and adults ages 19–50. (NIH Publication No. 09–7403). Bethesda, MD: National Institute on Drug Abuse.
Lee, J. G., Goldstein, A. O., Klein, E. G., Ranney, L. M., & Carver, A. M. (2012). Assessment of college and university campus tobacco-free policies in North Carolina. Journal of American College Health, 60(7), 512-519. doi: 10.1080/07448481.2012.690464
Patton, Q. M. (1997). Utilization Focused Evaluation: The New Century Text (3rd Ed.), London: Sage Publications.
Stufflebeam DL: Stufflebeam’s improvement-oriented evaluation. In Systematic Evaluation. Edited by: Stufflebeam DL, Shinkfield AJ. 1985, Kluwere-Nijhoff, Boston, 151-207.