A copy of the following report is available for download at the bottom of this page.
Executive Summary Report
REAL School Gardens Program Evaluation
2010 - 2011 Baseline and Initial Findings
REAL School Gardens (RSG) is a nonprofit organization that partners with high-poverty schools to create learning gardens intended to become a central part of a schoolís culture and community. When this program evaluation began in the Spring of 2010, RSG supported 74 elementary schools in 5 urban school districts in North Texas. RSG engaged external evaluators PEER Associates, Inc. to facilitate and implement a highly collaborative, multi-year, utlization-focused program evaluation process. Evaluators first worked with RSG staff to create a Logic Model, a one-page graphic depiction of the RSG program. Then the team built an online survey that allowed educators to report the extent to which they were doing each of the outcomes described in the Logic Model. The team then adapted the survey for three
distinct uses: informing selection of new school partners as part of the application process, tracking individual educator and whole school outcomes over time, and tailoring professional development events to the appropriate stage of change of participating teachers. This report summarizes data from the first year of surveys as well as interviews conducted in Fall 2010. The final section of this report provides details about data samples and evaluation methods.
Key Participant Outcome Findings
Educators with more RSG training reported significantly higher outcomes on surveys
Spring 2011 survey responses were significantly (p < .01) higher than Fall 2010 responses by the same people on the same questions. For educators who filled out the survey multiple times over the course of the year (N = 64), the average response increased by approximately a half of a standard deviation for nearly all intended outcomes. Outcomes included: Subject area content knowledge, Belief in gardens as resources for increasing student
achievement, Educator engagement, Teaching skills and methods, Teaching tools and resources, and, most compellingly, the combined Index of all individual level outcomes. The outcome for Time spent teaching outdoors increased, but was not statistically significant. See Figure 1.
Amount of RSG training positively and significantly predicted 11 of 14 intended outcomes. The more contact a survey respondentís school had with RSG, the more likely they were to report a higher stage of change for target behaviors. The statistical effect sizes for school level outcomes such as school culture were notably smaller (rR2 = 0 to .09, p <.01) than those for the individual educator level behaviors (rR2 = .05 to .15, p <.01, see Figure 2). This is consistent with the RSG Logic Model. It makes sense that RSG would affect individual educators first (and probably more potently), before building up to longer-term effects at the scale of a whole school.
RSG has selected Partner schools that are generally ready, willing, and appreciative
- All interviewees spoke very highly of the RSG program. Statements such as "I think it's a wonderful program that should continue and not lose its focus," and "RSG is an awesome organization that has truly fostered the love for learning" were fairly typical. Perhaps more compelling were the many stories of specific, often inspiring, student results that interviewees attributed to RSG.
- Surveys from newer Partner schools showed higher family and community engagement. This could indicate that RSG recruiting has been increasingly effective at finding new Partner schools that already have a foundational understanding and practice of stakeholder engagement. Recruiting such schools likely increases the leverage of RSG resources invested.
- Training session participants reported high levels of satisfaction and utility. Respondents to surveys administered immediately after training sessions agreed strongly that their RSG training connected directly with their curriculum standards (average 3.8 on a scale of 1 to 4, N = 409), and that they could apply the training immediately (average 3.6 on a scale of 1 to 4, N = 409).
- The average stage of change reported on surveys was "Designing." This suggests that RSG Partner schools to date have been centered in the "strategic sweet spot" that RSG staff are aiming for (i.e. moving schools from the Designing to the Doing stage of change). Looking forward, the RSG Logic Model defines "sustainability" as "60% of educators at the Doing or Deepening stage." In the current baseline data, the three schools with the highest percentages of educators reporting being at the Doing or Deepening stage for teacher practice and school level outcomes were all schools with established RSG relationships of five or more years. For these older RSG Partner schools, the percentage of responses meeting the target threshold was closer to 30% than to the goal of 60%. Future
evaluation will track the progress of newer RSG Partner schools toward the hypothesized sustainability target.
Several themes from interview data complemented and added detail to findings from surveys
- Sustaining garden work takes time and energy. Nearly half of the interviewees made explicit mention of the large amount of "time and care" it takes to build and/or maintain a vibrant garden. One respondent offered the following metaphor: "It's been growth just like the flower's growth over time... Like the daylily, little by little we've seen it grow over the years."
- Gardens sparked awareness of food systems. Several interviewees mentioned that working in their gardens taught kids that "food doesn't just come from the grocery store." Interviewees mentioned vegetable gardening slightly more often than flower gardening.
- Gardens had a calming effect. Unprompted, several interviewees described taking upset students to the garden to soothe them. One said: "...most of the kids that have behavioral problems are better in the garden because they work out their frustrations."
Opportunities for Improvement and Considerations for Practice
- Proceed with confidence that the program is having many effects as intended. The four long-term student impacts named in the RSG Logic Model resonated strongly with all interviewees. The outcomes in the Train Educators strand of the Logic Model all proved susceptible to measurement with surveys, and showed compelling positive results. Early drafts of the Logic Model prompted more precise and robust reflection on program direction, e.g. contributing partially to the decision to suspend community-wide training events. In short, the Logic Model tool appears to be both accurate and useful. While it is important to continually sharpen and adapt the Logic Model to reflect emerging strategies and context, evaluation data thus far do not suggest that wholesale changes are warranted.
- Systematically investigate student outcomes, but go slowly and cautiously. Measuring academic achievement associated with outdoor learning remains a challenge. Many interviewees made strong claims that working in the garden helped students academically by providing hands-on experiences, inspiration, and conceptual contexts, but they struggled to provide hard systematic evidence of the connection. Many claimed you could ìjust seeî the effect on the kids. There were differing opinions as to whether standardized test scores are a reasonable measure of gains associated with garden work. No obvious best research design emerged for the upcoming RSG evaluation investigation into student outcomes.
- More explicitly support community connections, skills, and resources in trainings. The topic of gardens as catalysts for parent and community involvement came up unsolicited in most interviews. Perhaps the RSG program could provide more focused support for finding, inspiring, and nurturing the community volunteers whom participants claim are so helpful.
- Tap into clubs as a program delivery method. Nearly every interviewee mentioned that some kind of garden club was an inviting and successful vehicle for student involvement in the garden.
- Track the amount of RSG training at the level of individual teachers, not just whole schools. A system of spreadsheet tools is currently being built to allow for subsequent survey analyses to accurately capture the amount of exposure to RSG training that each individual teacher receives. This should substantially increase the power of the statistical analyses. This system will, however, require ongoing, detailed, reliable attention from RSG staff.
Data and Analysis Summary
Over the course of the 2010-2011 school year, the evaluation team collected and analyzed 1,042 educator surveys. Of these, 633 responses represented nearly the full staff of 18 Partner schools invited to fill out the survey in the Fall of 2010. The Partner school survey sample was composed of roughly equal portions new schools, established schools, and various other types/ amounts of RSG involvement. This diverse sample serves as a baseline for many analyses. The remaining 409 survey responses were collected at the conclusion of 29 of 91 different professional development training events conducted by RSG staff throughout the school year. Statistical analyses included linear and sequential regressions (to test the extent to which school-level participation in RSG predicted individual-level self-reported levels
of intended program outcomes), analysis of variance (to test for significant changes over time in cases where repeated group and individual measures were available), and various descriptive presentations.
During the Fall of 2010, evaluators conducted interviews with 12 RSG "champions" from 10 schools representing a wide range of geography, duration of relationship with RSG, and role within school. Evaluators systematically coded qualitative interview data for theme discovery and relevance, and integrated the findings with results from survey data