Educational Research Review 25 (2018) 56–72 Contents lists available at ScienceDirect Educational Research Review journal homepage: www.elsevier.com/locate/edurev Review T Effective universal school-based social and emotional learning programs for improving academic achievement: A systematic review and meta-analysis of 50 years of research☆ Roisin P. Corcoranb,a,∗, Alan C.K. Cheungc, Elizabeth Kimd, Chen Xiee a IRINSTITUTES, Dover, DE, USA b University of Nottingham, Nottingham, UK c Department of Educational Administration and Policy, Director of the Centre for University and School Partnership (CUSP) at The Chinese University of Hong Kong, Hong Kong d Johns Hopkins School of Education, 2800 N Charles St, Baltimore, MD 21218, United States e Department of Educational Administration and Policy at The Chinese University of Hong Kong, Hong Kong ARTICLE INFO ABSTRACT Keywords: This review explored the research regarding the effects of pre-K-12 school-based social and Academic achievement emotional learning (SEL) interventions on reading (N = 57,755), mathematics (N = 61,360), and Meta-analysis science (N = 16,380) achievement. The review focused on research that met the criteria for high Social and emotional learning programs methodological standards. Further, methodological and substantive characteristics of these stu- Systematic review dies were examined to investigate the association between SEL and study characteristics. There were 40 studies that qualified that were included in the final analysis based on pre-K-12 parti- cipants. The results of this review found that SEL had a positive effect on reading (ES = +0.25), mathematics (ES = +0.26), and (though small) science (ES = +0.19) compared to traditional methods, consistent with previous reviews. However, SEL programs from more rigorous rando- mized studies with large sample sizes that have dominated the classroom over the last few decades might not have as meaningful effects for pre-K-12 students as once thought. More ran- domized studies are needed to confirm these conclusions. 1. Conceptual definitions There are strong academic, social and emotional facets of teaching, learning to teach, and student learning in schools (Corcoran, 2018; Corcoran & Tormey, 2012a,b; Corcoran & O'Flaherty, 2017a,b; Zins, Weissberg, Wang, & Walberg, 2004). There is general agreement among educators, policymakers, and the general public that schools have a crucial role in fostering students' cognitive development, but should engage students’ social and emotional development as well (Corcoran, 2017a,b; Association for Supervision and Curriculum Development, 2007; Greenberg et al., 2003). There is both national and international evidence to suggest that improving social and emotional learning (SEL) allows students to connect with others and learn in a more effective way, thereby increasing their chances of success both in school and later life (Clarke, Morreale, Field, Hussein, & Barry, 2015; Weare & Nind, 2011; ☆ This study was funded by the Jacobs Foundation (Johns Hopkins University, 2015). However, opinions expressed do not necessarily represent the policies or positions of the Jacobs Foundation. ∗ Corresponding author. E-mail addresses: [email protected] (R.P. Corcoran), [email protected] (A.C.K. Cheung), [email protected] (E. Kim), [email protected] (C. Xie). https://doi.org/10.1016/j.edurev.2017.12.001 Received 14 July 2017; Received in revised form 10 November 2017; Accepted 14 December 2017 Available online 18 December 2017 1747-938X/ © 2017 Published by Elsevier Ltd.
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 Yoshikawa et al., 2015). As such many countries have endeavored to embed SEL competencies into school culture and ethos (Department for Education and Skills, 2004; Department of Education, 2015; NCCA, 2017; U.S. Department of Education, 2016). According to the Collaborative for Academic, Social, and Emotional Learning (CASEL), social and emotional development refers to “the process through which children and adults acquire and effectively apply the knowledge, attitudes, and skills necessary to understand and manage emotions, set and achieve positive goals, feel and show empathy for others, establish and maintain positive relationships, and make responsible decisions” (Collaborative for Academic, Social, and Emotional Learning, 2015, p. 5). CASEL further defines social and emotional learning as comprising five core competencies: self-awareness, self-management, social awareness, relationship skills, and responsible decision making (CASEL, 2015). Social and emotional interventions are influential in schools because the research has indicated that investing in a students' social- emotional competencies are predictive of better academic outcomes (Domitrovich, Durlak, Staley, & Weissberg, 2017; Durlak, Weissberg, & Pachan, 2010; Durlak, Weissberg, Dymnicki, Taylor, & Schellinger, 2011) and long-term economic outcomes (Belfield et al., 2015). CASEL indicates that the short-term goals of SEL programs are to promote students’ self-awareness, self-management, social-awareness, relationship-building, and responsible decision-making skills as well as improving student beliefs and attitudes about others, self, and school. Consequently, these provide a foundation for better academic performance and adjustment (CASEL, 2015). 2. Conceptual model The conceptual model that guides this research assumes that each of the core competencies contribute to increased skills and knowledge, supportive learning environments, and improved attitudes about school, self, and others, which in turn leads to reduced problem behaviors, reduced emotional stress, improved social behavior, improved self-esteem and that each of these factors then help contribute to improved academic performance in the classroom (CASEL, 2015). However, alternative models suggest that core competencies change teaching practices and allow for a richer classroom culture and this in turn elevates engagement as well as a feeling of security and support which in turn has an effect on improved academic skills. Other models distinguish between pro-social behavior and performance-related skills derived from SEL such as attention, regulation, or grit (Duckworth & Yeager, 2015; Farrington et al., 2012). Such SEL performance related skills are theorized to have a larger impact on academic outcomes than pro- social behavior. Lastly, the impact of SEL on reading and mathematics academic outcomes is more prevalent in the literature than science, though the effects of SEL on improved science has been examined (Challen, Noden, West, & Machin, 2011; Flay, Allred, & Ordway, 2001; Nix, Bierman, Domitrovich, & Gill, 2013). 3. SEL educational policy Serving and responding to culturally diverse students is a major challenge for schools (Learning First Alliance, 2001). There is some understanding that some of the student population may lack social and emotional competencies and as a result will often become less engaged in school. This lack of engagement may have a negative impact on their behavior, health, and academic performance (Blum & Libbey, 2004). However, schools are expected to address all of these areas but with limited resources and are under intense pressure to focus on academic performance. There is greater demand from policy makers that educational policy and practice be guided by evidence of effectiveness. In the United States of America, the recent passage of the Every Student Succeeds Act (ESSA, 2015) represents a landmark in federal policy—for the first time, schools across the nation are being asked to prioritize educating the whole child. In relation to the Every Student Succeeds Act (ESSA, 2015), the U.S. Department of Education recommends the use of the Federal What Works Clear- inghouse's (WWC, 2014) criteria when determining if a program meets ESSA standards for moderate to strong evidence (U.S. Department of Education, 2016). The new law challenges states and districts to enable teachers, administrators, and parents to implement social, emotional, and academic learning. There has also been a growing awareness of the importance of early inter- ventions and in helping children develop necessary competencies, and so the learning and teaching of social and emotional com- petencies must begin as soon as possible. Due to this, it is increasingly imperative for educators as well as policy makers to obtain reliable and accessible reviews of “what works” in education, that has applied both rigor and replicability in their evaluative approach. One study has attempted to summarize effective programs in SEL (Durlak et al., 2011). However, Durlak et al.’s searches had been assessed only up until the end of 2007, but since then, National Science Foundation and the Institute for Education Studies at the U.S. Department of Education have supported several experimental evaluations of SEL programs. Thus it has become necessary for the findings of this research agenda to be subjected to a systematic review applying reliable, scientifically justified methods. 4. Previous reviews There are quite a few reviews of SEL or programs with similar objectives but in order to take into consideration the contribution of their work for the research agenda, certain criteria should be assessed: whether reviews tend to be outdated, lack sufficient focus on the area in question, age group of the students in the sample, or have included studies which used non-robust methodologies. A search of the literature found the following relevant reviews. The first review studied universal school-based social information processing interventions for aggressive behavior (Wilson & Lipsey, 2006). This review contained studies that had been published up until 2003 and focused primarily on reducing and preventing violence rather than research specific to social and emotional learning. 57
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 Accordingly, this review is not focused on academic (i.e., mathematics, reading and science) outcomes. There was a review of school- based program interventions that focused on reducing bullying and victimization (Farrington & Ttofi, 2009; last updated in March 2010). This review assessed studies which measured bullying or aggression towards peers outcomes rather than research specific to social and emotional skills. Again, this review is not focused on academic (i.e., mathematics, reading, and science) outcomes. There was a review of self-control programs that focused on reducing problem behaviors and delinquency (Piquero, Jennings, Farrington, & Jennings, 2010). This review did not explicitly focus on universal, school-based program interventions. This review restricts its focus to children under the age of 10. Further, this review is not focused on academic (i.e., mathematics, reading, and science) outcomes. There was a review of mindfulness-based program interventions that focused on improving students’ achievement, in addition to socio and emotional functioning (Maynard, Solis, & Miller, 2015). This review restricts its focus to interventions making use of mindfulness tools rather than including interventions that use a wider range of techniques to improve social and emotional learning. Lastly, Spivak, Lipsey, Farran, and Polanin (2015) carried out a review of practices and program intervention elements that focus on increasing pro-social behavior in children and youth. This review does not include interventions aimed at improving social and emotional outcomes. This review does not have an explicit focus on academic (i.e., mathematics, reading, and science) outcomes. Within the extant literature, there have been several reviews in the area of social and emotional learning (SEL) programs (i.e., Browne, Gafni, Roberts, Byrne, & Majumdar, 2004; Clarke et al., 2015; Payton et al., 2008; Wilson & Lipsey, 2007). Overall, they concluded that SEL programs in general produce moderate to small effects on achievement. Despite this, very few of them focus exclusively on SEL program interventions in order to explore their influence on student academic achievement outcomes, and none of these reviews to date explore the effects of SEL program interventions on improvement in science specifically. This is important to note, given the importance of students’ success in STEM (Duschl, Schweingruber, & Shouse, 2007; Kilpatrick & Quinn, 2009). However, while these reviews offer interesting and valuable information regarding effective practices in SEL program inter- ventions, many of them do not employ a system of rigorous inclusion and review methods that have become a standard for quality research reviews in education. For example, 49 out of the 94 studies included in Clarke et al.’s (2015) study did not involve a comparison group. Therefore, there is a distinct possibility that the positive outcomes that were reported in these studies were not necessarily the result of the intervention. The most relevant of these studies is Durlak et al.’s (2011) meta-analysis, which focused specifically on school-based programs and their effect on a variety of student outcomes: social and emotional skills and attitudes; positive social behavior; conduct pro- blems; emotional distress; and academic performance using composite outcomes for mathematics and reading (N = 35). However, Durlak et al.’s searches were conducted up until the end of 2007 and thus the findings are now considerably dated. Further, Durlak et al.’s did not include an explicit focus on the separate outcome domains of mathematics, reading, and science. Outcome domains matter because (1) the Federal What Works Clearinghouse's (WWC, 2014) conclusions about an intervention's effectiveness are based on impacts on a given domain. For example, if a study examines impacts on science achievement and mathematics achievement, conclusions will be made about whether the intervention had an impact on science achievement separately from whether it had an impact on mathematics achievement. The SEL intervention could have a positive impact on mathematics and no impact on science. When domains are defined, conclusions can be drawn about impacts on broad domains separately (i.e., mathematics achievement, reading achievement); and (2) The WWC has topic area protocols, which guide reviews of interventions in different areas and provide additional detail about how the WWC standards are applied in each specific topic area (e.g., interventions for ELLs; beginning reading; middle school mathematics; dropout prevention). One thing that topic area protocols provide guidance for is outcome domains. These protocols specify what outcome domains are examined when the WWC conducts reviews of interventions within the topic area. As part of their reviews, the WWC determines what domain each outcome measure falls into. (And outcomes that measure domains not specified by the topic area are not reviewed by the WWC at all.) Regardless of how researchers define domains, the WWC classifies measures according to their own definitions for a given topic area. There is no topic area protocol that aggregates mathematics, reading, and science outcomes into a single domain. Since these previous reviews have been published, the review team is aware of a number of new evaluations that were carried out on interventions aiming to improve student academic outcomes through universal SEL programming. A current systematic review that includes some of the more recent published journal articles is needed. While there are some ongoing or completed narrative reviews that have a focus on individual programs, outcomes that are specific social and emotional measures, or a specific niche of socio-emotional growth, these reviews which are in process or are completed, tend not to focus on a systematic review, nor do they evaluate the varied range of social and emotional programs that are currently offered in schools for pre-K-12 students. This review is therefore more extensive and inclusive in scope and this allows for the selection of more recent literature that has been emerging in the field. The present review explores the research on the effects of SEL interventions on academic achievement (i.e., mathematics, reading and science). However, unlike the majority of prior reviews, this study implements a consistent inclusion standard to concentrate on research that maintained a high level of methodological rigor. Moreover, these studies are evaluated on their methodological and substantive characteristics in order to explore the association between interventions and key study characteristics. Lastly, the ob- jective of this review seeks to answer a broader set of questions that involve the overall effect of school-based SEL program inter- ventions with a focus on comparing the effectiveness of different types of interventions, and subgroups, in order to establish whether significant methodological and substantive characteristics are related to greater effects. This objective requires a more extensive review that has not yet been attempted by any prior research in the field. 58
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 5. Method 5.1. Search procedures The current study utilized techniques outlined by Glass, McGaw, and Smith (1981) and Lipsey and Wilson (2001). In order to locate all studies that had the potential to be included in this study, a search of the extant literature for articles that were written between 1970 and 2016 was conducted. Online searches were carried out using a variety of educational research databases and utilizing multiple combinations of key words, web-based repositories, and relevant journals tables of contents. Table 1 summarizes all key word phrases, web based repositories, and journals sourced for this study. In addition, program name searches were conducted (e.g., “Roots of Empathy”, “promoting alternative thinking strategies”, “RULER”, “Positive Action”, “Incredible Years”, “Tools of the mind”). Grey literature, or research produced by organizations that are not affiliated with the traditional academic publishing and distribution channels, was identified from intervention websites and research websites. Education publishers’ websites and web- based repositories were examined. Further, SEL programs developers were contacted to assess whether they were aware of additional studies. Citations from SEL programs reviews or potentially related topics were further investigated. Lastly, a search of key journals for recent tables of contents was conducted. 5.1.1. Criteria for inclusion The search identified 611 studies of SEL programs, and these were then reviewed according to the criteria below. The selection of many of these criteria were meant to establish a homogenous sample of SEL intervention studies such that valid group comparisons could be made across each of the outcomes of interest. In addition, some criteria were instituted in order to establish a basic standard of methodological rigor so that if there were large variations in effect sizes, this would not be due to poorly designed or implemented studies. In order for studies to be included (Corcoran & Slavin, 2016), they had to meet the following criteria. 1. The studies that were included primarily assessed school-based SEL practices and programs. If a program intentionally promoted students' growth across CASEL's five dimensions of social and emotional competence, it was considered a SEL program. This criteria was established to ensure that SEL interventions were similar in their objectives. 2. The selection of studies included an approach that started implementation during a child's development in grades pre-K through 12. 3. The selection of studies involved a comparison between those children who received instruction in classes that had a SEL in- tervention or some type of SEL practice with children in control classess that used an alternative intervention or standard ap- proaches. 4. Acceptable studies could have been conducted in any country/locale, however, the report had to have been available in English. 5. Acceptable studies required the implementation of random assignment or alternatively, could have employed a matching tech- nique with appropriate controls for any potential pre-test differences. Any study that did not include a control group in the design, such as a single-group pre-post comparison, were excluded. 6. Pre-test measures had to be included to ensure initial equivalence. Studies with substantial pre-test differences cannot be properly controlled for, even with analyses of covariance, as underlying distributions may be fundamentally different (Shadish, Cook, & Campbell, 2002). The WWC guidleines indicate that controlling for relevant covariates such as an outcome measure pre-test adds methodological rigor to both RCTs and quasi-experimental research designs. This additional step affords greater precision by reducing bias of parameter estimates and providing more accurate evaluation of program effectiveness (WWC, 2014). Controlling for pre-test outcome measures is particularly crucial for quasi-experimental research designs as these covariates might act as a confounding factor that could seriously impact the effect of the intervention (WWC, 2014). Even if there is no evidence of pre-test differences, adjusting for them will control for chance variations and improve the precision of the impact estimates (Corcoran, 2017a,b; Flay et al., 2005). 7. The outcomes measures needed for selection also required quantitative indicators of student performance: reading, mathematics, and/or science. 8. Study duration was required to be at least 12 weeks. Twelve weeks is roughly the duration of a semester and represents programs that can be practically implemented over the course of the school year. This criterion ensured this review adhered to previous reviews in reading, mathematics and science (Samdal, Eide, Barth, Williams, & Meland, 2017). 9. Acceptable studies required at least two teachers and a minimum of 15 students within each condition to properly avoid con- founding treatment effects with potential teacher effects. However, there was no upper bound on how large the size of the study could be because large-scale studies provide more reliable estimates. All studies modeled the nested data structure with regard to students, in classrooms or schools in their analysis. Two authors looked at each potential study independently with inter-rater agreement exceeding 90%. When there were dis- agreements, authors reexamined the specific studies and reached final agreement. The total number of studies that met this criteria and was selected were 40 studies. 5.1.2. Study coding To critique the strengths and weaknesses of the body of evidence and examine the studies’ methodological and substantive characteristics, the studies required a coding procedure. Methodological features included research design which was designated as 59
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 Table 1 Key-words, web-based repositories, journal, and program sources. Key-word phrases Web-based repositories Journals Programs Social and emotional learning & JSTOR American Educational 4-H achievement Research Journal ERIC Journal of Educational 4Rs Social and emotional learning & Psychology achievement & effectiveness EBSCO Learning and Instruction Academic and Behavioral Competency Program Psych INFO American Journal of Al's Pals Social emotional program & achievement Community Psychology Social emotional intervention & Dissertation Abstracts Child Development Bully Proofing Your School Google Scholar Journal of Research in CARE (Cultivating Awareness and Resilience in achievement Adolescence Education) Social and emotional learning effectiveness NREPP Journal of Consulting and Caring School Community Social emotional program & evaluation Clinical Psychology WWC Journal of Primary Prevention CharacterPlus Way Social emotional program & academic Blueprints for Healthy Youth Journal of School Psychology Child Development Initiative Early Years Development Positive youth development & achievement AERA repository Journal of Youth and CLASS Positive youth development program & Adolescence Prevention Science Class-Wide Function-based Intervention Teams academic Psychology in the Schools Comer School Development Program Positive youth development & effectiveness School Psychology Review Communities That Care Positive youth development & evaluation Community of Caring Positive youth development & competence Competence Support Program Consistency Management and Cooperative & schools Discipline Social skills program & achievement Coping Power Review of Educational Research DARE to be You Social skills program & academic Early Risers ‘Skills for Success’ Social skills intervention & academic Emotions Course Social skills program & effectiveness Emotional program & academic Expect Respect Emotional intelligence & academic & Expeditionary Learning intervention Facing History and Ourselves Coping intervention & academic Fast Track First Step to Success achievement Fourth Step Coping intervention & effectives & schools FRIENDS Coping intervention & competence & Head Start REDI HighScope schools I Can Problem Solve (formerly called Interpersonal Self-esteem program & academic cognitive problem solving skills) Self-esteem program & achievement Improving Social Awareness Social Problem Solving Self-esteem intervention & achievement Project (ISA-SPS) Empathy intervention & achievement Incredible Years Training Series Empathy & academic & school Keepin it REAL Empathy program & academic Kia Kaha Empathy & academic & effectiveness Know Your Body Empathy & effectiveness & school Learning for Life Lessons in Character Pro-social behavior & effectiveness & school Linking the Interests of Families and Teachers Pro-social behavior & academic Pro-social behavior & achievement Lions Quest Pro-social behavior & evaluation & school Mindfulness & academic achievement Living with a Purpose Self-Determination Program Mindfulness intervention & effect & school Love In a Big World Mindfulness intervention & evaluation & Mate-Tricks Michigan Model for Health school MindUp Mindfulness program & academic achievement Mindfulness program & achievement & school Note: List of program names were compiled based on previous reviews of SEL programs and individual articles that mentioned specific programs. 60
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 either a randomized controlled trial (RCT) or a quasi-experiment (QE), in addition to sample size which was coded as either small (S: N ≤ 250 participants) or large (L: N > 250 participants). Substantive features included grade levels which were coded as ele- mentary, secondary, or elementary and secondary; program intensity which was coded as low, high, or unknown; and socio-economic status which was coded as low, high, or unknown. 5.1.3. Effect size calculations and statistical analyses Techniques proposed by Lipsey and Wilson (2001) and Sedlmeier and Gigerenzer (1989) were employed to calculate effect sizes. Generally, for included studies the effect sizes produced were derived from the mean difference between student posttests in each condition after controlling for covariate measures, and then dividing by the unadjusted posttest pooled SD. Methods proposed by Borenstein, Hedges, Higgins, and Rothstein (2009) were employed regarding categories with smaller numbers of studies. The sta- tistical analyses were completed using Comprehensive Meta-Analysis software (V3) (Borenstein, Hedges, Higgins, and Rothstein, 2016). 6. Results 6.1. Overall effects As indicated in Table 2, there were 40 qualified studies that were selected based on pre-K-12 participants included in the final analysis: 35 studies included dependent measures of reading performance (N = 57,755), 33 studies included dependent measures of mathematics performance (N = 61,360), and 5 studies included dependent measures of science performance (N = 16,380). Because the sciene outcome data came from only five sources, some caution should be exercised when interpreting its results. As reported in Table 3, the overall mean effect sizes for reading, mathematics, and science were +0.25, +0.26, and +0.19, respectively. This study employed the use of a random effects model in order to account for the distribution of effect sizes in this selection of studies which was found to be highly heterogeneous (QB = 1298.02, df = 34, p < .001, QB = 743.18, df = 32, p < .001, QB = 57.47, df = 4, p < .001, respectively). This finding indicates that the variation of effect sizes for these studies was substantially greater than could be explained by random sampling error alone. This significant finding in the presence of a low power test typical of Cochran's Q and a small number of studies indicates there is strong heterogeneity in the distribution of effect sizes for all three measures. 6.2. Sensitivity analysis A sensitivity analysis was conducted in order to assess if there were any extreme outliers that may have influenced the results. The process of employing a “one-study removal” analysis (Borenstein et al., 2009) was performed and there was a range of effect sizes that fell within the 95% confidence interval of the overall effect size (0.14 to 0.36 and 0.18 to 0.34 for reading and mathematics, respectively). Hence, if one effect size were removed, it would not substantially impact the overall effect size. 6.3. Publication bias Classic fail-safe N (Rosenthal, 1979) and Orwin's fail-safe N (Orwin & Boruch, 1982) analyses were employed to check for robustness. The Classic fail-safe N test indicated that 6414 studies (reading) and 4538 studies (mathematics) with null results were required for the effect to be nullified. The Orwin's test provides an estimate of the missing null studies that would be needed to approximate the mean effect size at a level that would be considered trivial. This study set the trivial value at a 0.01 level. These results suggest that the total number of missing null studies that were needed to approximate an overall mean effect size of 0.01 was 901 (reading) and 577 (mathematics). Overall, these results indicate that publication bias were not driving the effect sizes observed. To supplement this analysis and alleviate concerns regarding publication bias, a mixed-effects model was conducted to test if significant differences existed between published and unpublished journal articles (e.g. technical reports and dissertations). Table 4 indicates that the overall effect sizes for published and unpublished articles for reading were +0.27 and + 0.21, respectively. The Q- value statistic (QB = 0.26, df = 1, and p = .61) indicates that publication bias was not present in this selection of studies. Table 5 reports the overall effect sizes for published and unpublished articles for mathematics, +0.32 and +0.18 respectively. The Q-value statistic (QB = 2.93, df = 1, and p = .09) indicates that publication bias was not present in this selection of studies also. Hence, the published journal articles had larger, but non-significant effect sizes compared to the effect sizes of the dissertations and technical reports for reading and mathematics. This finding is compatible with previous meta-analyses (see Glass et al., 1981; Lipsey & Wilson, 2001). 6.4. Publication year Another objective of this study was also concerned with whether there existed significant differences between studies according to publication year. As indicated in Tables 6 and 7, in this collection of studies the largest effects on reading were found on students in studies published during the 90s and the 00s (ES = +0.33), followed by the 10s (ES = +0.17). In this collection of studies, the largest effects on mathematics were found on students in studies published during the 90s and the 00s (ES = +0.31), followed by the 10s (ES = +0.20). However, no significant differences were found for the variable of year of publication for reading or mathematics. 61
Table 2 Design Intervention N SES Grade Program Post-test Domain ES R.P. Corcoran et al. SEL approaches. (Type) unknown ms,hs Intensity My Teaching 12 schools, low preK low Composite scores Study Partner 78 teachers, unknown Tools of the Mind 1267 students low 3 to 8 +0.06 Allen, Pianta, Gregory, RCT (L) 18 teachers, low preK low Virginia state standards assessment all content, +0.14 Mikami, and Lun RCT (S) Positive Action 210 students high (Commonwealth of Virginia, 2005) NA (2011) REDI/PATHS low K to 6 14 schools, unknown 5,6,8,9 low Peabody Picture Vocabulary Test (PPVT III, Reading +0.22 Barnett et al. (2008) SEL intervention 247 teachers, unknown 5,6,8,9 low Dunn & Dunn, 1997); Woodcock–Johnson Mathematics +0.38 Student Success 1170 students low 2,3 low Letter–Word Identification (WJ-R LW; Science NA Bavarian et al. (2013) RCT (L) Skills 44 classrooms, low 5,6 unknown Woodcock & Johnson, 1989; Woodcock & +0.09 Student Success 356 students low Munoz-Sandoval, 1996); Wechsler Preschool Reading NA Bierman et al. (2014), RCT (L) Skills Primary Scale of Intelligence Animal Pegs Mathematics NA Bierman, Nix, Resolving Conflict 1 school, subtest (WIPPS; Wechsler, 1989); IDEA Oral Science 62 Greenberg, Blair, and Creatively 60 students Language Proficiency Test (OLPT; Ballard & Reading +0.67 Domitrovich (2008) Student Success 12 schools, Tighe, 1999) Mathematics +0.99 Skills 220 students ISAT Illinois state level test (Illinois State Science NA Braxton Arnason (1998) RCT (S) 6 schools, Board of Education Division of Assessment, −0.03 370 students 2013) Reading +0.45 Brigman, Webb, and QED (S) 15 schools, Florida Mathematics NA Educational Research Review 25 (2018) 56–72 Campbell (2007) QED (L) 2543 students Comprehensive Assessment Test (FCAT; Science +0.15 QED (L) 20 schools, Florida Reading +0.11 Brigman and Campbell 25 counselors, Department of Florida Department of Mathematics NA (2003) 240 students Education, 2002), the Expressive One-Word Science NA Picture Vocabulary Test Reading +0.05 Brown (2003) (Brownell, 2000), Test of Preschool Early Mathematics NA Literacy (TOPEL; Lonigan, Wagner, Torgesen, Science +0.23 Campbell and Brigman QED (S) & Rashotte, 2007) Reading +0.48 (2005) Grades obtained from school records Mathematics NA Science (continued on next page) FCAT (Florida Department of Education, Reading 2002) Mathematics Science FCAT (Florida Department of Education, 2002) California Achievement Tests, Fifth Edition (CAT- 5; CTB Macmillan/McGraw-Hill, 1992) FCAT (Florida Department of Education, 2002)
Table 2 (continued) Design Intervention N SES Grade Program Post-test Domain ES R.P. Corcoran et al. Study (Type) Intensity low +0.37 Challen et al. (2011) QED (L) UK Resilience 22 schools, High 7 National Pupil Reading +0.10 Program ∼6000 students unknown Database information on academic attainment Mathematics +0.11 (Department for Education, 2013) Science +0.09 Cook, Murphy, and Hunt RCT (L) Comer School 10 schools, low 5 to 8 unknown Iowa Test of Basic Skills (ITBS; Seton Testing Reading +0.72 (2000) RCT (L) Services, n.d.-a) Mathematics NA Development program 1685 students Low Science NA Cook et al. (1999) GPA, California Achievement Test (CTB Reading +0.21 Comer School 23 schools, mixed ms low Macmillan/McGraw-Hill, 1992), Maryland Mathematics NA state readiness test scores (unspecified). Science +0.15 Development program 12,398 students high NWEA MAP scores (Northwest Evaluation Reading +0.34 Association, 2011) Mathematics NA Corsello and Sharma RCT (L) BARR 1 school, low 9 Science −0.04 (2015) 555 students STAR reading and math (Renaissance Reading −0.02 Learning, 2009, 2010) Mathematics NA Diperna, Lei, Bellinger, and RCT (L) SSIS-CIP 6 schools, unknown 2 Science composite Cheng (2015) Tools of the Mind 39 classes, low pre-K Woodcock Johnson III (Woodcock, McGrew, & Mathematics, achievement −.06 1st 494 students Mather, 2001) Reading year, −.15 2nd year, Farran, Wilson, Meador, QED (L) 44 schools, 60 -.07 3rd year Norvell, and Nesbitt classroom, 847 +0.73 (2015) students +0.34 NA 63 Flay et al. (2006) RCT (L) Positive Action 111 schools, low 1 to 6 low Hawai'i Content and Performance Standards Reading +1.32 5066 students NA (Hawaii State Department of Education, 2005) Mathematics NA +0.48 Science +0.57 +0.26 Flay and Allred (2003) QED (L) Positive Action 36 schools low elem low FCAT (Florida Department of Education, Reading (24E, 12C) high K-6 Positive Action 2160 students 2002) Mathematics 60 schools low 9 Consistency (Nevada 12E, elem Science Management 24C) (Hawaii 8E, Flay et al. (2001) QED (L) and Cooperative 16C) low TerraNova Comprehensive Test of Basic Skills Reading Discipline 3600 students CMCD 1 school, (Seton Testing Services, n.d.-b), Stanford Mathematics 587 students Achievement Test (Pearson, n.d.) Science Freiberg et al. (2011). QED (L) unknown Texas Assessment of Knowledge and Skills Reading +0.45 unknown (TAKS) (Texas Education Agency, n.d.) Mathematics +0.51 Freiberg, Huzinec, and QED (L) 14 schools, low unknown Science NA Educational Research Review 25 (2018) 56–72 Templeton (2009). 700 students Texas Assessment of Academic Skills, Texas Learning Index (Texas Education Agency, Reading +0.43 Gibbons, Foster, Owens, RCT (L) Caring School 40 schools unknown elem 2003) Mathematics +0.41 Caldwell, and Marshall Community Missouri state-wide achievement tests (not Science NA (2006) specified) Reading NA (Section 3) Mathematics +0.14 Science NA (continued on next page)
Table 2 (continued) Design Intervention N SES Grade Program Post-test Domain ES R.P. Corcoran et al. Study RCT (L) (Type) 2 to 5 Intensity 50 schools, low low California Standards Tests (California Reading +0.09 Hanson, Dietsch, and Lessons in 4683 students unknown Department of Education, 2010) Zheng (2012) Character unknown Mathematics NA unknown School archival English language arts and high mathematics test scores Science NA low NWEA MAP scores (Northwest Evaluation Science NA low Association, 2011) Hanson, Izu, Petrosino, RCT (L) Tribes 13 schools, mixed 1–4 unknown Reading −0.14 Delong-Cotty, and RCT (L) BARR 2309 students 9 unknown Illinois State Achievement Test (Illinois State Zheng (2011) SWPBS K-5 high Board of Department for Education, 2013), Mathematics −0.19 Stanford Achievement Test (Pearson, n.d.) Hinojosa et al. (2016) unknown Stanford Achievement Test—Version 9 Science NA (Pearson, n.d.), grades 3 schools, low Reading +0.11 ∼1000 students New York State standardized assessments (unspecified), ECLS-K 3rd grade assessment Mathematics 0.00 (National Center for Education Statistics, n.d.) Final grades in Reading and Math, California Science NA Tests of Basic Skills (CTBS; CTBS/McGraw- Horner et al. (2009) RCT (L) 63 schools low Hill, 1981) Reading +0.58 DIBELS (Dynamic Measurement Group, 2008), grades Mathematics NA 21st Century Washington DC, New York City test scores Science NA Community Learning (unspecified) James-Burdumy et al. elem– RCT, Centers Program 2288 elementary low elem Elementary schools (2005), ms–QED (L) students, ms TOPEL (Lonigan, Wagner, Torgesen, & Dynarski et al. (2004) 26 schools, Rashotte, 2007), Test of Word Reading Reading +0.02 4300 middle Efficiency (Torgesen,Wagner, & Rashotte, students, 1999), Woodcock-Johnson III (Woodcock Mathematics −0.06 61 schools et al., 2001) PPVT-III (Dunn & Dunn, 1997), Early Math Science −0.02 Skills (Zill, 2003) Middle schools 64 Reading +0.04 Mathematics +0.06 Science +0.05 Jones, Brown, and Aber RCT (L) 4Rs 18 schools, low 3,4 Reading −0.05 (2011) QED (S) 146 teachers, low 4,5 Unique Minds 1184 children low pre-K Mathematics −0.22 Linares et al. (2005) School Program 2 schools, low ms 13 classes, Science NA Foundations of 119 students Learning 2 school districts, Reading +0.10 89 teachers, Expeditionary 1203 students Mathematics +0.12 Learning 5 Schools, ∼1700 students Science NA Morris et al. (2013) RCT (L) Reading +0.22 Mathematics +0.49 Science NA Nichols-Barrer and QED (L) Reading +0.16 Haimson (2013) Mathematics +0.29 Educational Research Review 25 (2018) 56–72 Science NA Science NA Nix et al. (2013) RCT (L) REDI 25 classrooms, low pre-K Reading +0.37 44 centers, 356 students Mathematics NA Science NA Raver et al. (2011) RCT (L) CSRP 35 classrooms, low pre-K Reading +0.49 602 students Mathematics +0.54 Science NA (continued on next page)
R.P. Corcoran et al. Table 2 (continued) Study Design Intervention N SES Grade Program Post-test Domain ES (Type) Intensity −0.06 −0.13 Rimm-Kaufman et al. RCT (L) Responsive Classroom 24 schools, high 2 to 5 unknown Virginia Standards Of Learning Reading NA (2014) Responsive Classroom 276 teachers, high 2 to 5 unknown (Commonwealth of Virginia, 2005) Mathematics +0.15 2904 students 6 to 8 unknown Science +0.20 Rimm-Kaufman, Fan, Chiu, QED (L) 6 schools, ∼1401 Degrees of Reading Power (Touchstone Reading NA and You (2007) students Applied Science Associates, 2002), CMT-Math Mathematics (Connecticut State Department of Education, Science 0.00 Scales, Blyth, Berkas, and RCT (L) Service Learning 3 schools, mixed 2006) Kielsmeier (2000) QED (S) 1153 students low GPA GPA +0.45 Boys and Girls low +0.91 Schinke, Cole, and Poulin Club 15 sites, mixed age 10-14 High grades Reading +0.92 (2000) PATHS 191 youth Mathematics +0.07 State mastery test (unspecified) Science +0.18 65 Schonfeld et al. (2015) RCT (L) Positive Action 24 schools, 4 to 8 unknown Reading NA 705 students Stanford Achievement Test (Pearson, n.d.), Mathematics +0.60 Snyder et al. (2010) RCT (L) 4,5 low TerraNova (Seaton Testing Services, n.d.-b), Science +0.81 20 schools, Hawai‘i Content and Performance Standards Reading NA 544 students (Hawaii State Department of Education, 2005) Mathematics The Group Reading Assessment and Science +0.11 Somers et al. (2010) RCT (L) Reading 34 schools, low 9 high Diagnostic +0.07 Apprenticeship 4584 students Examination (GRADE; American Guidance Reading NA Service, 2001) Mathematics Webb, Brigman, and RCT (L) Student Success 20 schools, mixed 5,6,8,9 low FCAT (Florida Department of Education, Science +0.11 Campbell (2005) Skills 418 students 2002) +0.29 Reading NA Mathematics Educational Research Review 25 (2018) 56–72 Science Note. In the table, studies are listed alphabetically by author. S indicates a small sample (N ≤ 250 participants), L indicates a large sample (N > 250 participants).
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 Table 3 Overall effect sizes. Random k Effect size and 95% Test of null p-value Heterogeneity df (Q) p-Value Model confidence interval Z-value Reading 35 ES SE Lower limit Upper limit Q-Value 34 .00 Mathematics 33 4.57 32 .00 Science 5 0.25 0.06 0.14 0.36 6.27 .00 1298.02 4 .00 0.26 0.04 0.18 0.34 2.72 .00 743.18 0.19 0.07 0.05 0.33 .01 57.47 Table 4 Mean effect sizes (reading) in studies with published and unpublished studies. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 0.26 1 .61 Published 22 0.27 0.08 Unpublished 13 0.21 0.08 Total between 35 Table 5 Mean effect sizes (mathematics) in studies with published and unpublished studies. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 2.93 1 .09 Published 19 0.32 0.07 Unpublished 14 0.18 0.05 Total between 33 6.5. Methodological characteristics A study's research design and the overall sample size were examined as possible characteristics that could help to explain var- iation in effect sizes across studies. 6.5.1. Research design Randomized experiments in education have characteristically lower effect sizes than experiments that are quasi-experiments (Cheung & Slavin, 2016). There were two groups of research designs that were identified in this selection of studies. There were randomized experiments (N = 22 and N = 20 for reading and mathematics, respectively) that involved schools, classes, or students which were randomly assigned to comparison groups. Quasi-experiments (N = 13 and N = 13 for reading and mathematics, re- spectively) consisted of studies that assigned participants to comparison groups through a matching process involving key variables. Tables 8 and 9 present the results of the research design analysis. For RCTs and quasi-experiments, the average effect sizes for reading were +0.20 and + 0.34, respectively. The mean effect sizes for randomized experimental studies and quasi-experiments for mathematics were +0.23 and + 0.31, respectively. The mean effect, though non-significant, for quasi-experimental studies was larger than that for randomized studies for reading and mathematics. 6.5.2. Sample size Research suggests large studies produce smaller effect sizes compared to smaller studies (Liao, 1999). As indicated in Tables 10 and 11, there were no statistically significant differences between large studies and small studies for reading and mathematics. However, this difference was in the same direction as previous studies and was marginally significant (or significant at the 0.10 alpha level, p = .06) for mathematics. 6.6. Substantive features Grade levels, program intensity, and SES were examined to potentially understand the variation in the models. Table 6 Mean effect sizes (reading) in studies with different year of publication. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 2.18 1 .14 90s & 100s 17 0.33 0.10 110s 18 0.17 0.05 Total between 35 66
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 Table 7 Mean effect sizes (mathematics) in studies with different year of publication. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 1.63 1 .20 90s & 100s 18 0.31 0.05 110s 15 0.20 0.06 Total between 33 Table 8 Mean effect sizes (reading) in studies with different research designs. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 1.39 1 .24 Randomized 22 0.20 0.06 Quasi-Experiment 13 0.34 0.10 Total between 35 Table 9 Mean effect sizes (mathematics) in studies with different research designs. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 1.03 1 .31 Randomized 20 0.23 0.06 Quasi-Experiment 13 0.31 0.06 Total between 33 6.6.1. Grade levels This collection of studies were categorized into three levels of grade: elementary, secondary, and elementary/secondary. The reading and mathematics findings by grade levels are provided in Tables 12 and 13. The results indicate that the effect sizes for elementary (N = 21), secondary (N = 7), and elementary/secondary (N = 7) reading were +0.30, +0.20, and +0.14, respectively. The results also show that the between-group difference (QB = 2.87, df = 2, p = .24) was non-significant. Effect sizes for elementary (N = 18), secondary (N = 8), and elementary/secondary (N = 7) mathematics were +0.24, +0.18, and +0.43, respectively. Lastly, the results indicate that the between-group difference (QB = 4.89, df = 2, p = .09) was non-significant. 6.6.2. Program intensity Program intensity was categorized as follows: low intensity (less than 15 min a day or less than 75 min a week), high intensity (over 15 min a day or 75 min a week) and unknown. This coding made this review consistent with previous research reviews (Cheung & Slavin, 2012) The effect sizes for low and high intensity were +0.32 (N = 16) and +0.12 (N = 6) for reading, and +0.32 (N = 14) and +0.15 (N = 4) for mathematics, respectively. There were no significant differences found between these categories for reading or mathematics (QB = 4.01, df = 2, p = .14, QB = 2.43, df = 2, p = .30). These results suggest that the more SEL programs are utilized does not inevitably result in improved outcomes. 6.6.3. Socio-economic status (SES) SES was categorized as follows: low SES, mixed SES, high SES and unknown. Low SES denotes studies that had greater than 50% of the student population who received free and reduced-price lunch. High SES denotes studies that had less than 50% of the student population who received free and reduced-price lunch. There was no difference found between the high and low SES groups for reading or mathematics (p's > 0.05). 7. Discussion The primary aim of this review was to explore the overall effectiveness of school-based SEL interventions on reading, mathe- matics, and science outcomes in pre-K-12 classrooms. This study employed the use of crucial substantive and methodological Table 10 Mean effect sizes (reading) in studies with larger and smaller sample size. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 0.17 1 .68 Small (≤250) 6 0.21 0.09 Large (> 250) 29 0.25 0.06 Total between 35 67
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 Table 11 Mean effect sizes (math) in studies with larger and smaller sample size. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 3.51 1 .06 Small (≤250) 6 0.49 0.14 Large (> 250) 27 0.23 0.04 Total between 33 Table 12 Mean effect sizes (reading) in studies with elementary and secondary school. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 2.87 2 .24 Elementary 21 0.30 0.09 Secondary 7 0.20 0.06 Elementary & Secondary 7 0.14 0.04 Total between 35 Table 13 Mean effect sizes (mathematics) in studies with elementary and secondary school. Numbers of Studies Point Estimate Standard Error Q-Value df (Q) p-Value 4.89 2 .09 Elementary 18 0.24 0.07 Secondary 8 0.18 0.04 Elementary & Secondary 7 0.43 0.11 Total between 33 moderator variables, including research design, SES, sample size, and program intensity. These moderators were utilized to establish whether the outcomes were significantly different with regard to important study factors. The findings of this study suggest that SEL interventions generally produced a positive effect for reading (ES = +0.25), mathematics (ES = +0.26), and smaller effect for science (ES = +0.19) in comparison to traditional methods. These findings are consistent with previous research reviews of similar focus. WWC guidelines have identified an effect size of 0.25 standard deviations as constituting a meaningful effect (WWC, 2014, p. 19). However, WWC is careful to differentiate meaningfully significant from statistically significant in that effect sizes may be meaningful (ES is larger than 0.25), but the p-value is greater than a 0.05 alpha level. Rather than dismissing this effect as being no different from zero, it should be noted that low sample size might be the reason that a small effect size cannot be detected. However, the effects may vary by research design. For example, the mean effect sizes for randomized and quasi-experiment studies for reading were +0.20 and + 0.34, respectively. Out of the 40 qualified studies that included dependent measures of reading and mathematics performance, 19 and 18 of them respectively were large randomized experiments (e.g., Flay, Acock, Vuchinich, & Beets, 2006; Horner et al., 2009; Schonfeld et al., 2015). All of these qualified studies were spread fairly equally across years. This study found that there was no trend indicating greater effects in the recent studies. The higher quality randomized studies with larger samples have found effect sizes ranging from −0.14 to +0.73 for reading (overall effect size of randomized studies +0.20), from −0.22 to +0.81 for mathematics (overall effect size of randomized studies +0.23), and −0.02 for science. The evidence suggests that some SEL interventions that have dominated the classrooms over the last few decades might not have as meaningful effects in reading, mathematics and science for pre-K-12 students. This given the WWC guidelines that a 0.25SD effect is considered a standard cutoff for what can be considered a meaningful effect. Further research is needed to validate these results. The largest effect for randomized studies with large samples for reading and mathematics included Positive Action. Positive Action is a 6-unit curriculum taught in K-12 classrooms in an effort to promote positive actions in the physical, intellectual, emotional, and social areas. Over 140 15-minute lessons per grade level are taught 4 days a week with scripted lessons for teachers. Moreover, a school-wide climate development program, counselor's program, parent program, and community program are part of this com- prehensive system. Yet, only three of the five included studies conducted to date for Positive Action were randomized, and our findings suggest that non-randomized studies of SEL interventions overstate effect sizes. Overall, more randomized studies are needed to examine the effectiveness of promising programs. To supplement these overall findings, there were several important specific findings that were notable. First, approximately 37% (N = 15) of the qualified studies that were selected for this review were quasi-experiments. Out of the qualified studies, only 5 examined science as an outcome and only one of these was a randomized experiment. The present findings point to an urgent need for more practical randomized studies in the area of SEL programs for improving reading, mathematics, and particularly science achievement. Second, our findings suggest that larger studies produced smaller effect sizes than smaller studies for mathematics. It is more difficult to sustain high fidelity in these large-scale studies, which may explain these findings. 68
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 Third, in terms of research designs, our results suggest that true experiments produced smaller effect sizes than quasi-experiments. Fourth, there were no significant differential impacts of SEL programs between the grade levels. However, this result should be interpreted with caution. There are few studies in some subgroup to compare outcomes by grade levels. Fifth, in terms of program intensity, our findings suggest that more SEL does not necessarily result in better outcomes. Finally, it appears that no differential impact of SEL programs was found for students for high SES students. However, the results should be interpreted cautiously given that there are a limited number of studies that compare these outcomes by high SES. The results of this review indicate that the moderators do not reveal statistically significant results. However, there are some key interesting finds. In terms of grade, the effect sizes for secondary and elementary/secondary in reading are similar, yet there are large differences in effect sizes between secondary and elementary/secondary for mathematics. This could be explained as published studies usually had higher ES than unpublished, small studies and quasi had higher ES than large studies and experiments, and previous reviews suggest that elementary studies usually had higher ES than secondary. However, the results should be interpreted cautiously because the subanalyses have a small number of studies. The discrepancy in effect sizes between quasi-experimental and experimental studies was expected, but the smaller effect sizes for the RCTs are close to the WWC ES 0.25 threshold of a meaningful effect. A larger number of randomized designs could increase the magnitude of the effect and would certainly increase the likelihood of detecting a significant effect. Lastly, the Positive Action intervention appears to have substantial promise as an effective SEL program on academic outcomes. If future designs were to focus around specific successful SEL interventions, such as Positive Action, this would greatly reduce the variation in effect sizes observed. A decision could be made amongst researchers to standardize SEL interventions found to have strong effects on students as part of a future research agenda. One potential area of opportunity for this literature is to explore whether SEL programs are sustainable over the long term, or put differently, whether there is significant decay over time in the effect of the intervention. Another potential opportunity is with mediating effects. This study's purpose was to identify a direct link between SEL interventions and student academic achievement in the literature. However, the authors recognize that there may be potential indirect links between SEL interventions through their impact on specific social and emotional student outcomes and these factors direct impact on academic achievement. The effect sizes for the direct link found in the study between these SEL interventions and academic outcomes does appear to be small. However, it may be that the effect sizes of SEL interventions on other outcomes (e.g., well-being, motivation and self-regulation) are meaningful. As more studies explore SEL's potential causal pathways toward impacting academic achievement, a more systematic meta-analysis should be conducted to capture the overall effectiveness of these program effects. 7.1. Limitations Studies with dependent quantitative measures of mathematics, reading, and science were included; however, qualitative research can help to better understand the ‘black-box’ of SEL program interventions. Second, this study centers on standardized tests of reading, mathematics, and science achievement. However, other outcomes may also be of importance to policymakers and practi- tioners. In addition, there were possible dependency issues related to multiple studies with similar authors which is also implemented similar SEL interventions. However, differences in effect size by type of intervention was not a moderator in this study due to the large number of unique SEL programs. An important moderator that was not included in this analysis was time from intervention to testing and also whether follow-up was measured and its effects. Lastly, low sample size that is incapable of detecting a significant effect may be a problem. Due to the low number of studies for many of these moderators (N < 35) and the relatively small to moderate effect sizes (ranging between ES 0.14 and 0.49), a post hoc power analysis assuming an α level of 0.05 for an independent sample t-test indicates that observed power for these effect sizes ranges from 0.07 to 0.27. Clearly, more studies are needed to increase sample size and with strong designs such as RCTs to provide more confidence in the effect sizes estimated. 8. Conclusion The current findings highlight the need for more large-scale, randomized studies, with a focus on effective SEL programs to improve student academic achievement. In addition, districts should implement and continue to evaluate evidence-based SEL pro- grams to offer researchers the ability to make better comparisons of effective SEL programming. Some of the SEL approaches most widely used in schools, tested via large randomized experiments do not present strong evidence of effectiveness. However, it is the position of this paper that these findings are due to the variety of SEL interventions that likely have variation in their effectiveness on these outcomes. The U.S. Department of Education and funders should further invest in the creation and assessment of new SEL program interventions for improving achievement and work towards evaluating and creating a standard of effective SEL interven- tions to aid researchers. In sum, more studies are needed to advance this research agenda and large sample sizes will ensure a study is appropriately powered and generalizable; randomized designs will ensure more confidence in making causal claims, and similar programs will limit scope to allow better comparisons. Funding This study was funded by the Jacobs Foundation (Jacobs Foundation Grant 120813, R. P. Corcoran, Principal Investigator). 69
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 References1 *Allen, J. P., Pianta, R. C., Gregory, A., Mikami, A. Y., & Lun, J. (2011). An interaction-based approach to enhancing secondary school instruction and student achievement. Science, 333, 1034–1037. https://doi.org/10.1126/science.1207998. American Guidance Service (2001). Group reading assessment and diagnostic Evaluation: Technical manual. Circle Pines, MN: American Guidance Service. Association for Supervision and Curriculum Development (2007). The learning compact redefined: A call to actionA report of the commission on the whole child. Retrieved from http://eric.ed.gov/?id=ED495964. Ballard, W. S., & Tighe, P. L. (1999). IDEA oral language proficiency test. Brea, CA: Ballard & Tighe Publishers. *Barnett, W. S., Jung, K., Yarosz, D. J., Thomas, J., Hornbeck, A., Stechuk, R., et al. (2008). Educational effects of the tools of the mind curriculum: A randomized trial. Early Childhood Research Quarterly, 23, 299–313. https://doi.org/10.1016/j.ecresq.2008.03.001. *Bavarian, N., Lewis, K. M., DuBois, D. L., Acock, A., Vuchinich, S., Silverthorn, N., ... Flay, B. R. (2013). Using social-emotional and character development to improve academic outcomes: A matched-pair, cluster-randomized controlled trial in low-income, urban schools. Journal of School Health, 83, 771–779. https://doi.org/10. 1111/josh.12093. Belfield, C., Bowden, B., Klapp, A., Levin, H., Shand, R., & Zander, S. (2015). The economic value of social and emotional learning. New York, NY: Columbia University. *Bierman, K. L., Nix, R. L., Greenberg, M. T., Blair, C., & Domitrovich, C. E. (2008). Executive functions and school readiness intervention: Impact, moderation, and mediation in the head start REDI program. Development and Psychopathology, 20, 821–843. https://doi.org/10.1017/S0954579408000394. *Bierman, K. L., Nix, R. L., Heinrichs, B. S., Domitrovich, C. E., Gest, S. D., Welsh, J. A., et al. (2014). Effects of head start REDI on children's outcomes 1 year later in different kindergarten contexts. Child Development, 85, 140–159. https://doi.org/10.1111/cdev.12117. Blum, R. W., & Libbey, H. P. (2004). School connectedness — strengthening health and education outcomes for teenagers. Journal of School Health, 74, 229–299 Retrieved from http://leohchen.com/wordpress/wp-content/uploads/2011/08/School-of-Health-Journal-School-Spirit.pdf. Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis. West Sussex, UK: Wiley. Borenstein, M., Hedges., L. V., Higgins, J. P. T., & Rothstein, H. R. (2016). Comprehensive meta-analysis version 3.0. Retrieved from:https://www.meta-analysis.com/ downloads/Meta-Analysis%20Manual%20V3.pdf. *Braxton Arnason, P. A. (1998). The effects of a comprehensive early intervention program for at-risk students on self-esteem and academic performance variables (Doctoral dissertation). Retrieved from. Dissertation Abstracts International Section A, 58, 3842. *Brigman, G., & Campbell, C. (2003). Helping students improve academic achievement and school success behavior. Professional School Counseling, 7, 91–98 Retrieved from http://www.coe.fau.edu/research/conferences/osac2007/documents/BrigmanCampbell03.pdf. *Brigman, G., Webb, L., & Campbell, C. (2007). Building skills for school success: Improving the academic and social competence of students. Professional School Counseling, 10, 279–288. https://doi.org/10.5330/prsc.10.3.v850256191627227. *Brown, J. L. (2003). The direct and indirect effects of a school-based social-emotional learning program on trajectories of children's academic achievement (Doctoral dissertation. Dissertation Abstracts International, 64, 1923. Browne, G., Gafni, A., Roberts, J., Byrne, C., & Majumdar, B. (2004). Effective/efficient mental health programs for school-age children: A synthesis of reviews. Social Science & Medicine, 58, 1367–1384. https://doi.org/10.1016/S0277-9536(03)00332-0. Brownell, R. (2000). Expressive one-word picture vocabulary test manual. Novato, CA: Academic Therapy Publications. California Department of Education (2010). California standards test: Technical report, spring 2009 administrationRetrieved July 13, 2010, from http://www.cde.ca.gov/ ta/tg/sr/documents/csttechrpt09.pdf. *Campbell, C. A., & Brigman, G. (2005). Closing the achievement gap: A structured approach to group counseling. Journal for Specialists in Group Work, 30, 1–16. https://doi.org/10.1080/01933920590908705. *Challen, A., Noden, P., West, A., & Machin, S. (2011). UK resilience programme evaluationFinal report (DFE-RR097). Retrieved from the U.K. Department for Education websitehttps://www.gov.uk/government/uploads/system/uploads/attachment_data/file/182419/DFE-RR097.pdf. Cheung, A. C., & Slavin, R. E. (2012). How features of educational technology applications affect student reading outcomes: A meta-analysis. Educational Research Review, 7(3), 198–215. Cheung, A., & Slavin, R. E. (2016). How methodological features of research studies affect effect sizes. Educational Researcher, 45(5), 283–292. Clarke, A. M., Morreale, S., Field, C. A., Hussein, Y., & Barry, M. M. (2015). What works in enhancing social and emotional skills development during childhood and adolescence? A review of the evidence on the effectiveness of school-based and out-of-school programmes in the UK. Retrieved from the U.K. Health Promotion Research Centre website:https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/411492/What_works_in_enhancing_social_and_emotional_skills_ development_during_childhood_and_adolescence.pdf. Collaborative for Social and Emotional Learning. (2015). CASEL guide: Effective Social and emotional learning programs. Retrieved fromhttp://secondaryguide.casel.org/ casel-secondary-guide.pdf. Commonwealth of Virginia (2005). The Virginia standards of learningTechnical Report. Richmond, VA: Commonwealth of Virginia. Connecticut State Department of Education (2006). CMT/CAPT skills checklist second generation. Retrieved from http://www.naacpartners.org/presentations/ workshops/materials/16300.pdf. *Cook, T. D., Habib, F., Phillips, M., Settersten, R. A., Shagle, S. C., & Degirmencioglu, S. M. (1999). Comer's school development program in Prince George's county, Maryland: A theory-based evaluation. American Educational Research Journal, 36, 543–597. https://doi.org/10.3102/00028312036003543. *Cook, T. D., Murphy, R. F., & Hunt, H. D. (2000). Comer's school development program in Chicago: A theory-based evaluation. American Educational Research Journal, 37, 535–597. https://doi.org/10.3102/00028312037002535. Corcoran, R. P. (2018). An embodied cognition approach to enhancing reading achievement in New York City public schools: Promising evidence. Teaching and Teacher Education, 71, 78–85. Available from URL: https://doi.org/10.1016/j.tate.2017.11.010. Corcoran, R. P. (2017a). Preparing principals to improve student achievement. Child & Youth Care Forum, 46(5), 769–781 Available from URL:https://doi.org/10. 1007/s10566-017-9399-9. Corcoran, R. P. (2017b). Preparing teachers' to raise students' mathematics learning. International Journal of Science and Mathematics Education Available from URL:https://doi.org/10.1007/s10763-017-9819-1. Corcoran, R. P., & O’Flaherty, J. (2017a). Longitudinal tracking of academic progress during teacher preparation. British Journal of Educational Psychology, 87, 664–682 Available from URL:https://doi.org/10.1111/bjep.12171. Corcoran, R. P., & O’Flaherty, J. (2017b). Executive function during teacher preparation. Teaching and Teacher Education, 63, 168–175. https://doi.org/10.1016/j.tate. 2016.12.023. Corcoran, R. P., & Slavin, R. E. (2016). Effective programs for social and emotional learning (SEL): A systematic review. Campbell Library Database of Systematic Reviews Available from URL:http://campbellcollaboration.org/lib/project/364/. Corcoran, R. P., & Tormey, R. (2012a). Developing emotionally competent teachers: Emotional intelligence and pre-service teacher education. Oxford, ENG: Lang. Corcoran, R. P., & Tormey, R. (2012b). How emotionally intelligent are pre-service teachers? Teaching and Teacher Education, 28, 750–759. https://doi.org/10.1016/j. tate.2012.02.007. *Corsello, M., & Sharma, A. (2015). The building assets-reducing risks program: Replication and expansion of an effective strategy to turn around low-achieving schoolsFinal report. Retrieved from http://static1.squarespace.com/static/5613cb59e4b009e45cc5c677/t/56fee985b6aa6038541b7c36/1459546503250/Final+report+for +BARR+i3+Development+grant+-+ERIC+upload.pdf. CTB Macmillan/McGraw-Hill (1992). The California achievement tests (5th ed.). Monterey, CA: CTB Macmillan/McGraw-Hill School Publishing Company. 1 References marked with an asterisk indicate studies included in the meta-analysis. 70
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 CTBS/McGraw-Hill (1981). CTBS/4 technical report manualMonterey, CA: Author. Department for Education (2013). National pupil database. Retrieved from https://www.gov.uk/government/collections/national-pupil-database. Department of Education (2015). Department's new priorities. Retrieved from https://www.gov.uk/government/speeches/nicky-morgan-speaks-at-early-intervention- foundation-conference. Department for Education and Skills (2004). Every child matters: Change for children. London: UK Government Retrieved from https://www.gov.uk/government/ uploads/system/uploads/attachment_data/file/257876/change-for-children.pdf. *Diperna, J. C., Lei, P., Bellinger, J., & Cheng, W. (2015). Effects of a universal positive classroom behavior program on student learning. Psychology in the Schools, 53, 189–203. https://doi.org/10.1002/pits.21891. Domitrovich, C. E., Durlak, J. A., Staley, K. C., & Weissberg, R. P. (2017). Social-Emotional competence: An essential factor for promoting positive adjustment and reducing risk in school children. Child Development, 88(2), 408–416. https://doi.org/10.1111/cdev.12739. Duckworth, A. L., & Yeager, D. S. (2015). Measurement matters: Assessing personal qualities other than cognitive ability for educational purposes. Educational Researcher, 44(4), 237–251. Dunn, L., & Dunn, L. (1997). Peabody picture vocabulary test (3rd ed.). Circle Pines, MN: American Guidance Service. Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., & Schellinger, K. B. (2011). The impact of enhancing students' social and emotional learning: A meta- analysis of school-based universal interventions. Child Development, 82, 405–432. https://doi.org/10.1111/j.1467-8624.2010.01564.x. Durlak, J. A., Weissberg, R. P., & Pachan, M. (2010). A meta-analysis of after-school programs that seek to promote personal and social skills in children and adolescents. American Journal of Community Psychology, 45, 294–309. https://doi.org/10.1007/s10464-010-9300-6. Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (2007). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: National Academies Press. Dynamic Measurement Group (2008). DIBELS 6th edition technical adequacy information. Tech. Rep. No. 6Eugene, OR: Author. *Dynarski, M., James-Burdumy, S., Moore, M., Rosenberg, L., Deke, J., & Mansfield, W. (2004). When schools stay open late: The national evaluation of the 21st century community learning centers program: New findings (NCEE 2004-3001). Retrieved from the U.S. Department of Education, National Center for Education Evaluation and Regional AssistanceWashington, DC: U.S. Government Printing Office. Every Student Succeeds Act (ESSA) of 2015, Pub. L. 114–95 (2015). *. Farran, D. C., Wilson, S. J., Meador, D., Norvell, J., & Nesbitt, K. (2015). Experimental evaluation of the tools of the mind pre-k curriculumTechnical report. Manuscript in progress. Peabody Research Institute, Vanderbilt University. Farrington, C. A., Roderick, M., Allensworth, E., Nagaoka, J., Keyes, T. S., Johnson, D. W., & Beechum, N. O. (2012). Teaching adolescents to become learners. The role of noncognitive factors in shaping school performance: A critical literature review. Chicago, IL: University of Chicago Consortium on Chicago School Research. Farrington, D. P., & Ttofi, M. M. (2009). School-based programs to reduce bullying and victimization. Retrieved from https://www.ncjrs.gov/pdffiles1/nij/grants/229377. pdf. *Flay, B., Acock, A., Vuchinich, S., & Beets, M. (2006). Progress report of the randomized trial of Positive Action in Hawaii: End of third year of interventionRetrieved from ResearchGate website:https://www.researchgate.net/publication/224942204. *Flay, B. R., & Allred, C. G. (2003). Long-term effects of the positive action program. American Journal of Health Behavior, 27, 6–21. https://doi.org/10.5993/ajhb.27.1. s1.2. *Flay, B. R., Allred, C. G., & Ordway, N. (2001). Effects of the positive action program on achievement and discipline: Two matched-control comparisons. Prevention Science, 2, 71–89. https://doi.org/10.1023/A:1011591613728. Flay, B. R., Biglan, A., Boruch, R. F., Castro, F. G., Gottfredson, D., Kellam, S., ... Ji, P. (2005). Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science, 6, 151–175. https://doi.org/10.1007/s11121-005-5553-y. Florida Department of Education (2002). Technical report for operational test administrators of the Florida Comprehensive Assessment TestTallahassee, FL: Author. *Freiberg, H. J., Huzinec, C. A., Rubino, C., Borders, K., Williams, L., & Alexander, R. (2011). The effects of a prosocial classroom management program on student achievement and behavior at a reconstituted inner city ninth-grade academy (American Educational Research Association Annual Meeting. Presented at the meeting of the American Educational Research Association, New Orleans, LA). *Freiberg, H. J., Huzinec, C. A., & Templeton, S. M. (2009). Classroom management—a pathway to student achievement: A study of 14 inner-city elementary schools. The Elementary School Journal, 110, 63–80. https://doi.org/10.1086/598843. *Gibbons, L., Foster, J., Owens, J., Caldwell, S. D., & Marshall, J. C. (2006). The CHARACTERplus way results monograph. Building a healthy school community: Experimental evidence that the CHARACTERplus way works. Retrieved from http://edpluscompdata.net/characterplus/Resources/Cplus_Results_MonographRev11_ 3.pdf. Glass, G., McGaw, B., & Smith, M. L. (1981). Meta-analysis in social research. Beverly Hills, CA: Sage. Greenberg, M. T., Weissberg, R. P., O'Brien, M. U., Zins, J. E., Fredericks, L., Resnik, H., et al. (2003). Enhancing school-based prevention and youth development through coordinated social, emotional, and academic learning. American Psychologist, 58, 466–474. https://doi.org/10.1037/0003-066X.58.6-7.466. *Hanson, T., Dietsch, B., & Zheng, H. (2012). Lessons in character impact evaluationFinal report (NCEE 2012-4004). Retrieved from http://eric.ed.gov/?id=ED530370. *Hanson, T., Izu, J. A., Petrosino, A., Delong-Cotty, B., & Zheng, H. (2011). A randomized experimental evaluation of the tribes learning communities prevention program. Retrieved from the National Criminal Justice Reference Service websitehttps://www.ncjrs.gov/pdffiles1/nij/grants/237958.pdf. Hawaii State Department of Education (2005). Hawaii content and performance standards III database. Retrieved from http://165.248.72.55/hcpsv3/. *Hinojosa, T., Bos, J., O'Brien, B., Park, S., Liu, F., Corsello, M., et al. (2016). Starting strong: A randomized controlled trial of the building assets reducing risks (BARR) model in 9th grade. Society for research on educational effectiveness conference (Presented at the meeting of Society for Research i(n Educational Effectiveness, Washington, DC). *Horner, R. H., Sugai, G., Smolkowski, K., Eber, L., Nakasato, J., Todd, A. W., et al. (2009). A randomized, wait-list controlled effectiveness trial assessing school-wide positive behavior support in elementary schools. Journal of Positive Behavior Interventions, 11, 133–144. https://doi.org/10.1177/1098300709332067. Illinois State Board of Education Division of Assessment (2013). Illinois standards achievement test 2013 technical manual. Retrieved from https://www.isbe.net/ Documents/isat_tech_2013.pdf. *. James-Burdumy, S., Dynarski, M., Moore, M., Deke, J., Mansfield, W., & Pistorino, C. (2005). When schools stay open late: The national evaluation of the 21st century community learning centers programFinal report (NCEE 2005-3002). Retrieved from the U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. Washington, DC: U.S. Government Printing Office. Johns Hopkins University (2015). Effective programs for social and emotional learning: A systematic review. (Jacobs Foundation Grant 120813, R. P. Corcoran, Principal Investigator). *Jones, S. M., Brown, J. L., & Aber, J. L. (2011). Two-year impacts of a universal school-based social-emotional and literacy intervention: An experiment in trans- lational developmental research. Child Development, 82, 533–554. https://doi.org/10.1111/j.1467-8624.2010.01560.x. Kilpatrick, J., & Quinn, H. (2009). Science and mathematics education: Education policy white paper. Washington, DC: National Academy of Education. Learning First Alliance (2001). Every child learning: Safe and supportive schools. Retrieved from the National Criminal Justice Reference Service website:https://www. ncjrs.gov/App/Publications/abstract.aspx?ID=203731. Liao, Y.-K. C. (1999). Effects of hypermedia on students' achievement: A meta-analysis. Journal of Educational Multimedia and Hypermedia, 8, 255–277 Retrieved from http://www.c3l.uni-oldenburg.de/cde/media/readings/liao.pdf. *Linares, L. O., Rosbruch, N., Stern, M. B., Edwards, M. E., Walker, G., Abikoff, H. B., et al. (2005). Developing cognitive-social-emotional competencies to enhance academic learning. Psychology in the Schools, 42, 405–417. https://doi.org/10.1002/pits.20066. Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis, Vol. 49. Thousand Oaks, CA: Sage. Lonigan, C. J., Wagner, R. K., Torgesen, J. K., & Rashotte, C. A. (2007). TOPEL: Test of preschool early literacy. Austin, TX: Pro-Ed. Maynard, B. R., Solis, M., & Miller, V. (2015). Mindfulness-based interventions for improving academic achievement, behavior and socio-emotional functioning of primary and secondary students: A systematic review. Retrieved from https://campbellcollaboration.org/lib/download/3829/Maynard_Mindfulness_Protocol.pdf. 71
R.P. Corcoran et al. Educational Research Review 25 (2018) 56–72 *Morris, P., Lloyd, C. M., Millenky, M., Leacock, N., Raver, C. C., & Bangser, M. (2013). Using classroom management to improve preschoolers' social and emotional skills: Final impact and implementation findings from the foundations of learning demonstration in Newark and Chicago. SSRN Electronic Journal. https://doi.org/ 10.2139/ssrn.2202401. National Center for Education Statistics. (n.d.). Early Childhood Longitudinal Program (ECLS). Retrieved from https://nces.ed.gov/ecls/kindergarten.asp. NCCA, National Council for Curriculum and Assessment (2017). Junior cycle wellbeing guidelines. Dublin: NCCA Available at: http://www.juniorcycle.ie/NCCA_ JuniorCycle/media/NCCA/Curriculum/Wellbeing/Wellbeing-Guidelines-for-Junior-Cycle.pdf. *Nichols-Barrer, I., & Haimson, J. (2013). Impacts of five expeditionary learning middle schools on academic achievement. No. e4330aa3795e4e87a89ea4b5296e5d65Cambridge, MA: Mathematica Policy Research. *Nix, R. L., Bierman, K. L., Domitrovich, C. E., & Gill, S. (2013). Promoting children's social-emotional skills in preschool can enhance academic and behavioral functioning in kindergarten: Findings from head start REDI. Early Education & Development, 24, 1000–1019. https://doi.org/10.1080/10409289.2013.825565. Northwest Evaluation Association (2011). Technical manual for measures of academic progress -&- measures of academic progress for primary grades. Portland, OR: Northwest Evaluation Association. Orwin, R. G., & Boruch, R. F. (1982). RRT meets RDD: Statistical strategies for assuring response privacy in telephone surveys. Public Opinion Quarterly, 46, 560–571. https://doi.org/10.1086/268752. Payton, J., Weissberg, R. P., Durlak, J. A., Dymnicki, A. B., Taylor, R. D., Schellinger, K. B., et al. (2008). The positive impact of social and emotional learning for kindergarten to eighth-grade students: Findings from three scientific reviewsTechnical report. Retrieved from http://files.eric.ed.gov/fulltext/ED505370.pdf. Pearson (n.d.). Stanford Achievement Test Series, Tenth Edition. Retrieved from https://www.pearsonassessments.com/learningassessments/products/100000415/ stanford-achievement-test-series-tenth-edition.html. Piquero, A. R., Jennings, W., Farrington, D., & Jennings, W. G. (2010). Self-control interventions for children under age 10 for improving self-control and delinquency and problem behaviors. Campbell Systematic Reviews, 6(2), https://doi.org/10.4073/csr.2010.2. *Raver, C. C., Jones, S. M., Li-Grining, C., Zhai, F., Bub, K., & Pressler, E. (2011). CSRP's impact on low-income preschoolers' pre academic skills: Self-regulation as a mediating mechanism. Child Development, 82, 362–378. https://doi.org/10.1111/j.1467-8624.2010.01561.x. Renaissance Learning (2009). STAR Math technical manual. Wisconsin Rapids, WI: Renaissance Learning. Renaissance Learning (2010). STAR Reading technical manual. Wisconsin Rapids, WI: Renaissance Learning. *Rimm-Kaufman, S. E., Fan, X., Chiu, Y.-J., & You, W. (2007). The contribution of the responsive classroom approach on children's academic achievement: Results from a three year longitudinal study. Journal of School Psychology, 45, 401–421. https://doi.org/10.1016/j.jsp.2006.10.003. *Rimm-Kaufman, S. E., Larsen, R. A. A., Baroody, A. E., Curby, T. W., Ko, M., Thomas, J. B., ... DeCoster, J. (2014). Efficacy of the responsive classroom approach: Results from a 3-year, longitudinal randomized controlled trial. American Educational Research Journal, 51, 567–603. https://doi.org/10.3102/ 0002831214523821. Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86, 638–641. https://doi.org/10.1037/0033-2909.86.3.638. Samdal, G. B., Eide, G. E., Barth, T., Williams, G., & Meland, E. (2017). Effective behaviour change techniques for physical activity and healthy eating in overweight and obese adults; systematic review and meta-regression analyses. International Journal of Behavioral Nutrition and Physical Activity, 14(1), 14–42. https://doi.org/ 10.1186/s12966-017-0494-y. *Scales, P. C., Blyth, D. A., Berkas, T. H., & Kielsmeier, J. C. (2000). The effects of service-learning on middle school students' social responsibility and academic success. The Journal of Early Adolescence, 20, 332–358. https://doi.org/10.1177/0272431600020003004. *Schinke, S. P., Cole, K. C., & Poulin, S. R. (2000). Enhancing the educational achievement of at-risk youth. Prevention Science, 1, 51–60. https://doi.org/10.1023/ A:1010076000379. *Schonfeld, D. J., Adams, R. E., Fredstrom, B. K., Weissberg, R. P., Gilman, R., Voyce, C., ... Speese-Linehan, D. (2015). Cluster-randomized trial demonstrating impact on academic achievement of elementary social-emotional learning. School Psychology Quarterly, 30, 406–420. https://doi.org/10.1037/spq0000099. Sedlmeier, P., & Gigerenzer, G. (1989). Do studies of statistical power have an effect on the power of studies? Psychological Bulletin, 105, 309–316 Retrieved from http://library.mpib-berlin.mpg.de/ft/gg/GG_Do%20Studies_1989.pdf. Seton Testing Services (n.d.-a). The Iowa Tests. Retrieved from http://www.setontesting.com/iowa-tests/. Seton Testing Services (n.d.-b). The TerraNova/CAT 6 Tests. Retrieved from http://www.setontesting.com/terranova/. Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and Quasi-experimental designs for generalized causal inference. Boston, MA: Houghton-Mifflin. Snyder, F., Flay, B., Vuchinich, S., Acock, A., Washburn, I., Beets, M., & Li, K.-K. (2010). Impact of a social-emotional and character development program on school- level indicators of academic achievement, absenteeism, and disciplinary outcomes: A matched-pair, cluster randomized, controlled trial. Journal of Research on Educational Effectiveness, 3, 26–55. http://doi.org/10.1080/19345740903353436. *. Somers, M. A., Corrin, W., Sepanik, S., Salinger, T., Levin, J., & Zmach, C. (2010). The enhanced reading opportunities study final report: The impact of supplemental literacy courses for struggling ninth-grade readers(NCEE 2010-4021). Retrieved from the National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Washington, DC: U.S. Government Printing Office. Spivak, A., Lipsey, M., Farran, D., & Polanin, J. (2015). Practices and program components for enhancing prosocial behavior in children and youth: A systematic review. Retrieved from https://www.campbellcollaboration.org/library/practices-and-program-components-for-enhancing-prosocial-behavior-in-children-and-youth-a- systematic-submitted for publication.html. Texas Education Agency (2003). Texas Assessment of Academic Skills and college entrance examination performance trends in TexasPolicy Research Report No. 16 (Document No. GE04 601 01). Austin, TX: Author. Texas Education Agency (n.d.). TAKS. Retrieved from https://tea.texas.gov/student.assessment/taks/. Torgesen, J. K., Wagner, R. K., & Rashotte, C. A. (1999). Test of word reading efficiency. Austin, TX: PRO-ED. Touchstone Applied Science Associates (TASA) (2002). Degrees of reading power program. Brewster, NY: TASA. U.S. Department of Education (2016). Non-regulatory guidance: Using evidence to strengthen education investments. Retrieved from https://goo.gl/b4ESW3. Weare, K., & Nind, M. (2011). Mental health promotion and problem prevention in schools: What does the evidence say? Health Promotion International, 26(suppl 1), i29–i69. https://doi.org/10.1093/heapro/dar075. *Webb, L. D., Brigman, G. A., & Campbell, C. (2005). Linking school counselors and student success: A replication of student success skills approach targeting the academic and social competence of students. Professional School Counseling, 8, 407–413 Retrieved from http://eric.ed.gov/?id=EJ714284. Wechsler, D. (1989). Wechsler preschool and primary scale of intelligence-revised. New York, NY: The Psychological Corporation. What Works Clearinghouse (WWC) (2014). WWC procedures and standards handbook (Version 3.0). Retrieved from https://ies.ed.gov/ncee/wwc/Docs/ referenceresources/wwc_procedures_v3_0_standards_handbook.pdf. Wilson, S. J., & Lipsey, M. (2006). The effects of school-based social information processing interventions on aggressive behavior: Part i: Universal programs: A systematic review. Campbell Systematic Reviews, 5.. https://doi.org/10.4073/csr.2006.5. Wilson, S. J., & Lipsey, M. J. (2007). Effectiveness of school-based intervention programs on aggressive behavior: Update of a meta-analysis. American Journal of Preventive Medicine, 33(suppl. 2), S130–S143. https://doi.org/10.1016/j.amepre.2007.04.011. Woodcock, R., & Johnson, M. (1989). Woodcock–johnson psycho-educational battery-revised. Allen, TX: DLM Teaching Resources. Woodcock, R. W., McGrew, K. S., & Mather, N. (2001). Woodcock-johnson III: Tests of achievement. Rolling meadows. IL: Riverside. Woodcock, R., & Munoz-Sandoval, A. F. (1996). Bateria Woodcock–Munoz: Pruebas de Habilidad Cognitive-Revisada. Chicago, IL: Riverside Publishing. Yoshikawa, H., Leyva, D., Snow, C. E., Treviño, E., Barata, C., Weiland, C., ... Arbour, M. C. (2015). Experimental impacts of a teacher professional development program in Chile on preschool classroom quality and child outcomes. Developmental Psychology, 51, 309–322. https://doi.org/10.1037/a0038785. Zill, N. (2003). Early math skills test. Rockville, MD: Westat. Zins, J. E., Weissberg, R. P., Wang, M. C., & Walberg, H. J. (2004). Building academic success on social and emotional learning: What does the research say? New York, NY: Teachers College Press. 72
Search
Read the Text Version
- 1 - 17
Pages: