1-s2.0-S187118711630150X-main
1-s2.0-S187118711630150X-main
1-s2.0-S187118711630150X-main
AR TI CLE I NF O AB S T R A CT
Keywords: The purpose of this meta-analysis was to examine the relationship between student levels of
Critical thinking critical thinking (as established via critical thinking tests) and community college student suc-
Academic achievement cess. We conducted a meta-analysis to synthesize the extant literature on critical thinking and
Community college community college student success. After systematically searching the relevant literature through
Two-year college
electronic databases using an array of search terms, we screened studies and reviewed them for
Meta-analysis
inclusion. Included studies were reliably coded using a protocol to extract correlational effect size
data and study characteristics. These effect sizes were aggregated meta-analytically. From a total
of 23 studies (27 samples, N = 8233), we found that critical thinking was moderately related to
community college student success. The relationship between student achievement and levels of
critical thinking (as skills or dispositions) was constant for both nursing and non-nursing students
as well as for grades and individual test outcomes; however, students’ levels of critical thinking
was more strongly associated with longer-term outcomes compared with those at the shorter
term. Meta-regression results indicated some evidence that effects were weaker for male and
minority community college students. Implications for this study include the importance of
cultivating critical thinking for all community college students and the exploration of under-
studied areas for future research.
1. Introduction
Critical thinking is arguably one of the most valued learning goals in postsecondary education (Gellin, 2003; Pithers & Soden,
2000; Wilson & Wagner, 1981). In an expanding world of information, there is growing need for individuals to process data, evaluate
ideas, and reason through arguments (Pascarella & Terenzini, 2005). Critical thinking is particularly critical for community college
students engaged in vocational training or preparing for transfer to four-year institutions. Community college enrollment has rapidly
expanded over the last century, affording greater participation in higher education, especially for individuals with limited oppor-
tunities (Goldrick-Rab, 2010; Stevens & Kirst, 2015; Zientek, Yetkiner Ozel, Fong, & Griffin, 2013). In fact, recent statistics indicated
that nearly 40% of students in higher education are enrolled in community colleges (Shapiro, Dundar, Yuan, Harrell, & Wakhungu,
2014). However, despite the recent rise of community colleges, failure to match students to credentials is a persistent issue. Research
has consistently shown low degree completion rates (Fike & Fike, 2008); in fact, only one third of community college students earned
a credential within six years (Calcagno, Bailey, Jenkins, Kienzl, & Leinbach, 2006). Moreover, some reports indicated that almost half
☆
Parts of this paper were presented at the annual meeting of the Council for the Study of Community Colleges. This project has been funded through an American
Psychological Association Educational Psychology Early Career Grant.
⁎
Corresponding author at: Department of Curriculum and Instruction, Texas State University, San Marcos, TX, 78666, USA.
E-mail address: carltonfong@txstate.edu (C.J. Fong).
http://dx.doi.org/10.1016/j.tsc.2017.06.002
Received 20 October 2016; Received in revised form 10 April 2017; Accepted 6 June 2017
Available online 15 June 2017
1871-1871/ © 2017 Elsevier Ltd. All rights reserved.
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
of community college students dropped out within their first year (Xu & Jaggars, 2011).
The preponderance of the literature examining factors affecting community college students’ academic success and persistence
has tended to concentrate on variables such as first-generation status, socioeconomic status, and prior school achievement. Although
it is vital to identify such background factors, the use of prescriptive measures which assess students’ cognitive, motivational, and
behavioral variables affecting access, success, and retention cannot be overlooked (Fong et al., 2017). By identifying variables which
can be enhanced through educational interventions, such as students’ critical thinking skills, educators and practitioners can design
and implement interventions to help students improve in these areas.
Critical thinking is an elusive construct to define and understand, partly due to a variety of definitions in the field (Moore, 2013).
For our study, a working definition of critical thinking is “purposeful, self-regulatory judgment that results in interpretation, analysis,
evaluation, and inference, as well as explanations of the considerations which that judgment is based”. Critical thinking has been
conceived of in two ways: skills and disposition. Critical thinking skills involve a set of capacities including interpreting, predicting,
analyzing, and evaluating (Abrami et al., 2008). On the other hand, dispositions refer to an individual's proclivities and personality
characteristics related with aspects of critical thinking such as curiosity, inquisitiveness, open-mindedness, and prudence in decision-
making (Facione, Facione, & Sanchez, 1994). Some of the most common measurements of critical thinking skills and/or dispositions
are the Watson-Glaser Critical Thinking Appraisal tool (Watson & Glaser, 1964), Cornell Critical Thinking Test (Ennis,
Millman, & Tomko, 1985), Motivated Strategies for Learning Questionnaire (Pintrich, Smith, Garcia, & McKeachie, 1993), and Cali-
fornia Critical Thinking Disposition Inventory (Facione, Facione, & Giancarlo, 2001).
Critical thinking has been used as an appropriate framework for adult education (Garrison, 1992). Applying critical thinking to
adult learners, Brookfield (1987) outlined five phases: a triggering event, appraisal of the situation, exploration and explanation,
development of alternatives, and integration of perspectives. Particularly important for adult learners who prefer self-directed
learning, critical thinking places on the learner the responsibility of making sense of a situation or integrating new ideas with
previous knowledge. In a variety of educational settings, there is evidence that supports how critical thinking increases learning
outcomes (Facione, 2009; Halpern, 1998; Heijltjes, van Gog, Leppink, & Paas, 2014). Evidence from a prior meta-analytic review has
illustrated the positive impact of the college experience on developing critical thinkers (Huber & Kuncel, 2016; McMillan, 1987).
Moreover, another recent meta-analysis has documented effective strategies that target critical thinking. Despite recent syntheses on
critical thinking, the context of the community college and the role of critical thinking in student achievement have yet to be explored
meta-analytically.
In the community college setting overall, critical thinking is considered an ideal learning outcome (Bers, McGowan, & Rubin,
1996). However, there is inconsistent evidence regarding the boon of critical thinking for community college students. For example,
Thompson (2009) examined two groups of two-year college students: a successful group (grades of A/B) and an unsuccessful group
(grades of C or worse). There were significantly different levels of critical thinking disposition between the two groups with the
successful group reporting high levels of critical thinking. Alternatively, in a distance-learning class, Puzziferro (2006) also compared
successful and unsuccessful students on their course grades and critical thinking but found no significant differences. Similarly,
Singleton-Williams (2009) found a non-significant correlation between students’ levels of critical thinking and final course grades in
an introductory computer application class. Whereas some studies may show negative effects (e.g., Foust, 2005), some studies show
no effects or positive relations with achievement (e.g., Thompson, 2006). Due to the discrepancies in the literature as well as the
prevalence of critical thinking as a proposed goal of the community college experience, a research synthesis to aggregate previous
findings is timely and important.
One subpopulation among community college students in which critical thinking is particularly salient is nursing students.
Because of the rapidly changing healthcare environment and the need to quickly process and interpret multiple sources of in-
formation, critical thinking is an indisputable characteristic nursing educators desire to cultivate within their students
(Scheffer & Rubenfeld, 2000). Moreover, because community colleges produce a significant amount of nurses in our society
(Jacobs & Dougherty, 2006), cultivating critical thinking at the community college level is becoming increasingly important.
A number of factors may influence the relationship between students’ levels of critical thinking and their academic achievement.
Student background characteristics, such as demographic information like gender and ethnicity, have been examined in the literature
on critical thinking; however, the evidence is mixed. Regarding gender, some studies have reported no differences on critical thinking
measures, differences favoring women, or differences favoring men (Giancarlo & Facione, 2001; King, Wood, & Mines, 1990). Re-
garding ethnicity, Terenzini, Springer, Pascarella, and Nora (1995) found that ethnicity (White vs. non-White) was not a significant
predictor of students’ levels of critical thinking skills. Furthermore, a comparison between traditional college students and first-
generation students, who tend to consist of historically underrepresented minorities, yielded no significant differences in critical
thinking skills (Terenzini, Springer, Yaeger, Pascarella, & Nora, 1996). Although ethnicity may not play a direct role in students’
levels of critical thinking skills, there is compelling evidence regarding the influence of diversity and diversity-related experiences
72
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
that enhance students’ cognitive and intellectual development (Mayhew et al., 2016). For example, in a meta-analysis by Bowman
(2010), the strongest positive associations between diversity experiences and college student cognitive development were found
when collegians had interpersonal, interracial experiences. To further clarify how gender, ethnicity, and diversity moderate the
relation between students’ levels of critical thinking and their achievement, we planned moderator tests to assess these student
characteristics.
Another important college student characteristic to consider is students’ academic discipline or major. Similarly, the literature
regarding the influence of academic major on critical thinking is mixed (Mayhew et al., 2016). A large number of studies have
demonstrated that students in the physical sciences, life sciences, engineering, and math generally had significantly higher critical
thinking scores than students in the humanities and social sciences (Arum & Roksa, 2011; Brint, Cantwell, & Saxena, 2012; King et al.,
1990). On the other hand, research has indicated no significant differences among majors (Li, Long, & Simpson, 1999). In particular,
Pike, Kuh, and McCormick (2011) found disciplinary differences on higher order thinking among college freshmen, yet these dif-
ferences were not apparent among college seniors. Categorizing majors into practice disciplines (i.e., nursing, education, business)
and nonpractice disciplines (i.e., English, history, psychology), Walsh and Hardy (1999) found that in general, students in non-
practice disciplines had more favorable levels of critical thinking dispositions. In the community college literature, critical thinking
has been frequently studied in the nursing field. Therefore, the current meta-analysis will measure disciplinary differences as a
moderator, namely between nursing fields and non-nursing fields, mirroring the contrast of practice discipline versus nonpractice
discipline, used by Walsh and Hardy (1999).
Our meta-analytic study was guided by two research questions: 1) What is the relationship (direction and magnitude) between
critical thinking and community college student success in the existing literature? 2) What factors explain variation in the re-
lationship between critical thinking and community college student success?
2. Method
Research syntheses aggregate the findings of studies that address the same research question. Bowman (2012) encouraged the use
of quantitative research synthesis, or meta-analysis, in higher education research. Despite the rise of meta-analytic research in
education, use of this methodology in the community college field is scant. In the following sections, we briefly outline the meth-
odological and analytic approaches used in our meta-analysis, which is closely aligned with the methods from Cooper, Hedges, and
Valentine (2009) and Cooper (2010).
The first step in conducting the research synthesis was to exhaustively search the literature for studies on critical thinking and
community college success. In order to do so, we used an array of search strategies to comprehensively uncover relevant studies (see
Table 1). We opted to choose a broad set of keyword terms because researchers may not explicitly mention or describe critical
thinking. Retrieved studies must have included at least one search term in the following three domains: predictors related to critical
thinking, populations related to community college, and outcomes related to achievement and persistence. We searched the following
electronic databases: ERIC, PsycINFO, and Proquest Dissertation and Theses Full Text. The literature search did not have an explicit start
date and ended in January 2015. In addition, we conducted a number of hand searches in the most prominent community college
journals, namely, Community College Journal of Research and Practice and Community College Review.
After all citations were retrieved, we screened titles and abstracts of each of the citations for potential relevance to the topic. The
full-text documents of the resulting pool of studies were located and screened using the following inclusion criteria (Fig. 1 depicts the
search retrieval and screen flow chart). Within each full-text document, we also conducted an ancestry search, examining the
Table 1
Search Strategy.
Domain Keywords
Predictor “critical thinking” OR “creative thinking” OR “study habits” OR “study skills” OR “problem solving” OR “conflict resolution” OR “decision
making” OR “psychosocial factors” OR “time management” OR adjustment OR “communication skills” OR adaptability OR “career development”
OR “vocational maturity” OR self-efficacy OR “student attitudes” OR planning OR conscientiousness OR “work attitudes” OR involvement OR
determin* OR factor* OR variable* OR parameter* OR reason* OR caus* OR correlat* R antecedent* OR predictor*
Population “community college” OR “two year college” OR “junior college” OR “two-year college” OR undergraduate OR freshman OR sophomore OR
“junior student” OR “senior student” OR “transfer student” OR “vocational school” OR “vocational college” OR “technical school” OR “technical
college” OR “associate’s degree” OR “city college”
Outcome “academic achievement” OR “academic performance” OR grade* OR scholastic OR grade point average OR GPA OR degree OR “college
performance” OR “college achievement” OR mark OR graduation OR completion OR attainment OR “retention” OR “school holding power” OR
“academic persistence” OR attendance OR dropout OR “dropping out” OR “enrollment management” OR “student attrition” OR truancy OR
withdrawal
Note. Asterisks denote truncation, and quotation marks surround a phrase. Each domain of keywords is linked by the AND Boolean.
73
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
Once full-texts were deemed to meet our inclusion criteria, a team of five coders retrieved information on various characteristics
for each of the research reports. Because every study report differs in how much detail it provides, we inferred a number of codes
when necessary by using pre-established definitions to code ambiguous characteristics. The codes were categorized into four do-
mains: research report, sample, predictor variable, and outcome variable.
Research report characteristics. First, we coded characteristics regarding the report, such as the author name, year of the
report, and type of report. We were mainly interested in the type of report to ascertain whether a work was published in a peer-
reviewed journal or not. This allowed us to assess publication bias, an important feature when conducting a meta-analysis (Polanin,
Tanner-Smith, & Hennessy, 2016).
Sample characteristics. Second, we coded for sample characteristics in each report. We coded for the community college setting
as well as demographic characteristics such as age, gender, ethnicity, and socioeconomic status. We also coded for educational
characteristics related to the sample, including program/major, full-time status, and hours worked per week.
Predictor characteristics. Third, we coded aspects of the predictor variable such as the type of psychosocial variable and study
authors’ descriptions of the predictor. We also noted if there was a scale or instrument name, reliability of the measure, and domain.
Two aspects of interest were whether the critical thinking measure described skills or a disposition and whether the measure was self-
reported (i.e., self-efficacy for or use of skills) versus a formal assessment of their critical thinking abilities.
Outcome characteristics. Fourth, we coded characteristics of the outcome. Namely, we were interested in student success
outcomes, which comprise of either retention in community college, degree attainment, and course completion or achievement-
related outcomes such as grades, GPA, or tests. We also coded for the duration of the persistence measure (one semester or beyond
one semester) to compare the shorter-term impacts with longer-term ones.
Coder reliability. All reports were coded independently by five trained coders. The coders had experience in meta-analytic
coding and were extensively trained on each code using the previously mentioned coding frame. For a reliability check, we compared
all pairs of codes for each study between two coders. We calculated a reliability measure by dividing the number of matched codes by
the total number of codes. Disagreements were noted and resolved by another coder. Coder interrater reliability was high with an
74
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
Effect sizes were computed as Pearson’s r. We converted the correlations to Fisher’s z, conducted our analyses, then converted
back to r. When possible, we extracted correlations and sample sizes from correlation tables or text in the reports. If data were only
available from means, standard deviations, and sample sizes of two groups (i.e., high achievers and low achievers), we estimated a
correlation. When this information was not reported in a study, corresponding inference test statistics (e.g., t-statistic, F-statistic, p-
values, chi-squared test) were used to derive an effect size. If statistical significance was denoted yet both raw data and inferential test
statistics were unavailable, a conservative effect size was derived with an assumed p-value of 0.05.
We used a shifting-unit-of-analysis approach (Cooper, 2010) in order to deal with the issue of determining what constitutes an
independent estimate of effect. This approach involves coding as many effect sizes from each study that exist as a result of variations
within the study. Then, we averaged effects appropriately in analyses to prevent violating the assumption of independence at various
data points.
Accounting for sample size, we used weighted procedures to calculate average effect sizes across all comparisons. The weights
were calculated by multiplying each independent effect size by the inverse of its variance and then the sum of these products is then
divided by the sum of the inverses (see Cooper et al., 2009). We also calculated 95% confidence intervals (CI) for each effect size.
Synthesis procedures were conducted on Comprehensive Meta-Analysis software (Borenstein, Hedges, Higgins, & Rothstein, 2005).
Because of inherent measurement error in instruments (in our case, critical thinking measures), we also calculated unattenuated
correlation effect sizes. Using psychometric methods recommended by Borenstein, Hedges, Higgins, and Rothstein (2009), we
converted observed or attenuated correlations to adjusted or unattenuated correlations, using the reliability coefficient of each
critical thinking measure. We then conducted a second overall meta-analysis using the adjusted effect sizes. Using estimated total
variance of the observed effect estimates and the estimated variance of the true effects, we calculated a proportion of variance in the
observed or attenuated effects explained by artifacts of measurement error or sampling error in the critical thinking variable.
In a research synthesis, there is a theoretical possibility of not obtaining all studies that have investigated the relationship between
critical thinking and community college student success, either due to failure on the part of the meta-analyst to retrieve all relevant
reports or censoring on the part of individual authors. Therefore, we employed Duval and Tweedie’s (2000) trim-and-fill procedure to
assess whether the effect size distribution differed from normally distributed estimates. Trim-and-fill methods impute missing values
that would be present to achieve an approximately normal distribution of effect sizes. Thus, we provided corrected effect sizes when
effect sizes were deemed missing on the right or left size of the distribution to indicate the impact of data censoring on the effect size
distribution.
Analyses were conducted using both random error (RE) assumptions and fixed error (FE) assumptions. Random error assumptions
account for variance at the study level (Hedges & Vevea, 1998), which is assumed to be an additional source of random variation,
whereas a fixed of error does not but rather, assumes that differences among participants are the sole source of error. However,
neither model is perfect. Fixed error assumptions do not take into consideration how study-level features contribute random error to a
set of correlational data. Overton (1998) cautioned that random effects can overestimate error variance and produce overly con-
servative confidence intervals when assumptions are violated. Because both models can be flawed, we provide our results using both
FE and RE as a way of sensitivity analysis as recommended by Greenhouse and Iyengar (1994). If there are discrepancies between
both sets of error, i.e., a significant result using FE and a nonsignificant result using RE, then we will interpret our findings presuming
that a significant result using both models of error is considered the most robust (Cooper, 2010). This approach follows other meta-
analytic studies in the educational and psychological literature (Dent & Koenka, 2016; Patall, Cooper, & Robinson, 2008).
Moderator analyses. Effect sizes may vary even if they estimate the same underlying population value. Therefore, homogeneity
analyses were necessary to examine whether sampling error alone accounted for this variance compared to the observed variance
caused by study features. We tested homogeneity of the observed set of effect sizes using a within-class goodness-of-fit statistic (Qw).
A significant Qw statistic suggests that sampling variation alone could not adequately explain the variability in the effect size esti-
mation, so moderators should then be examined (Cooper et al., 2009). Similarly, homogeneity analyses can be used to determine
whether multiple groups of average effect sizes vary more than predicted by sampling error. In this case, statistical differences among
different categories of studies were tested by computing the between-class goodness-of-fit statistic, Qb. A significant Qb statistic
indicates that average effect sizes vary between categories of the moderator variables more than predicted by sampling error alone.
For continuous moderators, we used meta-regression to assess the moderation of continuous variables on the correlation between
critical thinking and student success.
3. Results
After employing our search for relevant studies, we uncovered 11,832 unique reports. Upon reviewing titles and abstracts, we
selected 697 studies for full-text retrieval in order to evaluate them using the inclusion criteria. Our final pool included 27 samples (k)
from 23 studies spanning the years 1976–2014. The total sample size of participants across all studies was 8233 (N) with an average
sample size of 316.7 per study. After using Grubbs’ test to identify outliers within that set of effect sizes, no outliers were detected.
75
Table 2
Characteristics of Included Studies.
Author (Year) Publication Type Sample Size (Female%) Minority (%) Major/ Program CT Type CT Measure (reliability) Outcome Duration r r ru
C.J. Fong et al.
Bachman (1998) Dissertation 78 (92%) NR Nursing D CCTDI (0.90) GPA/Grades > 1 semester 0.26 0.27 0.29
S CCTST (0.78) 0.28
Cornelius (2011) Dissertation 172 (93%) 6% Nursing S CTA (0.69*) GPA/Grades > 1 semester 0.22 0.22 0.26
Criner (1992) Dissertation 41 (73%) NR Non-nursing S NJTRS (0.70*) GPA/Grades > 1 semester 0.42 0.42 0.50
Eason (1986) Dissertation 187 (62%) NR Non-nursing S WGCTA (0.77) GPA/Grades 1 semester −0.05 0.06 0.00
Test score > 1 semester 0.16
Foust (2008) Dissertation 21 (62%) 20% Non-nursing S MSLQ (0.16) GPA/Grades 1 semester −0.35 −0.35 −0.88
Gaythwaite (2006) Dissertation 57 (67%) 23% Non-nursing S MSLQ (0.80) GPA/Grades 1 semester 0.14 0.14 0.15
Guster and Batt (1989) Article 50 (58%) NR Non-nursing S WGCTA (0.83) Test score 1 semester 0.14 0.14 0.15
Hearron (1991) Dissertation 294 (77%) 16% Non-nursing S TOLT (0.83) GPA/Grades 1 semester 0.47 0.47 0.51
Hurov (1987) Report 118 (80%) 10% Nursing S WGCTA (0.80) GPA/Grades 1 semester 0.25 0.25 0.28
Kuznar (2009) Dissertation 136 (53%) 10% Nursing S MSLQ (0.79) Test score 1 semester −0.09 −0.09 −0.10
Lee (2001) Dissertation 84 (85%) NR Nursing S WGCTA (0.73) GPA/Grades > 1 semester 0.13 0.13 0.15
Money (1997) Dissertation 68 (59%) 0% Nursing S CCTT (0.62*) GPA/Grades 1 semester −0.01 −0.01 −0.02
56 (59%) 0% Non-nursing 0.13 0.13 0.17
57 (59%) 0% 0.30 0.30 0.38
Parlett (2012) Dissertation 47 (67%) 9% Nursing S MSLQ (0.70*) GPA/Grades > 1 semester 0.49 0.49 0.58
Pitts (2001) Dissertation 161 (85%) 25% Nursing S CCTST (0.62) GPA/Grades 1 semester 0.44 0.30 0.37
> 1 semester 0.39
67 (90%) 26% D CCTDI (0.88) 1 semester 0.16
> 1 semester 0.17
S CCTST (0.62) 1 semester 0.13 0.14 0.15
76
> 1 semester 0.30
D CCTDI (0.88) 1 semester −0.01
> 1 semester 0.15
228 (86%) 25% S CCTST (0.62) 1 semester 0.30 0.23 0.30
> 1 semester 0.36
D CCTDI (0.88) 1 semester 0.09
> 1 semester 0.17
49 (88%) 48% S CCTST (0.62) 1 semester 0.31 0.30 0.32
> 1 semester 0.34
D CCTDI (0.88) 1 semester 0.28
> 1 semester 0.28
Puzziferro (2006) Dissertation 815 (80%) 26% Non-nursing S MSLQ (0.80) GPA/Grades 1 semester 0.00 0.00 0.00
Reid (2000) Dissertation 417 (86%) 24% Nursing S CCTST (0.71) GPA/Grades > 1 semester 0.22 0.22 0.26
Silver (1999) Dissertation 398 (72%) NR Non-nursing S SSSES (0.88) GPA/Grades > 1 semester 0.65 0.65 0.69
Singleton-Williams (2009) Dissertation 85 (68%) 67% Non-nursing S MSLQ (0.80) GPA/Grades 1 semester 0.09 0.09 0.10
Sisung (2005) Dissertation 1015 (83%) 6% Non-nursing S CCAP (0.82) Test score 1 semester 0.30 0.30 0.33
Thompson (2006) Dissertation 185 (86%) NR Nursing S NET (0.93*) GPA/Grades 1 semester 0.48 0.48 0.50
Thompson (2009) Dissertation 32 (NR) NR Nursing D CCTDI (0.93) GPA/Grades 1 semester 0.62 0.62 0.64
Yost (2003) Dissertation 40 (90%) NR Nursing S MSLQ (0.80) GPA/Grades 1 semester 0.06 0.05 0.05
Test score 1 semester 0.11
Note. Author names are limited to just the surname of the first author; for the full author list, please refer to the reference list. CT = Critical Thinking; NR = not reported; D = Disposition; S = Skills; CCAP = Collegiate
Assessment of Academic Proficiency; CCTDI = California Critical Thinking Disposition Inventory; CCTST = California Critical Thinking Skills Test; CCTT = Cornell Critical Thinking Test; CTA = Critical Thinking Assessment;
MSLQ = Motivated Strategies for Learning Questionnaire; NET = Nurse Entrance Test; NJTRS = New Jersey Test of Reasoning Skills; SSSES = Study Skills Self-efficacy Survey (Text-based Critical Thinking); TOLT = Test of
Logical Thinking; WGCTA = Watson Glaser Critical Thinking Appraisals. Reliabilities of the scales are provided in parentheses after scale abbreviations. When reliabilities for the included study sample were not provided,
reliabilities from the cited literature were included as estimates (these are marked by an *). r = weighted correlation for each unique sample; ru = unattenuated weighted correlation for each unique sample.
Thinking Skills and Creativity 26 (2017) 71–83
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
Table 3
Overall Meta-Analysis Results of Critical Thinking and Community College Achievement.
k r 95% CI k r 95% CI Qw
Note. Trim-and-fill procedures were conducted using both fixed effects (FE) and random effects (RE); trimmed studies were to the right of the mean. ***p < 0.001.
In this section, we want to highlight salient characteristics of our pool of included studies (see Table 2 for the comprehensive list
of characteristics of included studies). The first notable characteristic is the large majority of unpublished studies, namely doctoral
dissertations, in our database of studies. Second, regarding student characteristics, there was a large proportion of nursing majors in
the included samples. Third, measures of critical thinking mostly assessed skills, and only a few studies assessed dispositions. Some
studies used measures that asked students to self-report their perceptions of their confidence to employ critical thinking skills (e.g.,
Motivated Strategies Learning Questionnaire, MSLQ). In contrast, other instruments consisted of test questions (e.g., California
Critical Thinking Skills Test), which judge their ability to perform on multiple-choice items. All instruments were established scales,
validated by previous research. Lastly, academic achievement outcomes presented in the included studies were course grades or GPA,
with a smaller segment of studies using individual exams as the dependent variable. Outcomes were assessed at the end of a semester
(the duration of one course) or beyond one semester.
3.2. Overall analysis of the relationship between community college and achievement
First, we examined the overall relationship between critical thinking and community college student achievement (see Table 3).
Under a fixed effects model, the weighted average r was 0.26 with a 95% CI of 0.23–0.28. Under a random effects model, the
weighted average r was 0.24 with a 95% CI of 0.15–0.33. Therefore, the hypothesis that the relationship between critical thinking
Fig. 2. Forest plot of studies. The size of points reflects weight in meta-analysis.
77
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
and achievement is equal to zero was rejected under both FE and RE models (p < 0.001). See Fig. 2 for forest plot. When meta-
analyzing the unattenuated correlations (the adjusted effect sizes correcting for measurement error using scale reliabilities), the
weighted average effect size was 0.28. The psychometric analysis would estimate the proportion of explained variance is 0.74,
indicating that 74% of the total variance in the attenuated effect sizes is explained by sampling and measurement error.
Using trim-and-fill analyses to look for asymmetry using both fixed- and random-error models (Duval & Tweedie, 2000), we
searched for possible missing effects on the right side of the distribution, which would increase the size of the slightly positive average
r for community college students on achievement. Using the fixed-effects model, we found evidence that three effect sizes might have
been missing on the right side. Imputing these values would change the correlation between critical thinking and achievement to
r = 0.27 (95% CI = 0.25, 0.30) under fixed effects and r = 0.28 (95% CI = 0.19, 0.36) under random effects. Using the random-
effects model, we found evidence that five effect sizes might have been missing on the right side. Imputing these values would change
the correlation between critical thinking and achievement to r = 0.32 (95% CI = 0.29, 0.34) under fixed effects and r = 0.30 (95%
CI = 0.21, 0.38) under random effects. Thus, when accounting for possible data censoring, the relationship between critical thinking
and community college student achievement is slightly stronger.
In addition, the tests of the distribution of the effect sizes revealed that we could reject the hypothesis that the effects were
estimating the same underlying population value, Q(26) = 247.12, p < 0.001; therefore, there was sufficient heterogeneity among
the effect sizes that could be explained by moderator variables, which we will examine in the following section.
Next, we examined two categories of moderators: categorical moderators, which include publication status (published vs. un-
published), student program/major (nursing vs. non-nursing), predictor type (skills vs. disposition), and outcome type (test score vs.
GPA) and duration (one semester vs. beyond one semester); and continuous moderators which include percent female composition
and percent minority student composition (see Table 4 for categorical moderator results). These continuous moderators were cal-
culated because effect sizes were not often provided by gender or ethnicity. Therefore, we created continuous variables of the
percentage of the sample that was female or minority status as a proxy for these demographic variables.
For the first categorical moderator, we found no significant differences between published and unpublished studies, indicating
that there does not seem to be a publication bias in this literature under both fixed and random effects (FE: Q = 0.35, p = 0.55; RE:
Q = 0.11, p = 0.75). See Fig. 3 for funnel plot. Second, we assessed differences between nursing (k = 14) and non-nursing students
(k = 13); non-nursing students consisted of collegians from business, music, computer science, and biology. Under both fixed and
random effects, we did not observe a significant difference between the two groups of students (FE: Q = 1.86, p = 0.17; RE:
Q = 0.13, p = 0.72). Third, we assessed whether the relationship between critical thinking and achievement would significantly
differ if critical thinking was measured as skills (k = 24) or a disposition (k = 4). Moderator tests revealed no significant differences,
under both fixed (Q = 0.62, p = 0.43) and random effects (Q = 0.76, p = 0.38). Fourth, we assessed if self-reported measures versus
formal assessments moderated the relationship between students’ levels of critical thinking and academic achievement. Under fixed
effects only, there was a significant difference between self-reported measures of critical thinking (r = 0.21) and ability tests of
critical thinking (r = 0.28), Q = 6.48, p = 0.011. A significant difference was not detected under random effects, Q = 0.43,
Table 4
Categorical Moderator Analyses for Critical Thinking and Community College Achievement.
k r Qb 95% CI r Qb 95% CI
Note. Sample sizes varied per analysis due to the shifting unit of analysis approach and the existence of multiple effect sizes per study; * p < 0.05, ** p < 0.01,
***p < 0.001.
78
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
p = 0.46, but mean point estimates were trending in a similar direction to the fixed effects model.
Fifth, we compared the effects of critical thinking on individual test scores such as nursing licensure exams (k = 3) with overall
GPA (k = 24), which also resulted in nonsignificant differences (FE: Q = 0.03, p = 0.87; RE: Q = 0.79, p = 0.38). Lastly, we tested
outcome duration as a moderator variable, one semester (k = 22) compared to beyond one semester (k = 11). Under fixed effects
only, there was a significant difference with a larger average effect size for achievement outcomes beyond one semester (r = 0.33)
and a smaller average effect size for outcomes at one semester (r = 0.18), Q = 34.06, p < 0.001. A significant difference was not
detected under random effects, Q = 0.65, p = 0.41, but the observed mean effect sizes were in a similar direction to the fixed effects
model. Forest plots for categorical moderator analyses are present in supplementary materials.
For the continuous moderators, we found significant moderation of gender (k = 25) and minority status (k = 17) under fixed
effects only. Specifically, using the percentage of females in the student sample composition, we found a slope value for gender on
effect sizes significantly different from zero (FE: β = 0.25, p = 0.02, RE: β = 0.29, p = 0.42). This result suggests that as female
percentage increases, the relation of critical thinking and achievement is higher. In addition, using the percentage of historically
underrepresented minority students (non-White, non-Asian) in the student sample composition, we found a slope value for minority
status on effect sizes significantly different from zero (FE: β = −0.27, p = 0.02, RE: β = −0.09, p = 0.68). Thus, as minority status
percentage increases, the relation of critical thinking and achievement was lower. However, due to incomplete data reporting in the
primary studies, the sample sizes of these moderator tests were reduced, and results should be interpreted with caution.
In sum, critical thinking had a moderate influence on community college achievement across a variety of student programs and
outcome types with no sign of publication bias. Under fixed effects only, it appeared that critical thinking was more strongly related
to longer-term achievement outcomes beyond the one semester mark. Examining demographic characteristics as moderators revealed
that critical thinking seemed more impactful for women and majority students but under fixed effects only.
4. Discussion
In sum, the results of our meta-analysis indicated positive relationships between critical thinking and community college student
success. According to Cohen’s (1992) benchmarks for evaluating effect sizes, the magnitude of the relationships was small to
moderate. Additionally, when conducting sensitivity analyses to impute missing effect sizes and correct for attenuated correlations
affected by measurement error and internal consistency of critical thinking scales, the truer weighted average effect sizes was even a
bit larger. Given the low rates of degree completion that plague U.S. two-year colleges and the complexity of community college
persistence and achievement, identifying positive factors, such as critical thinking, that are malleable and can bolster student success
is surely worthwhile. Not accounting for other background characteristics, critical thinking skills and dispositions seem to positively
relate with community college student success.
Comparing the relationship between critical thinking and achievement from previous meta-analytic reviews on four-year college
or university students, we observed that the average weighted correlations in our meta-analysis were markedly higher. In a meta-
analysis by Robbins, Lauver, Le, Davis, and Langley (2004), the observed mean correlation between academic-related skills (aca-
demic and cognitive strategies) and GPA was 0.129. Similarly, in an updated review by Richardson, Abraham, and Bond (2012), the
observed average correlation between critical thinking and achievement (correcting for sampling error) was 0.15. The effects in our
sample of community college studies ranged from 0.24 to 0.32. In our community college sample, the critical thinking-achievement
relationship was twice the magnitude. One may argue that compared to the four-year university setting, attaining higher achievement
79
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
at community colleges requires students to develop higher levels of critical thinking because of the increased demands of work and
family responsibilities. Terenzini et al. (1996) suggested that college students who work longer hours off-campus and juggle multiple
responsibilities (like many community college students) can cultivate better time management skills, self-discipline, and academic
focus, which in turn lead to greater cognitive development and higher levels of critical thinking.
Furthermore, because critical thinking development has been found because of the significant vocational focus at the community
college level, perhaps critical thinking is more relevant to success as students directly engage in workforce preparation. For instance,
in community colleges, nurses-in-training, which comprised a large portion of the meta-analytic student sample, are taught to reason
critically about the judgments they make in their day-to-day responsibilities and the life-and-death decisions in nursing practice
(Colucciello, 1997). Thus, the demand for high levels of critical thinking for the success of community college nursing students may
explain the larger associations in our meta-analysis compared to previous reviews.
Moderator results. Results from our moderator tests revealed interesting patterns regarding the demographic characteristics of
the community college samples as well as measurement and methodology issues. Before interpreting our moderator results, one
caveat to understanding the significance level of the contrasts is when there are discrepant findings between fixed and random effects.
When this occurs, the random effects model is considered more robust, so significant findings solely under fixed effects should be
interpreted with caution. In general, moderator results in meta-analysis are exploratory given the nature of across-study synthesis.
Also, we reiterate the caution required when interpreting results from these exploratory tests given the small number of studies that
comprise moderator categories.
Regarding student characteristics, critical thinking was not moderated by nursing student status, indicating that critical thinking
influences achievement equally for nursing and non-nursing disciplines. In spite of the emphasis of critical thinking in nursing
education, critical thinking seemed to be equally important across majors. Under fixed effects, taking into consideration gender and
minority status, evidence that the relationship between students’ levels of critical thinking and achievement decreased with samples
of greater male representation and minority representation is noteworthy. With regards to small correlations for students with greater
numbers of minority students, this result leads to an interesting discussion on the impact of racial diversity and diversity experiences
on cognitive development. From the literature, one may expect that as minority composition increases, critical thinking would be
further cultivated and lead to increased achievement due to the positive influence of diversity experiences on students’ levels of
critical thinking (Bowman, 2010). One possible explanation to disentangle this effect comes from research that demonstrated di-
versity experiences to be more impactful for White students’ levels of critical thinking skills compared to non-White students (Loes,
Pascarella, & Umbach, 2012). Although samples with a greater minority student composition, in theory, have more opportunities for
interpersonal diversity experiences, the minority students themselves may not have as a large of a benefit on their cognitive de-
velopment and thus academic achievement.
Although these findings were only apparent under fixed effects, it is interesting that the achievement of minority males did not
benefit as much from critical thinking, especially given the attention focused on their remediation in the community college lit-
erature. Harris and Wood (2014) developed the Socio-Ecological Outcomes model to capture the variety of factors that come into play
regarding male minority student success in community colleges. Factors from the individual, academic and environmental contexts,
and campus ethos (and the interactions among them) are proposed to increase student success. Thus, a multitude of other factors
apart from critical thinking skills and dispositions are required to understand all community college students and their success,
particularly minority male collegians.
From a methodological perspective, moderator tests indicated that skills and dispositions were equally related with community
college achievement, under both fixed and random effects. However, given the few studies that measured dispositions, further
research in this area is required. One significant methodological moderator under fixed effects only involved whether critical thinking
was measured via a self-report survey or ability test, with the latter having stronger associations with academic achievement. This
finding suggests that ability tests are potentially more valid when related with achievement, as self-reported surveys may introduce
more measurement error. Regarding outcome characteristics, the relationship between critical thinking and achievement was not
moderated by achievement type (GPA/grades or tests/exams); similarly, the small number of studies contributing effect sizes for tests
and exams warrants additional scholarship. Another significant moderator result in our study (under fixed effects) revealed that effect
sizes were larger when achievement was measured beyond one semester compared to at the one semester mark. This suggests that
critical thinking had a stronger relationship with achievement in the long run, which can be possibly explained through the cu-
mulative effects of psychological strategies which develop and strengthen over time (Dent & Koenka, 2016).
Publication bias. Although the moderator analysis comparing published and unpublished studies did not indicate a publication
bias (given the few published studies in our sample), the more notable result is the large majority of unpublished studies in our
dataset. This is indicative of the field of community college research. Crisp, Carales, and Nuñez (2016) observed how despite the long
history of community colleges in the U.S., only since the 1990s have research in this area emerged. They recognized that the majority
of this research may be unpublished, and our study corroborates such findings along with previous meta-analyses (Fong et al., 2017).
Whereas some scholars may scrutinize the heavy use of unpublished data in a meta-analysis citing lack of quality as the primary
criticism, the accepted and encouraged practice in contemporary meta-analysis in the education field is to include all viable data
regardless of publication status (Polanin et al., 2016). This can aid in mitigating the threat of publication bias.
This study has several important implications for research and practice, but for the purposes of this study, we focus on a few
salient points. First, regarding implications for future research, our systematic synthesis revealed a few gaps in the literature. For
80
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
example, our search through the literature exposed a dearth of studies that focus on other important community college outcomes
such as persistence, degree attainment, transfer, and job placement and the role of critical thinking. Only one study (Eason, 1986)
measured credits earned and course completion as an outcome in relation to critical thinking with small or negative correlations
ranging from −0.04 to 0.05. Further research is needed to examine persistence-related outcomes beyond achievement and test
outcomes.
By affirming the positive relationship between critical thinking and community college success, this study offers administrators
and practitioners accumulated and synthesized evidence for informed decision-making and practices. In particular, our findings can
be used to build programs, initiatives, orientations, success courses, and trainings that focus on bolstering critical thinking in
community college students (Piergiovanni, 2014). Given the substantive relationship between students’ levels of critical thinking and
community college student success, one suggestion is to teach students the importance of critical thinking and its relevance for
academic achievement. Previous research has underscored the use of critical thinking courses to further cognitive development
(Mayhew et al., 2016). These courses along with other opportunities can serve as the context for faculty to discuss with students their
understanding of critical thinking, its bearing on their academic performance, and how it can be developed further. Additionally,
critical thinking can also be infused in instructional practices as recommended by meta-analysis, which emphasized two proven
strategies to increase critical thinking. First, opportunities of discussion or dialogue where teachers pose questions in groups or
whole-class instruction were helpful. Second, exposing students to authentic tasks with role playing or applied problem solving
promoted critical thinking. Third, in combination with these two strategies, including mentorship as an additional strategy appeared
to augment the critical thinking process. Furthermore, when measuring students’ levels of critical thinking, formal ability tests like
the Watson Glaser Critical Thinking Appraisal may be more preferable given the stronger associations between tests and achievement
compared to using self-reported surveys.
However, from our moderator results, extra supports beyond critical thinking training need to be offered for collegiate men and
students of color who may not benefit as much from boosts in critical thinking. In general, the small to moderate correlations within
this study, especially the smaller correlations for samples with greater proportions of male and minority students, demonstrate the
need to examine other factors that influence community college achievement.
Finally, we hope our study provides an example and encouragement for the use of meta-analysis as a methodological tool to
understand the landscape of higher education research. Systematic review and synthesis techniques can afford researchers, admin-
istrators, educators, and practitioners the ability to confidently evaluate, measure, target, and change the most important factors
related to community college student outcomes.
There were a number of limitations to this study. The first limitation of all research syntheses in general consist of synthesis-
generated evidence, which should not be interpreted as supporting causal relationships (Cooper, 2010). For example, it is not ne-
cessarily the case of our meta-analysis that critical thinking causes higher achievement because high achieving students may possess
greater critical thinking. Therefore, when significant differences were found within a research synthesis, results should be interpreted
with caution and used to direct future research of these factors in a controlled design to appropriately infer causal impact. It is also
important to recognize that the findings were based on small numbers of effect sizes, making it difficult to place great confidence in
the magnitude of the estimated effects.
In addition, meta-analysts depend on the authors of the primary studies to provide all relevant data and characteristics. Thus,
there were many important variables that could not be examined as moderators due to incomplete reporting. Sample characteristics
such as age, full-time status, hours working, and distance versus traditional learning modes (which are very significant factors when
studying community college students) could not be assessed. One important research design issue that was not inferable from the
included studies was the timeframe of critical thinking assessment. A researcher could foreseeably assess critical thinking at the
beginning of community college as a predictor for success or towards the end of their schooling. The latter situation would be
confounded with the influence of postsecondary instructive over the duration of the college experience (Huber & Kuncel, 2016).
Future research is encouraged to systematically examine these factors to see their moderating influence on the relationship between
students’ levels of critical thinking and postsecondary student success.
Another limitation of the study is the psychometric approach towards operationalizing critical thinking taken in this study. Liu,
Frankel, and Roohr (2014) argued for a parsimony between authenticity and psychometric quality, citing that measures that rely on
multiple-choice tests are not fully capturing the complexities of critical thinking. Rather, measures that prioritize psychometric
quality may be tapping into other related constructs such as reading comprehension or general linguistic competencies. Although the
current study has solely focused on these types of measures, we encourage future research to examine other ways of understanding
and measuring critical thinking that reflects the authenticity of both the construct and the contexts in which students use their critical
thinking skills.
Supplementary data associated with this article can be found, in the online version, at http://dx.doi.org/10.1016/j.tsc.2017.06.
002.
81
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
References1
Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., et al. (2008). Instructional interventions affecting critical thinking skills and
dispositions: A stage 1 meta-analysis. Review of Educational Research, 78, 1102–1134. http://dx.doi.org/10.3102/0034654308326084.
Arum, R., & Roksa, J. (2011). Academically adrift: limited learning on college campuses. University of Chicago Press.
*Bachman, M. L. (1998). Anxiety, critical thinking and age as performance predictors of community college nursing students. CO: Colorado State University Fort Collins
[Unpublished doctoral dissertation].
Bers, T. H., McGowan, M., & Rubin, A. (1996). The disposition to think critically among community college students: The California critical thinking dispositions
inventory. The Journal of General Education, 45, 197–223.
Borenstein, M., Hedges, L., Higgins, J., & Rothstein, H. (2005). Comprehensive meta-analysis (Version 2.1) [Computer software]. Englewood, NJ: BioStat.
Borenstein, M., Hedges, L., Higgins, J., & Rothstein, H. (2009). Introduction to meta-analysis. West Sussex, UK: Wiley & Sons.
Bowman, N. A. (2010). College diversity experiences and cognitive development: A meta-analysis. Review of Educational Research, 80, 4–33. http://dx.doi.org/10.
3102/0034654309352495.
Bowman, N. A. (2012). Effect sizes and statistical methods for meta-analysis in higher education. Research in Higher Education, 53, 375–382. http://dx.doi.org/10.
1007/s11162-011-9232-5.
Brint, S., Cantwell, A. M., & Saxena, P. (2012). Disciplinary categories majors, and undergraduate academic experiences: Rethinking Bok’s underachieving colleges
thesis. Research in Higher Education, 53, 1–25.
Brookfield, S. D. (1987). Developing critical thinkers. San Francisco, CA: Jossey-Bass.
Calcagno, J., Bailey, T., Jenkins, D., Kienzl, G., & Leinbach, D. T. (2006). IIs student right-to-know all you should know? An analysis of community college graduation
rates. Research in Higher Education, 47, 491–519. http://dx.doi.org/10.1007/s11162-005-9005-0.
Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155–159. http://dx.doi.org/10.1037/0033-2909.112.1.155.
Colucciello, M. L. (1997). Critical thinking skills and dispositions of baccalaureate nursing students: A conceptual model for evaluation. Journal of Professional Nursing,
13, 236–245.
Cooper, H., Hedges, L. V., & Valentine, J. C. (Vol. Eds.), (2009). The handbook of research synthesis and meta-analysis. 1Russell Sage Foundation.
Cooper, H. (2010). Synthesizing research: A guide for literature reviews (4th ed.). Thousand Oaks, CA: Sage.
*Cornelius, S. T. (2011). Prior grade point average, social and financial support, and standardized test scores as predictors of success among associates degree nursing students
(Unpublished doctoral dissertation). Ann Arbor, MI: Walden University.
*Criner, L. A. (1992). Teaching thinking and reasoning: A study of critical thinking in adults (Unpublished doctoral dissertation). Fayetteville, AR: University of Arkansas.
Crisp, G., Carales, V. D., & Núñez, A. M. (2016). Where is the research on community college students. Community College Journal of Research and Practice, 40, 767–778.
Dent, A. L., & Koenka, A. C. (2016). The relation between self-regulated learning and academic achievement across childhood and adolescence: A meta-analysis.
Educational Psychology Review, 28, 425–474. http://dx.doi.org/10.1007/s10648-015-9320-8.
Duval, S., & Tweedie, R. (2000). A nonparametric trim and fill method of accounting for publication bias in meta-analysis. Journal of the American Statistical Association,
95, 89–98. http://dx.doi.org/10.1080/01621459.2000.10473905.
*Eason, L. E. (1986). The relationship of critical thinking skills and psychological type in community college students’ responses to science instruction (Unpublished doctoral
dissertation). Gainesville, FL: University of Florida.
Ennis, R. H., Millman, J., & Tomko, T. N. (1985). Cornell critical thinking tests level X & level Z: Manual. Pacific Grove, CA: Midwest Publications.
Facione, N. C., Facione, P. A., & Sanchez, C. A. (1994). Critical thinking disposition as a measure of competent clinical judgment: The development of the California
Critical Thinking Disposition Inventory. Journal of Nursing Education, 33(8), 345–350.
Facione, P. A., Facione, N. C., & Giancarlo, C. A. F. (2001). California critical thinking disposition inventory: CCTDI. California Academic Press.
Facione, P. A. (2009). Critical thinking: What it is and why it counts. [Retrieved from http://www.insightassessment.com/9articles%20WW.html].
Fike, D. S., & Fike, R. (2008). Predictors of first-year student retention in the community college. Community College Review, 36, 68–88. http://dx.doi.org/10.1177/
0091552108320222.
Fong, C. J., Davis, C. W., Kim, Y., Kim, Y. W., Marriott, L., & Kim, S. (2017). Psychosocial factors and community college success: A meta-analytic investigation. Review
of Educational Research, 87, 388–424. http://dx.doi.org/10.3102/0034654316653479.
*Foust, R. A. (2008). Learning strategies, motivation, and self-reported academic outcomes of students enrolled in web-based coursework (Unpublished doctoral dissertation)
Detroit, MI: Wayne State University.
Garrison, D. R. (1992). Critical thinking and self-directed learning in adult education: an analysis of responsibility and control issues. Adult Education Quarterly, 42(3),
136–148.
*Gaythwaite, E. S. (2006). Metacognitive self-regulation, self-efficacy for learning and performance, and critical thinking as predictors of academic success and course retention
among community college students enrolled in online, telecourse, and traditional public speaking courses (Unpublished doctoral dissertation). Orlando, FL: University of
Central Florida.
Gellin, A. (2003). The effect of undergraduate student involvement on critical thinking: A meta-analysis of the literature 1991–2000. Journal of College Student
Development, 44, 746–762. http://dx.doi.org/10.1353/csd.2003.0066.
Giancarlo, C. A., & Facione, P. A. (2001). A look across four years at the disposition toward critical thinking among undergraduate students. Journal of General
Education, 50, 29–55.
Goldrick-Rab, S. (2010). Challenges and opportunities for improving community college student success. Review of Educational Research, 80, 437–469. http://dx.doi.
org/10.3102/0034654310370163.
Greenhouse, J. B., & Iyengar, S. (1994). Sensitivity analysis and diagnosis. In H. Cooper, & S. Iyengar (Eds.), Handbook of research synthesis (pp. 383–398). New York:
Russell Sage Foundation.
*Guster, D., & Batt, R. (1989). Cognitive and affective variables and their relationships to performance in a lotus 1–2-3 class. Collegiate Microcomputer, 7, 151–156.
Halpern, D. F. (1998). Teaching critical thinking for transfer across domains: Dispositions, skills, structure training, and metacognitive monitoring. American
Psychologist, 53, 449–455. http://dx.doi.org/10.1037/0003-066X.53.4.449.
Harris, F., III, & Wood, J. L. (2014). The socio-ecological outcomes model: A framework for examining men of color’s experiences and success in community colleges. San Diego,
CA: Minority Male Community College Collaborative.
*Hearron, M. C. (1991). The predictive effect of logical thinking, prior knowledge, and learning style characteristics on academic achievement in an anatomy and physiology
course (Unpublished doctoral dissertation). Commerce, TX: East Texas State University.
Hedges, L. V., & Vevea, J. L. (1998). Fixed-and random-effects models in meta-analysis. Psychological Methods, 3, 486–504.
Heijltjes, A., Van Gog, T., Leppink, J., & Paas, F. (2014). Improving critical thinking: Effects of dispositions and instructions on economics students’ reasoning skills.
Learning and Instruction, 29, 31–42. http://dx.doi.org/10.1016/j.learninstruc.2013.07.003.
Huber, C. R., & Kuncel, N. R. (2016). Does college teach critical thinking? A meta-analysis. Review of Educational Research, 86, 431–468. http://dx.doi.org/10.3102/
0034654315605917.
*Hurov, J. T. (1987). A study of the relationship between reading, computational, and critical thinking skills and academic success in fundamentals of chemistry (Report)St.
Louis, MO: Saint Louis Community College.
Jacobs, J., & Dougherty, K. J. (2006). The uncertain future of the community college workforce development mission. New Directions for Community Colleges,
2006(136), 53–62.
1
Denoted with an * were included in the meta-analysis.
82
C.J. Fong et al. Thinking Skills and Creativity 26 (2017) 71–83
King, P. M., Wood, P. K., & Mines, R. A. (1990). Critical thinking among college and graduate students. The Review of Higher Education, 13, 167–186.
*Kuznar, K. A. (2009). Effects of high-fidelity human patient simulation experience on self-efficacy, motivation and learning of first semester associate degree nursing students.
Minneapolis, MN: University of Minnesota.
*Lee, G. J. (2001). An investigation of the development of critical thinking skills for traditional program and non-traditional program associate degree nursing students
(Unpublished doctoral dissertation). Moscow, ID: University of Idaho.
Li, G., Long, S., & Simpson, M. E. (1999). Self-perceived gains in critical thinking and communication skills: Are there disciplinary differences. Research in Higher
Education, 40, 43–60. http://dx.doi.org/10.1023/A:1018722327398.
Liu, O. L., Frankel, L., & Roohr, K. C. (2014). Assessing critical thinking in higher education: Current state and directions for next-generation assessment. ETS Research
Reports Series, 2014, 1–23. http://dx.doi.org/10.1002/ets2.12009.
Loes, C., Pascarella, E., & Umbach, P. (2012). Effects of diversity experiences on critical thinking skills. Journal of Higher Education, 83, 1–25.
Mayhew, M. J., Rockenbach, A. N., Bowman, N. A., Seifert, T. A. D., Wolniak, G. C., Pascarella, E. T., et al. (2016). How college affects students: 21st century evidence that
higher education works, Vol. 3. San Francisco, CA: Jossey-Bass.
McMillan, J. H. (1987). Enhancing college students’ critical thinking: a review of studies. Research in Higher Education, 26, 3–29.
Money, S. M. (1997). The Relationship Between Critical Thinking Scores, Achievement Scores, and Grade Point Average in Three Different Disciplines (Unpublished Doctoral
Dissertation). Lansing MI: Michigan State University.
Moore, T. J. (2013). Critical thinking: Seven definitions in search of a concept. Studies in Higher Education, 38, 506–522. http://dx.doi.org/10.1080/03075079.2011.
586995.
Overton, R. C. (1998). A comparison of fixed-effects and mixed (random-effects) models for meta-analysis tests of moderator variable effects. Psychological Methods,
3(3), 354–379.
*Parlett, D. K. (2012). A comparison of associate and bachelor degree nursing students’ motivation (Unpublished doctoral dissertation). Minneapolis, MN: Walden University.
Pascarella, E. T., & Terenzini, P. T. (2005). In K. A. Feldman (Vol. Ed.), How college affects students. Vol. 2. San Francisco, CA: Jossey-Bass.
Patall, E. A., Cooper, H., & Robinson, J. C. (2008). The effects of choice on intrinsic motivation and related outcomes: A meta-analysis of research findings.
Psychological Bulletin, 134, 270–300. http://dx.doi.org/10.1037/0033-2909.134.2.270.
Piergiovanni, P. R. (2014). Creating a critical thinker. College Teaching, 62, 86–93. http://dx.doi.org/10.1080/87567555.2014.896775.
Pike, G. R., Kuh, G. D., & McCormick, A. C. (2011). An investigation of the contingent relationships between learning community participation and student en-
gagement. Research in Higher Education, 52, 300–322.
Pintrich, P. R., Smith, D. A., García, T., & McKeachie, W. J. (1993). Reliability and predictive validity of the Motivated Strategies for Learning Questionnaire (MSLQ).
Educational and Psychological Measurement, 53(3), 801–813.
Pithers, R. T., & Soden, R. (2000). Critical thinking in education: A review. Educational Research, 42, 237–249. http://dx.doi.org/10.1080/001318800440579.
*Pitts, L. N. (2001). Critical thinking skill and disposition as predictors of success in associate degree nursing education (Unpublished doctoral dissertation). Gainesville, FL:
University of Florida.
Polanin, J. R., Tanner-Smith, E. E., & Hennessy, E. A. (2016). Estimating the difference between published and unpublished effect sizes: A meta-review. Review of
Educational Research, 86, 207–236. http://dx.doi.org/10.3102/0034654315582067.
*Puzziferro, M. (2006). Online technologies self-efficacy, self-regulated learning, and experiential variables as predictors of final grade and satisfaction in college-level online
courses (Unpublished doctoral dissertation). New York, NY: New York University.
*Reid, H. V. (2000). The correlation between a general critical thinking skills test and a discipline-specific critical thinking test for associate degree nursing students (Unpublished
doctoral dissertation). Denton, TX: University of North Texas.
Richardson, M., Abraham, C., & Bond, R. (2012). Psychological correlates of university students’ academic performance: A systematic review and meta-analysis.
Psychological Bulletin, 138, 353–387. http://dx.doi.org/10.1037/a0026838.
Robbins, S. B., Lauver, K., Le, H., Davis, D., Langley, R., & Carlstrom, A. (2004). Do psychosocial and study skill factors predict college outcomes? A meta-analysis.
Psychological Bulletin, 130, 261–288. http://dx.doi.org/10.1037/0033-2909.130.2.261.
Scheffer, B. K., & Rubenfeld, M. G. (2000). A consensus statement on critical thinking in nursing. Journal of Nursing Education, 39(8), 352–359.
Shapiro, D., Dundar, A., Yuan, X., Harrell, A., & Wakhungu, P. K. (2014). Completing college: A national view of student attainment rates – fall 2008 cohort (Signature report
No. 8)Herndon, VA: National Student Clearinghouse Research Center.
*Silver, B. B. (1999). Indicators of academic achievement: A structural equation model (Unpublished doctoral dissertation). Storrs, CT: The University of Connecticut.
*Singleton-Williams, S. D. (2009). Motivational and cognitive learning strategies as predictors of academic success in economically disadvantaged community college students
(Unpublished doctoral dissertation). Minneapolis, MN: Capella University.
*Sisung, J. (2005). Relationship between standardized critical thinking test scores and earned grades in courses purported to teach critical thinking at Kellogg Community College
(Unpublished doctoral dissertation). Lincoln, NE: The University of Nebraska.
Remaking college: The changing ecology of higher education. In M. Stevens, & M. Kirst (Eds.), Stanford University Press.
Terenzini, P. T., Springer, L., Pascarella, E. T., & Nora, A. (1995). Influences affecting the development of students’ critical thinking skills. Research in Higher Education,
36, 23–39.
Terenzini, P. T., Springer, L., Yaeger, P. M., Pascarella, E. T., & Nora, A. (1996). First-generation college students: Characteristics, experiences, and cognitive de-
velopment. Research in Higher Education, 37, 1–22.
*Thompson, M. (2006). A correlational study on critical thinking as a predictor of success in the associate degree nursing program at Seminole Community College (Unpublished
doctoral dissertation). Minneapolis, MN: Capella University.
*Thompson, J. (2009). To question or not to question: The effects of two teaching approaches on students’ thinking dispositions, critical thinking skills, and course grades in a
critical thinking course (Unpublished doctoral dissertation). Minneapolis, MN: Capella University.
Walsh, C. M., & Hardy, R. C. (1999). Dispositional differences in critical thinking related to gender and academic major. Journal of Nursing Education, 38, 149–155.
Watson, G., & Glaser, E. M. (1964). Watson-Glaser Critical Thinking Appraisal manual: Forms. New York, NY: Harcourt, Brace, and World.
Wilson, D. G., & Wagner, E. E. (1981). The Watson-Glaser Critical Thinking Appraisal as a predictor of performance in a critical thinking course. Educational and
Psychological Measurement, 41(4), 1319–1322.
Xu, D., & Jaggars, S. S. (2011). Online and hybrid course enrollment and performance in Washington State community and technical colleges (Unpublished manuscript) (2011,
March)NY: Community College Research Center, Columbia University New York.
*Yost, H. D. (2003). Motivational orientation, self-regulated learning, and academic performance in associate degree nursing students (Unpublished doctoral dissertation).
Greensboro, NC: The University of North Carolina.
Zientek, L. R., Yetkiner, Z. E., Fong, C. J., & Griffin, M. (2013). Student success in developmental mathematics. Community College Journal of Research and Practice, 37,
990–1010. http://dx.doi.org/10.1080/10668926.2010.491993.
83