[net-gold] Active Learning Increases Student Performance in STEM

  • From: "David P. Dillard" <jwne@xxxxxxxxxx>
  • To: Other Net-Gold Lists -- Educator Gold <Educator-Gold@xxxxxxxxxxxxxxx>, net-gold@xxxxxxxxxxxxx, NetGold <netgold@xxxxxxxxxxxxxxx>, K-12ADMINLIFE <K12ADMIN@xxxxxxxxxxxxxxxxxxx>, K12AdminLIFE <K12AdminLIFE@xxxxxxxxxxxxxxx>, Nabble Groups Net-Gold <ml-node+s3172864n3172864h56@xxxxxxxxxxxxx>, Net-Platinum <net-platinum@xxxxxxxxxxxxxxx>, Net-Gold <NetGold_general@xxxxxxxxxxxxxxxxx>, Temple Gold Discussion Group <TEMPLE-GOLD@xxxxxxxxxxxxxxxxxxx>, Temple University Net-Gold Archive <net-gold@xxxxxxxxxxxxxxxxxxx>, "Net-Gold @ Wiggio.com" <netgold@xxxxxxxxxxxxxx>, Health Lists -- Health Diet Fitness Recreation Sports <healthrecsport@xxxxxxxxxxxxxxx>, HEALTH-RECREATION-SPORTS-TOURISM@xxxxxxxxxxxxxxxxxxx
  • Date: Sat, 17 May 2014 03:01:38 -0400 (EDT)






.





Date: Fri, 16 May 2014 14:36:07 -0700
From: "Richard Hake rrhake@xxxxxxxxxxxxx [Net-Gold]" <Net-Gold@xxxxxxxxxxxxxxx>
To: AERA-L@xxxxxxxxxxxxxxxxx, Net-Gold@xxxxxxxxxxxxxxx
Subject: [Net-Gold] Active Learning Increases Student Performance in STEM



If you reply to this long (66 kB) post please don't hit the reply button, bane 
of discussion lists, unless you prune the copy of this post that may appear in 
your reply
down to a few relevant lines, otherwise the entire already archived post may be 
needlessly resent to subscribers. The abstract reads:

 *********************************************

ABSTRACT: In a recent widely acclaimed report "Active learning increases student 
performance in science, engineering, and mathematics" Freeman et al. (2014) at 
<http://bit.ly/1v4JVbW> wrote (my CAPS):

 "To test the hypothesis that lecturing maximizes learning and course 
performance, we metaanalyzed 225 studies that reported data on examination scores or 
failure rates
when comparing student performance in undergraduate science, technology, 
engineering, and mathematics (STEM) courses under TRADITIONAL LECTURING versus 
ACTIVE LEARNING.
The effect sizes indicate that on average, student performance on examinations 
and concept inventories increased by 0.47 SDs under active learning (n = 158 
studies). . . .
. . students in classes with traditional lecturing were 1.5 times more likely 
to fail than were students in classes with active learning. . . . . . This is 
the largest and
most comprehensive metaanalysis of undergraduate STEM education published to 
date. THE RESULTS raise questions about the continued use of traditional 
lecturing as a
control in research studies, and SUPPORT ACTIVE LEARNING AS THE PREFERRED, 
EMPIRICALLY VALIDATED TEACHING PRACTICE IN REGULAR CLASSROOMS."

 That the results of the meta-analysis of Freeman et al. (2014) "support active 
learning as the preferred practice in regular classrooms" is consistent with e.g.: 
(a)
meta-analyses by Springer et al. (1999) <http://bit.ly/1lbJPZo>, Hake (1998a) 
<http://bit.ly/d16ne6>, Minner et al. (2010) <http://bit.ly/wdJq4R>, and Ruiz-Primo 
et al.
(2011) <http://bit.ly/1ouNzdm>; and (b) literature reviews by Handelsman et al. (2004) 
<http://bit.ly/ILrHBK>, Prince (2004) <http://bit.ly/rkiBjq>, Froyd (2007)
<http://bit.ly/1lerTBS>, and NRC (2013) <http://bit.ly/126os6j>.

 *********************************************

 In a recent widely acclaimed report "Active learning increases student performance 
in science, engineering, and mathematics" Freeman et al. (2014) at 
<http://bit.ly/1v4JVbW> wrote (my CAPS):

 "To test the hypothesis that lecturing maximizes learning and course 
performance, we metaanalyzed 225 studies that reported data on examination scores or 
failure rates
when comparing student performance in undergraduate science, technology, 
engineering, and mathematics (STEM) courses under TRADITIONAL LECTURING versus 
ACTIVE LEARNING.
The effect sizes indicate that on average, student performance on examinations 
and concept inventories increased by 0.47 SDs under active learning (n = 158 
studies). . . .
. . students in classes with traditional lecturing were 1.5 times more likely 
to fail than were students in classes with active learning. . . . . . This is 
the largest and
most comprehensive metaanalysis of undergraduate STEM education published to 
date. THE RESULTS raise questions about the continued use of traditional 
lecturing as a
control in research studies, and SUPPORT ACTIVE LEARNING AS THE PREFERRED, 
EMPIRICALLY VALIDATED TEACHING PRACTICE IN REGULAR CLASSROOMS."

 What do Freeman et al. mean by the key terms "active learning" and "traditional 
lecturing"? Towards the end of their report under "Materials and Methods" they write
(slightly edited for clarity):

 (a) "To create a working definition of 'active learning' we collected written 
definitions from 338 audience members, just prior to biology department seminars on 
active
learning at universities throughout the United States and Canada. We then coded 
elements in the responses to create the following consensus definition: 'Active 
learning
engages students in the process of learning through activities and/or 
discussion in class, as opposed to passively listening to an expert. It 
emphasizes higher-order
thinking and often involves group work.' "

 (b) "Following Bligh (2001), we defined 'traditional lecturing' as ' . . . . . 
continuous exposition by the teacher.' Under this definition, student activity was 
assumed
to be limited to taking notes and/or asking occasional and unprompted questions of 
the instructor."

 That the results of the meta-analysis of Freeman et al. (2014) "support active 
learning as the preferred practice in regular classrooms" is consistent with (a)
meta-analyses by Springer et al. (1999), Hake (1998a), Minner et al. (2010), 
and Ruiz-Primo et al. (2011); and (b) literature reviews by Handelsman et al. 
(2004), Prince
(2004), Froyd (2007), and NRC (2013).

 Despite the rave reviews of Freeman et al. (2014) in the "Chronicle of Higher 
Education" [CHE (2014)]; "Inside Higher Ed [Lederman (2014)], a press release by the 
NSF
(2014), and "Wired" [Bhatia (2014)], I don't think Freeman et al. is without fault. 
Their first sentence begins: "To test the hypothesis that lecturing maximizes learning
and course performance . . . . . ." suggests that Feeman et al. regard student "learning" 
and "course performance" as closely related.

 But Wilbert McKeachie (1987) has pointed out that the time-honored gauge of student 
learning - course exams and final grades – "typically measure lower-level 
objectives,
such as memory of facts and definitions rather than higher-level outcomes such as 
critical thinking and problem solving" – witness "Academically Adrift: Limited 
Learning
on College Campuses" [Arum & Roksa (2011)]. Furthermore, traditional 
introductory physics lecturers have been known to receive teaching awards, usually a 
sure sign of
exemplary student evaluations prompted by the award of many A's and B's.  This 
even despite the fact that students in such courses achieve near zero increase 
in their
understanding of Newtonian mechanics - see Hake (1998a,b).

  Under "Results," Freeman et al. wrote [my CAPS]:

 "For the data on examinations and other assessments, a heterogeneity analysis 
indicated that average effect sizes were lower when the outcome variable was an
instructor-written course examination as opposed to performance on a concept 
inventory . . . . .. Although student achievement was higher under active 
learning for both
types of assessments, we hypothesize that [lower] gains for examinations [than 
for] concept inventories may be due to the two types of assessments testing 
qualitatively
different cognitive skills. This explanation is consistent with previous 
research indicating that active learning has a greater impact on student 
mastery of higher- versus
lower-level cognitive skills, and the recognition that MOST CONCEPT INVENTORIES 
ARE DESIGNED TO DIAGNOSE KNOWN MISCONCEPTIONS, in contrast to course 
examinations that
EMPHASIZE CONTENT MASTERY OR THE ABILITY TO SOLVE QUANTITATIVE PROBLEMS. Most 
concept inventories also undergo testing for validity, reliability, and 
readability."

 It would appear that Freeman et al. may think that instructor-written course 
examinations can show "content mastery" by students whose misconceptions of the 
fundamentals
have not been alleviated.

 Richard Hake, Emeritus Professor of Physics, Indiana University; Honorary 
Member, Curmudgeon Lodge of Deventer, The Netherlands; President, PEdants for 
Definitive
Academic References which Recognize the Invention of the Internet (PEDARRII); LINKS TO: 
Academia <http://bit.ly/a8ixxm>; Articles <http://bit.ly/a6M5y0>; Blog
<http://bit.ly/9yGsXh>; Facebook <http://on.fb.me/XI7EKm>; GooglePlus 
<http://bit.ly/KwZ6mE>; Google Scholar <http://bit.ly/Wz2FP3>; Linked In 
<http://linkd.in/14uycpW>;
Research Gate <http://bit.ly/1fJiSwB>; Socratic Dialogue Inducing (SDI) Labs 
<http://bit.ly/9nGd3M>; Twitter <http://bit.ly/juvd52>.

 "Physicists are out in front in measuring how well students learn the basics, 
as science educators incorporate hands-on activities in hopes of making the 
introductory
course a beginning rather than a finale." – Erik Stockstad (2001)

 Physics educators have led the way in developing and using objective tests to 
compare student learning gains in different types of courses, and chemists, 
biologists, and
others are now developing similar instruments. These tests provide convincing 
evidence that students assimilate new knowledge more effectively in courses 
including active,
inquiry-based, and collaborative learning, assisted by information technology, than in 
traditional courses."   - William Wood & James Gentile (2003).

 

REFERENCES [All URL's shortened by <http://bit.ly/> and accessed on 16 May 
2014.]

Arum, R. & J. Roksa. 2011. "Academically Adrift: Limited Learning on College 
Campuses." University of Chicago Press, publisher's information, including a synopsis 
and bio,
is online at <http://bit.ly/RrkWhK>. Amazon.com information at <http://amzn.to/f1f45O>, 
note the searchable "Look Inside" feature.  For a review see Jaschik (2011).

 Bhatia, A. 2014. "Active Learning Leads to Higher Grades and Fewer Failing Students in 
Science, Math, and Engineering," Wired Science Blog "Empirical Zeal," online at
<http://wrd.cm/1k5ogLM>.

 Bligh, D.A. 1998. "What's the Use of Lectures," 5th Edition from  Intellect Books, 
publishers information at <http://bit.ly/w08hmR>.  [The 1st edition evidently was
evidently published in 1971]. An expurgated Google book preview of Bligh (1998) is 
online at <http://bit.ly/uu7BV3>.

 Bligh D.A. 2000a. "What's the Use of Lectures." Jossey-Bass. Amazon.com information at 
<http://amzn.to/uRo4zv>.  Note the searchable "Look Inside" feature. CAUTION!!
According to Bligh [as  quoted in Hake (2003)] this U.S. Jossey-Bass edition is 
a severely eviscerated version of the British Intellect Books edition [Bligh 
(1998)].
Evidently all U.S. editions after 1998 have been or will be similarly degraded, 
presumably in order to increase sales to the primarily non-scientific 
educational
establishment within the U.S.

 CHE. 2014. "Active Learning Is Found to Foster Higher Pass Rates in STEM Courses," 
Chronicle of Higher Education Staff, online at <http://bit.ly/1oP91Hq>.

 DeHaan, R.L. 2005. "The Impending Revolution in Undergraduate Science 
Education," Journal of Science Education and Technology 14(2): 253-269; online as a 
152 kB pdf at
<http://bit.ly/ncAuQa>.

 Freeman, S., S.L. Eddy, M, McDonough, M.K. Smith, N. Okoroafor, H. Jordt, and M.P. 
Wenderoth. 2014. "Active learning increases student performance in science,
engineering, and mathematics," Proceedings of the National Academy of Sciences 
(PNAS), early edition, 12 May; online at <http://bit.ly/1v4JVbW>. A summary is at
<http://bit.ly/1lfjf1f>, which includes a tabulation of the "online impact" (called 
"metrics"). Supporting information at <http://bit.ly/1gnYZ0f> as a 156 kB pdf is.

 Froyd, J.E. 2007. “Evidence for the Efficacy of Student-active Learning Pedagogies,” 
online as a 147 kB pdf at <http://bit.ly/1lerTBS>.

Hake, R.R. 1998a. "Interactive-engagement vs traditional methods: A 
six-thousand-student survey of mechanics test data for introductory physics 
courses," Am. J. Phys. 66:
64-74; online as an 84 kB pdf at <http://bit.ly/d16ne6>. The abstract reads 
(paraphrased): A survey of pre/post test data using Concept Inventories 
<http://bit.ly/dARkDY>
is reported for 62 introductory physics courses enrolling a total number of 
students N = 6542. The average effectiveness of a course in promoting 
conceptual understanding
is taken to be the average normalized gain <g> = (%<post> – %<pre>) / (100 – %<pre>) - 
(actual gain)/ (max possible gain). 14 "traditional" (T) courses (N = 2084) which
made little or no use of interactive-engagement (IE) methods achieved average <g> = 
0.23 ± 0.04 std dev. 48 IE courses (N = 4458) achieved <g> IE-ave = 0.48 ± 0.14 std
dev), almost two std devs of <g> IE-ave above that of the traditional courses.  
Results for 30 (N = 3259) of the above 62 courses on the problem-solving Mechanics 
Baseline
test imply that IE strategies enhance problem-solving ability. The conceptual 
and problem-solving test results strongly suggest that the classroom use of IE 
methods can
increase mechanics-course effectiveness well beyond that obtained in 
traditional practice.

 Hake, R.R. 1998b. "Interactive-engagement methods in introductory mechanics 
courses," online as a 108 kB pdf at <http://bit.ly/aH2JQN> (108 kB). A crucial (but 
generally
ignored) companion paper to Hake (1998a).

 Hake, R.R. 2003. "Re: Should Education Research Be Like Medical Research?" online on 
the OPEN! POD archives at <http://bit.ly/u0MgSz>. Post of 4 Dec 2003 20:40:51-0800 to
EvalTalk, Math-Learn, PhysLrnR, and POD.

 Hake, R.R. 2008. "Design-Based Research in Physics Education Research: A Review," 
in Kelly, Lesh, & Baek (2008); a prepublication version of Hake's chapter is online as a
1.1 MB pdf at <http://bit.ly/9kORMZ>.

 Handelsman, J., D. Ebert-May, R. Beichner, P. Bruns, A. Chang, R. DeHaan, J. Gentile, S. 
Lauffer, J. Stewart, S.M. Tilghman, & W.B. Wood. 2004. "Scientific Teaching,"
Science 304 (23): 521-522, April; online as a 90 kB pdf at <http://bit.ly/ILrHBK>. 
See also the supporting material online as a 344 kB pdf at <http://bit.ly/eOCJmo>. 
[URL's are specified for some, but (unfortunately) not all, online materials]. See 
also Miller, Pfund, Pribbenow, & Handelsman (2008).

 Jaschik, S. 2011. "Academically Adrift," Inside Higher Ed, 18 January; online at 
<http://bit.ly/hOOK09>.

 Kelly, A.E., R.A. Lesh, J.Y. Baek. 2008. "Handbook of Design Research Methods in 
Education: Innovations in Teaching." Routledge Education, publisher's information at 
<http://bit.ly/dkLabI>. Amazon.com information at <http://amzn.to/flJaQ9>.

 Lederman, D. 2014. "A Boost for Active Learning," Inside Higher Ed, 13 May; online 
at <http://bit.ly/1st62of>.

 McDermott, L.C. 1991. "Millikan Lecture 1990: What we teach and what is learned - 
Closing the gap." Am.

J. Phys. 59 (4): 301-315; online as a 1.9 MB pdf at <http://bit.ly/1t1eAFV>. She 
wrote: "The usual measure of assessment common in physics courses – the ability to 
state
correct definitions, reproduce proofs, solve standard problems – cannot provide 
sufficiently detailed information to determine to what degree students achieve 
the
intellectual objectives mentioned earlier. . . . . .[[understanding of the 
basic concepts of physics]] . . .

 McKeachie, W.J. 1987. "Instructional evaluation: Current issues and possible 
improvements," Journal of Higher Education 58(3): 344-350. The first page is online 
at
<http://bit.ly/1g6gZMm>. The entire article may be downloaded for free from 
that site if one takes a few minutes to obtain a JSTOR account.

 Miller, S., C. Pfund, C.M. Pribbenow, & J. Handelsman. 2008. "Scientific Teaching 
in Practice."  Science 28 (5906): 1329-1330; online as a 418 kB pdf at
<http://bit.ly/Jx3TEW>. Supporting online material is online as a 614 kB pdf at 
<http://bit.ly/IKOSfD>.

 Minner, D.D., A.J. Levy, & J. Century. 2010. "Inquiry-based science instruction. 
What is it and does it matter? Results from a research synthesis years 1984 to 2002,"
Journal of Research in Science Teaching 47: 474-496; online as a 197 kB pdf at 
<http://bit.ly/wdJq4R>.  They wrote:"Various findings across 138 analyzed 
studies indicate a
clear, positive trend favoring inquiry-based instructional practices, 
particularly instruction that emphasizes student active thinking and drawing 
conclusions from data. .
. . . . . We did not find, however, that overall high levels of inquiry 
saturation in instruction were associated with more positive learning outcomes 
for students. The
only learning associations we found with the amount of inquiry saturation were 
modest."

 NRC. 2013. "Adapting to a Changing World - Challenges and Opportunities in Undergraduate 
Physics Education." National Academy Press, online at <http://bit.ly/126os6j>. On
page 35 it’s stated (slightly edited): "One of the most robust findings from 
Physics Education Research is that traditional, lecture-style introductory courses 
have little
long-lasting effect on students’ erroneous notions about the physical world 
(McDermott, 1991; Hake, 1998a). . . . .This can be assessed by asking students 
simple questions
such as making a prediction or drawing an inference about a physical situation."

 NSF. 2014. "Enough with lecturing," National Science Press Release, 14 May, online 
at              <http://1.usa.gov/1sO3SBu>.

 Prince, M. 2004. “Does Active Learning Work? A Review of the Research,” Journal of 
Engineering Education 93(3): 223–231; online as a 770 kB pdf at 
<http://bit.ly/rkiBjq>.
The abstract reads:"This study examines the evidence for the effectiveness of 
active learning. It defines the common forms of active learning most relevant for 
engineering
faculty and critically examines the core element of each method. It is found 
that there is broad but uneven support for the core elements of active, 
collaborative,
cooperative and problem-based learning."

 Ruiz-Primo, M.A., D. Briggs, H. Iverson, R. Talbot, & L.A. Shepard. 2011. "Impact 
of undergraduate science course innovations on learning,"  Science 331(6022): 1269–1270;
online as a 225 kB pdf at <http://bit.ly/1ouNzdm>. Supporting online material are online 
as a 242 kB pdf at <http://bit.ly/vf2FSu>. They conclude: "This evidence suggests
that undergraduate course innovations in biology, chemistry, engineering, and 
physics have positive effects on student learning. However, some caveats are in 
order. . . .
. . . . . . . . . . . "

 Springer, L, M.E Stanne, & S.S. Donovan. 1999. "Effects of small-group 
learning on undergraduates in science, mathematics, engineering, and technology. Rev. 
Educ. Res.
69(1): 21–51; online as a 164 kB pdf at <http://bit.ly/1lbJPZo>. They wrote: 
"The results . . .  suggest that small-group learning has significant . . . positive 
effects
on undergraduates in SMET courses and programs. Average main effect sizes are 
consistently around half a standard deviation, exceeding most findings in 
comparable reviews
of educational innovations.

 Stokstad, E. 2001. "Reintroducing the Intro Course," Science 293: 1608-1610, 31 
August; online at <http://bit.ly/1k4Yp6O>.

 Wood, W.B., & J.M. Gentile. 2003. "Teaching in a research context," Science 302: 
1510; 28 November; online as a 213 kB pdf <http://bit.ly/SyhOvL>, thanks to Ecoplexity
<http://bit.ly/152aFQ9>.  See also DeHaan (2005).



__._,_.___
[stime=1400276182]

Visit Your Group
 *  New Members 1
Yahoo! Groups • Privacy • Unsubscribe • Terms of Use

__,_._,___

Other related posts:

  • » [net-gold] Active Learning Increases Student Performance in STEM - David P. Dillard