[net-gold] Is the 'Teacher Effect' the Dominant Factor in Students' Academic Gain?

  • From: "David P. Dillard" <jwne@xxxxxxxxxx>
  • To: Other Net-Gold Lists -- Educator Gold <Educator-Gold@xxxxxxxxxxxxxxx>, Educator Gold <Educator-Gold@xxxxxxxxxxxxxxxx>, net-gold@xxxxxxxxxxxxx, NetGold <netgold@xxxxxxxxxxxxxxx>, Net-Gold <net-gold@xxxxxxxxxxxxxxxx>, K-12ADMINLIFE <K12ADMIN@xxxxxxxxxxxxxxxxxxx>, K12AdminLIFE <K12AdminLIFE@xxxxxxxxxxxxxxx>, NetGold <netgold@xxxxxxxxxxxxxxxx>, Net-Platinum <net-platinum@xxxxxxxxxxxxxxx>, Net-Gold <NetGold_general@xxxxxxxxxxxxxxxxx>, Temple Gold Discussion Group <TEMPLE-GOLD@xxxxxxxxxxxxxxxxxxx>, Temple University Net-Gold Archive <net-gold@xxxxxxxxxxxxxxxxxxx>
  • Date: Fri, 8 Apr 2011 00:39:14 -0400 (EDT)



.

.


Date: Thu, 7 Apr 2011 17:51:59 -0700
From: Richard Hake <rrhake@xxxxxxxxxxxxx>
Reply-To: Net-Gold@xxxxxxxxxxxxxxx
To: AERA-L@xxxxxxxxxxxxxxxxx
Cc: Net-Gold@xxxxxxxxxxxxxxx
Subject: [Net-Gold] Is the 'Teacher Effect' the Dominant Factor in Students'
    Academic Gain?

.

If you reply to this very long (69 kB) post
please don't hit the reply button unless you
prune the copy of this post that may appear in
your reply down to a few relevant lines,
otherwise the entire already archived post may be
needlessly resent to subscribers.

.

**************************************

.

ABSTRACT: In his PhysLrnR post Rob Spencer
pointed out that Sanders & Rivers (1996) in
"Cumulative and Residual Effects of Teachers on
Future Student Academic Achievement" concluded
that "Within grade levels, the single most
dominant factor affecting student academic gain
is teacher effect." This post considers four
items relevant to Spencer's post:

.

I . VALUE-ADDED ASSESSMENT (VAA) criticized
earlier in (a) "First, Let's Fire All the
Teachers!" Hake (2010a,b), and (b) "Value-Added
Inequities: Should Value-Added Measures Be Used
to Evaluate Teachers?"[ Hake (2011a,b); and in
this post by reference to (c) "Using Student
Progress To Evaluate Teachers: A Primer on
Value-Added Models" [Braun (2005)
<http://bit.ly/gZiJdH>], (d) "Challenges of
Value-Added Assessment" [Doran & Fleischman
(2005) <http://bit.ly/dF6CuB>], (e)"Problems with
the use of student test scores to evaluate
teachers" [Baker, Barton, Darling-Hammond,
Haertel, Ladd, Linn, Ravitch, Rothstein,
Shavelson, & Shepard (2010)
<http://bit.ly/h8k5Fw>], (f) "The Death and Life
of the Great American School System: How Testing
and Choice Are Undermining Education"[Ravitch
(2010) <http://amzn.to/faJ9JZ>].

.

II. SOCIOECONOMIC STATUS (SES) stressed by
Bernard Ricca, who pointed to Toni Feder's 2009
"Physics Today" report on Marder & Bansal (2009).
Their statistical-mechanics study of "Flow and
diffusion of high-stakes test
scores"<http://bit.ly/hYbbLe> indicated that,:
"Poverty is a more powerful influence on test
scores than value added by teachers and schools."
The case that poverty is an overriding influence
on students' classroom achievement has been
forcefully argued by David Berliner in "Our
Impoverished View of Educational Reform"
[Berliner (2009) <http://bit.ly/ff8BVj>], and
"Poverty and Potential: Out-of-School Factors and
School Success" [Berliner (2010)
<http://bit.ly/fqiCUA>].

.

III. LESSONS FROM THE PHYSICS EDUCATION REFORM
EFFORT suggest that: (a) high school, college,
and university courses employing "Interactive
Engagement" (IE) methods result in average
normalized gains <g> in conceptual understanding
that are about two standard deviations greater
the <g>'s achieved by traditional (T)
passive-student lecture courses [Hake (1998a,b;
2008); (b) "Teachers who possess both content
knowledge and 'pedagogical content knowledge' are
better equipped to deliver IE instruction" [Hake
(2002)]; and (c) paraphrasing Wells et al.
(1995): "teacher expertise is the critical factor
in improving introductory physics instruction." I
suspect that the growing body of educational
research in disciplines other than physics will
eventually show that these same lessons apply to
most other academic subjects.

.

IV. IS THE 'TEACHER EFFECT' THE DOMINANT FACTOR
IN STUDENTS' ACADEMIC GAIN?" Judging from the all
above:

.

(a) if "academic gain" means "gain in
higher-level learning for *U.S. K-12 students*,"
then the answer is: "NO! POVERTY is the dominant
factor" - see 'I' and 'II' above, and the next
most important factor is the degree to which
'Interactive Engagement' is provided by courses
and teachers - see III above";

.

(b) if "academic gain" means "gain in
higher-level learning for *students in higher
education* then the answer is: "The dominant
factor in promoting academic gain is the degree
to which 'Interactive Engagement' in provided by
courses and teachers - see III above."

.

**************************************

.

Rob Spencer (2011), in his PhysLrnR post "single
most dominant factor affecting student academic
gain" wrote (paraphrasing):

.

"Can anyone comment on "Cumulative and Residual
Effects of Teachers on Future Student Academic
Achievement" [Sanders & Rivers (1996)]? In the
conclusion, the authors state 'within grade
levels, the single most dominant factor affecting
student academic gain is teacher effect.' "

.

More precisely, in their conclusion Sanders &
Rivers (1996) state. . . .(my insert at ". . . .
.[[insert]]. . . ."):

.

"Wright, Horn, and Sanders (in press). . . .
.[[now Wright et al. (1997)]]. . . . have
demonstrated that, within grade levels, the
single most dominant factor affecting student
academic gain is teacher effect. The present
studies, expanding on the earlier research,
strongly suggest the presence of cumulative
effects of teachers on student achievement.
Groups of students with comparable abilities and
initial achievement levels may have vastly
different academic outcomes as a result of the
sequence of teachers to which they are assigned."

.

Boris Korsunsky (2011) responded to Spencer as
follows [bracketed by lines "KKKKK. . . . ."; my
inserts at ". . . . .[[insert]]. . . . ."; my
CAPS]:

.

KKKKKKKKKKKKKKKKKKKKKK

.

After what I admit was a quick scan, I think this
article . . . . .[[Sanders & Rivers (1996)]]. . .
is a joke, frankly. Note that the "research"
that they cite in the beginning does not come
from peer-reviewed journal articles. It comes
from the publications like their own: they just
cook and cook them and then cite each other so
that it *looks like* a serious research body.
"VALUE-ADDED ASSESSMENT SYSTEMS" ARE HIGHLY
CONTROVERSIAL precisely because so many "studies"
are done in an arbitrary fashion (often hidden
behind long words and some numbers) - and each of
them, even if it does not pursue a political
agenda (which most do!), can be debunked based on
the methodology.

.

Many of these "studies", essentially, *define*
effective teachers as the ones who produce good
testing gains - and then, surprise, find
correlations between the test gains and the
'effectiveness' of their teachers. I am pretty
sure that what this 'study' did. Oh, and it seems
that "black-or-white" was the only
non-school-related factor that they considered. .

.

. . . . . .

.

Besides, while the teacher quality (whatever it
is) may be, indeed the most important
*school-related* factor, the SES-RELATED
INDICATORS, OVERALL, ARE FAR, FAR MORE IMPORTANT,
as many studies have shown. . . . . . For K-12
folks like me, discussing these articles is more
than an arm-chair exercise, you see: they have a
potential of affecting my and my fellow teachers'
work in very substantial ways.. . . .
Interestingly, since I happen to work in a nice
rich town, I will probably be fine... but I feel
that I must speak for my colleagues.

.

KKKKKKKKKKKKKKKKKKKKKK

.

Boris considers two important areas: I - Value
Added Assessment (VAA), and II - SocioEconomic
Status (SES). I consider those areas below, plus
a third and fourth relevant to Spencer's post:

.

I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I

.

I. VALUE-ADDED MEASURES
For a discussion of the U.S. Dept. of Education's
use of Value Added Assessment (VAA) [advocated by
e.g., William Sanders <http://bit.ly/f5pKV1>;
Sanders & Rivers (1996); Sanders, Saxton, & Horn
(1997); Sanders (2000); Sanders, Wright, Rivers
(2006); Sanders & Wright (2008); Wright, Horn,
Sanders (1997); Ballou, Sanders, & Wright (2004);
Wright, Sanders, & Rivers (2006); Wright &
Sanders (2008); Wright, White, Sanders, & Rivers
(2010); and Wright (2010)]; and the baleful
influence of VAA on Boris' fellow teachers see,
e.g.:

.

a. "First, Let's Fire All the Teachers!" Hake (2010a,b), and

.

b. "Value-Added Inequities: Should Value-Added
Measures Be Used to Evaluate Teachers?[ Hake
(2011a,b).

.

In Hake (2011a) I pointed to a weakness in VAA
analyses that does not seem to be generally
recognized by VAA statisticians [see that post
for references other than Wright et al. (2010),
National Academies (2008), NCSU (2011), Halloun
et al. (1995), and Campbell (1976)]. I wrote
(slightly edited): "the validity and reliability
of VAA depends critically on validity and
reliability of the tests employed. Wright et al.
(2010) state: 'SAS EVAAS analyses make use of
scores on standardized tests such as those
provided by major educational testing companies
and those used by states to fulfill their NCLB
obligations'. . . . . . .[but such tests] are
probably inferior gauges of *higher-order
learning* as compared to "Concept Inventories" -
see e.g., the Wikipedia entry at
<http://bit.ly/dARkDY>, National Academies
(2008), NCSU (2011), FLAG (2011), Halloun et al.
(1995), & Thornton & Sokoloff (1998). Fortunately
"Concept Inventories," as far as I know, have
only been used in *formative* assessment to
improve the effectiveness of instruction, not in
*summative* evaluation of teachers as for VAA. .
. . [["Concept Inventories," unlike VAA's, are
not therefore subject to the corrupting influence
set forth in Campbell's (1976) Law: "The more any
quantitative social indicator is used for social
decision making, the more subject it will be to
corruption pressures and the more apt it will be
to distort and corrupt the social processes it is
intended to monitor."]]. . . .

.

Among VAA references either absent or not
stressed in Hake (2011a,b) are these four:

.

******************************************

.

c. "Using Student Progress To Evaluate Teachers:
A Primer on Value-Added Models" [Braun (2005)].
The last paragraph of his Executive Summary
reads]: "Most importantly, VAM. . . . . .[[Value
Added Model]]. . . . results should not be used
as the sole or principal basis for making
consequential decisions about teachers
(concerning salaries, promotions and sanctions,
for example). THERE ARE TOO MANY PITFALLS IN
MAKING 'EFFECTIVE TEACHER' DETERMINATIONS USING
THE KIND OF DATA TYPICALLY AVAILABLE FROM SCHOOL
DISTRICTS. One can imagine, however, an important
role for a quantitative component in a thorough
teacher evaluation process. Such a process has
yet to be implemented. Although improved teacher
accountability is a legitimate goal, it is only
one of many levers available to states in their
quest to enhance the quality of teaching over the
long term. A comprehensive and sustained strategy
is more likely to be successful than a more
narrowly focused initiative."

.

d. "Challenges of Value-Added Assessment" [Doran
& Fleischman (2005)]. They wrote [bracketed by
lines D&H-D&H-D&H-. . . ."; my CAPS]:

.

D&H-D&H-D&H-D&H-D&H-D&H

.

The end result of value-added assessment is an
estimate of teacher quality, referred to as a
TEACHER EFFECT in the value-added literature
(Ballou, Sanders, & Wright, 2004). This measure
describes how well the teacher performed in
improving the achievement of the students in his
or her class and how this performance compares
with that of other teachers.

.

Value-added models have surfaced as an important
topic among education policymakers, researchers,
and practitioners. U.S. Secretary of Education
Margaret Spellings has organized a federal
working group to investigate how such models
might be incorporated into NCLB. The Government
Accountability Office is investigating the
integration of these models into state test-based
accountability systems. There is also great
interest in value-added assessment at the state
level, with at least three states - Ohio,
Pennsylvania, and Tennessee - using value-added
assessment statewide.

.

Possibly the most important question about
value-added assessment is whether the estimate
obtained from a value-added model can actually be
called a teacher effect. Can any statistical
model really sift through all the other factors
that may have influenced the student's score (for
example, socio-economic status or early learning
environment) and isolate the learning that we can
specifically attribute to the teacher's methods?
As it currently stands, NO EMPIRICAL RESEARCH
VALIDATES THE CLAIM THAT VALUE-ADDED MODELS
ACCURATELY IDENTIFY THE MOST EFFECTIVE TEACHERS.
The many anecdotal claims have not yet been
verified through experimental research."

.

D&H-D&H-D&H-D&H-D&H-D&H

.

e. "Problems with the use of student test scores
to evaluate teachers" [Baker, Barton,
Darling-Hammond, Haertel, Ladd, Linn, Ravitch,
Rothstein, Shavelson, & Shepard (2010)]. They
wrote: "A review of the technical evidence leads
us to conclude that, although standardized test
scores of students are one piece of information
for school leaders to use to make judgments about
teacher effectiveness, SUCH SCORES SHOULD BE ONLY
A PART OF AN OVERALL COMPREHENSIVE EVALUATION.
Some states are now considering plans that would
give as much as 50% of the weight in teacher
evaluation and compensation decisions to scores
on existing tests of basic skills in math and
reading. Based on the evidence, we consider this
unwise. Any sound evaluation will necessarily
involve a balancing of many factors that provide
a more accurate view of what teachers in fact do
in the classroom and how that contributes to
student learning."



f. "The Death and Life of the Great American
School System: How Testing and Choice Are
Undermining Education"[Ravitch (2010)]. On pages
179-180 Ravitch wrote [bracketed by lines "RRRRR.
. . . .]

.

RRRRRRRRRRRRRRRRRRRRRRRR

.

NCLB required that the scores rise in reading and
mathematics in every grade from third through
eighth, which meant that this year's fourth grade
had to get a higher score than last year's fourth
grade. It didn't take long for school officials
to realize that they needed what are called
"growth models," so the progress of individual
students could be tracked over time. This way of
measuring academic improvement was known as
"value added assessment" (VAA), a technique that
was developed mainly by William Sanders of the
University of Tennessee. A statistician and (at
that time) adjunct professor in the university's
College of Business Administration. . . . . .

.

His value-added method aimed to calculate the
extent to which teachers contributed to the gains
made by their students, as compared to other
factors. Drawing on studies, which were purely
statistical in nature (i.e., not involving
classroom observations), Sanders concluded that
"the most important factor affecting student
learning is the teacher. In addition, the results
show wide variation in effectiveness among
teachers. The immediate and clear implication of
this finding is that seemingly more can be done
to improve education by improving the
effectiveness of teachers than by any other
single factor. *Effective teachers appear to be
effective with all students of all achievement
levels, regardless of he heterogeneity of their
classrooms*." [Wright et al. (1997, pp. 57-67) ;
Sanders & Rivers (1996)]

.

. . . . . . . . . . . .
. . . . . . . . . . . .
. . . . . . . . . . . .

.

Value-added assessment is the product of
technology; it is also the product of a
managerial mind-set that believes that every
variable is a child's education can be
identified, captured, measured, and evaluated
with precision. Computers make it possible to
assemble the annual test scores of students and
quickly analyze which students have gained the
most, which gained nothing, and which lost ground
on standardized tests. Sanders the statistician
soon became Sanders the education measurement
guru.

.

As the methodology gained adherents, education
policy increasingly became the domain of
statisticians and economists. With their
sophisticated tools and their capacity to do
multivariant longitudinal analysis, they did not
need to enter the classroom, observe teachers, or
review student work to know which teachers were
the best and which were the worst, which were
effective and which were ineffective.

.

Discussions of what to teach and what constituted
a quality education receded into the background;
those issues were contentious and value-laden,
not worthy of the attention of data-minded policy
analysts. Using value-added models, the
technical experts could evaluate teachers and
schools without regard to the curriculum or the
actual lived experiences of their students. What
mattered most in determining educational quality
was not curriculum or instruction, but data.

.

RRRRRRRRRRRRRRRRRRRRRRRR

.

******************************************

.

Despite the criticism of VAA in "a" - "f" above:

.

(1) The U.S. Department of Education has designed
its "Race to the Top" scoring system to reward
states that use VAA in teacher evaluations :-( -
see e.g., "Obama Unveils Race for School
Billions" [Bruce & Nies (2009)]. For recent news
on the "Race to the Top" see "Nine States and the
District of Columbia Win Second Round Race to the
Top Grants" [USDE (2010, 2011)].

.

(2) Some school districts are using VAA in making
teacher evaluation and compensation decisions :-(
- see e.g., " 'Value-added' teacher evaluations:
L.A. Unified tackles a tough formula" [Watanabe
(2011)]. Watanabe wrote [my insert at ". . .
.[[insert]]. . . .]: "Nor is there widespread
agreement about how much the resulting ratings
should count. Tensions are all the greater
because the stakes for teachers are high as more
districts consider using the evolving. . . .
[[VAA]]. . . . science as a factor in hiring,
firing, promotions, tenure and pay. 'It is too
unreliable when you're talking about messing with
someone's career,' said Gayle Fallon, president
of the Houston Federation of Teachers. She said
many teachers don't understand the calculations.
The general formula for the 'linear mixed model'
used in her district is a string of symbols and
letters more than 80 characters long: y = Xbeta +
Zv + epsilon; where beta is a p-by-1 vector of
fixed effects; X is an n-by-p matrix; v is a
q-by-1 vector of random effects; Z is an n-by-q
matrix; E(v) = 0, Var(v) = G; E(epsilon) = 0,
Var(epsilon) = R; Cov(v, epsilon) = 0. V = Var(y)
= Var(y - X beta) = Var(Zv + epsilon) = ZGZT + R.
It's doctorate-level math,' Fallon said."

.

The doctorate-level equation is on page 2 under
"2. Technical Details, Part I: The Linear Mixed
Model" in Wright & Sanders (2008)]. Greek letters
have been replaced by ASKII-compliant words.

.

Although the VAA may be bad for education, it's
are good for business, spawning a thriving
evaluation industry - see e.g., SAS (2011) and TQ
Center (2011).

.

II-II-II-II-II-II-II-II-II-II-II-II-II-II-II

.

II. SOCIOECONOMIC STATUS (SES)

.

As for SES-related indicators, Bernard Ricca
(2011) responded to Spencer (2010) as follows [my
insert at ". . . . .[[insert]]. . . . ."]:

.

". . . . the fact that this study. . . .[[Sanders
& Rivers (1996)]]. . . . is 15 years old ignores
a lot of more recent research that exists. (Okay,
they ignored most of the research on the topic
that was done before 1996 as well.) As one
example probably accessible to this listserv,
look at the work that was shown on pg. 28 of the
December 2009 Physics Today. . . . .[[Feder
(2009)]]. . . . The authors of that research. .
. .[[Marder (2010), Marder & Bansal (2009)]]. . .
. call into question whether or not teacher
effects are sufficient to make changes, or even
if they are that large."

.

Toni Feder (2009) wrote: "By applying his
expertise in statistical physics to analyze
school test scores, Mike Marder. . . .
.[[<http://bit.ly/frCOFB> of the University of
Texas at Austin's Center of Nonlinear Dynamics
<http://chaos.utexas.edu/> and co-director of
U-Teach <http://bit.ly/fs1Nsm> ]]. . . . .
discovered two times at which the performances of
economically disadvantaged kids take a dive: the
transitions to middle school and high school.
And, he says, his findings question the commonly
held idea that teacher effectiveness is the most
important variable in student performance.

.

"POVERTY MAY BE MORE IMPORTANT."

.

At his website <http://bit.ly/gWMKB8> Marder
(2010) states [see the website for figures
relevant to "b" - "d"; for an explanation of the
figures see Feder (2009) and Marder & Bansal
(2009)]:

.

a. Every state in the nation has been gathering
detailed data on the yearly progress of students
in mathematics because of No Child Left Behind.

.

b. The very large numbers of student scores make
it possible to apply techniques from statistical
mechanics used to describe flow and diffusion of
particles. Such methods have been applied to
17,000,000 Texas mathematics scores.

.

c. Results show that scores of low-income
students diverge most strongly from scores of
affluent students between fifth and eighth grade

.

d. POVERTY IS A MORE POWERFUL INFLUENCE ON TEST
SCORES THAN VALUE ADDED BY TEACHERS AND SCHOOLS.
[My CAPS.]

.

More completely, in "Flow and diffusion of
high-stakes test scores," Marder & Bansal (2009)
wrote [bracketed by lines "M&R-M&R-M&R-. . . .";
references converted to APA style; my CAPS]:

.

M&R-M&R-M&R-M&R-M&R-M&R

.

EDUCATIONAL OUTCOMES FOR STUDENTS FROM WEALTHY
AND POOR FAMILIES ARE VERY DIFFERENT IN TEXAS.
The flow fields show where the greatest
divergences between these groups occur. The flow
patterns in the top and bottom rows of Fig. 1
start out in nearly the same direction until the
transition to middle school between fifth and
seventh grade, when students from economically
disadvantaged backgrounds flow downwards at a
higher pace than their less disadvantaged
counterparts and never recover. Ninth grade is
another crucial time because students who are not
passing the mathematics exams are forced to
repeat a grade and consequently disappear from
schools in large numbers. This effect is much
stronger for those who are economically
disadvantaged than for those who are not, as
shown in Fig. 2B.

.

Flow fields address many questions about the
educational system. There is a debate over the
student variables that should be used to describe
effects of teachers and schools. Sanders (2000)
states that "models should not include
socio-economic or ethnic accommodations but
should only include measures of previous
achievement of individual students." In this
view, prior year scores contain everything one
needs to know about the state of the students.

.

However differences between flow directions have
great statistical significance. For example,
sixth graders not eligible for free and reduced
meals and mathematics scores between 90% and 100%
in 2006/2007 drop on average in score by 4.4% the
next year, whereas those eligible for free and
reduced meals drop in score by 7.0%. (N = 30,000,
t = 34, p < 10^9). Similar statistical
significance applies to the differences between
virtually all the arrows in the upper and lower
rows of Fig. 1.

.

CHANGES IN SCORES DEPEND STRONGLY, REPRODUCIBLY,
AND WITH HIGH STATISTICAL SIGNIFICANCE, UPON
POVERTY LEVEL EVEN AFTER CONTROLLING FOR PREVIOUS
ACHIEVEMENTS OF STUDENTS. It is possible that
this difference in score changes is entirely due
to the lower quality of teachers assigned to the
least affluent students. However, it is difficult
to reach such a conclusion simply from test data;
the conclusion that ineffective teachers are
largely to blame for unsatisfactory student
performance risks being circular [Kupermintz
(2003)] if ineffective teachers are defined to be
those whose students' test scores decrease
[Jordan et al. (1997)]. Drawing conclusions about
school effectiveness from test data presents
comparable difficulties [Haertel (2005)].

.

M&R-M&R-M&R-M&R-M&R-M&R

.

The case that poverty is an overriding influence
on students' classroom achievement has been
forcefully argued in "Our Impoverished View of
Educational Reform" [Berliner (2009)], and
"Poverty and Potential: Out-of-School Factors and
School Success" [Berliner (2010)]. In the latter
Berliner wrote:

.

BBBBBBBBBBBBBBBBBBBBBBBB

.

This brief details Out-of-School Factors (OSF's)
common among the poor that significantly affect
the health and learning opportunities of
children, and accordingly limit what schools can
accomplish *on their own*:

.

(1) low birth-weight and non-genetic prenatal influences on children;

.

(2) inadequate medical, dental, and vision care,
often a result of inadequate or no medical
insurance;

.

(3) food insecurity;

.

(4) environmental pollutants;

.

(5) family relations and family stress; and

.

(6) neighborhood characteristics.

.

These OSFs are related to a host of
poverty-induced physical, sociological, and
psychological problems that children often bring
to school, ranging from neurological damage and
attention disorders to excessive absenteeism,
linguistic underdevelopment, and oppositional
behavior.

.

Also discussed is a seventh OSF, extended
learning opportunities, such as pre-school, after
school, and summer school programs that can help
to mitigate some of the harm caused by the first
six factors.

.

BBBBBBBBBBBBBBBBBBBBBBBB

.

III-III-III-III- III-III-III-III-III-III-III-III

.

III. LESSONS FROM THE PHYSICS EDUCATION RESEARCH EFFORT

.

A. In "Interactive-engagement vs traditional
methods: A six thousand- student survey of
mechanics test data for introductory physics
courses" [Hake (1998a)] I showed that 48 courses
employing "Interactive Engagement" (IE) methods
resulted in average normalized gains <g> in
conceptual understanding that were about two
standard deviations above the <g>'s achieved by
14 traditional (T) passive-student lecture
courses.

.

Here "Interactive Engagement (IE)" courses are
defined *operationally* as those designed at
least in part to promote conceptual understanding
through the active engagement of students in
minds-on (always) and hands-on (usually)
activities that yield immediate feedback through
discussion with peers and/or instructors; and
"traditional (T)" courses are defined
*operationally* as those reported by instructors
to make little or no use of IE methods, relying
primarily on passive student lectures, recipe
laboratories, and algorithmic problem
examinations.

.

As of 2008, results similar to those of Hake
(1998a,b), showing marked superiority in <g>'s
for IE over T courses had been reported in about
25 physics education research papers as listed in
Hake (2008).

.

B. Lesson #7 of "Lessons from the physics
education reform effort" [Hake (2002)] is:
"Teachers who possess both content knowledge and
'pedagogical content knowledge' are more apt to
deliver effective instruction." I wrote:

.

HHHHHHHHHHHHHHHHHHHHHH

.

"Pedagogical content knowledge" is evidently a
term coined by Shulman (1986, 1987), but its
importance has long been known to effective
classroom teachers. The difference between
content knowledge and "pedagogical content
knowledge," can be illustrated by consideration a
Halloun & Hestenes (1985a,b) HH-type question:

.

A student in a lab holds a brick of weight W in
her outstretched horizontal palm and lifts the
brick vertically upward at a constant speed.
While the brick is moving vertically upward at a
constant speed, the magnitude of the force on the
brick by the student's hand is:

A. constant in time and zero.

B. constant in time, greater than zero, but less than W.

C. constant in time and W.

D. constant in time and greater than W.

E. decreasing in time but always greater than W.

.

Note that the responses include as distractors
not only "D," the common Aristotelian
misconception that "motion requires a net force,"
but also other, less common student
misconceptions, "A" and "E," that might not be
known to traditional teachers. Unfortunately, too
few teachers "shut up and listen to their
students" to find out what they are thinking
(Arons 1981). The distractors are based on my
years of listening to students as they worked
through the experiments in Socratic Dialogue
Inducing Lab #1 "Newton's First and Third Laws"
(Hake 2001). For *actual* HH questions, the
distractors were usually gleaned through careful
qualitative research involving interviews with
students and the analysis of their oral and
written responses to mechanics questions.

.

*Content knowledge* informs the teacher that,
according to Newton's First Law, while the brick
is moving vertically upward at a constant speed
in the inertial reference frame of the lab, the
magnitude of the force on the brick by the
student's hand is constant in time and of
magnitude W, so that the net force on the brick
is zero.

.

On the other hand, *pedagogical content
knowledge* would inform the teacher that students
may think that, for example, (a) because a net
force is required to produce motion, the force on
the brick by the student's hand is constant in
time and greater than W, or (b) because the
weight of the brick diminishes as it moves upward
away from the Earth, the force on the brick by
the student's hand decreases in time but is
always greater than W, or (c) no force is exerted
on the brick by the student's hand because as the
student's hand moves up, the brick must simply
move up to stay out of the hand's way.

.

In addition, pedagogical content knowledge
provides a hard-won toolkit of strategies for
guiding the student away from these
misconceptions and towards the Newtonian
interpretation - see, for example the list of
"Popular IE Methods" in (a) the section of the
same name in Hake (2002), and "Interactive-
engagement methods in introductory mechanics
courses" (Hake, 1998b)]. UNFORTUNATELY, SUCH
KNOWLEDGE MAY TAKE MANY YEARS TO ACQUIRE (Wells,
Hestenes, & Swackhamer (1995).

.

HHHHHHHHHHHHHHHHHHHHHH

.

C. According to Wells, Hestenes, & Swackhamer
(1995) [EMPHASIS in the original]:

.

WHS-WHS-WHS-WHS- WHS-WHS

.

We are now prepared to draw some strong
conclusions about what is most needed to improve
high school physics. . . . .[the same could be
said for college and university physics]]. . . .
. TEACHER EXPERTISE IS THE CRITICAL FACTOR. The
teacher, above all, determines the quality of
student experience in the classroom. Equipment
and school environment are secondary factors. To
reach and maintain his/her full potential, the
TEACHER MUST BE ENGAGED IN LIFELONG PROFESSIONAL
DEVELOPMENT. It will take at least ten years to
reach the teacher's highest level of competence.
Mere accumulation of academic credits and hours
of classroom teaching count for little, unless
the teacher is consistently engaged in deliberate
effort to improve.

.

TEACHER COMMITMENT IS ESSENTIAL, and individual
teachers, like Malcolm [Wells], can go far in
designing and executing their own programs for
personal development. However, even Malcolm
needed help to reach his peak, so the ULTIMATE
SUCCESS OF EVERY TEACHER DEPENDS ON OPPORTUNITIES
TO DRAW ON THE RESOURCES OF THE PHYSICS
COMMUNITY.

.

. . . . . . . . . . . . . . . .
. . . . . . . .

.

systemic reform will fail unless it
focuses on developing and sustaining teacher
expertise.

.

WHS-WHS-WHS-WHS- WHS-WHS

.

On the other hand there's no evidence that the
traditional T the passive-student lecture mode
can result in substantive student learning, even
when administered by teachers thought to be
exemplary on the basis of advanced degrees,
academic credits, hours of classroom teaching,
teaching awards, student evaluations, peer
evaluations, or VAA's.

.

For example, Halloun & Hestenes (1985a) wrote
[bracketed by lines "H&H-H&H-H&H-. . . ."; my
CAPS; my insert at ". . . . .[[insert]]. . . .
."]:

.

H&H-H&H-H&H-H&H-H&H-H&H

.

Within the format of conventional instruction,
wide variations in instructional style are
possible. The styles of the four lecturers in
University Physics listed in Table I differ
considerably. Professor A is a theoretical
physicist. . . .[[Hestenes himself]]. . . .; his
lectures emphasize the conceptual structure of
physics, with careful definitions and orderly
logical arguments. The other professors are
experimental physicists, but with quite different
specialties. Professor B incorporates many
demonstrations in his lectures, and he expends
great time and energy preparing them; he strives
especially to help students develop physical
intuition. Professor C emphasizes problem
solving, and he teaches by example, solving one
problem after another in his lectures. Professor
D is an experimental physicist teaching
introductory physics for the first time; he
followed the book closely in his lectures. All
four professors are known as good teachers
according to informal peer opinion and formal
evaluations by students.

.

INDEED, PROFESSOR B HAS TWICE RECEIVED AWARDS FOR
OUTSTANDING TEACHING. . . . . .[["in research
universities "received teaching awards" in
virtually synonymous with "received outstanding
Student Evaluations of Teaching (SET's)" since
SET's are usually the only gauge of teaching
effectiveness that's used :-(]]. . . . Now, Table
I shows that the BASIC KNOWLEDGE GAIN IS THE SAME
FOR ALL FOUR OF THE CLASSES in University Physics
. . . . .[[ the average normalized gains <g> were
deplorably low <g> - about 0.25 - see Hake
(1998b, Table 1c.]].. . . . . . All four classes
used the same textbook (Tipler), and covered the
same chapters in it. Considering the wide
differences in the teaching styles of the four
professors, we conclude that THE BASIC KNOWLEDGE
GAIN UNDER CONVENTIONAL INSTRUCTION IS
ESSENTIALLY INDEPENDENT OF THE PROFESSOR. This is
consistent with the common observation among
physics instructors that the most strenuous
efforts to improve instruction hardly seem to
have any effect on general student performance.
H&H-H&H-H&H-H&H-H&H-H&H

.

I suspect that the growing body of educational
research in disciplines other than physics will
eventually show that the above lessons A, B, C
apply to most other academic subjects.

.

IV-IV-IV-IV-IV-IV-IV-IV-IV-IV-IV-IV-IV-IV-IV

.

IV. IS THE 'TEACHER EFFECT' THE DOMINANT FACTOR

.

IN STUDENTS' ACADEMIC GAIN?" Judging from the all
above:

.

(a) if "academic gain" means "gain in
higher-level learning for *U.S. K-12 students*,"
then the answer is: "NO! POVERTY is the dominant
factor" - see 'I' and 'II' above, and the next
most important factor is the degree to which
'Interactive Engagement' is provided by courses
and teachers - see III above";

.

(b) if "academic gain" means "gain in
higher-level learning for *students in higher
education* then the answer is: "the dominant
factor in promoting academic gain is the degree
to which 'Interactive Engagement' in provided by
courses and teachers - see III above."

.

.

.

Richard Hake, Emeritus Professor of Physics, Indiana University
Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
President, PEdants for Definitive Academic References which Recognize the
Invention of the Internet (PEDARRII)
<rrhake@xxxxxxxxxxxxx>
<http://www.physics.indiana.edu/~hake>
<http://www.physics.indiana.edu/~sdi>
<http://HakesEdStuff.blogspot.com>
<http://iub.academia.edu/RichardHake>

.

.

.

"The [Race to the Top] initiative should support
research based on data that links student test
scores with their teachers, but should not
prematurely promote the use of value-added
approaches (which evaluate teachers based on
gains in their students' performance) to reward
or punish teachers."
"Letter Report to the U.S. Dept. of
Education on the Race to the Top Fund" [NRC
(2009)]

.

.

.

"To develop the genuine understanding of concepts
and theories that underlie [declarative]
knowledge, the college student, no less than the
elementary school child, must engage in deductive
and inductive mental activity coupled with
interpretation of personal observation and
experience. Unfortunately, such activity is
rarely induced in passive listeners, but it can
be nurtured, developed, and enhanced in the
majority of students providing it is
experientially rooted and not too rapidly paced,
and providing the mind of the learner is actively
engaged."
Arnold Arons (1983)

.

.

.

". . . I know from both experience and research
that the teacher is at the heart of student
learning and school improvement by virtue of
being the classroom authority and gatekeeper for
change. Thus the preparation, induction, and
career development of teachers remain the
Archimedean lever for both short- and long-term
improvement of public
schools."
Larry Cuban (2003) in "Why Is It So Hard To Get Good Schools?" (page 1)

.

.

.

REFERENCES [All URL's accessed on 07 April 2011;
some shortened by <http://bit.ly/>.]
Amrein-Beardsley, A. 2008. "Methodological
Concerns About the Education Value-Added
Assessment System," Educational Researcher 37(2):
65-75; online at <http://bit.ly/eSWaQA>. For a
response see Sanders & Wright (2008).

.

.

.

Arons, A. B. 1981. "Thinking, reasoning, and
understanding in introductory physics courses.
Physics Teacher 19(3): 166-172; online to
subscribers at <http://bit.ly/fjdamv >. See also
Hake (2004).

.

Arons, A.B. 1983. "Achieving Wider Scientific
Literacy," Daedalus, Spring. Reprinted in Arons
(1997).

.

Arons, A.B. 1997. "Teaching Introductory
Physics." Wiley. Amazon.com information at
<http://amzn.to/bBPfop>. Note the searchable
"Look Inside" feature.

.

Baker, E.L., P.E., Barton, L. Darling-Hammond, E.
Haertel, H.F. Ladd, R.L. Linn, D. Ravitch, R.
Rothstein, R.J. Shavelson, & L.A. Shepard. 2010,
"Problems with the use of student test scores to
evaluate teachers," Economic Policy Institute
(EPI) Briefing Paper #278; online as a 315 kB pdf
at <http://bit.ly/h8k5Fw>.

.

Ballou, D., W. Sanders, & P. Wright. 2004.
"Controlling for student background in
value-added assessment of teachers," Journal of
Educational and Behavioral Statistics 29(1):
37-65; online as a 1.8 MB pdf at
<http://bit.ly/fbaDBz>.

.

Berliner, D.C. 2005. "Our Impoverished View of
Educational Reform," Teachers College Record,
August 02, free online as an 872 kB pdf at
<http://bit.ly/ff8BVj>. Berliner argues that:
"poverty places severe limits on what can be
accomplished through school reform efforts,
particularly those associated with the federal No
Child Left Behind law. The data presented in this
study suggest that the most powerful policy for
improving our nations' school achievement is a
reduction in family and youth poverty."

.

Berliner, D.C. 2009. "Poverty and Potential:
Out-of-School Factors and School Success."
Education and Public Interest Center (Univ. of
Colorado) and Education Policy Research Unit,
(Arizona State University); online as a 729 kB
pdf at <http://bit.ly/fqiCUA>.

.

Braun, H.I. 2005. "Using Student Progress To
Evaluate Teachers: A Primer on Value-Added
Models." Educational Testing Service; online as a
29 kB pdf at <http://bit.ly/gZiJdH>.

.

Bruce, M. & Y.D. Nies. 2009. "Obama Unveils Race
for School Billions: Competition for Share of
$4.35B Pot Will Have Winners, Losers, Sec Duncan
Says," ABC News, 24 July; online at
<http://abcn.ws/halIyj>. The authors wrote:
"Despite the many challenges of administering
these grants, the Obama administration is sending
a clear message to America's teachers: Embrace
merit-based pay or risk losing out on millions of
dollars of stimulus money. Long opposed by
teachers' unions, the application requires
educators to be evaluated by the achievement of
their students and calls on states to provide
opportunities for effective teachers to receive
additional compensation. The Race to the Top also
challenges the tenure system by encouraging
states to fire under-performing tenured
teachers." For the latest news on the Race to the
Top see USDE (2010, 2011).

.

Campbell, D.T. 1976. "Assessing the impact of
planned social change," in G. Lyons, ed. ,
"Social research and public policies: The
Dartmouth/OECD Conference, " Chapter 1, pp. 3-45
Dartmouth College Public Affairs Center, p. 35;
online as a 196 kB pdf at <http://bit.ly/hMsyUr>.

.

Cuban, L. 2003. "Why Is It So Hard To Get Good
Schools?" Teachers College Press, publisher's
information at <http://bit.ly/gSn3P2>. Amazon.com
information at <http://amzn.to/gvxHIb>. Note the
"Look Inside" feature.

.

Doran, H.C. & S. Fleischman. 2005. "Challenges of
Value-Added Assessment," Educational Leadership
63(3): 85-87; online at <http://bit.ly/dF6CuB>.

.

Feder, T. 2009. "What determines how well kids do
in school?" Physics Today 62(12): 28 December, an
abstract is online at <http://bit.ly/hn5u2A>.

.

Haertel, E. 2005. "Using a longitudinal student
tracking system to improve the design for public
school accountability in California," an abstract
is online at <http://bit.ly/gdhf5A>. Haertel
wrote: ". . . .substantial technical hurdles
stand in the way of statewide implementation
within the next few years, especially for
high-stakes accountability purposes in a state as
big and complex as California. As with many
technical innovations, while potential benefits
are real, they may fall well short of some
popular claims and expectations."

.

Hake, R.R. 1998a. "Interactive-engagement vs
traditional methods: A six thousand- student
survey of mechanics test data for introductory
physics courses," Am. J. Phys. 66(1), 64-74
(1998); online at <http://bit.ly/d16ne6>. See
also Hake (1998b).

.

Hake, R.R. 1998b. "Interactive- engagement
methods in introductory mechanics courses,"
online at <http://bit.ly/aH2JQN>. Submitted on
6/19/98 to the "Physics Education Research
Supplement to AJP" (PERS), but rejected :-( by
its editor on the grounds that the very
transparent, well-organized, and crystal-clear
Physical-Review-type data tables were
"impenetrable"! This universally ignored crucial
companion paper to Hake (1998a) tabulates and
references: average pre/post test scores,
standard deviations, instructional methods,
materials used, institutions, and instructors for
each of the survey courses of Hake (1998a). In
addition the paper includes: (a) case histories
for the seven IE courses of Hake (1998a) whose
effectiveness as gauged by pre-to-post test gains
was close to those of T courses, (b) advice for
implementing IE methods, and (c) suggestions for
further research.

.

Hake, R.R. 2001. Socratic Dialogue Inducing (SDI)
labs for introductory physics. Available online
at: <http://www.physics.indiana.edu/~sdi/>.

.

Hake, R.R. 2002. "Lessons from the Physics
Education Reform Effort," Ecology and Society
5(2): 28; online at <http://bit.ly/aL87VT>. For
an update see Hake (2007).

.

Hake, R.R. 2004. "The Arons Advocated Method,"
online as a 144 kB pdf at
<http://bit.ly/boeQQt>. Submitted to the
"American Journal of Physics" on 24 April 2004,
but rejected :-( by an editor who evidently
believed a a referee who erroneously claimed that
ARONS DID NO PHYSICS EDUCATION RESEARCH ! (did
ethnographer Margaret Mead <http://bit.ly/eSQat5>
do no anthropological research?) Science
education experts: (a) Anton Lawson
<http://bit.ly/hBRhjb> wrote to me on 29 June
2009: "I liked it. . . . .great job!!" and (b)
Uri Ganiel [<http://bit.ly/diSCGX>/"Professors
Emeriti" where "/" means "click on"] wrote to me
on 6 Feb 2005: "I have by now read your paper:
'The Arons-Advocated Method' and found it very
instructive. I fully agree with your assessment
that Arons was "... along with Robert Karplus one
of the founding fathers of U.S. Physics Education
Research...". I cannot understand the referee's
objection. . . . . The argument of the referee
that you quote: ' ...his activities did not
constitute systematic investigations...' make me
suspect it is someone from the 'educational'
community, with their typical insistence on
'methodologies' taken from psychology or the
social sciences, rather than on a good
understanding of subject matter, identification
of foci of difficulty, combined with sensible
pedagogy - that was what Arons was so good at."
(a) Anton Lawson <http://bit.ly/hBRhjb> wrote to
me on 29 June 2009: "I liked it. . . . .great
job!!" But what do Lawson and Ganiel know
compared with the profound understanding of the
anonymous referee?

.

Hake, R.R. 2007. "Six Lessons From the Physics
Education Reform Effort," Latin American Journal
of Physics, online as a 124 kB pdf at
<http://bit.ly/bjvDOb> (references by number) and
<http://bit.ly/96FWmE> (references by author).
This is a review and update of the six lessons on
"interactive engagement" in Hake (2002).

.

Hake, R.R. 2008. "Design-Based Research in
Physics Education Research: A Review," in
"Handbook of Design Research Methods in
Education: Innovations in Science, Technology,
Engineering, and Mathematics Learning and
Teaching" [Kelly, Lesh, & Baek (2008)] - A pre-
publication version of Hake's chapter is online
as a 1.1 MB pdf at <http://bit.ly/9kORMZ> (1.1
MB).

.

Hake, R.R. 2010a "Re: Fwd: First, Let's Fire All
the Teachers!" online on the OPEN! EDDRA2
archives at <http://yhoo.it/dYsjLB>. Post of 5
March 5, 2010 10:06 am to EDDRA2 and ARN-L.

.

Hake, R.R. 2010b. "Re First, Let's Fire All the
Teachers! ADDENDUM," online on the OPEN! EDDRA2
archives at <http://yhoo.it/hUkdvI>. Post of 6
March to ARN-L, EDDRA2, Math-Teach, PhysLrnR, and
POD.

.

Hake, R.R. 2011a. "Value-Added Inequities: Should
Value-Added Measures Be Used to Evaluate
Teachers?" online on the OPEN! AERA-L archives at
<http://bit.ly/fN1HmD>. Post of 18 Jan 2011
15:34:47-0800to AERA-L and Net-Gold. The
abstract and link to the complete post were
transmitted to various discussion lists are also
online on my blog "Hake'sEdStuff" at
<http://bit.ly/h23shQ> with a provision for
comments.

.

Hake, R.R. 2011b. "Re: Value-Added Inequities:
Should Value-Added Measures Be Used to Evaluate
Teachers?" online on the OPEN! AERA-L archives at
<http://bit.ly/hAboJq>, post of 19 Jan 2011
11:36:22-0800 to AERA-L, EDDRA2, MathEdCC,
Math-Teach, Net-Gold, & PhysLrnR.

.

Halloun, I. & D. Hestenes. 1985a. "The initial
knowledge state of college physics students," Am.
J. Phys. 53, 1043-1055 (1985); online at
<http://bit.ly/b1488v>, scroll down to
"Evaluation Instruments."

.

Halloun, I. & D. Hestenes. 1985b. "Common sense
concepts about motion."Am. J. Phys. 53:
1056-1065; online at <http://bit.ly/b1488v>,
scroll down to "Evaluation Instruments."

.

Halloun, I., R.R. Hake, E.P. Mosca, & D.
Hestenes. 1995. "Force Concept Inventory (1995
Revision)," online (password protected) at
<http://bit.ly/b1488v>, scroll down to
"Evaluation Instruments." Currently available in
20 languages: Arabic, Chinese, Croatian, Czech,
English, Finnish, French, French (Canadian),
German, Greek, Italian, Japanese, Malaysian,
Persian, Portuguese, Russian, Spanish, Slovak,
Swedish, & Turkish.

.

Jordan, H.R., R.L. Mendro, & D.Weerasinghe. 1997.
"Teacher Effects on Longitudinal Student
Achievement: A Report on Research in Progress,"
Dallas Public Schools, online as a 1.1 MB pdf at
<http://bit.ly/eStmZm>.

.

Kelly, A.E., R.A. Lesh, & J.Y. Baek. 2008.
"Handbook of Design Research Methods in
Education: Innovations in Science, Technology,
Engineering, and Mathematics Learning and
Teaching." Routledge, publisher's information at
<http://bit.ly/dkLabI>; Amazon.com information at
<http://amzn.to/aHnWQs>.

.

Korsunsky, B. 2011. "Re: single most dominant
factor affecting student academic gain," PhysLrnR
post of 2 Apr 2011 10:31:12-0400; online on the
PhysLrnR archives at <http://bit.ly/gF3l53>. To
access the archives of PhysLnR one needs to
subscribe :-(, but that takes only a few minutes
by clicking on <http://bit.ly/beuikb> and then
clicking on "Join or leave the list (or change
settings)." If you're busy, then subscribe using
the "NOMAIL" option under "Miscellaneous." Then,
as a subscriber, you may access the archives
and/or post messages at any time, while receiving
NO MAIL from the list!

.

Kupermintz, H. 2003. "Teacher effects and teacher
effectiveness: A validity investigation of the
Tennessee Value Added Assessment System," Educ.
Eval. Policy Anal. 25: 287-298; an abstract is
online at <http://bit.ly/efeecr>. It reads: "This
article addresses the validity of teacher
evaluation measures produced by the Tennessee
Value Added Assessment System (TVAAS). The system
analyzes student test score data and estimates
the effects of individual teachers on score
gains. These effects are used to construct
teacher value-added measures of teaching
effectiveness. We describe the process of
generating teacher effectiveness estimates in
TVAAS and discuss policy implications of using
these estimates for accountability purposes.
Specifically, the article examines the TVAAS
definition of teacher effectiveness, the
mechanism employed in calculating numerical
estimates of teacher effectiveness, and the
relationships between these estimates and student
ability and socioeconomic background
characteristics. OUR VALIDITY ANALYSES POINT TO
SEVERAL LOGICAL AND EMPIRICAL WEAKNESSES OF THE
SYSTEM, AND UNDERSCORE THE NEED FOR A STRONG
VALIDATION RESEARCH PROGRAM on TVAAS." [My CAPS.]

.

Labov, J.B., S.R. Singer, M.D. George, H.A.
Schweingruber, & M.L. Hilton. 2009. "Effective
Practices in Undergraduate STEM Education Part 1:
Examining the Evidence," CBE Life Sci Educ 8(3):
157-161; online at <http://bit.ly/cRc0JC>. This
article includes a discussion of the "Workshop on
Linking Evidence and Promising Practices in STEM
Undergraduate Education" [National Academies
(2008)].

.

Lissitz, R.W., ed. 2005. "Value Added Models in
Education: Theory and Applications." JAM Press.
Contents and ordering information are online as a
25 kB pdf at <http://bit.ly/fVg407> You can
safely ignore the phishing site warning.

.

Lissitz, R.W., ed. 2006. "Longitudinal and Value
Added Models of Student Performance." JAM Press.
Contents and ordering information online as a 33
kB pdf at <http://bit.ly/ik8fJk> (You can safely
ignore the phishing site warning.) See also
Lissitz (2005).

.

Marder, M. 2010. "Student Flows in Texas,"
online at <http://bit.ly/gWMKB8>.

.

Marder, M. & D. Bansal. 2009. "Flow and diffusion
of high-stakes test scores," Proceedings of the
National Academy of Sciences 106 (41):
17267-17270, 13 October; online at
<http://bit.ly/hYbbLe>.

.

Millman, J. ed. 1997. "Grading Teachers, Grading
Schools: Is Student Achievement a Valid
Evaluation Measure?" Sage Publications,
publisher's information at
<http://bit.ly/gply6J>. Amazon.com information at
<http://amzn.to/fDKgO6>, note the searchable
"Look Inside" feature.

.

National Academies. 2008. "Workshop on Linking
Evidence and Promising Practices in STEM
Undergraduate Education": (a) introductory
sessions are online at <http://bit.ly/ciNwjQ>;
(b) commissioned Papers are online at
<http://bit.ly/ceg1Bx>. See also the commentary
on this workshop by Labov et al. (2009).

.

NCSU. 2010. "Assessment Instrument Information
Page," Physics Education R & D Group, North
Carolina State University"; online at
<http://bit.ly/9gfUpY>.

.

NRC. 2009. National Research Council, Board on
Testing and Assessment (chaired by E.H. Haertel),
"Letter Report to the U.S. Department of
Education on the Race to the Top Fund," online at
<http://bit.ly/dOg8v6>. The NRC's description is:
"This report examines the Race to the Top
initiative - a $4.35 billion grant program
included in the American Recovery and
Reinvestment Act to encourage state-level
education reforms. The report strongly supports
rigorous evaluations of programs funded by the
Race to the Top initiative. THE INITIATIVE should
support research based on data that links student
test scores with their teachers, BUT SHOULD NOT
PREMATURELY PROMOTE THE USE OF VALUE ADDED
APPROACHES, WHICH EVALUATE TEACHERS BASED ON
GAINS IN THEIR STUDENTS' PERFORMANCE, TO REWARD
OR PUNISH TEACHERS. . . . . [[My CAPS]]. . . .
The report also cautions against using the
National Assessment of Educational Progress, a
federal assessment that helps measure overall
U.S. progress in education, to evaluate programs
funded by the Race to the Top initiative."

.

Ravitch, D. 2010. "The Death and Life of the
Great American School System: How Testing and
Choice Are Undermining Education." Basic Books,
publisher's information at
<http://bit.ly/ejC8kb>. Amazon.com information at
<http://amzn.to/faJ9JZ>, note the searchable
"Look Inside" feature. A expurgated Google book
preview is online at <http://bit.ly/e96eJd>.

.

Ricca, B. 2011. "Re: single most dominant factor
affecting student academic gain," PhysLrnR post
of 2 Apr 2011 13:29:56-0400; online at
<http://bit.ly/htAjCj>.

.

Sanders, W.L. & J.C. Rivers. 1996. "Cumulative
and Residual Effects of Teachers on Future
Student Academic Achievement," Research Progress
Report University of Tennessee Value-Added
Research and Assessment Center; online as a 602
kB pdf at <http://bit.ly/gN8EZI>.

.

Sanders, W.L., A.M. Saxton, & S.P. Horn 1997.
"The Tennessee Value-Added Assessment System: A
Quantitative, Outcomes-Based Approach to
Educational Assessment," in Millman (1997, pp.
137-162); these pages can be accessed at Amazon's
"Look Inside" feature <http://amzn.to/fDKgO6> for
Millman (1997).

.

Sanders, W.L. 2000. "Value-added assessment from
student achievement data: Opportunities and
hurdles," J. Pers. Eval. Educ. 14(4): 329-339,
online as a 74 kB pdf at <http://bit.ly/ezybVW>.

.

Sanders, W.L. S.P. Wright, & J.C. Rivers. 2006.
"Measurement of Academic Growth of Individual
Students Toward Variable and Meaningful Academic
Standards," in Lissitz (2006).

.

Sanders, W.L. & S.P. Wright. 2008. "A Response to
Amrein-Beardsley (2008) "Methodological Concerns
About the Education Value-Added Assessment
System," online as a 438kB pdf at
<http://bit.ly/gmnpfX>. They wrote: "The
objective of this response is to give the
rationale for the methodology that we use, to
offer evidence for its robustness, and to note
how this approach mitigates and dampens to near
triviality many of the concerns expressed by
Amrein-Beardsley."

.

Shulman, L. 1986. "Those who understand:
knowledge growth in teaching," Educational
Researcher 15(2): 4-14; online to subscribers at
<http://bit.ly/fMo1de>.



Shulman, L. 1987. "Knowledge and teaching:
foundations of the new reform," Harvard
Educational Review 57: 1-22; online as a 1.7 MB
pdf at <http://bit.ly/iafsB4>.

.

Spencer, R. 2011. single most dominant factor
affecting student academic gain, PhysLrnR post of
1 Apr 2011 21:35:31-0700; online on the PhysLrnR
archives at <http://bit.ly/hAifHY>. To access the
archives of PhysLnR one needs to subscribe :-(,
but that takes only a few minutes by clicking on
<http://bit.ly/beuikb> and then clicking on "Join
or leave the list (or change settings)." If
you're busy, then subscribe using the "NOMAIL"
option under "Miscellaneous." Then, as a
subscriber, you may access the archives and/or
post messages at any time, while receiving NO
MAIL from the list!

.

SAS. 2011. Statistical Analysis System -
Education; online at <http://bit.ly/hUpnjT>:
K-12 at <http://bit.ly/h4n1hP>; Higher Education
at <http://bit.ly/hLIQ3K>; White Paper "Ensuring
Effective Data Use in Education" at
<http://bit.ly/hhGvwy> wherein it is stated "How
SAS® Solutions Can Help You Execute the DQC's. .
. .[[probably Data Quality Controls]]. . . . .
Recommended 10 State Actions to Meet NCES
Directives. . . . .[[NCES = National Center for
Education Statistics, part of the U.S. Dept. of
Education's "Institute of Education Sciences"
(IES) that collects, analyzes, and publishes
statistics on education and public school
district finance information in the United
States. ]]. . . .. To track student progress and
trends across districts longitudinally or
historically, the Data Quality Campaign (DQC)
prescribes 10 State Actions. In this paper, learn
how SAS helps states execute on the DQC's 10
State Actions - and ultimately achieve their LDS
goals. . . . [[LDS may stand for "Longitudinal
Data System"]]. . . . Our solutions, consulting
services, and industry best practices not only
give you the integrated functionality you need,
but also dramatically reduce project time, risk
and cost over time."

.

TQ Center. 2011. National Comprehensive Center
for Teacher Quality, online at
<http://www.tqsource.org/>. At "About Us"
<http://bit.ly/iajnqA> under "Mission" it is
stated that: "The TQ Center was created to serve
as the premier national resource to which the
regional comprehensive centers, states, and other
education stakeholders turn for strengthening the
quality of teaching-especially in high-poverty,
low-performing, and hard-to-staff schools-and for
finding guidance in addressing specific needs,
thereby ensuring highly qualified teachers are
serving students with special needs."

.

USDE. 2010. U.S. Dept. of Education, "Nine States
and the District of Columbia Win Second Round
Race to the Top Grants," 24 August; online at
<http://bit.ly/fZys8o>: "The 10 winning Phase 2
applications in alphabetical order are: the
District of Columbia, Florida, Georgia, Hawaii,
Maryland, Massachusetts, New York, North
Carolina, Ohio, and Rhode Island."

.

USDE. 2011. U.S. Dept. of Education, "Race to the
Top Fund," online at <http://1.usa.gov/i5KXLY>.

.

Watanabe, T. 2011. " 'Value-added' teacher
evaluations: L.A. Unified tackles a tough
formula," Los Angeles Times, 28 March; online at
<http://lat.ms/h9TUB3>.

.

Wells, M., D. Hestenes, & G. Swackhamer. 1995. "A
Modeling Method for High School Physics
Instruction, Am. J. Phys, 63(7): 606-619; online
as a 115 kB pdf at <http://bit.ly/fT3WDj>.

.

Wright, S.W., S.P. Horn, & W.L. Sanders. 1997.
"Teachers and Classroom Heterogeneity: Their
Effects on Educational Outcomes," "Journal of
Personnel Evaluation in Education 11(1): 57-67;
online to subscribers at <http://bit.ly/ha6RHY>.
The abstract reads: "The Tennessee Value-Added
Assessment System (TVAAS) has been designed to
use statistical mixed-model methodologies to
conduct multivariate, longitudinal analyses of
student achievement to make estimates of school,
class size, teacher, and other effects. This
study examined the relative magnitude of teacher
effects on student achievement while
simultaneously considering the influences of
intraclassroom heterogeneity, student achievement
level, and class size on academic growth. THE
RESULTS SHOW THAT TEACHER EFFECTS ARE DOMINANT
FACTORS AFFECTING STUDENT ACADEMIC GAIN and that
the classroom context variables of heterogeneity
among students and class sizes have relatively
little influence on academic gain. Thus, a major
conclusion is that teachers make a difference.
Implications of the findings for teacher
evaluation and future research are discussed."

.

Wright, S.P., W.L. Sanders, J.C. Rivers. 2006.
"Measurement of Academic Growth of Individual
Students toward Variable and Meaningful Academic
Standards,"in Lissitz (2006) online as a 532 kB
pdf at <http://bit.ly/goVdcT>.

.

Wright, S.P. & W.L. Sanders. 2008. "Decomposition
of Estimates in a Layered Value-Added Assessment
Model," presented at the "National Conference on
Value-Added Modeling," online as a 418 kB pdf at
<http://bit.ly/hynTZ0>.

.

Wright, S.P., J.T. White, W.L. Sanders, & J.C.
Rivers. 2010. SAS-EVAAS Statistical Models,
online as a 913 kB pdf at <http://bit.ly/hPrO7s>.
See also Wright (2010b). For a debate on cons and
pros of SAS-EVAAS see CON: "Methodological
Concerns About the Education Value-Added
Assessment System" [Amrein-Beardsley (2008)];
PRO: "A Response to Amrein-Beardsley (2008)
"Methodological Concerns About the Education
Value-Added Assessment System," Sanders & Wright
(2008).

.

Wright, S.P. 2010. "An Investigation of Two
Nonparametric Regression Models for Value-Added
Assessment in Education," online as a 586 kB pdf
at <http://bit.ly/fkSc1g>.

.

.

.

Other related posts:

  • » [net-gold] Is the 'Teacher Effect' the Dominant Factor in Students' Academic Gain? - David P. Dillard