Revolution Lullabye

May 1, 2009

Hamp-Lyons and Condon, Questioning Assumptions about Portfolio-Based Assessment

Hamp-Lyons, Liz and William Condon. “Questioning Assumptions about Portfolio-Based Assessment.” CCC 44.2 (1993): 176-190. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 315-329.

The authors argue that portfolio-based assessments are not inherently better, more valid, or more ethical than other kinds of writing assessments. It takes much critical reflection and work on the part of WPAs and writing instructors to make portfolio grading, which is more time consuming, a better assessment. They point out that more texts and genres doesn’t always make scoring decisions easier, that pedagogical and curricular values aren’t taken into account because they are not articulated, and that collaborative portfolio grading is often conflict-ridden, for it is hard to build consensus over assessment and instruction values. They do not argue to abandon portfolios, just to warn that certain stipulations – like criteria and conversations about program goals and values – must be in place to make portfolios a better assessment.

Quotable Quotes

“Increased accuracy is not an inherent virtue of portfolio assessment” (327).

Veal and Hudson, Direct and Indirect Measures for Large-Scale Evaluation of Writing

Veal, L. Ramon and Sally Ann Hudson. “Direct and Indirect Measures for Large-Scale Evaluation of Writing.” Research in the Teaching of English. 17 (1983): 290-296. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 13-18.

Using data collected from over 2400 students attending 24 geographically, ethnically, and culturally diverse Georgia high schools, Veal and Hudson argue that holistically-scored essays allow for the best balance of validity, reliability, and cost-effectiveness in writing assessments. The students took different kinds of national achievement tests which scored their essays using holistic, analytic, primary-trait, or mechanics-based assessments, and the researchers detailed the validity, reliability, and costs associated with the different measures.

Quotable Quotes

“The consideration of the user at this point becomes whether the cost or the face validity of a direct assessment of writing is more critical” (18)

Notable Notes

1983 publication, before portfolios came on the scene, doesn’t address larger questions of ethics or the nature of validity

April 29, 2009

Williamson, The Worship of Efficiency

Williamson, Michael. “The Worship of Efficiency: Untangling Theoretical and Practical Considerations in Writing Assessment.” Assessing Writing 1(1994): 147-174. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 57-80.

Williamson argues that educators must adopt a different educational model – that of the craft workshop – in order to create an assessment theory and practice that breaks the hold of the god-terms of efficiency, fairness, and reliability. Williamson traces how the concept of efficiency led assessment and educational practices during much of the 20th century, resulting in invalid assessments only based on one data point, assessments grounded in standardized tests that allowed for the development and dominance of factory and bureaucratic educational models. He points to other assessment practices, like those in France, rely on interviews and non-standardized assessments given by the teacher, who knows the curriculum and students best. If teachers are to be treated as the professoinals that they are, Williamson argues, they should be given the right and the responsibility to develop and give assessments to their students.

Quotable Quotes

“we will need to begin to trust teachers” (78).

“the privilege of true professionalism” (79).

“For the most part, students are assessed, labeled, and placed in school curricula on the basis of their scores on succeeding standardized tests…these tests remain one of the single most important indicators of a child’s future” (67).

“efficiency has governed both the theoretical and practical developments in assessment” (69).

Notable Notes

development of psychometrics to allow for an objective, outside scorer – this is reversed in the craft workshop model with teacher in charge

child-centered assessment v. system-centered assessment

libertarian assessment

history of shift from oral exams to written exams to multiple-choice testing (Arthur Otis)

efficiency is a key American cultural and social force

craft workshop model (Shedd and Bacharach; Schon’s reflective practicioner)

assessment as a contextual, dynamic, continuous, reflective process

assessments with multiple data points converging = valid

April 28, 2009

Yancey, Looking Back as We Look Forward

Yancey, Kathleen Blake. “Looking Back as We Look Forward: Historicizing Writing Assessment.” CCC 50 (1999): 483-503. Reprinted in Assessing Writing: A Critical Sourcebook. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 131-149.

Yancey divides the history of writing assessment into three “waves.” The first wave (1950-1970) focused on objective, non-essay testing that prioritized efficiency and reliability. The second wave (1970-1986) moved towards holistic scoring of essays, based on rubrics and scoring guides first developed through ETS and AP. The third wave (1986-present) expanded assessment to include portfolios (consisting of multiple examples of student writing) and larger, programmatic assessments. She looks at these waves from several perspectives: at how the concepts of reliability and validity are negotiated and valued; at the struggle between the knowledge of the assessment expert (and psychometrics) and the contextual, local knowledge of the non-expert teacher (and hermeneutics); and the move of assessment from outside and before the classroom to within and after the class. She voices concerns and directions for further scholarship and practice in writing assessment, challenging the field to look for ways to use assessment rhetorically and ethically to help students and programs develop and to produce scholarly knowledge.

Quotable Quotes

“It is the self we want to teach, that we hope will learn, but that we are often loathe to evaluate. What is the role of the person/al in any writing assessment?” (132).

Notable Notes

The role of the classroom teacher moving into writing assessment: in the 1st wave, testing specialists evaluated, but through the 2nd and 3rd waves, the roles of teacher and evaluator overlapped into the new discipline of writing assessment.

Questions: Who is authorized to speak about assessment? What is the purpose of writing assessment in education? Who shuold hold hte power?

the use of portfolios shifts the purpose and goals of the assessment: using pass/fail instead of scoring and communal grading moves more towards assessing a program and establishing community values than individual student assessment. Use different stakeholders to read?

waves fall into each other, aren’t strict lines that categorize what’s happening.

Moss, Can There Be Validity without Reliability?

Moss, Pamela. “Can There Be Validity without Reliability?” Educational Researcher 23.4(1994): 5-12. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 81-96.

Moss challenges the primacy of reliability in assessment practices, arguing for the value of contextual, hermeneutic alternative assessments that can more accurately reflect the complex nature of writing tasks, knowledge, and performances. She describes the difference between hermeneutic and pyschometric evaluation, the latter which uses outside scorers or readers that do not know the context of the task, curriculum or the student, as teachers would. Pointing out that many high-stakes assessments are not standardized or generalizable (like tenure, granting graduate degrees), she argues that the warrant that writing assessment scholars use in the argument of generalizability, the warrant of standardization, needs to be re-evaluated and rearticulated from a hermeneutic perspective. By making reliability (meaning standardization, I think) an option rather than a requirement, assessment practices can be opened up that reflect more of a range of educational goals.

Quotable Quotes

Hermeneutic: “an ethic of disciplined, collaborative inquiry that encourages challenges and revisions to initial interpretations; and the transparency of the trail of evidence leading to the interpretations, which allows users to evaluate the conclusions for themselves” (87).

“There are certain intellectual activities that standardized assessments can neither document nor promote” (84).

“potential of a hermeneutic approach to drawing and warranting interpretations of human products or performances” (85).

Notable Notes

some hermeneutic assessment practices: allowing studnets to choose the products they feel best represent them (not just the same tasks for all) – fair, ethical, and places agency in the student; alos critical discussion and debate during assessment, disagreement does not equate invalidity, the importance of a dialogic perspective of a community (what Broad and Huot draw on)

detached, impartial scorers silence the teachers, those who know students and curriculum best

look @ public education accountability movement

April 27, 2009

Huot and O’Neill, Assessing Writing

Huot, Brian and Peggy O’Neill, eds. Assessing Writing: A Critical Sourcebook. Boston: Bedford/St. Martin’s, 2009.

This edited collection, divided into three sections – Foundations, Models, and Issues – focus on writing assessment that takes place outside of an individual classroom, namely placement and exit exams and programmatic evalutions. It draws on scholarship within the field of composition and rhetoric as well as that from educational evaluation, K-12 education, and measurement and testing. Huot and O’Neill see much of the scholarship written in writing assessment, starting in the 1940s, as negotiating the tension between reliable evaluation and valid evaluation, and argue that writing assessment needs to be taken up critically and reflectively by comp/rhet scholars as a positive and productive force (not punitive).

I’ve surveyed this volume (it contains 24 essays along with a selected bibliography) and have read the selections that I’ve seen cited in other scholarship about assessment as well as those that seem particularly helpful for WPAs.

Quotable Quotes

“Writing assessment is an activity – a practice – as well as a theoretically rich scholarly field” (6).

April 15, 2009

Broad, What We Really Value

Broad, Bob. What We Really Value: Beyond Rubrics in Teaching and Assessing Writing. Logan: Utah State UP, 2003.

Broad introduces the practice of dynamic criteria mapping (DCM) as an inquiry-driven alternative to static, traditional rubrics that have a normative rather than descriptive function, not even addressing many of the things are taught in writing classes (therefore not a valid assessment). His book is a case study of the use of DCM at “City University,” a university with 4000 students in a 3-course English sequence that is assessed through portfolios, graded collectively by 3-teacher teams. Instead of starting with certain textual features to check off, DCM asks teachers and assessors to describe what they see in a text (good and bad.) Together, the instructors find synonyms and antonyms for what they notice, categorize similar ones, and create a visual map that illustrates the values about good writing that the program’s teachers hold collectively. This method, though time-consuming and messy, better articulates the complex processes and ideas that students are showing in their writing. The process is locally, site-baed: though the method of DCM can be used, individual maps cannot be transported across institutions or even across years; it should be a conversation about values that happens continually.

Quotable Quotes

“We can now face the truth equipped with tools (qualitative inquiry) and attitudes (hermeneutics) that help us tap the energy of apparant chaos without being consumed by it. We can embrace the life of things” (137).

“In their rush toward clarity, simplicity, brevity, and authority, traditional scoring guides make substantial knowledge claims based on inadequate research” (3)

“In pursuit of their normative and formative purposes, traditional rubrics surrender thier descriptive and informative potential: responsiveness, detail, and complexity in accounting for how writing is actually evaluated” (2).

“The age of the rubric has passed” (4)

Notable Notes

Vinland map – not appropriate now

move to validity(not the same as reliability)

the DCM finds textual criteria and contextual criteria (things not found in text but have an impact on assessing, before DCM these have not been visible)

benefits of DCM: 1. student learning (shows writing is more complex, they have a better understanding of what they’re doing well and  what teachers are looking for); 2. professional development and community; 3. program development and evaluation; 4. more valid assessment; 5. better relations with the public (values are made public, written down)

drawbacks? time-consuming and needs constant reflection and revisiting

must happen in communal writing assessment so there will be debate, disagreement, and discussion of values.

once the values are visible, you can start having conversations about whether you should value what you do.

a search for truth through hermeneutics, not psychometrics

Huot, (Re)Articulating Writing Assessment

Huot, Brian. (Re)Articulating Writing Assessment for Teaching and Learning. Logan, Utah: Utah State UP, 2002.

Assessment needs to be rearticulated by composition and rhetoric scholars as an important, necessary part of writing scholarship and teaching. Huot addresses assessment in a different way in each chapter (highlighting its connection to student response, teaching students self-assessment, need to create a field of writing assessment, and a history of writing assessment practices), but all of his studies and discussion point to central principles for his new theory and practice of writing assessment. Assessment must be site-based, locally controlled, context-sensitive, rhetorically-based, and accessible (to students, public, teachers, adminstrators.) Composition and rhetoric scholars and teachers are doing themselves no favors by abdicating assessment to education or to self-appointed writing assessment specialists; assessment is an issue that must be taken up by every WPA and teacher.

Quotable Quotes

“Instead of envisioning assessment as a way to enforce certain culturally positioned standards and refuse entrance to certain people and groups of people, we need to use our assessments to aid the learning environment for both teachers and students” (8).

“People who write well have hte ability to assess their own writing, and if we are to teach students to write successfully, then we have to teach them to assess their own writing” (10)

Notable Notes

assessment is articulating what we value; it marks our identities as teachers, programs, and a field; how do our judgments get articulated into our assessments?

Chapter 2 – need to connect comp/rhet with K-12 assessment to create  a writing assessment subfield, pooling knowledge and methods, talk about validity

Chapter 3 – need to teach students how to assess their own writing; writing as reflective judgment; use portfolios to full advantage

Chapter 4 – history of assessment practices

Chapter 5 – teacher response to student writing (draw on Phelps) and the contraint inherent in the act of reading

Chapter 6 – writing assessment is treated like a technology. It needs to be reimagined as research. This changes the role and activity of the assessors (151)

Chapter 7 – the practice of writing assessment needs to be reflective, conscious, theoretical, and instructive. Assessment can be social action, something that the field claims again, led by WPAs and teachers. (175)

movement away from objective rubric-like assessments, more based on community questions, inquiry, research, and practice

technocentric argument (Hawshier) – the tool of the assessment should not drive the practice

Create a free website or blog at WordPress.com.