Revolution Lullabye

May 1, 2009

Hamp-Lyons and Condon, Questioning Assumptions about Portfolio-Based Assessment

Hamp-Lyons, Liz and William Condon. “Questioning Assumptions about Portfolio-Based Assessment.” CCC 44.2 (1993): 176-190. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 315-329.

The authors argue that portfolio-based assessments are not inherently better, more valid, or more ethical than other kinds of writing assessments. It takes much critical reflection and work on the part of WPAs and writing instructors to make portfolio grading, which is more time consuming, a better assessment. They point out that more texts and genres doesn’t always make scoring decisions easier, that pedagogical and curricular values aren’t taken into account because they are not articulated, and that collaborative portfolio grading is often conflict-ridden, for it is hard to build consensus over assessment and instruction values. They do not argue to abandon portfolios, just to warn that certain stipulations – like criteria and conversations about program goals and values – must be in place to make portfolios a better assessment.

Quotable Quotes

“Increased accuracy is not an inherent virtue of portfolio assessment” (327).

Carter, A Process for Establishing Outcomes-Based Assessment Plans for Writing and Speaking in the Disciplines

Carter, Michael. “A Process for Establishing Outcomes-Based Assessment Plans for Writing and Speaking in the Disciplines.” Language and Learning Across the Disciplines 6.1. (2003): 4-29. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 268-286.

Carter outlines how the Campus Writing and Speaking Program, a WAC-like program at NC State (where Chris Anson is), helped departments establish speaking and writing outcomes for their undergraduate majors. Outcome-based assessment asks programs what skills and knowledge graduates should have, how the program helps students achieve these outcomes, and how the program could assess their outcomes and use their assessment for program development. The essay contains a list of questions departments can use to develop both objectives and outcomes (which, unlike objectives, are teachable and measurable), and gives an extended example of the outcomes from the anthropology department. Carter argues that such a discipline-specific assessment broadens both the responsibility of teaching writing and speaking skills to all departments and the timeline in which a student will be able to achieve these communication outcomes.

Notable Notes

outcomes need to be student-centered, faculty-driven, and meaningful (271)

outcome-based assessment does not assume that students will achieve something based on one course; it looks holistically at a whole program to assess its effectiveness in helping students achieve outcomes

compare to the continual improvement assessment in industry (ISO certification) and accountability movement in K-12 schools

the departments can state the disciplinary goals for their majors

what about students not in a traditional major? at schools with more blending capabilities?

articulate an assessment procedure with each department – including things like tests, exit interviews

the function of a speaking/writing professional (a WPA?) changes with outcome-based assessment

Durst, Roemer, and Schultz, Portfolio Negotiations

Durst, Russel K. , Marjorie Roemer, and Lucille M. Schultz. “Portfolio Negotiations: Acts in Speech.” In New Directions in Portfolio Assessment. Eds. Black, Diaker, Sommers, and Stygall. Portsmouth, NH: Boynton/Cook, 1994. 286-300. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 218-232.

Using the conversations from two groups of instructors grading portfolios (one beginning TAs, the other veteran teachers), the authors show how the discussion that takes place is a performative speech-act (J.L. Austin), whereby the conversations are making judgments, negotiations, and setting community standards and values for student writing. They argue that grading papers is an act of reading, a complex and inexact process, that will result in inconsistency among graders, but this inconsistency is a powerful force that can be harnessed for further program development and identity-making.

Haswell and Wyche-Smith, Adventuring into Writing Assessment

Haswell, Richard and Susan Wyche-Smith. “Adventuring into Writing Assessment.” CCC 45 (1994): 220-236. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 203-217.

Haswell and Wyche-Smith, from Washington State University, explain the process by which they had a direct influence and control over the new writing assessment put into place at their institution and use their story to give other WPAs and composition faculty advice for how to create writing assessments. Their advice is four-fold: 1. assume administrators want the writing faculty to create the assessment (even if it seems that they don’t); 2. let local context shape the assessment, not vice versa; 3. take into consideration recent scholarship on assessment; 4. solict advice and suggestions from the teaching staff, who will be using and maintaining the assessment system.

Quotable Quotes

“Writing teachers should be leery of assessment tools made by others…they should, and can, make their own.” (204).

Notable Notes

my idea – look at writing assessment arguments K-U with the quality assurance programs put into place with ISO certification

Elbow and Belanoff, Portfolios as a Substitute for Proficiency Exams

Elbow, Peter and Pat Belanoff. “Portfolios as a Substitute for Proficiency Exams.” CCC 37 (1986): 336-339. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 97-101.

Elbow and Belanoff describe the process and the benefits of the portfolio evaluation system they piloted at Stony Brook University. Instead of focusing on scoring and ranking essays, the portfolio system they put in place, which is a pass/fail (C or not) from the student’s teacher and another instructor, is mastery- and competency-based. The focus of the assessment and the course turns to comments, feedback, advice, and revision as well as collaboration among teachers. Students see the portfolio assessment (which has a dry run mid-semester) as a hurdle to overcome. Elbow and Belanoff argue that even though the assessment process leads to much debate among teachers during the assessment, this disagreement and chaos is key to learning and the development of community standards and values.

Veal and Hudson, Direct and Indirect Measures for Large-Scale Evaluation of Writing

Veal, L. Ramon and Sally Ann Hudson. “Direct and Indirect Measures for Large-Scale Evaluation of Writing.” Research in the Teaching of English. 17 (1983): 290-296. In Assessing Writing. Eds. Huot and O’Neill. Boston: Bedford/St. Martin’s, 2009. 13-18.

Using data collected from over 2400 students attending 24 geographically, ethnically, and culturally diverse Georgia high schools, Veal and Hudson argue that holistically-scored essays allow for the best balance of validity, reliability, and cost-effectiveness in writing assessments. The students took different kinds of national achievement tests which scored their essays using holistic, analytic, primary-trait, or mechanics-based assessments, and the researchers detailed the validity, reliability, and costs associated with the different measures.

Quotable Quotes

“The consideration of the user at this point becomes whether the cost or the face validity of a direct assessment of writing is more critical” (18)

Notable Notes

1983 publication, before portfolios came on the scene, doesn’t address larger questions of ethics or the nature of validity

Create a free website or blog at WordPress.com.