I am delighted to be here for this session; Chris Jackson, CWRA Program Director, and I have been corresponding since Fall, and regular readers of my blog know I have frequently touted the virtues of CWRA, the College and Work Readiness Assessment.    CWRA first came to my attention last winter, when as a Klingenstein Visiting Fellow I was told of it by a fellow fellow Mark Desjardins, Head of Holland Hall in Oklahoma.     But it vaulted further upwards in my awareness upon my reading of Tony Wagner’s Global Achievement Gap  In it, Wagner raves about CWRA as a very valuable testing tool to assess the value add our schools are providing our students, measuring their mastery of 21st century skills such as critical thinking and problem solving.

No bones about it: I am on a CWRA promotion crusade (though I have no affiliation).   From what I have learned, CWRA offers powerful assessment of what our students are learning in the most important ways.   One of my very first acts as Head-Elect of St. Gregory College Prep has been to institute the funding for St. Gregory to join the CWRA family, and I am eager to see many more NAIS schools join the flock. 

CWRA is an offshoot, I think it is fair to say, of the College Learning Assessment (CLA), currently used at 250 colleges, and developed by the Council for Aid to Education to measure the value added by colleges in these types of higher order thinking skills.   But to call it an offshoot is not exactly right; CWRA and CLA are one agency, producing and delivering one test, for both college and secondary populations.

The CWRA presentation begins with an overview, energetically presented by the youthful CWRA team, of the data about student critical thinking skills (and the lack therof).  I am a little sorry the room is here is not jam-packed; sadly there are only about 16 of us.  We are evaluating tools for assessment, and doing so with the example of jumping jacks, which are now performed in front of us.   Lively, fun.

Chris is now sharing the CWRA twitter feed, at which you can follow news from them.

CWRA, imho, has carved out the turf on 21st c. learning assessment,  leading the way in measuring critical thinking, analytic reasoning, problem solving, and written communication, in a holistic, open-ended manner.  Students prepare constructed responses, based on real-life scenarios.  The test assess a student’s ability to use information to prepare a response.

Presenters here share a sample performance task from a “retired” test.   The student is working for the Mayor of a town where the key issue is the growth of the crime rate.   Artifacts provided included a newspaper article about a crime, with the robbery suspect being a drug addict, a table of statistics of crime and drug use in the community, a research memo from a think tank about how a drug treatment program influenced crime reduction in another community, a graph of the number of crime incidents and of police officers on the beat, a chart of robberies and drug use in the community by zip code, and a research abstract about drug prevention and crime reduction results on a wider basis than the previous.   Students have to select the right source of data, analyze the data and issues of correlation and causation, and use their problem-solving and writing skills to come up with a memo for they mayor.

Scoring proctors look for whether students can establish a thesis, maintain the thesis, support the thesis with examples, and anticipate and counter opposing arguments, identify logical flaws, evaluate evidence, analyze and synthesize evidence, and draw conclusions.

CWRA is fascinating not just because it measures the right kind of skills, but that it is structured carefully as a value-add measurement.  Students are measured at the beginning of their freshmen year and at the end of senior year, and thus CWRA can gauge how much students grew in their thinking skills over those four years.   An example is provided:  one school may have students start at the 5 level (on a 1-10 scale) and end at 9; another school might begin kids at a 2 and finish them at an 8; to make the point obvious, the second school, though scoring lower, has done a better job of providing their students a value-add education.     (CF Malcolm Gladwell last year at NAIS NYC sounded off very loudly on this topic: we should rank colleges by their value-add, not the performance in the abstract!)

One nice feature about this CWRA session is the smarts of the people working here– listening to them speak about sample size, value-add assessments, and other topics is a real pleasure.   The CWRA team is really a joint team, working as one unit on both CWRA and CLA, the College Learning Assessment; it is a team that has to work closely with college administrators all over, and to effectively articulate their program for that market, they need to know what they are talking about.    Yes, the CWRA itself is still very small (only 16 schools at present), but it is “drafting” off of the much larger CLA, a draft that makes CWRA an incredible value for secondary schools.

As much as I like to enthuse about CWRA, and I want to heartily commend those NAIS schools already doing it such as Lawrenceville, Holland Hall, Hotchkiss, St. Andrews,  Montclair-Kimberley, and Severn (among others), I need to add that there are still important issues to consider.   CWRA uses for its student performance assessments the same as what CLA uses for college students.  These are challenging tasks which CWRA demands, and it worries me that this might be awfully onerous for some of our 14 year old freshmen; I would hate for our kids to be too demoralized or discouraged by the experience.  I am feeling the wish, requiring more thought, that CWRA will eventually evolve so as to separate itself from CLA, or at least develop a more independent identity and test, in order to alleviate this problematic issue and to better serve the specific needs of secondary schools.   I should add that the three fine CWRA/CLA staff members say that they are confident that they are not getting negative feedback from CWRA schools about this problem, and that they are scoring the tests with a rubric sophisticated and thorough enough that they can still effectively measure ninth grade achievement.