Regular readers here know I very frequently write about my enthusiasm for the CWRA; one of my most viewed posts is the video I produced of my students discussing how much they enjoyed taking the test.
Now we’ve received our first “institutional report regarding our student performance on the CWRA College Work Readiness Assessment), and we are happy to share it here for any and all interested. It is thirty plus page. The report itself is largely narrative and explanations; the key data are on pages six through eight. Nowhere in the insitutional report are individual student names, or individual student scores, reported.
Our Upper School Head, Susan Heintz, and I have worked hard to generate from this report several key takeaways.
Thirty-three St. Gregory seniors (class of 2010) and forty-seven freshmen (class of 2013) completed the assessment during the school year 2009-10. Forty-nine high schools participated, with a total of 3,332 seniors and 1,775 freshmen. (Not all schools and students were included in all analyses.) Freshmen at 153 colleges took the CLA, the college equivalent of the CWRA, providing the population for data comparisons.
CWRA Mean Scores
50th percentile performance | Collegefreshmen | StGfreshmen | StGseniors |
Mean Raw Score | 1070 | 1107 | 1235 |
Mean Percentile (compared to college freshmen) | 50
[by definition] |
67 | 97 |
St. Gregory seniors had an “expected” mean raw score of 1185 based on a) their SAT/ACT scores and b) the estimated relationship between CWRA scores and SAT/ACT scores of college. The “observed” or actual mean raw score, seen on the table above, was 1235, indicating our students demonstrated higher order thinking skills proficiency considerably above what would be expected from their SAT measured academic skill set.
Decile rank of St. Gregory seniors compared to other CWRA seniors
Unadjusted | Adjusted for ability* |
10th | 8th |
* ability measured by the Scholastic Level exam, given when students take the CWRA
Increase from Freshmen to Seniors
St. Gregory seniors | |
Mean Raw Score Points Increase | 128 |
Percentile Increase, on the College Freshman scale | 30 percentile points
(67th to 97th) |
Median Standard Deviation Increase | .92 (compared to .51 at other high schools) |
Several notes and observations.
1. I am delighted; I think our students performed beautifully. Not only can we proudly celebrate that our 12th graders performed (at their median) at a thinking level equal the 93rd percentile of college freshmen, we can also say that our teachers and educational program added an enormous value to this thinking proficiency, much, much more so than do the norm of other CWRA schools.
2. We will want to track results year to year, and of course, the big picture, long term project would be to seek to do even better in future years: to raise the college readiness score from the 93rd percentile of college freshmen, for instance, to the 95th. That said, things are always complicated: we know that the higher performance is at the baseline, the harder it is to raise in the future.
We also would want to improve (even further!) our success at improving student proficiency from 9th to 12th grades: to see those 26 percentile points become 30, for instance. There is a glitch, though, at least at present, in comparing freshmen to senior performance: we are not comparing the same cohort groups, but different groups of kids.
Looking back to last year, we can say as a general rule that the ninth and twelfth grades were both perceived to be strong cohort groups; looking ahead to this year, we note that though our current twelfth grade certainly has several or many outstanding individual students (truly so!), as a whole group this smaller than average class is not an outstanding one; our new ninth grade, however, appears to be quite strong. Hence, we can anticipate a fairly high possibility that the positive value-add effect in the 2011 testing will actually decline when compared to the 2010 positive value-add.
3. For the results to be more actionable for us, we need more information. This layer of information seems to suggest we did well, but that doesn’t help us to know where to go forward with the project.
I am working on two fronts to address this problem. First, we need more detailed information from CWRA about how our students performed: we need to know how well they performed in analytic reasoning as compared to problem-solving as compared to written expression. Fortunately, CWRA folks assure me that in the very next testing cycle, beginning next month, they will be reporting exactly this new breakdown detail to schools.
Second, I need norm groups for comparison that are exclusively high performing academic prep schools like our school. I have taken a lead in the last few months in working with staff at NAIS to form exactly this: an NAIS cohort group of CWRA administering schools with its own norms for more nuanced comparison; I am hoping and expecting it will be in place a year from now. CWRA, the agency, has been very cooperative with us in this project. Of course, this requires more than just the norm group establishment; it also requires more NAIS schools to participate, and this is something I am also working toward, presenting a session about CWRA and advocating for it at the NAIS annual conference in February.
Other blog posts of mine for more information about CWRA:
CWRA: A True 21st c. Learning Assessment
Excellent CWRA Info. Session at NAIS
2 Articles from Hersh further explaining the CWRA
Leave a Reply