Pam Moran and I shared this and facilitated this conversation today at educon: our thanks to the attendees for the rich and meaningful conversation.
October 19, 2010
Performance Task Assessment and Teaching: Learning from Chun and CLA/CWRA
Posted by Jonathan Martin under Uncategorized | Tags: Assessment, CWRA, perf/task/assmt |[4] Comments
CWRA and CLA ought not be only tools for assessing educational effectiveness; they must also be tools for informing improved learning of what they assess, and ultimately, we need to close the gap and link them back to effective teaching for effective learning of higher order thinking skills.
It is an empty exercise to assess student learning without providing a means to adjust teaching in response to deficiencies revealed through the information gleaned from that assessments.
So argues Marc Chun, director of education for the CWRA and CLA host, Council for Aid to Education (CAE), in an article published in Change magazine: Taking Teaching to Performance Task: Linking Pedagogical and Assessment Practices. (Unfortunately, the article is behind a pay-wall, but someone posted it here). Marc runs a program called CLA in the Classroom, and offers workshops around the country.
The critical thing for Marc and his team is to encourage institutions using the Collegiate Learning Assessment not just to measure overall performance, but to connected with teaching.
Assessment should align with student learning objectives so that what faculty are teaching maps directly into what is being assessed. However, a way to achieve even closer alignment is to seek convergence between pedagogical practice and assessment tools: in other words, for an institution to teach and assess in the same way. Teaching and assessment– so often seen at odds- instead become coterminous.
To make the learning even more powerful, this can also be done so that both the teaching and assessment mimic how the skills or knowledge will eventually be used. (more…)
September 29, 2010
Secretary Duncan’s Call for Assessment 2.0: Two Cheers and a Caveat
Posted by Jonathan Martin under Uncategorized | Tags: Assessment, CWRA, Data, MAP |1 Comment
Last week I spoke at the US Education Department about three “next generation” assessments which I believe can really expand and improve the way we assess and improve learning in our schools; what I didn’t entirely realize is that only two weeks earlier, Secretary Duncan himself had also spoken about next-gen assessments, and our remarks are remarkably aligned.
I have criticized, and many, many of my friends in the Twitter/blogosphere have attacked, Secretary Duncan’s perpetuation of NCLB’s unwarranted, narrow, student soul-deadening, and distorting use and even abuse of fill-in-the bubble, standardized multiple choice tests of basic skills. I don’t think we should necessarily eliminate these tests altogether, but I very deeply believe they must be supplemented extensively by tests which far more authentically assess the development of the higher order thinking skills such as analytic reasoning and innovative problem-solving, things that a bubble test can never capture.
I also believe, and also spoke about at the Ed Department, that when and where we do use multiple choice tests to evaluate more basic skills in math and reading, we should do so in computer adapted methodologies that configure themselves quickly to students actual skill level, inform us far more acutely about students’ proficiencies, and provide that information to teachers immediately, in real-time.
These two points are almost exactly parallel to what Secretary Duncan called for in his Assessment 2.0 speech: Beyond the Bubble Test: Next Generation Assessments on September 2. It was also written about in the New York Times on September 4, US Asks Educators to Reinvent Tests, and How they are Given. (For the record, I published my first piece on this same topic on August 4.) I have been using already the term “next-generation” assessments, but I also appreciate the term Assessment 2.0, and I think readers here can expect to see it appear frequently in the future.
Duncan’s speech, which I extensively quote below, celebrates emphatically that we will in the future have authentic assessments which measure higher order thinking skills, and that we will have computer adapted testing which provides real-time assessments of basic skills. He calls upon us to celebrate these valuable, wonderful steps “beyond the bubble tests,” and I join his celebration in offering two cheers.
However, he also says again and again that these will be coming in the future “for the first time,” and that they will not be available until 2014. Hence my singular caveat and objection to his remarks, and I want to say this loudly (!), is that both these kinds of “next generation” assessments already exist, in the form of the CWRA and the MAP from NWEA. I should also point out that I believe St. Gregory is the only school in the country, (or if I am wrong, one of no more than a dozen at most), public or private, that is performing currently both of these assessments. (more…)
September 22, 2010
Aligning Data with Mission: US Dept. of Ed. Presentation, Sept. 22
Posted by Jonathan Martin under Uncategorized | Tags: CWRA, Data, HSSSE, MAP |1 Comment
High School Survey of Student Engagement (HSSSE)
NWEA’ Measurement of Academic Progress (MAP)
College and Work Readiness Assessment (CWRA)
NAIS Monograph: Student Outcomes that Measure School’s Value Added
St. Gregory students discuss the CWRA: Long version– click on More. (more…)
September 16, 2010
St. Greg’s CWRA report: We are very pleased
Posted by Jonathan Martin under Uncategorized | Tags: CWRA, St. Gregory |Leave a Comment
Regular readers here know I very frequently write about my enthusiasm for the CWRA; one of my most viewed posts is the video I produced of my students discussing how much they enjoyed taking the test.
Now we’ve received our first “institutional report regarding our student performance on the CWRA College Work Readiness Assessment), and we are happy to share it here for any and all interested. It is thirty plus page. The report itself is largely narrative and explanations; the key data are on pages six through eight. Nowhere in the insitutional report are individual student names, or individual student scores, reported.
Our Upper School Head, Susan Heintz, and I have worked hard to generate from this report several key takeaways.
Thirty-three St. Gregory seniors (class of 2010) and forty-seven freshmen (class of 2013) completed the assessment during the school year 2009-10. Forty-nine high schools participated, with a total of 3,332 seniors and 1,775 freshmen. (Not all schools and students were included in all analyses.) Freshmen at 153 colleges took the CLA, the college equivalent of the CWRA, providing the population for data comparisons.
CWRA Mean Scores
50th percentile performance | Collegefreshmen | StGfreshmen | StGseniors |
Mean Raw Score | 1070 | 1107 | 1235 |
Mean Percentile (compared to college freshmen) | 50
[by definition] |
67 | 97 |
September 14, 2010
Responding to Alfie Kohn’s Turning Children into Data
Posted by Jonathan Martin under Uncategorized | Tags: CWRA, Data, HSSSE, MAP |[2] Comments
“Not everything that counts can be counted, and not everything that can be counted counts.” Albert Einstein
I greatly admire Alfie Kohn. I have read him since the eighties; I have only heard him speak once, over a decade ago, but it was unforgettable. I am grateful for his piece in a recent edweek, “Turning Children into Data: A Skeptic’s Guide to Assessment Programs.”
And yet, I wish for more. The quote Kohn provides from Einstein atop his piece (provided atop my piece) doesn’t call for ignoring all data; it declares that there is indeed data that does and should count, but that we need to be choosy and skeptical about what data we use and how we use it. Unlike Einstein’s quote, in Kohn’s piece there is not a single thing which he suggests should count; it only about what we shouldn’t count. As a result, it strikes me he has misappropriated Einstein for his purpose.
As regular readers here know, I think we need to be serious about setting our goals as an educational institution, and then identify what we things we should count to measure our progress, hold ourselves accountable for success, and most importantly, guide our continuous improvement. I think too that what we choose to measure sends signals, to teachers, to parents, and to students, about what is most important, and we should be intentional about the signals we send. (more…)