Pam Moran and I shared this and facilitated this conversation today at educon: our thanks to the attendees for the rich and meaningful conversation.

Above are the slides for my presentation today to the Association of Colorado Independent Schools Heads and senior administrators, a three hour workshop.   Sadly, I (again!) made the mistake of trying to stuff in too much information, and several of our intended activities and videos had to be cut.

Below are first, some of the key links to think to which I referred, and below that, some of the videos I showed or intended to show as part of the presentation.

A very valuable reference is the excellent presentation, Measuring What We Value by Lyons and Niblock.   (Note: I borrowed/adapted a small number of these slides for integration into my presentation: My thanks to Lyons and Niblock. )

My thanks to Lee Quinby, the very fine ACIS executive director, and all those who took the time for the session this morning.

Links:

genericcovercwra.ai-133940

  • Far more detailed institutional reports–
  • student level scoring validity
  • possibility of improved student test-taking motivation
  • available for 8th graders now
  • flexible scheduling
  • Lower price– $38
  • Special trial price this spring only $22

Regular readers here know of my interest in, and on balance enthusiasm for, the CWRA– the College Work Readiness Assessment, which I have administered as a school head over the course of three years, presented on about half a dozen times and written about about here a dozen times.

Run, don’t walk, to register for the 30 minute free webinar CAE, CWRA’s parent, is offering this week and next about the forthcoming changes in the CWRA.   If you can’t attend one of these sessions, you’ll do pretty well as an alternate reviewing the five page overview of CWRA changes I’ve embedded at the bottom of this post.  (Be sure to click “more” to see it if you are interested).

As enthusiastic as I’ve been, I’ve also been a gentle critic on the following fronts.

  • The institutional reporting lacks detail and specificity for use in identifying program gaps and targeting institutional improvement.
  • It is too expensive.
  • Students lack motivation to perform because they have no stake in the game– there is no student-level report.
  • There are enough or pertinent norm groups for comparison– particularly in the lack of independent school comps.
  • It doesn’t have enough possible purposes beyond an institutional check on student learning.
  • It isn’t available for middle school students.
  • Is automated scoring of essays proven and reliable enough?

And now, here it is: the new CWRA plus addresses nearly all of these issues.  I feel almost as if they were listening to me.  (Smile) (more…)

Performance Task Assessment, sometimes referred to simply as Performance Assessment, is coming soon in a substantial and significant way to K-12 schooling;  21st century principals and other educational leaders would do well to familiarize themselves with this method and began to make plans for successful integration of this new, alternative format of assessments.

[the following 10 or so paragraphs lay out some background for my “10 Things;” scroll down to the section heading if you want to skip over the background discussion]

President Obama and Secretary Duncan have been assuring us for several years that they will take standardized testing “beyond the bubble,” and both PARCC and Smarter Balanced are working hard at developing new common core assessment using the format of perfomance task assessment.

As PARCC explains,

PARCC is… contracting with [other organizations] to develop models of innovative, online-delivered items and rich performance tasks proposed for use in the PARCC assessments. These prototypes will include both assessment and classroom-based tasks.

Smarter Balanced, meanwhile, states that by 2014-15,

Smarter Balanced assessments will go beyond multiple-choice questions to include performance tasks that allow students to demonstrate critical-thinking and problem-solving skills.

Performance tasks challenge students to apply their knowledge and skills to respond to complex real-world problems. They can best be described as collections of questions and activities that are coherently connected to a single theme or scenario.

These activities are meant to measure capacities such as depth of understanding, writing and research skills, and complex analysis, which cannot be adequately assessed with traditional assessment questions.

Samples of the performance tasks being developed for grades K-8 are available here.   (more…)

Links of interest:

New York Times article about Character report cards: http://www.nytimes.com/2011/09/18/magazine/what-if-the-secret-to-success-is-failure.html

NAIS monograph, Value Add Measurements of Learning: http://www.nais.org/sustainable/article.cfm?ItemNumber=151607

Contact for the PISA Based testing for Schools pilot in Canada: For any questions, please email pisabasedtestforschools@oecd.org or Charles Cirtwill at charlescirtwill@aims.ca


 


My thanks to Sarah Hanawald for the following liveblog transcript of our session Friday at NAIS, where I was joined by CWRA administrator Chris Jackson and Lawrenceville Dean of Faculty (and Klingenstein Curriculum Instructor legend) Kevin Mattingly in presenting on the College Work Readiness Assessment.
At the end of the session, I mentioned my interest in forming a network of folks interested in working to develop a parallel, CWRA-Style assessment for middle school students, (as is being done in an interesting way in the Virginia Beach School District in a program there run by Jared Cotton).    If you are interested in being a part of this network, or being apprised of such activities, please let me know by entering your name and email here.
But first, the session slides:

CWRA Session is packed–folks are standing outside.
11:33
Chris Jackson: Opens with a book reference–Academically Adrift
11:36
Chris Jackson:

The mission, to help schools know how they are doing with what other tests don’t measure. Metrics for the schools.

Subscores on essential areas: critical thinking,  analytical reasoning, Effective writing, and problem-solving. (more…)

As Tony Wagner argues in his essential book, The Global Achievement Gap, I too think that we need to be very concerned that our secondary and college students are not learning what they need to be learning.   We can be deceived: they may go through the motions of learning, and the bright ones (bright from unique combinations of lucky genes, supportive parents/households, and strong K-8 education) may score well enough on the SAT to convey to us we are educating them.   But are we, and how do we know we really are, succeeding in facilitating their development of the essential critical thinking, problem-solving, and writing skills they most need?

Academically Adrift, the new book which I haven’t read but have read several articles about, is about college students, not secondary, but I believe it has compelling information for us.  From the NYTimes article, How Much Do College Students Learn, and Study?:

the authors followed more than 2,300 undergraduates at two dozen universities, and concluded that 45 percent “demonstrated no significant gains in critical thinking, analytical reasoning, and written communications during the first two years of college.” (more…)

Resources and Links below, or after the jump.

Some comments:

1.  As much as the audience seemed to appreciate my presentation, some who were there, and some who were not, expressed and/or felt that the topic wasn’t ideally suited for an audience of  ed. technologists and librarians– because they are not often involved enough in the decision-making about assessment and measurement of learning.  (And I am apologetic for my topic having been a little off-base for some attendees).

However, there was definitely interest in some areas of the talk:  the topic of computer adaptive assessment as exemplified by MAP, for one.   Some asked me why, when the technology for computer adaptive assessment has been available for years, why it is only now coming on-line (or, according to Sec. Duncan, won’t be available until 2014).  I didn’t know the answer, but others in the audience speculated that it might be because the hardware hasn’t been available in the classroom to exploit computer adaptive assessment software until now.  There was also an illuminating conversation among attendees about new tools via Moodle for teachers to design their own computer adaptive testing, which was fascinating to me.  (more…)

CWRA and CLA ought not be only tools for assessing educational effectiveness; they must also be tools for informing improved learning of what they assess, and ultimately, we need to close the gap and link them back to effective teaching for effective learning of higher order thinking skills.

It is an empty exercise to assess student learning without providing a means to adjust teaching in response to deficiencies revealed through the information gleaned from that assessments.

So argues Marc Chun, director of education for the CWRA and CLA host, Council for Aid to Education (CAE), in an article published in Change magazine: Taking Teaching to Performance Task: Linking Pedagogical and Assessment Practices.  (Unfortunately, the article is behind a pay-wall, but someone posted it here).  Marc runs a program called CLA in the Classroom, and offers workshops around the country.

The critical thing for Marc and his team is to encourage institutions using the Collegiate Learning Assessment not just to measure overall performance, but to connected with teaching.

Assessment should align with student learning objectives so that what faculty are teaching maps directly into what is being assessed.  However, a way to achieve even closer alignment is to seek convergence between pedagogical practice and assessment tools: in other words, for an institution to teach and assess in the same way.  Teaching and assessment– so often seen at odds- instead become coterminous.

To make the learning even more powerful, this can also be done so that both the teaching and assessment mimic how the skills or knowledge will eventually be used. (more…)

Last week I spoke at the US Education Department about three “next generation” assessments which I believe can really expand and improve the way we assess and improve learning in our schools; what I didn’t entirely realize is that only two weeks earlier, Secretary Duncan himself had also spoken about next-gen assessments, and our remarks are remarkably aligned.

I have criticized, and many, many of my friends in the Twitter/blogosphere have attacked, Secretary Duncan’s perpetuation of NCLB’s unwarranted, narrow,  student soul-deadening, and distorting use and even abuse of fill-in-the bubble, standardized multiple choice tests of basic skills.   I don’t think we should necessarily eliminate these tests altogether, but I very deeply believe they must be supplemented extensively by tests which far more authentically assess the development of the higher order thinking skills such as analytic reasoning and innovative problem-solving, things that a bubble test can never capture.

I also believe, and also spoke about at the Ed Department, that when and where we do use multiple choice tests to evaluate more basic skills in math and reading, we should do so in computer adapted methodologies that configure themselves quickly to students actual skill level, inform us far more acutely about students’ proficiencies, and provide that information to teachers immediately, in real-time.

These two points are almost exactly parallel to what Secretary Duncan called for in his Assessment 2.0 speech: Beyond the Bubble Test: Next Generation Assessments on September 2.  It was also written about in the New York Times on September 4, US Asks Educators to Reinvent Tests, and How they are Given.   (For the record, I published my first piece on this same topic on August 4.)  I have been using already the term “next-generation” assessments, but I also appreciate the term Assessment 2.0, and I think readers here can expect to see it appear frequently in the future.

Duncan’s speech, which I extensively quote below, celebrates emphatically that we will in the future have authentic assessments which measure higher order thinking skills, and that we will have computer adapted testing which provides real-time assessments of basic skills.   He calls upon us to celebrate these valuable, wonderful steps “beyond the bubble tests,” and I join his celebration in offering two cheers.

However, he also says again and again that these will be coming in the future “for the first time,” and that they will not be available until 2014.   Hence my singular caveat and objection to his remarks, and  I want to say this loudly (!), is that both these kinds of “next generation” assessments already exist, in the form of the CWRA and the MAP from NWEA.   I should also point out that I believe St. Gregory is the only school in the country, (or if I am wrong, one of no more than a dozen at most), public or private, that is performing currently both of these assessments. (more…)

High School Survey of Student Engagement (HSSSE)

NWEA’ Measurement of Academic Progress (MAP)

College and Work Readiness Assessment (CWRA)

NAIS Monograph: Student Outcomes that Measure School’s Value Added

St. Gregory students discuss the CWRA: Long version– click on More. (more…)

Regular readers here know I very frequently write about my enthusiasm for the CWRA; one of my most viewed posts is the video I produced of my students discussing how much they enjoyed taking the test.

Now we’ve received our first “institutional report regarding our student performance on the CWRA College Work Readiness Assessment), and we are happy to share it here for any and all interested.   It is thirty plus page.    The report itself is largely narrative and explanations; the key data are on pages six through eight.   Nowhere in the insitutional report are individual student names, or individual student scores, reported.

Our Upper School Head, Susan Heintz, and I have worked hard to generate from this report several key takeaways.

Thirty-three St. Gregory seniors (class of 2010) and forty-seven freshmen (class of 2013) completed the assessment during the school year 2009-10. Forty-nine high schools participated, with a total of 3,332 seniors and 1,775 freshmen. (Not all schools and students were included in all analyses.)  Freshmen at 153 colleges took the CLA, the college equivalent of the CWRA, providing the population for data comparisons.

CWRA Mean Scores

50th percentile performance Collegefreshmen StGfreshmen StGseniors
Mean Raw Score 1070 1107 1235
Mean Percentile (compared to college freshmen) 50

[by definition]

67 97

(more…)

Not everything that counts can be counted, and not everything that can be counted counts.” Albert Einstein

I greatly admire Alfie Kohn.  I have read him since the eighties; I have only heard him speak once, over a decade ago, but it was unforgettable.   I am grateful for his piece in a recent edweek, “Turning Children into Data: A Skeptic’s Guide to Assessment Programs.”

And yet, I wish for more.  The quote Kohn provides from Einstein atop his piece (provided atop my piece) doesn’t call for ignoring all data; it declares that there is indeed data that does and should count, but that we need to be choosy and skeptical about what data we use and how we use it.   Unlike Einstein’s quote, in Kohn’s piece there is not a single thing which he suggests should count; it only about what we shouldn’t count.  As a result, it strikes me he has misappropriated Einstein for his purpose.

As regular readers here know, I think we need to be serious about setting our goals as an educational institution, and then identify what we  things we should count to measure our progress, hold ourselves accountable for success, and most importantly, guide our continuous improvement.   I think too that what we choose to measure sends signals, to teachers, to parents, and to students, about what is most important, and we should be intentional about the signals we send.   (more…)