Another Sputnik, it was called last week: the latest OECD results were released, and Shanghai schools topped the list, with the US far down the ladder.    This is disappointing, to say the least, and we in the US should indeed be deeply concerned.

But let’s be clear about what we should be concerned.   Readers of the New York Times article, for instance, (and probably that source is the most common source), might not have the opportunity to recognize and appreciate what is really being tested in the PISA until the very last sentence.

I fear NY Times readers might read carefully only through to the quote from Secretary Duncan: ““The United States came in 23rd or 24th in most subjects. We can quibble, or we can face the brutal truth that we’re being out-educated.”  In doing so, they might think that PISA is a conventional bubble test of “basic skills,” and that what Duncan is suggesting we take from it is that we need more NCLB type teaching and basic skill development, because unfortunately that is what Duncan department of Education has become known for: NLCB on steroids.

(I don’t think this is an entirely fair characterization of Duncan’s leadership and vision, but it is what has become the connotative representation of his administration thus far.)

This would be wrong. In fact, the PISA results suggest quite the opposite: we need to break away from the curricular narrowing effects of conventional standardized testing, basic skills emphasis, and rote memorization, and unleash in our schools a revolution of applied problem-solving to real-world situations. (more…)

Resources and Links below, or after the jump.

Some comments:

1.  As much as the audience seemed to appreciate my presentation, some who were there, and some who were not, expressed and/or felt that the topic wasn’t ideally suited for an audience of  ed. technologists and librarians– because they are not often involved enough in the decision-making about assessment and measurement of learning.  (And I am apologetic for my topic having been a little off-base for some attendees).

However, there was definitely interest in some areas of the talk:  the topic of computer adaptive assessment as exemplified by MAP, for one.   Some asked me why, when the technology for computer adaptive assessment has been available for years, why it is only now coming on-line (or, according to Sec. Duncan, won’t be available until 2014).  I didn’t know the answer, but others in the audience speculated that it might be because the hardware hasn’t been available in the classroom to exploit computer adaptive assessment software until now.  There was also an illuminating conversation among attendees about new tools via Moodle for teachers to design their own computer adaptive testing, which was fascinating to me.  (more…)

Our middle school has launched a new pilot project using computer adaptive testing, the Measurement of Academic Progress, from NWEA, the Northwest Evaluation Association.

We have administered it only once now, in September, to sixth and seventh grades, and our MS teachers are beginning to use the results we have gotten,and consider how they can best inform and improve learning for our students.

Regular readers know that this blog frequently advocates best practices in next-generation assessments.   I will use this space to update you regularly about our progress.