Delighted today to have the opportunity to share these slides and thoughts with folks here at OESIS today.  I continue to think that using technologies, current and emerging, to reinvent testing and assessment is among the primary projects for 21st century K-12 learning in the current decade, and I’m going to continue to do my best to support this reinvention.

As I explained at some length in the opening of my session, and I realize I may stand a bit alone here, I still love tests– of all kinds, including the “test” that is asking students to demonstrate their learning in challenging ways– and a huge part of my personal mission is to make testing more engaging and meaningful for students: let’s improve the way we use assessment as, for, and of learning! 

I spent most of the day yesterday working with our fine Middle school head, Heather Faircloth, preparing the presentation above for last evening’s program about our use of MAP, the Measures of Academic Progress.  This is the tool we use for standardized testing, and which we are administering three times a year to our students in grades six, seven, and eight.

Often I write about our work to enrich our students with leadership and innovation education, our focus on higher order thinking skills, our advisory programs, project-based learning, and academic extracurriculars.  But we never forget that all of this stands upon the bedrock of a very serious and strong foundation of core academic skills, the skills assessed in the MAP testing.

Until very recently here, standardized testing was administered only one time a year, in a one-size-fits all, uniform, paper and pencil, bubble test, the results of which came only months later and which were promptly filed, with only a very little amount of attention given to them, and even at that, very few resources available to make a good use of the results.  As Mrs. Faircloth explained last night, and as you can see in the above,  our use of MAP is different in nearly every way from the previous practice.

MAP is administered three times a year with results received nearly immediately; over the course of nine test segments, it creates a motion picture of a student’s learning in progress, rather than a static snapshot once a year.  It is a computer-adaptive assessment, meaning it quickly conforms itself to the student’s individual learning level, and then gives a far closer and clearer view of that student’s individual proficiencies and areas of proximate growth, things which are unique to every students. (more…)

I enjoyed greatly a six hour session yesterday with NAIS President Pat Bassett; he spoke on critical trends facing our industry and on generative questions framing the “School of the Future.”

I have, for this post, picked out seven of the trends he discussed, summarizing his points and offering a small response to each.

1. Marketing and Communicating Value: Competition among school sectors (public, charter, private, independent, on-line, home schooling) will only continue to intensify.   To flourish, NAIS  schools will need to seek and gain a larger market share of a declining market: we must work harder to demonstrate value,  to make the case for quality education of our kind, and we must discover and distribute a sticky message.    PB suggests one possible message, to the many consumers who are struggling with the costs associated with quality, national caliber independent education:  You Can’t Afford Not To Afford an Independent School! Many, many high school graduates are attending college, PB points out, but far, far fewer succeed brilliantly in college, graduate, and go on to grad school successfully– but NAIS grads do, in high numbers.     The investment is worth it; independent school grads succeed in university in unparalleled proportions, Bassett argues.     (more…)

Resources and Links below, or after the jump.

Some comments:

1.  As much as the audience seemed to appreciate my presentation, some who were there, and some who were not, expressed and/or felt that the topic wasn’t ideally suited for an audience of  ed. technologists and librarians– because they are not often involved enough in the decision-making about assessment and measurement of learning.  (And I am apologetic for my topic having been a little off-base for some attendees).

However, there was definitely interest in some areas of the talk:  the topic of computer adaptive assessment as exemplified by MAP, for one.   Some asked me why, when the technology for computer adaptive assessment has been available for years, why it is only now coming on-line (or, according to Sec. Duncan, won’t be available until 2014).  I didn’t know the answer, but others in the audience speculated that it might be because the hardware hasn’t been available in the classroom to exploit computer adaptive assessment software until now.  There was also an illuminating conversation among attendees about new tools via Moodle for teachers to design their own computer adaptive testing, which was fascinating to me.  (more…)

Our middle school has launched a new pilot project using computer adaptive testing, the Measurement of Academic Progress, from NWEA, the Northwest Evaluation Association.

We have administered it only once now, in September, to sixth and seventh grades, and our MS teachers are beginning to use the results we have gotten,and consider how they can best inform and improve learning for our students.

Regular readers know that this blog frequently advocates best practices in next-generation assessments.   I will use this space to update you regularly about our progress.

Last week I spoke at the US Education Department about three “next generation” assessments which I believe can really expand and improve the way we assess and improve learning in our schools; what I didn’t entirely realize is that only two weeks earlier, Secretary Duncan himself had also spoken about next-gen assessments, and our remarks are remarkably aligned.

I have criticized, and many, many of my friends in the Twitter/blogosphere have attacked, Secretary Duncan’s perpetuation of NCLB’s unwarranted, narrow,  student soul-deadening, and distorting use and even abuse of fill-in-the bubble, standardized multiple choice tests of basic skills.   I don’t think we should necessarily eliminate these tests altogether, but I very deeply believe they must be supplemented extensively by tests which far more authentically assess the development of the higher order thinking skills such as analytic reasoning and innovative problem-solving, things that a bubble test can never capture.

I also believe, and also spoke about at the Ed Department, that when and where we do use multiple choice tests to evaluate more basic skills in math and reading, we should do so in computer adapted methodologies that configure themselves quickly to students actual skill level, inform us far more acutely about students’ proficiencies, and provide that information to teachers immediately, in real-time.

These two points are almost exactly parallel to what Secretary Duncan called for in his Assessment 2.0 speech: Beyond the Bubble Test: Next Generation Assessments on September 2.  It was also written about in the New York Times on September 4, US Asks Educators to Reinvent Tests, and How they are Given.   (For the record, I published my first piece on this same topic on August 4.)  I have been using already the term “next-generation” assessments, but I also appreciate the term Assessment 2.0, and I think readers here can expect to see it appear frequently in the future.

Duncan’s speech, which I extensively quote below, celebrates emphatically that we will in the future have authentic assessments which measure higher order thinking skills, and that we will have computer adapted testing which provides real-time assessments of basic skills.   He calls upon us to celebrate these valuable, wonderful steps “beyond the bubble tests,” and I join his celebration in offering two cheers.

However, he also says again and again that these will be coming in the future “for the first time,” and that they will not be available until 2014.   Hence my singular caveat and objection to his remarks, and  I want to say this loudly (!), is that both these kinds of “next generation” assessments already exist, in the form of the CWRA and the MAP from NWEA.   I should also point out that I believe St. Gregory is the only school in the country, (or if I am wrong, one of no more than a dozen at most), public or private, that is performing currently both of these assessments. (more…)

High School Survey of Student Engagement (HSSSE)

NWEA’ Measurement of Academic Progress (MAP)

College and Work Readiness Assessment (CWRA)

NAIS Monograph: Student Outcomes that Measure School’s Value Added

St. Gregory students discuss the CWRA: Long version– click on More. (more…)

Not everything that counts can be counted, and not everything that can be counted counts.” Albert Einstein

I greatly admire Alfie Kohn.  I have read him since the eighties; I have only heard him speak once, over a decade ago, but it was unforgettable.   I am grateful for his piece in a recent edweek, “Turning Children into Data: A Skeptic’s Guide to Assessment Programs.”

And yet, I wish for more.  The quote Kohn provides from Einstein atop his piece (provided atop my piece) doesn’t call for ignoring all data; it declares that there is indeed data that does and should count, but that we need to be choosy and skeptical about what data we use and how we use it.   Unlike Einstein’s quote, in Kohn’s piece there is not a single thing which he suggests should count; it only about what we shouldn’t count.  As a result, it strikes me he has misappropriated Einstein for his purpose.

As regular readers here know, I think we need to be serious about setting our goals as an educational institution, and then identify what we  things we should count to measure our progress, hold ourselves accountable for success, and most importantly, guide our continuous improvement.   I think too that what we choose to measure sends signals, to teachers, to parents, and to students, about what is most important, and we should be intentional about the signals we send.   (more…)

I’m delighted and excited to have been invited recently as a panelist at September’s annual US DoE’s  Office of Non-Public Education-Office of Improvement and Innovation’s Private School Leadership Conference (that is a mouthful).  I’ve been invited to present on the topic of “aligning data collection with school mission.”

Regular readers here know I have long used this forum to advocate for the College and Work Readiness Assessment (CWRA) and the High School Survey of Student Engagement (HSSSE); speaking at this event to an audience of influential private school educators and association executives will give me a great opportunity to carry forward my advocacy.

It is also a chance to think more, more thoroughly, more deeply about the panel’s helpful title and framing. I hadn’t myself focussed squarely enough until now  upon the simple but elegant and critically important concept of “aligning data collection with school mission,” but that is of course exactly what I am circling around and trying to get more fully in focus.  I will be developing my remarks and presentation over the next few weeks, and I will certainly share/post it here, but here is a first stab at summarizing my thesis:

Most of us who are leading in private and independent education place high priority, in our educational missions and throughout our school cultures, upon three core goals:

  • upon delivering and achieving personalized and differentiated teaching and learning which has a significant and positive impact improving the educational progress of individual learners of a wide range of abilities, maintaining a focus upon the individual and not the mass of learners;
  • upon forging and sustaining a connected community of engaged, active, intrinsically motivated, extracurricularly involved, technologically employing, hard-working learners; (more…)