As Jamie Reverb and Lee Burns write in this fine and valuable piece, among the most important in NAIS Independent School magazine in recent years, it is not enough any longer to discuss what might be aspects of 21st century learning, it is time to “actually start being 21st century schools.”

There’s a tendency to continue to do school as usual — tweaking things, rather than embracing serious and necessary innovation.

The authors organize their article, and their work at Lee’s school, Presbyterian Day School (PDS), around Google’s Nine Principles of Innovation.

Ideas Come from Everywhere.

At PDS, they are cracking open the walls of their campus to engage with and study cool developments happening all over the world.   For me, the most important suggestion here is what they are doing with faculty meetings:

At PDS, we have restructured faculty meetings and retreats so that the focus is far less on logistics and far more on provocative questions that engage all of us in discussions.

We are working to reinvent faculty meetings too, at St. Gregory, in these same ways– both in full sessions, and in our incredibly valuable Critical Friends Groups, where there are many “provocative questions” being pursued.

Share everything you can

The argument here is for transparency in schools, and that we find every way we can to put ideas and actions out into the ether to be seen and considered.

Schools are siloed geographically with their egg-carton designs and siloed psychologically with their role-specific emphasis.

Knock down the opaque walls, I frequently call for, literally and metaphorically.   I adore our series of classrooms with glass walls along the walkways because of the signal that they send that learning is visible at St. Gregory.   (more…)

Resources and Links below, or after the jump.

Some comments:

1.  As much as the audience seemed to appreciate my presentation, some who were there, and some who were not, expressed and/or felt that the topic wasn’t ideally suited for an audience of  ed. technologists and librarians– because they are not often involved enough in the decision-making about assessment and measurement of learning.  (And I am apologetic for my topic having been a little off-base for some attendees).

However, there was definitely interest in some areas of the talk:  the topic of computer adaptive assessment as exemplified by MAP, for one.   Some asked me why, when the technology for computer adaptive assessment has been available for years, why it is only now coming on-line (or, according to Sec. Duncan, won’t be available until 2014).  I didn’t know the answer, but others in the audience speculated that it might be because the hardware hasn’t been available in the classroom to exploit computer adaptive assessment software until now.  There was also an illuminating conversation among attendees about new tools via Moodle for teachers to design their own computer adaptive testing, which was fascinating to me.  (more…)

Our middle school has launched a new pilot project using computer adaptive testing, the Measurement of Academic Progress, from NWEA, the Northwest Evaluation Association.

We have administered it only once now, in September, to sixth and seventh grades, and our MS teachers are beginning to use the results we have gotten,and consider how they can best inform and improve learning for our students.

Regular readers know that this blog frequently advocates best practices in next-generation assessments.   I will use this space to update you regularly about our progress.

Our 2nd HSSSE (High School Survey of Student Engagement) report has arrived, and we are delighted about our results.   We are one of only three AZ schools which administers the HSSSE because we are serious about our students’ engagement in learning.  Our students continue to outpace the national norms in every category of engagement in learning.  Our students report double the national HSSSE population, for instance, in often writing papers of greater than five pages, in often receiving helpful feedback on assignments, and in the school’s emphasizing analyzing ideas in depth.

Two notes on the slides above:  1. The graphs are not easy to read; to view more clearly, click on menu in the lower left hand corner, then “view full screen.”  2. Because the all-school results changed very little from 2009-2010, we kept the graphs simpler and less cluttered by using the 2009 all-HSSSE  school results as the baseline.

However, in several of the data points we track, our school-wide results declined from 2009 to 2010.  After careful review and scrutiny, we believe this is primarily due to the school’s population having reconfigured quite significantly from 2009 to 2010:  the population surveyed went from only 19% freshmen in 2009 to 33% in 2010, and so in  areas where 9th graders are less likely to respond positively (as an example, 9th graders are likely to experience less opportunity to have a  “voice in the classroom” because our 9th grade classes are more content-driven and more lecture-0riented), our overall numbers declined.  Another example is that 9th graders are not asked as often to write papers of more than five pages in length, so our results in that area declined.  This is not to say we will not give ongoing attention to these areas, or that we will not give continued attention to improving them.  We will.

A very positive result for us is in the kind of discussions we are facilitating.   A year ago we gave particular attention to this question:  How Often Have you Discussed questions in class that have no clear answers? (Slides 5 and 6) While other school’s students reported they did so often or sometimes 72% of the time, our students in 2009 reported that to be the case 82% of the time.    In reviewing it last year, we discussed the importance of having students do so; this also came up repeatedly in our discussions as a faculty last fall in reviewing our 2009 summer reading, Tony Wagner’s Global Achievement Gap.   The new results are in, and students answering often or sometimes soared to 92%.  (Those answering “often” remained even at 45%, so there is still room for improvement there). (more…)

Last week I spoke at the US Education Department about three “next generation” assessments which I believe can really expand and improve the way we assess and improve learning in our schools; what I didn’t entirely realize is that only two weeks earlier, Secretary Duncan himself had also spoken about next-gen assessments, and our remarks are remarkably aligned.

I have criticized, and many, many of my friends in the Twitter/blogosphere have attacked, Secretary Duncan’s perpetuation of NCLB’s unwarranted, narrow,  student soul-deadening, and distorting use and even abuse of fill-in-the bubble, standardized multiple choice tests of basic skills.   I don’t think we should necessarily eliminate these tests altogether, but I very deeply believe they must be supplemented extensively by tests which far more authentically assess the development of the higher order thinking skills such as analytic reasoning and innovative problem-solving, things that a bubble test can never capture.

I also believe, and also spoke about at the Ed Department, that when and where we do use multiple choice tests to evaluate more basic skills in math and reading, we should do so in computer adapted methodologies that configure themselves quickly to students actual skill level, inform us far more acutely about students’ proficiencies, and provide that information to teachers immediately, in real-time.

These two points are almost exactly parallel to what Secretary Duncan called for in his Assessment 2.0 speech: Beyond the Bubble Test: Next Generation Assessments on September 2.  It was also written about in the New York Times on September 4, US Asks Educators to Reinvent Tests, and How they are Given.   (For the record, I published my first piece on this same topic on August 4.)  I have been using already the term “next-generation” assessments, but I also appreciate the term Assessment 2.0, and I think readers here can expect to see it appear frequently in the future.

Duncan’s speech, which I extensively quote below, celebrates emphatically that we will in the future have authentic assessments which measure higher order thinking skills, and that we will have computer adapted testing which provides real-time assessments of basic skills.   He calls upon us to celebrate these valuable, wonderful steps “beyond the bubble tests,” and I join his celebration in offering two cheers.

However, he also says again and again that these will be coming in the future “for the first time,” and that they will not be available until 2014.   Hence my singular caveat and objection to his remarks, and  I want to say this loudly (!), is that both these kinds of “next generation” assessments already exist, in the form of the CWRA and the MAP from NWEA.   I should also point out that I believe St. Gregory is the only school in the country, (or if I am wrong, one of no more than a dozen at most), public or private, that is performing currently both of these assessments. (more…)

High School Survey of Student Engagement (HSSSE)

NWEA’ Measurement of Academic Progress (MAP)

College and Work Readiness Assessment (CWRA)

NAIS Monograph: Student Outcomes that Measure School’s Value Added

St. Gregory students discuss the CWRA: Long version– click on More. (more…)

Susan Engels strikes another positive blow for reasonable discourse in thinking about education in her piece entitled Scientifically Tested Tests ; she is great asset for those of us trying to chart the right middle course.   In contrast to Alfie Kohn on the left, and, let’s say, Secretary Duncan on the right, she holds and articulates the thoughtful centrist position that this blog is seeking for regularly.

By shifting our assessment techniques, we would learn more of what we really need to know about how children, teachers and schools are doing. And testing could be returned to its rightful place as one tool among many for improving schools, rather than serving as a weapon that degrades the experience for teachers and students alike.

This centrist course is not one of obsessively testing with multiple choice scantrons; it is also not about summarily judging and condemning schools and teachers on narrow bases.

there are few indications that the multiple-choice format of a typical test, in which students are quizzed on the specific formulas and bits of information they have memorized that year, actually measures what we need to know about children’s education.

These tests are easy to administer, but to what return?  I am not the first to say that frequent multiple choice testing reminds us of the old joke of looking for the keys under the lightpost; they were dropped in the dark, but the man looks under the light because he can’t (easily) see in the dark.

There are flashlights; let’s use them.    Engels, whom I have praised here before, offers interesting suggestions about authentic assessments that do get at what we really want students to learn.

we should come up with assessments that truly measure the qualities of well-educated children: the ability to understand what they read; an interest in using books to gain knowledge; (more…)

Not everything that counts can be counted, and not everything that can be counted counts.” Albert Einstein

I greatly admire Alfie Kohn.  I have read him since the eighties; I have only heard him speak once, over a decade ago, but it was unforgettable.   I am grateful for his piece in a recent edweek, “Turning Children into Data: A Skeptic’s Guide to Assessment Programs.”

And yet, I wish for more.  The quote Kohn provides from Einstein atop his piece (provided atop my piece) doesn’t call for ignoring all data; it declares that there is indeed data that does and should count, but that we need to be choosy and skeptical about what data we use and how we use it.   Unlike Einstein’s quote, in Kohn’s piece there is not a single thing which he suggests should count; it only about what we shouldn’t count.  As a result, it strikes me he has misappropriated Einstein for his purpose.

As regular readers here know, I think we need to be serious about setting our goals as an educational institution, and then identify what we  things we should count to measure our progress, hold ourselves accountable for success, and most importantly, guide our continuous improvement.   I think too that what we choose to measure sends signals, to teachers, to parents, and to students, about what is most important, and we should be intentional about the signals we send.   (more…)

I’m delighted and excited to have been invited recently as a panelist at September’s annual US DoE’s  Office of Non-Public Education-Office of Improvement and Innovation’s Private School Leadership Conference (that is a mouthful).  I’ve been invited to present on the topic of “aligning data collection with school mission.”

Regular readers here know I have long used this forum to advocate for the College and Work Readiness Assessment (CWRA) and the High School Survey of Student Engagement (HSSSE); speaking at this event to an audience of influential private school educators and association executives will give me a great opportunity to carry forward my advocacy.

It is also a chance to think more, more thoroughly, more deeply about the panel’s helpful title and framing. I hadn’t myself focussed squarely enough until now  upon the simple but elegant and critically important concept of “aligning data collection with school mission,” but that is of course exactly what I am circling around and trying to get more fully in focus.  I will be developing my remarks and presentation over the next few weeks, and I will certainly share/post it here, but here is a first stab at summarizing my thesis:

Most of us who are leading in private and independent education place high priority, in our educational missions and throughout our school cultures, upon three core goals:

  • upon delivering and achieving personalized and differentiated teaching and learning which has a significant and positive impact improving the educational progress of individual learners of a wide range of abilities, maintaining a focus upon the individual and not the mass of learners;
  • upon forging and sustaining a connected community of engaged, active, intrinsically motivated, extracurricularly involved, technologically employing, hard-working learners; (more…)