Links of interest:

New York Times article about Character report cards:

NAIS monograph, Value Add Measurements of Learning:

Contact for the PISA Based testing for Schools pilot in Canada: For any questions, please email or Charles Cirtwill at


Resources and Links below, or after the jump.

Some comments:

1.  As much as the audience seemed to appreciate my presentation, some who were there, and some who were not, expressed and/or felt that the topic wasn’t ideally suited for an audience of  ed. technologists and librarians– because they are not often involved enough in the decision-making about assessment and measurement of learning.  (And I am apologetic for my topic having been a little off-base for some attendees).

However, there was definitely interest in some areas of the talk:  the topic of computer adaptive assessment as exemplified by MAP, for one.   Some asked me why, when the technology for computer adaptive assessment has been available for years, why it is only now coming on-line (or, according to Sec. Duncan, won’t be available until 2014).  I didn’t know the answer, but others in the audience speculated that it might be because the hardware hasn’t been available in the classroom to exploit computer adaptive assessment software until now.  There was also an illuminating conversation among attendees about new tools via Moodle for teachers to design their own computer adaptive testing, which was fascinating to me.  (more…)

Our 2nd HSSSE (High School Survey of Student Engagement) report has arrived, and we are delighted about our results.   We are one of only three AZ schools which administers the HSSSE because we are serious about our students’ engagement in learning.  Our students continue to outpace the national norms in every category of engagement in learning.  Our students report double the national HSSSE population, for instance, in often writing papers of greater than five pages, in often receiving helpful feedback on assignments, and in the school’s emphasizing analyzing ideas in depth.

Two notes on the slides above:  1. The graphs are not easy to read; to view more clearly, click on menu in the lower left hand corner, then “view full screen.”  2. Because the all-school results changed very little from 2009-2010, we kept the graphs simpler and less cluttered by using the 2009 all-HSSSE  school results as the baseline.

However, in several of the data points we track, our school-wide results declined from 2009 to 2010.  After careful review and scrutiny, we believe this is primarily due to the school’s population having reconfigured quite significantly from 2009 to 2010:  the population surveyed went from only 19% freshmen in 2009 to 33% in 2010, and so in  areas where 9th graders are less likely to respond positively (as an example, 9th graders are likely to experience less opportunity to have a  “voice in the classroom” because our 9th grade classes are more content-driven and more lecture-0riented), our overall numbers declined.  Another example is that 9th graders are not asked as often to write papers of more than five pages in length, so our results in that area declined.  This is not to say we will not give ongoing attention to these areas, or that we will not give continued attention to improving them.  We will.

A very positive result for us is in the kind of discussions we are facilitating.   A year ago we gave particular attention to this question:  How Often Have you Discussed questions in class that have no clear answers? (Slides 5 and 6) While other school’s students reported they did so often or sometimes 72% of the time, our students in 2009 reported that to be the case 82% of the time.    In reviewing it last year, we discussed the importance of having students do so; this also came up repeatedly in our discussions as a faculty last fall in reviewing our 2009 summer reading, Tony Wagner’s Global Achievement Gap.   The new results are in, and students answering often or sometimes soared to 92%.  (Those answering “often” remained even at 45%, so there is still room for improvement there). (more…)

High School Survey of Student Engagement (HSSSE)

NWEA’ Measurement of Academic Progress (MAP)

College and Work Readiness Assessment (CWRA)

NAIS Monograph: Student Outcomes that Measure School’s Value Added

St. Gregory students discuss the CWRA: Long version– click on More. (more…)

Not everything that counts can be counted, and not everything that can be counted counts.” Albert Einstein

I greatly admire Alfie Kohn.  I have read him since the eighties; I have only heard him speak once, over a decade ago, but it was unforgettable.   I am grateful for his piece in a recent edweek, “Turning Children into Data: A Skeptic’s Guide to Assessment Programs.”

And yet, I wish for more.  The quote Kohn provides from Einstein atop his piece (provided atop my piece) doesn’t call for ignoring all data; it declares that there is indeed data that does and should count, but that we need to be choosy and skeptical about what data we use and how we use it.   Unlike Einstein’s quote, in Kohn’s piece there is not a single thing which he suggests should count; it only about what we shouldn’t count.  As a result, it strikes me he has misappropriated Einstein for his purpose.

As regular readers here know, I think we need to be serious about setting our goals as an educational institution, and then identify what we  things we should count to measure our progress, hold ourselves accountable for success, and most importantly, guide our continuous improvement.   I think too that what we choose to measure sends signals, to teachers, to parents, and to students, about what is most important, and we should be intentional about the signals we send.   (more…)

  • Back to Homepage.Continuing my review of HSSSE materials, seeking to learn more about how schools are using student engagement data (in part in preparation for a presentation next month).     One great source of information is in the 2009 report, a 25 page letter on the HSSSE data from 2009, and, more importantly, case studies in how the data are being used for school improvement in five schools or districts.

The only independent school profiled, Explorations Academy, takes its HSSSE data very seriously.

When the HSSSE data come back to the school, there are usually two kinds of initial analyses that emerge from the data: One set of responses are the “congratulations,” the things that students affirm the school is doing well.

Another set of responses are the “eye-openers” for staff, the areas that students say need more work…the school works on these issues, through “robust” staff discussions in which “HSSSE figures pretty prominently”; assumptions are uncovered and tested, and student engagement data are used to plan programs and processes, driven by an important central question: “Will something new gain us an additional unit of educational growth?” (more…)

Next month, as previously mentioned here, I am presenting at the US DoE’s Annual Private School Leadership Conference, on the topic of  Aligning Data and School Mission.   They have asked me to speak particularly about HSSSE, the High School Survey of Student Engagement:  I will also discuss CWRA, the College Work Readiness Asssessment,  and MAP, the Measurement of Academic Progress.

In preparation, I am doing some researching to learn more about HSSSE, which we have administered here at St. Gregory since 2009.   On Friday, I had a terrific hour-long conversation with Ethan Yazzie-Mintz, HSSSE’s ED, and he offered me a set of resources that I am now reviewing, and will be sharing and discussing on my blog this week.

I have also launched HSSSE user groups on two nings, both on ise-net for independent school educators and at EDU-PLN for the broader audience.

Ethan is portrayed in the video above, which is a nice gentle introduction to the administration at one public high school in Indiana. At this school, they had received a grant to update instructional strategies, the principal explains, and HSSSE gave data to them about how students viewed learning.   (more…)

I’m delighted and excited to have been invited recently as a panelist at September’s annual US DoE’s  Office of Non-Public Education-Office of Improvement and Innovation’s Private School Leadership Conference (that is a mouthful).  I’ve been invited to present on the topic of “aligning data collection with school mission.”

Regular readers here know I have long used this forum to advocate for the College and Work Readiness Assessment (CWRA) and the High School Survey of Student Engagement (HSSSE); speaking at this event to an audience of influential private school educators and association executives will give me a great opportunity to carry forward my advocacy.

It is also a chance to think more, more thoroughly, more deeply about the panel’s helpful title and framing. I hadn’t myself focussed squarely enough until now  upon the simple but elegant and critically important concept of “aligning data collection with school mission,” but that is of course exactly what I am circling around and trying to get more fully in focus.  I will be developing my remarks and presentation over the next few weeks, and I will certainly share/post it here, but here is a first stab at summarizing my thesis:

Most of us who are leading in private and independent education place high priority, in our educational missions and throughout our school cultures, upon three core goals:

  • upon delivering and achieving personalized and differentiated teaching and learning which has a significant and positive impact improving the educational progress of individual learners of a wide range of abilities, maintaining a focus upon the individual and not the mass of learners;
  • upon forging and sustaining a connected community of engaged, active, intrinsically motivated, extracurricularly involved, technologically employing, hard-working learners; (more…)
Law School Surey of Student Engagement

I have written often here about the importance of measuring what matters, and treating the data seriously; I have also written here about the value of the High School Survey of Student Engagement (HSSSE), which has been called an idea to save the world in the Atlantic Monthly.   At my behest as the incoming head,  St. Gregory administered the HSSSE for the first time last spring, and we now have these results to share with readers.

Inside the powerpoint are a series of 20 graphs, representing our results in about 50 different criteria, with, in every case, our average compared to the average of the respondents nationally who participated.  The national respondent base represents over 100 diverse schools, private and public, urban, suburban and rural.   (we believe there is a slight self-selection bias, that schools more committed than average to promoting student engagement are more likely to participate in the survey!).    For each set of questions, two graphs are provided: the first compares averages for the one “top” option (among four), the “Often” or “Agree strongly” option.    The second compares the averages for the top two options (among four): the often and sometimes, or the agree strongly and agree simply.

I also am providing (after the jump)a table of the categories of greatest difference; those where our students reported much stronger engagement than their national peers (there were NO categories where St. Gregory students reported lesser engagement).

Meanwhile, we are also analyzing the results to identify areas which we as a faculty wish to target for improvement in the years to come, a list I will provide at some time in the future.

St. Gregory National Avg. Difference
Written paper more than five pages: Often 51 17 34
Written paper more than five pages: Often and Sometimes 94 51 43
I place a high value on learning: Strong agreement 63 35 28 (more…)

The current Atlantic offer a thoughtprovoking list of “ideas to save the world.”  Leaping out at this blogger is the one entitled Tell the Truth About Colleges. Thomas Toch directs a think-tank called Education Sector, and here he argues that

influential college rankings like the one published by U.S. News & World Report measure mostly wealth and status (alumni giving rates, school reputation, incoming students’ SAT scores); they reveal next to nothing about what students learn. We need to shed more light on how well colleges are educating their students—to help prospective students make better decisions, and to exert pressure on the whole system to provide better value for money.

I agree;  more to the point is my enthusiasm for the tools Toch recommends to do this, to “shed more light on how well colleges are educating:” the National Survey of Student Engagement and the College Learning Assessment.   Like Toch, I think these two tools, when used in combination, can reveal a great deal about how well schools are engaging and preparing their students– and regular readers of this blog know both that I have previously enthusiastically endorsed the secondary school analogues of each of these, the HSSSE and the CWRA, and that we are implementing both at St. Gregory College Prep.   It is great to see these vehicles for promoting school excellence advocated for in a national magazine; indeed, to see them labeled as ideas to save the world!   Surveying and testing kids– we are saving the world!