genericcovercwra.ai-133940

  • Far more detailed institutional reports–
  • student level scoring validity
  • possibility of improved student test-taking motivation
  • available for 8th graders now
  • flexible scheduling
  • Lower price– $38
  • Special trial price this spring only $22

Regular readers here know of my interest in, and on balance enthusiasm for, the CWRA– the College Work Readiness Assessment, which I have administered as a school head over the course of three years, presented on about half a dozen times and written about about here a dozen times.

Run, don’t walk, to register for the 30 minute free webinar CAE, CWRA’s parent, is offering this week and next about the forthcoming changes in the CWRA.   If you can’t attend one of these sessions, you’ll do pretty well as an alternate reviewing the five page overview of CWRA changes I’ve embedded at the bottom of this post.  (Be sure to click “more” to see it if you are interested).

As enthusiastic as I’ve been, I’ve also been a gentle critic on the following fronts.

  • The institutional reporting lacks detail and specificity for use in identifying program gaps and targeting institutional improvement.
  • It is too expensive.
  • Students lack motivation to perform because they have no stake in the game– there is no student-level report.
  • There are enough or pertinent norm groups for comparison– particularly in the lack of independent school comps.
  • It doesn’t have enough possible purposes beyond an institutional check on student learning.
  • It isn’t available for middle school students.
  • Is automated scoring of essays proven and reliable enough?

And now, here it is: the new CWRA plus addresses nearly all of these issues.  I feel almost as if they were listening to me.  (Smile) (more…)

SSATBThink Tank


The New York Times Notable Books of 2012 list was published this week, and it was good to see its recognition of Paul Tough’s new book, How Children Succeed: Grit, Curiosity, and the Hidden Power of Character. Surely many blog readers who have not yet gotten to this new book will recall Tough’s widely circulated article featured on the cover of the New York Times Magazine in October 2011: “What is the Secret of Success is Failure?”

That article, which sits at the center of the new book, describes the work being conducted at two New York City schools that serve very different populations: Riverdale Country School (a SSATB Member) and KIPP mid
dle school. School leaders at both schools, however, have teamed to develop new tools and techniques to both cultivate and assess a set of character—particularly so-called “performance character”—skills and attributes, believing them to be equally important or, if we look to the Darwin quote above, superior to traditionally defined intelligence in making for future success.

As is fairly well known within our association, Choate Rosemary Hall has undertaken such an experiment over the past decade in an extraordinarily impressive way, as part of a collaboration with Dr. Robert Sternberg, a former Choate parent, and at the time head of the PACE (Psychology of Abilities, Competencies, and Expertise) Center. Choate Admission Director Ray Diffley’s leadership of this project, and the lessons he has learned from it, have brought him to the leadership of our Think Tank.

Choate’s work expanding its range of admission assessments has had several iterations. In its earlier, expansive version, which included a wide array of student assessments and tasks, it found “three consistent variables among students best predicted a student’s ability to thrive at Choate….

Read the full post by clicking here.

The Innovation Portal - Online Collaboration for the Creation of Engineering Portfolios   Online Collaboration for the Creation of Engineering Portfolios-100957

Two of my great interests and enthusiasms regarding 21st century learning have, until now, felt a bit divorced from and at odds with each other.   Yesterday, however, I learned more about a fascinating bridge developing for them.

The first is high quality, authentic 21st century assessment: if we are going to make new pathways in learning that is more meaningful for students, more preparatory for the futures they are inheriting and more engaging for the people they are today, we need to have tools that allow us to evaluate effectively their learning, both to provide meaningful endorsements of these learning paths for the skeptical and, more importantly, to correct our courses to keep doing so more effectively.

The second is the joyful messiness of open-ended and unstructured project-based learning that is found in Fab labs, design-build studios, design thinking centers, and maker-faire type spaces.   These places ought to be free from tight strictures– they should celebrate experimentation, learning by doing, trial and error, fast-failure, and never be stifled by narrow or miserable “testing.”

It might be cruel to introduce assessment to these labs and studios, but I want those teachers and students who want to find a way to build in more structure, such that they can better evaluate their own progress, get external feedback, and meaningfully improve their work to have quality ways to do so.

Clearly I am not the only one to think this (and I never am).

The Innovation Portal was launched in the last year or so, (with strong support from Project-Lead-the Way, itself also a valuable resource),and as you view the site you can see it is still developing and rounding out.  It provides a platform for

students to create, maintain and share digital portfolios. The portfolios can be used to meet a class requirement or they can be used to submit the portfolio to a scholarship or open contest. The contest owners – or anyone else invited by the student – can evaluate a student’s portfolio. (more…)

The Fall issue of the Secondary School Admissions Test Board member newsletter, Memberanda, was just published, with my article introducing its new think tank.   Click here to read the full article; below is the top section.

—–

“Sure she’s smart, but I wish I could tell how creative she is: can she think out of the box?”

“His scores are middling, but he seems pretty motivated and persevering: too bad there isn’t a way to measure that.”

“You know, she had some difficulties on the test, but this teacher says she is a leader in her class – will the committee trust this one recommendation?”

Gather any group of admission directors together and before long, the conversation invariably turns to the issue of testing. While testing is useful and does tell us something about how a student will perform, there is so much more that we want to know about our applicants and so much more that is important about what they can bring to our school. How can we capture more information about our applicants?

SSATB recognizes that in the 21st century the nature of testing and assessment is changing and that its member schools are seeking new ways to assess diverse applicants’ readiness for their academic programs and educational settings. In response, SSATB has convened a Think Tank on innovation in assessment…. Read more. 

One of my main projects this year is serving as a member and consultant/writer for the Secondary School Admissions Testing Board (SSATB) Think Tank on the Future of Admissions Assessment.   More information on the Think Tank is here;  it’s charge is here. As part of my work I am posting a monthly column for the Think Tank; below is a “teaser” that post.  Click the link here or at bottom to read it in full. 

“Creativity,” Dr. Sternberg replied, when asked what addition to admissions assessment he would recommend if he had to limit himself to just one. Coming from the SSATB 2012 Annual Meeting’s keynote speaker, the former President of American Psychological Association, and arguably the world’s foremost scholar of – and experimental practitioner in – expanded admissions assessment, this is compelling counsel for our Think Tank’s work.

Using Sternberg as a framer and guide for the work of the Think Tank on the Future of Admissions Assessment is a no-brainer, and our time with him in Chicago was enormously valuable. In this post, we’ll take a deeper dive into assessing creativity; in future posts we’ll look at other Sternberg recommendations and many other aspects of expanded assessment for admissions.

Sternberg’s recommendation to prioritize creativity is both narrowly pragmatic and broadly idealistic.  Read on….

Creativity and Ethical Mindset: These are what we should be assessing in learning, PK-16, in addition to analytic intelligence. So says renowned author and academic, and former President of the American Psychological Association Robert Sternberg.

Important: All slides above are Sternberg’s, from his presentation today at SSATB.

Sternberg matters, and deserves even wider and deeper appreciation and influence in PK-12 education than he already has, and I am sometimes surprised about why he is not more frequently a reference point in 21st century learning. It may be that his work has been primarily in post-secondary that K-12 folks overlook him, but as he said today and as I believe firmly, in almost every way his works is entirely suited for applications in our domain.

Ray Diffley, the trailblazing Director of Admissions for Choate-Rosemary Hall, introduced Sternberg, and labeled him the single most important thinker on expanding and revamping educational assessment in the nation today.

Sternberg, who is in his sixties and has 20 month old triplets (!), couldn’t actually attend in person, due to his airplane’s equipment failure, but his virtual contribution worked just fine.

A few observations about Sternberg, but before you read any further, be sure to view the slides, all of them, if you haven’t seen him present. This session was a very valuable and sweeping overview of his essential themes and thoughts, and the slides convey a very high proportion of what he said in this session.

1. He clearly deeply cares about kids, his own kids and all others, and works always from a foundation of personal experience, his own learning journey.

2. He has walked this talk– he doesn’t just research how assessment can change, or theories of what it could be, but again and again at different universities and schools he has been implementing these assessments. Rarely do you find someone who has done more to blend theorizing and implementation.

3. Expanding assessment is both practical and idealistic.
(more…)

Across the breadth of the 21st century learning movement, the question of which critical skills and aptitudes we assess (and how we do so) looms large.

Very often we are discussing how to broaden what we assess beyond the narrowly defined cognitive skills which we are what are most frequently in our sights: reading comprehension and literacy, math and quantitative analysis.    We argue for assessing creativity, collaboration, and communication; we are looking for ways to evaluate student character: integrity, compassion, resilience, perseverence, grit, empathy.

Keith Stanovich, who is new to me I am afraid to say, offers a different tack in his highly acclaimed 2009 book, What Intelligence Tests Miss: the Psychology of Rational Thought, which won the 2010 Grawemeyer Award in Education.    Stanovich pleads with his readers, quite passionately I would say, to broaden our gaze beyond intelligence (the IQ g,  and SAT style testing, about which Stanovich says “has remained constant[ly] a stand-in for an  IQ test”) but not beyond the cognitive.   We must divide the cognitive into two quite distinct arenas, so distinct that it is easy to find abundant examples of individuals strong in one, but not the other: intelligence (IQ-g) and “Rational thought.”

Beyond asking us to recognize that there are two distinct cognitive domains, Stanovich makes the case that rational thought is the ugly step-sister of intelligence, neglected and overlooked by society, and yet is ultimately of equal or greater significance for the makings of success in all that we do, and hence deserving of a great leap forward in valuation by schools and employers.

In short, we have been valuing only the algorithmic mind and not the reflective mind.  This in part the result of historical accident.  We had measure of algorithmic-level processing efficiency long before we had measures of rational thought and the operation of reflective mind.

The lavish attention devoted to intelligence (raising it, praising it, worrying when it is low, etc.) seems wasteful in light of the fact that we choose to virtually ignore another set of mental skills with just as much social consequence– rational thinking mindware and procedures.

I simply do not think that society has weighed the consequences of its failure to focus on irrationality as a real social problem.    These skills and dispositions profoundly affect the world in which we live.

As illustration to open the book, Stanovich offers a very engaging discussion comparing Presidents, quoting in an opening epigram George W. Bush in a wonderfully revealing personal self-assessment:

I’m also not very analytical.  You know I don’t spend a lot of time thinking about myself, about why I do things. (more…)

Performance Task Assessment, sometimes referred to simply as Performance Assessment, is coming soon in a substantial and significant way to K-12 schooling;  21st century principals and other educational leaders would do well to familiarize themselves with this method and began to make plans for successful integration of this new, alternative format of assessments.

[the following 10 or so paragraphs lay out some background for my “10 Things;” scroll down to the section heading if you want to skip over the background discussion]

President Obama and Secretary Duncan have been assuring us for several years that they will take standardized testing “beyond the bubble,” and both PARCC and Smarter Balanced are working hard at developing new common core assessment using the format of perfomance task assessment.

As PARCC explains,

PARCC is… contracting with [other organizations] to develop models of innovative, online-delivered items and rich performance tasks proposed for use in the PARCC assessments. These prototypes will include both assessment and classroom-based tasks.

Smarter Balanced, meanwhile, states that by 2014-15,

Smarter Balanced assessments will go beyond multiple-choice questions to include performance tasks that allow students to demonstrate critical-thinking and problem-solving skills.

Performance tasks challenge students to apply their knowledge and skills to respond to complex real-world problems. They can best be described as collections of questions and activities that are coherently connected to a single theme or scenario.

These activities are meant to measure capacities such as depth of understanding, writing and research skills, and complex analysis, which cannot be adequately assessed with traditional assessment questions.

Samples of the performance tasks being developed for grades K-8 are available here.   (more…)

As I wrote about last year, I am greatly enthusiastic about the opportunity Open Computer testing plays in assessing our students in their development of 21st century skills.     I think it is a really exciting way to take assessment into 21st century information environments, and to situate students in much more real-world situations as we prepare them for the contemporary world of work.

The above slideshow displays what I think is a very well designed “Open computer/open internet” exam, and Dr. Scott Morris has added on every slide his annotations of how he expects students to use the internet in answering these questions, and how these questions, with those resources, demand more of his students, particularly the higher order thinking skills of critical thinking and analytic reasoning, and a deeper understanding of the course material. (If the slides are too small for the print to be legible, click on the full screen link at the bottom right).

As I frequently discuss with Dr. Morris, with whom I have prepared this post and with whom I am co-presenting on this topic at the NAIS Annual Conference in March, our rationale is that our students are preparing to work in professional environments where they must tackle and resolve complex problems, and we know that in nearly every envisionable such environment, they will have laptops or other mobile, web-connected, digital tools to address those problems.   Let’s assess their  understanding in situations parallel to those for which we are preparing them.

But it is not just a matter of situating them in real-world environments, but that with open computer testing, the format of exams, (and yes, as much as I am a huge fan of PBL, exhibitions, and portfolios, call me retrograde but I still think there is a place and a role for exams in the range of assessment tools we use,) but the format of exams changes in really meaningful ways. (more…)

I spent most of the day yesterday working with our fine Middle school head, Heather Faircloth, preparing the presentation above for last evening’s program about our use of MAP, the Measures of Academic Progress.  This is the tool we use for standardized testing, and which we are administering three times a year to our students in grades six, seven, and eight.

Often I write about our work to enrich our students with leadership and innovation education, our focus on higher order thinking skills, our advisory programs, project-based learning, and academic extracurriculars.  But we never forget that all of this stands upon the bedrock of a very serious and strong foundation of core academic skills, the skills assessed in the MAP testing.

Until very recently here, standardized testing was administered only one time a year, in a one-size-fits all, uniform, paper and pencil, bubble test, the results of which came only months later and which were promptly filed, with only a very little amount of attention given to them, and even at that, very few resources available to make a good use of the results.  As Mrs. Faircloth explained last night, and as you can see in the above,  our use of MAP is different in nearly every way from the previous practice.

MAP is administered three times a year with results received nearly immediately; over the course of nine test segments, it creates a motion picture of a student’s learning in progress, rather than a static snapshot once a year.  It is a computer-adaptive assessment, meaning it quickly conforms itself to the student’s individual learning level, and then gives a far closer and clearer view of that student’s individual proficiencies and areas of proximate growth, things which are unique to every students. (more…)

In recent posts I have made the point that we can better promote the learning we want for our students via “backward design,” assessing what we aim for and then working backwards to promote learning that will ensure success on our assessments, and the point that “digital natives” may be digitally comfortable, but that does not mean they are digitally sophisticated (or digitally fluent).

So it is in keeping with both previous that I write to share my interest in, and at least preliminary enthusiasm for, a recently retooled and now more broadly available testing assessment from ETS, the “iSkills.” The test aims to provide schools a fascinating way to assess (and, as a result, stimulate and motivate) the teaching and learning of more sophisticated “digital fluency” in our schools.

One important quick note: when you look at the online site for iSkills, it gives the distinct appearance of being available only to higher ed, but I have been assured and guaranteed that secondary schools of all kinds are welcome to participate and they think it suitable for students tenth grade and higher.

I had the good fortune to participate recently in an ETS iSkills webinar, and I am fascinated by the tool. (The 70 slides displayed above were the program of the webinar, and are extremely informative for interested parties.)

Much of the session was dedicated to defining the importance and nature of digital fluency; it occupies a spot among critical thinking, 21st century skills, information literacy and ICT: Information and Communication technology proficiency.

“Digital Fluency” as a term aims to capture critical thinking and communication in an online environment.  Surely most of us recognize that the world is going to continue to increasingly require of our students (and ourselves) powerful online and digital savvy in critical thinking, creativity, communication, and collaboration skills, and we should be looking for more terms and concepts to capture this emerging and essential skill.  (more…)

In recent weeks I’ve observed a growing conversation about how best to advance both character education and the learning of 21st century skills.

What is increasingly widely recognized is the idea of backward design: that we can promote learning of our intended outcomes if we put greater emphasis on our assessment of these intentions.  Students know what to strive for, and teachers over time find themselves giving greater attention teaching and having students demonstrate the things upon which teachers know they’ll have to assess and report.

The recent New York Times Magazine cover story, What if the Secret of Success is Failure?, tells the story of two schools in New York City working to develop a clear set of intended character outcomes for their students.  At one school, KIPP, they embedded these outcomes in a formal report card element, highly quantified and if I understand correctly, a part of the permanent record.

[KIPP] started working to turn it into a specific, concise assessment that he could hand out to students and parents at KIPP’s New York City schools twice a year: the first-ever character report card.

At the other school, Riverdale Country School, the school was avoiding formalizing the character goals into a formal report card, in part because of concerns that students would “game the system” if it became high stakes, and so instead they were working to find ways to bring these character goals into the culture of the school.

“I have a philosophical issue with quantifying character,” [Riverdale Head of School Dominic Randolph] explained to me one afternoon. “With my school’s specific population, at least, as soon as you set up something like a report card, you’re going to have a bunch of people doing test prep for it. I don’t want to come up with a metric around character that could then be gamed. I would hate it if that’s where we ended up.”

We here at St. Gregory believe we are seeking and finding a middle ground between Riverdale and KIPP in our approach to a character and also 21st century skills report card supplement.    This has been a central thrust of our efforts in the past three years to elevate the importance of and the development of these skills in our program, and to better fulfill our mission to promote and cultivate in our students Character, Scholarship, Leadership and Innovation.    Our approach has been to develop a KIPP like report card for these qualities, but use it as a formative guide for students to self-assess, collect feedback from their teachers, and set goals with their advisers, rather than as a high stakes summative assessment which would then be “gamed.” (more…)

Links of interest:

New York Times article about Character report cards: http://www.nytimes.com/2011/09/18/magazine/what-if-the-secret-to-success-is-failure.html

NAIS monograph, Value Add Measurements of Learning: http://www.nais.org/sustainable/article.cfm?ItemNumber=151607

Contact for the PISA Based testing for Schools pilot in Canada: For any questions, please email pisabasedtestforschools@oecd.org or Charles Cirtwill at charlescirtwill@aims.ca