1. As much as the audience seemed to appreciate my presentation, some who were there, and some who were not, expressed and/or felt that the topic wasn’t ideally suited for an audience of ed. technologists and librarians– because they are not often involved enough in the decision-making about assessment and measurement of learning. (And I am apologetic for my topic having been a little off-base for some attendees).
However, there was definitely interest in some areas of the talk: the topic of computer adaptive assessment as exemplified by MAP, for one. Some asked me why, when the technology for computer adaptive assessment has been available for years, why it is only now coming on-line (or, according to Sec. Duncan, won’t be available until 2014). I didn’t know the answer, but others in the audience speculated that it might be because the hardware hasn’t been available in the classroom to exploit computer adaptive assessment software until now. There was also an illuminating conversation among attendees about new tools via Moodle for teachers to design their own computer adaptive testing, which was fascinating to me.
Participants also expressed special interest in the demonstrations of learning approach, and some discussed how it might be united with the growing movement for digital portfolios, something ed-tech directors are very much a part of implementing. Ed-tech directors, we realized in discussion, can assert leadership by helping other educators in their schools to think through what should go into digital portfolios, and ed-tech folks can help frame this conversation by suggesting that it be students’ mastery accomplishments which are highlighted in the digital portfolios, using Bassett’s list for reference.
The audience’s conversations about demonstrations of learning, and about HSSSE, were particularly animated; many in the audience took note of and appreciated that the high school student survey revealed that “projects involving technology” were far and away rated the most engaging by students.
We ran out of time, even with a two hour block, to do as much work with performance task assessment as I had intended, but I think that there is an opportunity for technology educators, focussed on teaching inquiry, on-line research, and critical evaluation of on-line information to use performance task assessment models for this learning.
2. Like many presenters, I am not entirely happy with the way I am using slides, and I continue to experiment with different models. I had intended to use the prezi format, and worked briefly with a student of mine in this, and still found the learning curve too steep to master the Prezi tools. I chose to use slides in a way that didn’t call too much attention to themselves, but instead were intended to frame and outline the talks in an uncluttered format. I also like using slides for quotes, because I think if you read quotes out loud, people just don’t appreciate them so well– but when I put them on the slide, I let people read them for themselves. I did get the feedback , which I appreciate, that folks found themselves starved for visual stimulation and imagery to match the ideas, and so I may work harder for future sessions to include many more images of my school and students, as I try to do here on the blog.
3. I had had anxiety that some of the many very progressive educators in the room would rebel more fiercely against, or be closed-minded to, my presentation that there is a place for data gathering and measurement of student learning. Instead, I found my terrific audience to be open, respectful, engaged, and thoughtful, and I am appreciative for some of the positive words on Twitter. As an example, it was great to read Fred Bartels, who is always an opinionated colleague, tweet: Great talk by @JonathanEMartin on alternative assessment tools at#neit2010. Opened many minds to new possibilities. Thanks Fred!
4. One additional note about Twitter: I have greatly enjoyed the backchannelling I have participated in at many recent conferences. In my two NEIT sessions, I intentionally took time-outs, during which I asked audience members to discuss various topics, to check the twitter feed conversation, and it was really helpful to me. In at least a couple of cases, I was able to answer questions, or speak to topics, that were being discussed on the Twitter backchannel.
Resource & Links:
- Pat Basset’s NAIS Monograph: Student Outcomes that Measure School’s Value Added
- * St. Gregory’sHSSSE results on slideshare.
- High School Survey of Student Engagement (HSSSE)
- Secretary Duncan’s Assessment 2.0 speech
- My post on Duncan’s Assessment 2.0
- NWEA’ Measurement of Academic Progress (MAP)
- College and Work Readiness Assessment (CWRA)
- St. Gregory students discuss the CWRA
- St. Gregory’s CWRA institutional report
- My post about St. Gregory’s CWRA results
- Taking Teaching to the (Performance) Task: post about Marc Chun’s article
- * Examining a Performance Task.
- Performance Task Assessment: Cla in the Classroom
- A great book on problem-based learning: McCain’s Teaching for Tomorrow