As a followup to yesterday’s post on Angela Duckworth and her research/advocacy for the importance, perhaps equal or greater to that of traditionally defined intelligence, of cultivating self-control and grit/perseverance in our students, it seemed useful to share this fine short lecture from Paul Kim, of Stanford.

I’ve been participating with some moderate profit in Professor Kim’s Stanford MOOC, Designing Learning Environments.  (any other readers out there participating?  I still need to build my team).   Kim’s most recent lecture was on a topic that I’ve increasingly recognized must be woven into every aspect of technology integration: the importance of and more importantly, the techniques of attention, concentration, and self-control.

Even better, Kim’s lecture synthesizes the argument for self-control with the technology of student self-monitoring and self-tracking, an element of learning analytics I’m especially interested in.  (All quotes are from the video above.)

Basically, I would like to ask you to consider designing a learning environment that can trigger and help your students learn to better manage their own learning. I can never overemphasize the importance of this need for a learning environment design.

Kim offers the useful observation that while online learning, relative to traditional school-based learning, offers enormous advantages of convenience, it poses far greater challenges of self-control for students.   Simply put, if students have to or are very strongly expected to attend school every day, and go to scheduled classes for an hour or two at a time, they have much fewer decisions to make about their learning program, and they have, at least much of the time, a peer/social group there with them and supportive encouraging adults personally steering them forward.   But in an asynchronous online learning, none of this is the case– and so self-discipline for making your work progress is much more important. (more…)

As George Siemens has written in edu-cause, learning analytics, though still a young term, already suffers from “term sprawl.”  Nevertheless, he offers this definition:

 “learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.”11

I’ve written previously about learning analytics– particularly in my review of the NMC Horizon report, which predicts that learning analytics will become increasingly significant in K-12 education in the coming two to three years, writing this:

The goal of learning analytics is to enable teachers and schools to tailor educational opportunities to each student’s level of need and ability. Learning analytics promises to harness the power of advances in data mining, interpretation, and modeling to improve understanding of teaching and learning, and to tailor education to individual students more effectively.

What I wrote then continues to express my stress about learning analytics: “I can’t seem to decide if it perpetuates and worsens our viewing students as assembly line widgets to process and manipulate, or if it is a great breakthrough in differentiation and individualizing instruction.”

One of the ways to recast my concern is to ask: is learning analytics essentially or especially about large scale, institutional level or higher, views of learning, or can it be situated closer to the learning experience, in the hands of teachers and students?

Teacher-centered Analytics: 

I am eager to support, and last spring at St. Gregory this was my biggest single push, the use of action research protocols and procedures by teachers: I think that school-leaders should work hard to develop these embedded within professional learning communities (such as Critical Friends Groups) and action research teams should have quality time provided to them for the work.   Now, I haven’t seen these two concepts, learning analytics and teacher action research, conjoined, but I think they should be able to be linked, right?   To quote the second definition, it would seem they can: in action research, teachers could conduct “data mining, interpretation, and modeling to improve understanding of teaching and learning.”

But, learning analytics has most often been associated closely with the looming construct of “Big Data,” and even in the previous, we see the term “data mining,” which suggests combing through massive pools of data unlike anything teachers are likely to generate in their own action research.  So the difference is one of size: big vs. small data.  Can learning analytics truly occur inside of small data?

Student Centered Analytics

Empowering students to command their own learning is a consistent theme here at 21k12, and I’m quite excited about the prospect of putting learning analytics directly in the hands of students.     Siemens in the edu-cause article writes a list of nine possible values learning analytics can provide to higher ed, and number nine is the following:

They can provide learners with insight into their own learning habits and can give recommendations for improvement. Learning-facing analytics, such as the University of Maryland, Baltimore County (UMBC) Check My Activity tool, allows learners to “compare their own activity . . . against an anonymous summary of their course peers.”15 (more…)