June 2013

The new NMC report is out, and it is, as always, a fascinating, useful, and thought-provoking document.   Above is the promotional video capturing most of the highlights, below, at bottom,  is the full report.

My post on the 2012 report is here.

A few comments and observations:

1. The set of “six emerging technologies most likely to influence their sectors” evokes a mixed reaction.   Cloud computing, mobile technologies, Open Content, and 3D printing are developments of enormous value and significance to student learning: they make for greater connectivity to information and networks, greater affordability and flexibility, tremendous creative opportunities, and improved personalization of learning and customization of course design.

Learning Analytics is something I’m all for, when done well and for good and with nuance and care, but we all know there is much to be concerned here about the potential for loss of privacy and the mechanized automation of learning.     These concerns are, to my eyes, underappreciated and underemphasized in the discussion inside the report.

Remote and virtual laboratories wouldn’t have made my list.

Remote laboratories enable users to conduct experiments and participate in activities via the Internet using remotely controlled but real laboratory equipment.

Virtual laboratories are interactive online environments for performing experiments with simulated equipment. Both, however, offer the promise of authentic laboratory experiences regardless of the locale of the user.

I don’t mean to dismiss their value or significance altogether, but they are hard to get too excited about.   In an era of maker spaces and in a time when we need again, or as always, tremendous bursts of innovation in our student laboratories, it is hard not to fear that virtual and remote labs are as much a step back or step away as a step forward.  Sure, bring them, but don’t prioritize funding or time for them over real labs.

2.  The timeline seems a bit off to my eyes, also.   I’d have put 3D printing in the mid-time range, 2-3 years, and I think Learning Analytics is going to be slower in development and implementation by most regular users– it belongs in the further out time range, 4-5 years.

3.  For a reason not explained, (or did I overlook it?), this year’s report halves the number of emerging trends when compared to the 2012 report, only six rather than twelve.

Some of the exemplary elements in last year’s report are missing: collaborative environments, personalized learning environments, semantic applications (an abstract term, but best exemplified in tools such as Wolfram Alpha and one my very favorite iphone/ipad apps, TripIt), and Tools for assessing 21st century skills.   I’m sad to see them missing, and I would have placed any one of them over virtual/remote labs.

4. The two pages summarizing key trends, as differentiated from emerging technologies, is also useful (and entirely absent from the video).     I’m especially taken with these three, which deserve careful consideration and effective implementation in every school journeying toward becoming a school of the future: (more…)

View this document on Scribd

I continue to be impressed and informed by the fine work happening at SETDA, the State Educational Technology Directors Association, and the above is a useful report on a topic which nearly all of us in K-12 are working hard to get our arms around: making the leap from print to digital texts.

It is a report written more for state level educational administrators and to some extent district officials than for individual school leaders, but the latter can still plenty of useful information.

Highlights and observations: 

1.  SETA lays out the key benefits for going digital, and provides elaboration and examples for each:

SETDA sees four primary interrelated advantages to increasing the use of digital content in today’sschools. Over time and with good implementation, a shift to digital content will:

  • • Increase student learning and engagement
  • • Accommodate the special learning needs ofstudents
  • • Facilitate the search and discovery of unbundled resources
  • • Support educators in personalizing learning

These all make sense to me, and I especially appreciate the open-ended nature of the third,

“unbundled resources: The ability of educators to locate just the right resource, lesson, or chapter as they need it is an important consideration with digital content. There may be hundreds of potential resources to use for any given lesson when the teacher has the entire World Wide Web to choose from.”

There are many I, and I am sure readers here, would add quickly to this list, including supporting the development of connected learners, improving the opportunity for teachers as curators and authors, and the potential of significant cost savings.

What’s missing in this discussion is any exploration of the question of what might be lost in digital reading.   They make a nice point, that students are actually relatively more able to “mark up” a digital text than they are a school-owned, “permanent” textbook, which does facilitate reading for understanding.

But I think there might be value in acknowledging we are all still groping our way toward understanding the advantages and disadvantages of digital device situated reading, and if it is SETDA’s confident opinion that digital texts are equal to print as educational resources, to state that and defend that position.


On the one hand, as TIME magazine reported in a 2012 article entitled Do E-Books Make It Harder to Remember What You Just Read?    “different media have different strengths — and it may be that physical books are best when you want to study complex ideas and concepts that you wish to integrate deeply into your memory.”   Frankly, as much as a enthusiast as I am for digital tools, I fear this is my own experience.

However, the Atlantic has reported this month (June 2013) in a piece entitled “Study: Reading in Print, Versus on a Computer or Kindle, Doesn’t Change Comprehension” that researchers at one New York university found “Readers scored the same on comprehension tests regardless of the medium.”   Hardly conclusive, this research: to my mind the jury is still out.

2.  The  state case studies provided by SETDA are illuminating, especially, I found, the Utah example. (more…)

Courtesy of my good friend and Tucson neighbor, Bob Pearlman, (whose website is a must-visit resource for everyone in 21st century learning), it is great to share with you the above video featuring President Obama’s visit to New Tech Manor in Texas.

As regular readers may recognize, I’ve been a fan and advocate for the quality of the New Tech program for five years: it is great to see it get this recognition and admiration from the President.

This video does a great job interweaving Obama’s speech to the students and his visit to their classrooms.   This is authentic learning, hands on, engaged, challenging, rigorous, collaborative, meaningful.


“But what about collaboration?   There certainly aren’t any assessments available to evaluate our students’ proficiency in that critical 21st century skill.”

A common question and remark in my presentations about assessment, it hits on an important issue.  If we prize connected learning and collaboration as not only key 21st century workforce skills, (it is almost always among the various listings of skill sets prized in hiring) but also as key to lifelong learning and indeed, to the larger work of creativity and innovation– then shouldn’t we take steps to ensure our students are learning this key skill?

(I’m tempted here to make the case for the critical power of collaboration, and its fast-rising significance in the discovery and development of new ideas and new tools, but I think it is well enough understood that it can be stipulated here.)

An ongoing thesis of this blog is that we should all, as K-12 educators, be more clear with ourselves and our students about our specific and key intended learning outcomes, and if we are going to test or assess their learning at all, we should align that testing and assessing with those ILO’s, and we should use the information we collect to improve their learning.

Accordingly, we need more and better tools and resources for better assessing collaboration— something many of the best assessment minds have been working on in really exciting ways.

Clearly, authentic assessment of collaboration is key to this work– and many teachers are using PBL learning programs to provide more opportunities for student collaboration, and are building comprehensive rubrics to assess such collaboration.  Buck Institute for Education has a freely available such rubric available here.

I’ve written and presented with some regularity about open internet testing as a way to improve the teaching and learning of information literacy, web research, critical thinking, and applied problem-solving, and Will Richardson has built on my open internet posts to argue for the importance and value of “open network” testing, in which it is not just web-sites which are accessible in testing but other  persons, collected into a learning network, whom can be interviewed or surveyed to collect information for use in problem-solving.

But as regular readers know, I’m always interested in how we can mix and match both internal, such as the two examples above, and external measurements of what we think matter.  In Connecticut, I’ve heard from a source there, they are developing a new section for their secondary science standardized test for collaborative problem-solving.

As the video at top displays, there is exciting work in this direction coming from some super-smart Australians at the University of Melbourne and its ATC21S: Assessment and Teaching of 21st Century Skills.  To the best of my understanding, the kind of Collaborative problem-solving assessment tool featured in the video is not yet available, but still in development.

PISA, from OECD, is perhaps the broadest and most significant player in the new field of assessing collaboration and, more specifically, collaborative problem solving.  As is being increasingly reported, its 2015 test administration will include for the first time this key element.

As an example, creativity expert Keith Sawyer wrote about this development (and the work of ATC21S) on his blog recently. 

This new assessment will be included in the international PISA assessments to be implemented in 2015. Look for news stories describing which countries are “better at collaboration” based on these skills! The need for standardized tests is unfortunate in many ways, but I believe that having politicians and parents focused on collaboration will lead our teachers, students, and schools to emphasize collaboration more.

The PISA definition (version A):

Collaborative problem solving competency is the capacity of an individual to effectively engage in a process whereby two or more agents attempt to solve a problem by sharing the understanding and effort required to come to a solution.

These slides from PISA lay out a bit more information.

As the slides explain, PISA has been on a journey of many steps in improving its assessment of problem solving– which is, after all, quite well situated as the ultimate expression of what we want students to be able to do.

The article embedded below quotes Karl Popper to the effect that “All life is problem solving.”