In recent posts I have made the point that we can better promote the learning we want for our students via “backward design,” assessing what we aim for and then working backwards to promote learning that will ensure success on our assessments, and the point that “digital natives” may be digitally comfortable, but that does not mean they are digitally sophisticated (or digitally fluent).

So it is in keeping with both previous that I write to share my interest in, and at least preliminary enthusiasm for, a recently retooled and now more broadly available testing assessment from ETS, the “iSkills.” The test aims to provide schools a fascinating way to assess (and, as a result, stimulate and motivate) the teaching and learning of more sophisticated “digital fluency” in our schools.

One important quick note: when you look at the online site for iSkills, it gives the distinct appearance of being available only to higher ed, but I have been assured and guaranteed that secondary schools of all kinds are welcome to participate and they think it suitable for students tenth grade and higher.

I had the good fortune to participate recently in an ETS iSkills webinar, and I am fascinated by the tool. (The 70 slides displayed above were the program of the webinar, and are extremely informative for interested parties.)

Much of the session was dedicated to defining the importance and nature of digital fluency; it occupies a spot among critical thinking, 21st century skills, information literacy and ICT: Information and Communication technology proficiency.

“Digital Fluency” as a term aims to capture critical thinking and communication in an online environment.  Surely most of us recognize that the world is going to continue to increasingly require of our students (and ourselves) powerful online and digital savvy in critical thinking, creativity, communication, and collaboration skills, and we should be looking for more terms and concepts to capture this emerging and essential skill. 

Many students may have a reasonable grasp on technical literacy, but any of us who work with these fine young people are worried about their ability to effectively access and acquire the appropriate information and then to evaluate multiple information sources.   And then, can they (can we as educators?) effectively assembly the appropriate information into digital forms and vehicles that effectively communicate our message?

Having defined Digital Fluency, the ETS webinar went on to explain carefully the test they have designed to test, measure, and assess it.   Their goal is, of course, to “Provide test takers, instructors, and institutions with useful data and feedback based on solid measurement principles.”  The test they have designed has the following elements:

  • interactive tasks– not multiple choice.
  • Organized around real world scenarios.
  • Taken on computers with a downloaded program.
  • Measure higher order problemsolving and critical thinking skills
  • 7 digital fluency content areas
  • 14 short, 4 minute tasks
  • administered in 2 30 minute sections

ETS explains that the test is “not a multiple choice test, but rather interactive tasks.  When I inquired further about how they explain the tasks, my new good friend, iSkills developer and representative Bill Wynne offered this helpful clarification:

iSkills is a constructed-response assessment: People respond to each task by using simulated software rather than by selecting from a short list of options.  The entire test is machine-scored, and in order to do that reliably, there needs to be a finite set of actions for the test taker to do, all of which need to be recognizable by our scoring algorithms.  As an example, one type of tasks asks test takers to search for information.

To allow for machine scoring, we created a self-contained search engine instead of allowing students to directly access Google or the like.  The reason is because ETS cannot on the fly evaluate the relevance of every website on the internet.  So we needed to create a proprietary search engine with a finite number of websites for which we could predetermine the relevance to solving the iSkills problem.

In addition, the search terms that test takers enter are compared with lists of possible expected search terms, and the quality of the search terms (allowing for common spelling errors) is factored into the score.

Emphasized also that this is not a test of narrowly defined technological technical proficiencies: “in all of the tasks, the technology is minimal– these are very generic technical elements– the focus is on the cognitive, on problem solving and critical thinking.”

I have written often here about the CWRA, usually with great enthusiasm, and so the question quickly arises how does the iSkills compare to the CWRA.   One clear distinction: CWRA does evaluate writing effectiveness in a way iSkills does not seek to do.  However, both tests lay claim to evaluating critical thinking, but define the term a bit differently and capture different elements of critical thinking.    ETS wouldn’t, understandably, offer any direct comparison of the two, but Bill Wynne did articulate the distinct qualities of their assessment this way:

The short answer to the question of comparing iSkills and other performance-based measures is that we shouldn’t without the appropriate research.  Other performance assessments do not purport to measure digital fluency or information literacy, and iSkills does not purport to measure strictly critical thinking and does not measure writing at all. Performance assessments based on essay responses can measure written communication because the test taker writes.

iSkills does not claim to measure written communication because at no point does the student write, but at the same time iSkills does include tasks that get at communication skills in the form of considering an audience’s point-of-view and recognizing the key points in a persuasive communication.  Although many would argue that an essay-based tests and an assessment like iSkills measure critical thinking but in different ways, research is always required to determine if two tests correlate in any meaningful way.

And what about the results?  Do users get “news we can use?”  My preliminary and tentative answer is yes: it would appear that the data provided gives schools very detailed and valuable results that can be drawn upon to guide continuous improvement.

At St. Gregory, I am delighted to be working with our new Librarian and Director of Information Literacy, Laura Lee Calverley, to pilot this new iSkills test: we are recruiting about 8-10 student volunteers to take the test in the next few weeks.   We will then interview our students about the experience (I hope to do so on video and share the video here), and examine closely the results we get to determine their value and utility for informing and improving our work developing our students’ information literacy, analytic and communication skills, and sophisticated digital fluency.

More to come on the blog in future weeks about the new ETS iSkills.  Meanwhile, any others who might be reading this, please share your experience–with a comment, or if you would like, with a guest blog post.