About a month ago, I posted here a lengthy piece sharing my learning about the ETS iSkills test/assessment. In that post, which includes nearly 80 slides explaining the test in great detail and which was based on a webinar I “attended” and on my followup interviews with an ETS program director, I reported that I found the test intriguing conceptually, but I was unsure about its quality in practical implementation. Surely many of us believe that “digital fluency” or proficiency in using digital tools to effectively access, analyze, organize and communicate information is of incredible importance for students and professionals, so the test’s goals are worthy.
Recently, our new Librarian and Director of Information Literacy, Laura Lee Calverley, managed a pilot if the iSkills with seven of our students who volunteered to participate. This is her report about our pilot of the test, which, on balance, she found disappointing. We don’t expect to move ahead with using it in a broader way anytime soon.
———-
The iSkills assessment measures digital fluency, testing with a range of activities designed to simulate real-world, information literacy dependant scenarios. We were very excited to run our own pilot of the iSkills test with a small group of students here at St Gregory, where technology is so much a part of our school.
Our experience with the test was somewhat mixed. Ordering and purchasing was simple, but we had some frustrating issues with setup. The web-based exam is currently only available on PC computers with Internet Explorer, a browser that we do not use or support on school computers. Since our computer labs and library computers are all Macs, setting up PCs for testing also added a lot of extra hassle. I have a hard time fathoming how a “digital fluency” test can expect to be taken seriously and widely when it only works on such a limiting browser. Not only is it annoying, but the fact that every step of the iSkills exam seems to involve Internet Explorer also makes it pretty hard to trust them as any kind of authority on digital skills.
Proctoring the exam was simple and straightforward. Students took about an hour to complete the exam. Some students reported that the iSkills exam was fun and “like playing a game,” while others complained about the tasks. It was interesting to hear reactions to the test, especially in how distracted some of our students were by the real-world style simulations within the exam. In one scenario, for example, the tester is asked to examine a few emails to gather information, a task that stood out to several students who were bothered by the “creepiness” of reading someone’s emails.
After hearing so many interesting comments, I took the test for myself. Most of the real-world activities were pretty transparent and, to me, it felt like a very straightforward test. Some of the tasks, such as creating a presentation slide using information from a website, for example, seemed pretty good at testing my ability to filter and present information; however, far too many of the tasks seemed much more dependant on my ability to follow detailed sets of instructions more so than my ability to think critically.
When I received my iSkills score, I was once again disappointed by the reliance on Internet Explorer. Score reports only displayed in Explorer, leaving many students unable to view their own scores. From my experience taking the test, I suspect that scores may reflect a student’s attention to detail more so than any kind of skill or ease with technology. We only tested a small group of students though, so perhaps a larger sample would reveal something more universal.
The slides below, and if they are too small to read, click to enlarge the slides to full-screen size, provide an annotated view to the student report, with notes on the utility of these results for informing instruction in this critical area of digital fluency.
—
Overall, I’m not sure how the iSkills test will fit into St Gregory, especially when the constant reliance on Internet Explorer makes impossible for me to imagine us using this test widely with our Mac computer labs. Part of me wonders whether we need a test to simulate real-world digital fluency when many measurable activities are taking place every day here at St Gregory. Our students create presentations and papers using a variety of information sources and technologies, they learn to pay attention to detail, and utilize their email efficiently, prioritizing and sorting as needed. Students master these skills in order to succeed at St Gregory and perhaps those results are more meaningful than the scores on a standardized test could ever prove.
Leave a Reply