Reflections on Week 2 Class and Screencasting
A few ideas and issues that are still in on my mind after Monday's class:
- As I was scrolling through my Google Reader Monday night, I saw this posting on ALA Job List for an Assistant Librarian at LSU, which lists "experience in creating interactive tutorials" as a preferred candidate qualification. While I will not be job hunting until this time next year, I find it reassuring that the skills I am learning now can give me an edge in the academic library job market.
- I am stuck on the idea that there really is no such thing as a "perfect" screencast, and the decisions we must make to design the "best" tutorial we can. During class discussion, I noticed that some disagreement amongst my classmates about features they like to see in a screencast (eg. to close caption, or not to close caption?). Additionally, with regards to subject material for the screencast, we have to consider balance a balance in the type, amount, and depth of information provided.
- Trying to find this balance has my head spinning, and I think the ADDIE model could be helpful for addressing this problem. For instance, in assessing a need, one might decide that hearing impaired people are likely to use the tutorial, and therefore closed captioning would be a nice feature in design. Then, through evaluation, such features can be reconsidered and retooled to continually meet evolving user needs. This speaks to the importance of planning as an iterative process.
- Or, some of these issues may be addressed using the model in this article from Library Journal, "Screen Casting for an Audience of One" (as sent by our fearless instructor).
Information Literacy Literature
This week, we were asked to choose three articles on "information literacy" in our field of interest. Broadly, I am interested in academic libraries and lean more towards undergraduate-level or community colleges focusing on learning and instruction (rather than research), and I tried to target my readings to that purpose.
- "Information Literacy Learning Outcomes and Student Success" - S. Samson (2010). The Journal of Academic Librarianship, 36(3); 202-210
The purpose of this article was to compare student learning outcomes with the
ACRL Information Literacy Competency Standards for Higher Education. The study examined the final projects from two groups of college students - first-year and "capstone" students - and analyzed the projects with respect to the ACRL's five standards and performance indicators (see list in page linked above for specific guidelines). Interesting results included:
- First-year students cited more newspapers and web-based resources, while capstone students cited more primary source materials
- First-year students were more likely to use more "general" databases and capstone students were more likely to use subject-specific databases
- Capstone students were more likely to identify bias in information
- Of the capstone students who presented copyrighted material, 43.8% did not include permission granted notices (note: first year students did not report using copyrighted material)
The several significant differences in these population is a good demonstration the importance of assessment and planning. It also demonstrates the importance of understanding the context and background of your learners; even within the "undergraduate" population, it is important to consider how the needs of first-year students differ from more advanced students. Additionally, the article proposed this type of study as a method to identify areas in which students need more IL instruction. For instance, since students struggled with the legal aspects of information use, the library should incorporate this more readily into their instruction. I think this is an interesting framework for evaluation, but I also think it's dangerous if this is the only evaluative criteria. Maybe students understood that they were supposed to cite a copyrighted work, but were confused about how to do the citation, or simply did not want to take the time to do it. There still needs to be more direct interaction with students and faculty to address issues of causal ambiguity.
- "Information-seeking Habits and Information Literacy of Community and Junior College Students A Review of Literature" - H. Groce (2008). Community & Junior College Libraries, 14(3); 191-199
This article is interesting because it covers several aspects of information literacy from the perspective of community and junior colleges, and includes a review of literature published 1996 - 2006. The major issues identified are: the internet and the digital age; required information-seeking classes; technology overload; legitimacy, reliability, and authority of resources; use of scholarly/peer-reviewed articles; and subject content of assigned research projects.
With respect to instruction, the article suggests that IL analysis and assessment projects "need to stress that information literacy needs to be based on the students' needs." This is great because it addresses both the importance of planning and analyzing the needs of the user before implementing a program, and it also implies that understanding the student (not their project or their faculty member) is crucial in designing instruction. Both of these ideas have been important items for discussion in the first few weeks of class.
- "New Tools for Online Information Literacy Instruction" - S. Williams (2010). The Reference Librarian, 51(2); 148-162
The final article I chose for the week discussed various web-based tools for online literacy instruction. Since we discussed tutorials and screencasts this week, I thought it would be interesting to read how different online methods have been used for IL instruction. The basic premise of the article is that since students are increasingly turning to web-based resources for information, it may be important for IL instruction to also begin migrating to the web. The methods covered in this article include: course management software; academic organizations' websites; blogs; podcasts; screencasts; web-based board games; and Second Life. Of course there are pros and cons to each method, but overall the assessment was that online tools can be a valuable addition to an academic library's IL instruction. I was, however, left wondering - does anyone ever actually use Second Life?
The article also spends a bit of time comparing face-to-face and online IL instruction, which I think is an important topic. While it would be nice to have IL instruction online (and therefore meet students at their point of need), if is not shown to be an effective method of instruction, then what's the point. One study found that a "hybrid" of both face-to-face and online IL instruction was more effective than either method alone. This point has been made several times in our class and blog discussions already, and it was interesting to read some empirical data to back-up our assumptions.
Conclusions
So what does this all mean? While there was much more information in each article than I could ever cover here (even given this lengthy post), the three points I took away were:
1) It is important to understand who your users are when designing information literacy instruction, particularly with respect to the aspects of information literacy with which they need the most help
2) Studying how students are interacting with information can be a tool in assessing user needs and evaluate existing IL instruction programs
3) As predicted, hybrid IL instruction can be more effective than either face-to-face or online-only modes of instruction because students can enjoy the benefits of both learning environments