Curriculum Mapping
Curriculum mapping refers to the identification of which information literacy competencies or other student learning outcomes are covered by library instruction (whether delivered in person, online, or hybrid) for a particular department, program, course, or class level. Once the curriculum is mapped, courses etc. can be logically targeted for assessment. The following template, examples, and links provide more information:
- Sumsion, J. & Goodfellow, J. (2004). Identifying generic skills through curriculum mapping: a critical evaluation. Higher Education Research & Development, 23(3), 320-346.
- Curriculum Mapping: What is It? (.ppt)
- Course Alignment Matrix/Planning Guide (template)
- General Education IC Map Survey of Librarians (template)
- Information Competence Matrix (example)
- Information Competence Matrix Report (example)
Specific Methods
Rubrics
Rubrics are ranked measures of student learning outcomes presented in the form of a matrix of criteria and rankings, such as determining the quality of resources (authority, source, currency, breadth, depth, etc.) cited in an assignment and/or the efficacy of student search strategies as outlined in a search journal or another type of summary of how and from where resources were obtained. This is considered a direct as well as authentic assessment of student learning in that the focus is on actual student output. There are many information literacy related rubrics available on the Internet. Rubric examples:
- AACU Information Literacy VALUE Rubric
- Annotated Bibliography Rubric (CSU San Marcos)
- Annotated Bibliography Rubric (Santa Clara University)
- Information Competence Rubric (CSUN)
- Information Literacy Guidelines for Rubrics (Delaware Tech)
- Rubrics for Assessing Information Competence in the California State University (reprint, see p. 7)
- Rubric for Assessing Research Papers (ACRL, CLIP Note #32)
Surveys/Tests
Surveys/Tests refers to close-ended, multiple-choice, forced choice (yes/no, true/false); or open-ended short answer or essay questions. This can be direct or indirect assessment of student learning, but the method is not authentic in that it does not measure actual student work or student search behaviors.
Embedded IL Assessment
The following table of assessment tools were borrowed from the results of a survey of departments at California State University Northridge conducted by the Coordinator of Academic Assessment and Program Review. The columns labeled "survey/test," "rubric," and "other" indicate methods to consider for information literacy assessment to be embedded or applied to these general tools. The advantage to this method is 100% return rate and the application of authentic IL assessment to actual student work.
| Tool | Survey/Test | Rubric | Other |
|---|---|---|---|
| Term paper/speech outline, etc. (requires outside research, possibly research summary) |
X |
||
| Capstone course thesis/project/activity (requires outside research, possibly research summary) |
X |
||
| Embedded exam questions (objective or open-ended within a quiz/midterm/final) |
X |
||
| Portfolio (requires outside research, possibly research summary) |
X |
||
| Pretest/posttest |
X |
||
| Fieldwork/internship/student teaching evaluation (embedded research summary) |
X |
X |
|
| Interviews/focus groups (pre- or post-library instruction) |
X |
||
| Self-rating skills or attitudes toward library research/instruction/use |
X |
X |
|
| Survey |
X |
X |
Further Reading
- Knight, L.A. (2006). Using rubrics to assess information literacy. Reference Services Review, 34(1), 43-55. doi: 10.1108/00907320610640752
- Oakleaf, M. , & Kaske, N. (2009). Guiding questions for assessing information literacy in higher education. Portal: Libraries & the Academy, 9(2), 273-286. doi: 10.1353/pla.0.0046
- Sobel, K. , & Wolf, K. (2011). Updating your tool belt: Redesigning assessments of learning in the library. Reference & User Services Quarterly, 50(3), 245-258.

