This article describes how redesigning a program's assessment practices for teaching with primary sources (TPS) can provide a clear framework for talking about the impact of educators' work in archives and can provide feedback on how to refine instruction practices for greater results. The authors share a description of their assessment redesign process accompanied by analysis of the implementation of our new assessment tool in the hope others will consider the design and goals of their own assessment practices. The authors' work demonstrates that reflection on existing tools, development of new goals, and design of new assessment strategies can yield inspiring new data on program impact and highlight areas for improvement. By illustrating the authors' redesign process, this article also demonstrates the types of impacts and outcomes that educators can measure for TPS and points to the huge potential of TPS in local history contexts and elsewhere. The authors' revised student assessment moved archives staff from relying on self-reported, affect-focused data to better understanding the outcomes of their work with students: the impact of project-based learning in archives; the value that students find in various aspects of their encounters with archives; the role that TPS in local history contexts plays in connecting students to their community; and the transferability of research skills that students learn through TPS activities.

This article describes how redesign of a program's assessment practices for teaching with primary sources (TPS) can provide a clear framework for understanding and talking about the impact of our educational work and can provide us as educators with feedback on how to refine instruction practices for greater results. Examination of data from student surveys implemented regularly over ten years of a K–12 TPS program, Brooklyn Connections, helped us reflect on and revise assessment goals. With the support of researchers at Knology, we redesigned student assessment tools to gather data more closely aligned with new assessment goals and research questions. In this article, we share a description of our redesign process accompanied by an example of the implementation of our new assessment tool and analysis of results. The primary goal of this article is not to assess program outcomes, although our assessment results will be discussed in the context of analyzing our redesigned assessment tool, but instead to examine the process and outcomes of redesigning assessment tools. We hope this work will inspire others to reflect on how they understand the types of impacts and outcomes they can measure for similar instruction programs in archives and to consider how they might refine their assessment practices.

We understand our work on this project exists in the wider context of assessment in libraries, in library instruction, and in the field of TPS. To situate this work, it is helpful to consider literature on assessment in libraries and more specifically with respect to library instruction, TPS, and instruction for K–12 students. We are also interested in thinking about assessment of assessment: have others reflected on and revised their assessment practices to refine or change possible findings?

Assessment in Libraries

Broadly speaking, assessment of library services is a common practice. The American Library Association's Office for Research and Evaluation supports projects related to best practices for data collection and use in libraries and provides training on various aspects of library assessment.1 The Public Library Association initiated Project Outcome to provide assessment tools to help public libraries increase their community impact and use resources as efficiently as possible.2 Megan Oakleaf's work on library assessment is a common reference point across the profession and situates assessment as a tool to prove library value.3 More directly related to the impact of library programs specifically with K–12 audiences, Oakleaf's focus on the value of libraries recalls much earlier research by Mary Gaver, which assesses K–12 school library (and librarian) value through improved test scores.4

Assessment of K–12 Library Instruction

With respect to library instruction specifically, assessment has been used beyond demonstration of value to revise and refine program design and to improve librarians' instruction skills.5 Working with K–12 audiences, assessment has evolved from Gavers's research to determining the impact of specific library interactions.6 One example is the Tool for Real-time Assessment of Information Literacy Skills (TRAILS) assessment protocol, which was developed to assess whether K–12 library instruction programs help students acquire information literacy skills aligned with state education standards. Beyond communicating the impact of library instruction, TRAILS helped revise instructional practices.7 This protocol has been used in a variety of settings to assess information literacy competency and to measure skill development over time for grade school students.8 TRAILS was an online tool from 2006 through 2019, with functionality for librarians to create accounts and administer tests to students; the basic components of TRAILS, including assessment questions and related teaching resources, continue to be available freely online is the TRAILS archives.9

Tools for Library Assessment

Mechanisms for assessment vary across libraries and programs. Common approaches or tools used in academic libraries today include fixed-choice tests, such as the TRAILS method; performance assessment; and rubrics.10 Self-reporting as assessment is also common, although understood as problematic due to the high likelihood that students will evaluate their own abilities inaccurately.11 Megan Oakleaf credits fixed-choice assessment tools as highly reliable, although Andrew Walsh expresses concern at an overreliance on this type of assessment.12 Walsh notes that fixed-choice was the most common assessment tool found in his survey of the literature and suggests that it is most meaningfully implemented if designed in conjunction with clear information literacy standards.13 TRAILS is one example of a fixed-choice assessment that is aligned to standards, in this case those developed by the Ohio Department of Education, the American Association of School Librarians, and the Association for Educational and Communications Technology.14

Assessment in Teaching with Primary Sources

Specifically in the field of instruction with archives collections, commonly referred to as teaching with primary sources (TPS), there is a noted dearth of assessment literature from which to draw; historically, this field has largely relied on anecdotal feedback or self-assessment, or has lacked assessment entirely.15 A few assessment models based on rubrics have been developed to evaluate student engagement with digital and physical archival materials and development of document analysis skills through archival instruction.16 When writing about their own rubric for assessing instruction in archives, Press and Meiman note that rubric assessment can also be impressionistic.17

Practitioners across the field of TPS note a need for better assessment practices.18 The most prominent quantitative tool created in response to this demand is the Archival Metrics Toolkit. A Student Researcher version of the toolkit evaluates the quality and impact of general orientation in an archives and the quality of user interaction with archives staff, physical materials, and access tools.19 The toolkit relies on self-reported data; however, as noted, this type of assessment is limited in its reliability.20 Carini suggests that the Archival Metrics Toolkit could be more helpful if it was based on a set of standards or goals.21 This speaks to the necessity of identifying goals and metrics for measurement when designing assessment practice. Elizabeth Yakel and Deborah Torres advocate that instructors aim for a type of “archival intelligence” that encompasses a general understanding of archival practice, research strategies for navigating archives-based projects, and “intellective” skills for analyzing and interpreting the types of materials found in archives.22 The creation of the 2018 Guidelines for Primary Source Literacy set new benchmarks for literacy skills and has led to the creation of a TPS assessment rubric that breaks assessment into five categories: conceptualize; find and access; read, understand, and summarize; interpret, analyze, and evaluate; and use and incorporate.23 However, parallel to the development of these standalone guidelines are conversations about the value of TPS for teaching transferable skills—not just archives-specific skills.24 Although skills transfer may be a more immediate benefit as students use research and critical thinking skills in other areas of their academic studies, others note the helpfulness of TPS in imparting habits for civic engagement and skills that can be used in the job market.25

Redesign of Assessment Tools

While assessment is still new and uncertain territory for some, especially in the field of TPS, others have conducted substantial program assessment over a number of years and have considered whether they can use this information to redesign their own assessment practices. Susan Searing reflects on a decade of assessment of an information literacy program and concludes that revised assessment tools would not necessarily improve their process. Instead, Searing recommends application of lessons learned from previous assessment protocols as the most useful focus of future efforts.26 By contrast, in his interview with Daniel Callison, Keith Curry Lance explains the reasoning behind development of new assessment protocols for measuring the impact of school library services. Curry Lance says that existing assessment practices had become “exhausted”; extremely consistent findings over time and across a number of states made it difficult to learn new things from existing assessment tools.27

Our own redesign-of-assessment project is focused on the Brooklyn Connections program at Brooklyn Public Library's Center for Brooklyn History (a department formed through the merger of the Brooklyn Historical Society and Brooklyn Public Library's Brooklyn Collection). The Center for Brooklyn History provides access to the largest collection of Brooklyn history in the world. Brooklyn Connections works with grades 4 through 12 students in local schools, through a residency-style partnership program, to teach research skills through the lens of local history. Teachers apply for up to four of their classes (of the same grade level) to work with a Brooklyn Connections educator over the course of the school year; the program accepts teachers at up to thirty-five schools in a given school year, with preference given to Title 1 public schools.28 The majority of classes in the program are at either an ICT (integrated co-teaching) or general education level, although self-contained and honors classes also participate.29

Within the scope of their partnership, teachers work with their Brooklyn Connections educator to select a local history topic they can research at the Center for Brooklyn History and a series of skills-based information literacy lessons the educator will teach during four to six in-class visits. These might include analyzing historic documents, taking notes, citing sources, writing a research question, and crafting a thesis statement. All classes visit the archives at least once and complete a research project, which is displayed at a year-end convocation ceremony attended by students from every partner school.30

Dating back to its start in 2007, program assessment for Brooklyn Connections has been conducted both formally and informally. Informal assessment happens on a continuous basis: Brooklyn Connections educators are constantly in touch with partner teachers to discuss what has worked well during each classroom visit and what could be modified to better suit classroom learning styles and student needs. In addition, educators observe student learning during in-class sessions to better understand what teaching techniques, archival materials, and activity styles connect best with students.

Formal assessment has historically comprised anonymous pre- and posttest surveys completed by all partner educators and students at the beginning and end of the school year. For the purposes of this article, we focus on the posttest student assessment tool; presurveys did not gather information that could be used to analyze program impact.31 The Student Exit Survey provides anonymous self-report data, so results cannot be broken down by school but instead give an overall measure of the program's affective impact: Did students enjoy working on research? Did they enjoy visiting the archives and seeing historic documents? Did they find the Brooklyn Connections program helpful for learning research skills and completing a project? Do they feel confident about completing a future research project?

In 2019, Brooklyn Connections reached out to Knology, a social science research nonprofit with experience in evaluating educational initiatives, for assistance in understanding the data collected in formal assessments from the 2012–2013 school year through the 2018–2019 school year. Previously, assessment data had been used primarily to demonstrate program value to funders. Survey feedback provided easy quantitative and qualitative data for sharing in grant reports, such as 80% of students said that they would be more confident in working on a future research project after participating in Brooklyn Connections, or students reported that they enjoyed “getting to explore more about Brooklyn history.” The Brooklyn Connections team examined whether existing data could be used to understand more about the program's impact on students and teachers as well as whether the assessment tool could be strengthened to provide more useful information about our instruction practice and the outcomes of our program.

Researchers at Knology confirmed that, overall, existing assessment data showed that students and teachers were very happy with the Brooklyn Connections program. Knology's analysis organized student survey responses according to experiences with the program, self-assessment of skills learned, and self-assessment of overall perception. While responses were generally skewed toward the positive end of their scales, there was sufficient variability to test the strength of correlations between responses. The data showed a strong positive relationship between response types: increased ratings on experiences and skills learned was accompanied by increasingly positive perceptions of the Brooklyn Connections program as well as of history (or social studies) overall.32 The results were corroborated by a computational analysis of the underlying affective associations conveyed by word choices in free-text responses to qualitative survey questions. On average, students “felt good and ready for action” after completing the Brooklyn Connections program.33

Researchers at Knology highlighted the consistency of student report data across all years and suggested modifying the assessment strategy going forward; this echoes the sentiments of Keith Curry Lance, noted previously, regarding modification of assessment tools after they have become “exhausted.”34

With this analysis of past assessments in mind, Knology researchers met with the Brooklyn Connections team to imagine a redesign of program assessment tools. While the entire scope of this conversation and subsequent work involved redesign of teacher and student assessment tools, for the purposes of this article, we focus on redesign of student assessment surveys.

A first step in this process was setting goals for assessment. First, after realizing that the previous survey design limited findings to self-assessment of perception and outcomes, the Brooklyn Connections team was interested in assessing more concretely whether students were in fact learning research skills and whether they could transfer these skills to contexts outside archival research of local history topics. Clearer assessment of the success (or lack of) of skill transfer could help the Brooklyn Connections team refine instruction techniques for specific skills or modify supplemental materials to reinforce skills transfer, such as providing classroom teachers with clear mechanisms for repeating and reinforcing these skills in other subject areas. Second, the team also wondered whether the impact of Brooklyn Connections on identity formation was possible to assess: while educators often see students indicate a closer connection to the place they live after studying local history, previous assessment tools had not been designed to capture this. Understanding this could help Brooklyn Connections educators advise classroom teachers on how to incorporate the program into their curriculum and would also help communicate this program outcome to teachers who might not otherwise recognize the value of the program for their students. Staff at Knology helped craft these goals into a series of research questions that would drive redevelopment of a survey tool.35 Finally, Brooklyn Connections was also interested in continuing to gather data on student experiences with the program, as well as student perceptions of the library and of history research; this data remains invaluable for communicating with program funders.

Assessment format was a key consideration in the planning process. While other options were explored, the nature of Brooklyn Connections program delivery across an entire school year and with approximately 1,500 students per school year made a survey (delivered on paper or electronically) the simplest option for receiving and analyzing data. This could easily be administered by a Brooklyn Connections educator at the end of each program year, and construction of the survey tool considered the fact that it may need to be delivered either electronically or in person, depending on the school and on available technology. In the 2020–2021 school year, online learning made administration of an electronic survey very easy for teachers to share as a link through their respective virtual learning environments. We anticipate that future years will return to in-person teaching, and our experience has been that the majority of partner schools do not have classroom technology for all students to complete a task on a computer. Therefore, it was important to consider how a survey could be designed as adaptable for both electronic and paper delivery.

While previous years of the partnership program had included a pre- as well as a postsurvey, pretest instruments did not contain data that could be compared to assess change resulting from the program. Moreover, the most meaningful analysis of a pre- and posttest would require assignment of unique IDs to compare student responses. Because of the complexity of the school environments Brooklyn Connections usually works in, assignment of unique IDs was seen as having a high risk of noncompliance and was ruled out. However, student surveys were designed to include space for students to indicate their teacher; this provides the option to compare student responses with feedback provided by their teacher on a separate teacher survey (the teacher survey is not discussed within the scope of this article). A decision was made to design the new assessment tool as a postsurvey only, delivered either by the teacher or by a Brooklyn Connections educator during class time after the culmination of the school year partnership.

A new survey instrument with three modules was created to gather data related to our three new assessment goals: understanding student experiences with the Brooklyn Connections program; capturing program impact on identity and students' feelings about Brooklyn as a place; and analyzing whether students demonstrate the transferable research literacy skills that they were expected to learn through the program.36

In the first module, on students' experiences and enjoyment of the Brooklyn Connections program, students are invited to indicate how much they remember various aspects of the Brooklyn Connections program. They are asked to report if they visited any libraries for their research project (options include the Center for Brooklyn History, Brooklyn Public Library locations, the school library, or other), and to select the types of activities they did at the library. Students can then select the types of activities, from a similar list, that they might do if they had to complete another research project.

Module One also asks students to reflect on how much they enjoyed Brooklyn Connections. They are first invited to “self-calibrate” their sense of enjoyment by writing about their favorite school activity. Beyond using unbiased language in our assessment tool, this self-calibration offers the only other technique for preempting response bias for socially desirable responses. Following this, students are asked to use this as a benchmark of what they really like and to rank various aspects of their experience with Brooklyn Connections on a scale from “really disliked” to “really liked.” An open-ended question asks students to explain to another student who has not participated in Brooklyn Connections, why they might like to do the program.

The second module seeks to understand the “ultimate” impact of Brooklyn Connections. On a practical level, Brooklyn Connections helps students conduct archival research on Brooklyn history, using the Center for Brooklyn History at the Brooklyn Public Library. Beyond its education benefits, the process of archival research may spark recognition of people, places, and events from the past. This recognition may, in turn, spark affiliation with those people, attachment to those places, and engagement with those events, connecting people, places, and events from the past to people, places, and events in the present. Through this connection, archival research may amplify a Brooklyn (local) identity—affiliation, attachment, and engagement—that binds across time and other differences. This Brooklyn identity may motivate students to action (or, at least, aspirations of future action) that serve the Brooklyn community. Educators informally witnessed this in teachers and students over past years of the partnership but had not previously sought to capture this impact through an assessment tool. Our new survey tool aims to trace the ultimate impact of Brooklyn Connections along a path of research → recognition → affiliation/attachment/engagement → identity → motivation → aspirations/actions.

To achieve this, a series of survey questions invites students to indicate where they fall along the continuum of “strongly disagree” to “strongly agree” for the following statements, designed to gauge recognition, affiliation, identification, and motivation/aspiration respectively:

  • While doing research for my Brooklyn Connections project, I found that people, places, and events from Brooklyn's past reminded me of people, places, and events in Brooklyn today.

  • Learning about people, places, and events from Brooklyn's past makes me feel closer and more connected to Brooklyn's past.

  • Learning about people, places, and events from Brooklyn's past makes me feel closer and more connected to Brooklyn today.

  • Learning how people from Brooklyn's past shaped what Brooklyn is today makes me want to take action that will shape the Brooklyn of the future.

The third module of the new student survey aims to assess the “proximal” impact of Brooklyn Connections: are students in fact learning research skills and could they transfer these to other research contexts? Design of this section took into consideration the fact that, over the course of the partnership, each teacher chooses a series of skills-based information literacy lessons the educator will teach during four to six in-class visits. This means that not every student taking this survey will have received the same skills-based lessons. However, educators were able to pinpoint a number of skills that all Brooklyn Connections students are usually introduced to through whichever lessons their teacher selects:

  1. Understand the difference between a primary and secondary source

  2. Understand the difference between fact and opinion

  3. Judge the trustworthiness of a source

  4. Ask questions for research

  5. Understand that the research process has multiple steps

To assess whether students attained and could transfer these research skills, questions were adapted from the TRAILS assessment protocol described. Knology staff and Brooklyn Connections educators reviewed TRAILS assessment items corresponding to the skills noted, at elementary, middle, and high school levels, and selected five questions for each of the three learning levels. When these are delivered online, students can be directed to the appropriate questions for their learning level by selecting their grade level. When they are delivered as a paper survey, Brooklyn Connections educators will provide students with the appropriate version of the survey for their grade level. The goal of using TRAILS assessment items was to use an existing standards-aligned assessment tool that has been widely implemented. We were also aware that TRAILS had previously published a set of benchmarks that could allow us to reflect on how Brooklyn Connections students' responses measured against their peers.

While this survey instrument was designed for a specific student population in a specific context, the methodology used is not limited to this context, and sections of it could be adapted to other educational settings (including those outside K–12) that have similar desired outcomes. Specifically, the ultimate impact items in Module Two could be edited to refer to a different location, and skill-based questions in Module Three could be swapped with other questions, for the age group desired, within the TRAILS database.37

This new student survey was delivered for the first time at the end of the 2020–2021 Brooklyn Connections partnership year. The 2020–2021 partnership year was a very different experience for students and teachers because of the COVID-19 pandemic. The majority of New York City students were engaged in full or partial online learning, and so we adapted our partnership program for virtual delivery. Because of ongoing uncertainties during the school year due to changing school schedules and hybrid learning models, fewer teachers were able to engage with the partnership program. While past years have engaged sixty to seventy teachers working with approximately 1,500 students at thirty to thirty-five Brooklyn schools, in 2020–2021, we worked with fourteen teachers and 362 students in sixteen classes at ten schools. Eighty percent of these schools were Title 1. Participating classes included general education, integrated coteaching, self-contained, and honors level. All instruction from Brooklyn Connections was remote, and students were not able to visit the Center for Brooklyn History archives in person. Only one of the ten schools had students in the classroom for all of their Brooklyn Connections visits (the educator joined remotely). Most students received all of the Brooklyn Connections learning materials electronically through an online learning platform, although some schools were able to receive physical materials—primary source packets and worksheets—through coordination between school administration and Brooklyn Connections. All partner schools completed a research project; in lieu of an in-person project exhibition and convocation ceremony, projects were displayed in a virtual exhibition using Padlet. Brooklyn Connections educators made a short convocation video to celebrate the work of partner schools. A link to the online student survey was sent to teachers with the link to this convocation video and virtual exhibition; some teachers scheduled class time to watch the convocation video and have students complete the survey, while other teachers assigned the survey as a task for students to complete outside of class time.

Out of 362 students who participated in the Brooklyn Connections program during the 2020–2021 school year, 156 responded to our student survey, including students from each of 10 participating schools. Given the difficulties that we have seen teachers face in engaging students with online assignments, we were satisfied with this 43% response rate. We were also aware that some high school classes that had engaged in the partnership program in their fall semester were no longer accessible to the teacher through the virtual learning platform for administration of the survey because of the nature of access permissions in that digital learning environment for semester-length classes; as a result, these students did not receive the survey.

We assessed written responses to check for nonsensical entries that might indicate a lack of comprehension or attention to the task; we did not identify any concerning responses. Of the 156 responses received, 5 students stopped the survey at the end of the first module. An additional 2 elementary school students, 3 middle school students (grades 6 through 8), and 1 high school student (grades 10 through 12)—6 in total—began but did not complete the third module. The overall percentage of responses by age range as compared to a percentage breakdown of partnership participants by age range is illustrated in Figure 1. This comparison does not reveal any large enough discrepancies across age range to impact our use of the data for program assessment.

FIGURE 1.

Breakdown of survey responses and program participation

FIGURE 1.

Breakdown of survey responses and program participation

Close modal

We also tested the data for response bias to verify if responses skewed in a specific direction. All skewness values were less than the absolute value of ±1 and thus did not raise any concerns about response bias. For example, in Module One when we asked students “Rate how much you liked or disliked the following aspects of Brooklyn Connections,” responses gave the skewness values shown in Table 1).

Table 1.

Response Areas and Themes

Response Areas and Themes
Response Areas and Themes

The only response in this list that we noted as “suspicious” was related to liking the Brooklyn Connections educator, but skew was deemed well within the acceptable range (−1<−.83<1), and we did not feel that this could be flagged as response bias.

Module One Results

The first module of the survey measured students' experiences with Brooklyn Connections as well as their recall of various aspects of the program, resources they used for research, and tools that they would like to use for future research.

Students were provided with a list of 5 items and asked to indicate whether they “really remember,” “sort of remember,” or “completely forgot” each. For working on a research project, 118 students selected “really remember,” while 109 students really remembered studying a Brooklyn history topic, and 107 students really remembered working with a Brooklyn Connections educator. Ninety-three students really remembered seeing historic documents, and 38 really remembered visiting the Center for Brooklyn History archive (see Figure 2).

FIGURE 2.

Students' rating of the aspects they remember of Brooklyn Connections

FIGURE 2.

Students' rating of the aspects they remember of Brooklyn Connections

Close modal

Calculating a score for each of those categories, with a value of 3 assigned to “really remember,” 2 for “sort of remember,” and 1 for “completely forgot,” provides a mean score for each category. Creating a research project received the highest score, at 2.71. Studying a Brooklyn history topic (2.69) and working with a Brooklyn Connections educator (2.64) were close behind again; these mean scores are illustrated in Figure 3.

FIGURE 3.

Mean score of student responses on how much they remember aspects of Brooklyn Connections

FIGURE 3.

Mean score of student responses on how much they remember aspects of Brooklyn Connections

Close modal

It should be noted that, because of the COVID-19 pandemic, students did not visit the Center for Brooklyn History archives in person during the 2020–2021 partnership program. This option was left on the survey to allow comparison in future years. Students may have indicated that they remembered visiting the archives because they had done the program in a previous year and recalled a previous visit to the archives; because they recalled watching a brief video of the archives that was shared with their teachers as an optional activity; or because they were confused.

Students were asked if they visited any libraries or archives outside class time to do their research, including Brooklyn Public Library's Central Library, a different library in Brooklyn, a library not located in Brooklyn, their school library, the Center for Brooklyn History archives, the Brooklyn Public Library website, or none of these places. In the 2020–2021 partnership year, we expected responses to be low for this question because of library closures due to the COVID-19 pandemic. Responses are depicted in Figure 4.

FIGURE 4.

Places students visited to work on their projects outside of school

FIGURE 4.

Places students visited to work on their projects outside of school

Close modal

Students were also able to share what they did if they visited a library (see Figure 5). Twelve indicated that they talked to a librarian, 29 talked to a Brooklyn Connections educator, 23 looked for information in books, 11 used a library computer, and 59 indicated that they looked at online resources. Students were able to indicate other resources that they used at the library, and several commented that they were not able to visit libraries and had not had any in-person classes but had to do all their work remotely. One student mentioned, “I would not go to the an [sic] Library because I think looking online is much easier.” Other students indicated that they used the library to print things.

FIGURE 5.

Student responses on using library resources for research

FIGURE 5.

Student responses on using library resources for research

Close modal

In thinking about which of these library resources they would use or reuse if they had to do another research project, 28 indicated that they would speak with a librarian, 54 would speak with a Brooklyn Connections educator, 83 would look for information in books, 56 would use a computer, and 80 would use online resources (see Figure 5). In the space supplied for other comments, one student reiterated how much they would like to look for books, while another indicated that they would like to use microfilm. Twenty-four indicated that they would not visit a library. Overall, 113 students identified that they would use a type of resource that they did not previously identify using on their Brooklyn Connections project. Resources students indicated they had used versus resources students indicated they would like to use on a future project are compared in Figure 5.

Module One concluded with a focus on how much students enjoyed the program. After calibrating their own measure of enjoyment by writing a brief reflection on their favorite school activity, students were asked to rate a series of aspects of Brooklyn Connections on a 5-point scale, from “really disliked” (1 point) to “really liked” (5 points). Responses were assigned values of 1 to 5 points, and the mean score is displayed alongside a total number of students who selected “really liked” in Table 2. “Working with your Brooklyn Connections educator” was the aspect of the program that received the highest mean score (4.14) as well as the greatest number of “really liked” designations (67 students).

Table 2.

Students' Ratings on How Much They Liked Aspects of Brooklyn Connections

Students' Ratings on How Much They Liked Aspects of Brooklyn Connections
Students' Ratings on How Much They Liked Aspects of Brooklyn Connections

Module Two Results

In the second module, we aimed to measure the ultimate impact of Brooklyn Connections, looking for evidence that the process of archival research on local history topics sparks a recognition that in turn generates affiliation with the past, identification with the present, and aspiration for the future. Students answered a series of questions designed to measure whether the partnership program generated each of these four reactions, by moving a slider along a continuum of “really disagree” to “really agree.” These slider responses were translated into a number from 0 to 100, where 0 is “really disagree” and 100 is “really agree.” The mean of these scores is recorded in Table 3.

Table 3.

Measuring the Ultimate Impact of Brooklyn Connections

Measuring the Ultimate Impact of Brooklyn Connections
Measuring the Ultimate Impact of Brooklyn Connections

We identified that the correlation between scores for recognition and affiliation was reliable (r=.44, p < 0.001), as was the correlation between recognition and identification (r=.58, p < 0.001). If students gave a high rating to the connection they felt with Brooklyn's past while doing research, they were more likely to give a strong rating to how closely they feel connected to both the past and the present of the place they live in. We used a linear regression to predict the aspiration ratings from the ratings on the antecedent constructs—recognition, affiliation, and identification—as well as performance on the TRAILS-derived measure. Except for identification,38 the hypothesized antecedents contributed significantly39 to the aspiration ratings. For every unit increase of the ratings on recognition (on a scale from 0 to 100), the aspiration ratings increased by 7.30 (again, on a scale from 0 to 100). Similarly, aspiration ratings increased by 5.56 for every unit increase in affiliation. Altogether, the ratings on three antecedent constructs explained 32% of the variance in affiliation ratings; this is a large effect40 and provides strong evidence of a substantive relationship between future aspiration and feelings of recognition/identification and affiliation. Performance on the TRAILS test only accounted for 2% of the variance in affiliation ratings. In other words, aspirations were negligibly associated with learning performance. Feelings of recognition/identification and affiliation better predicted aspiration.

Module Three Results

The third module drew on TRAILS assessment protocols to assess the proximal impact of the Brooklyn Connections partnership program: are students learning new research skills that they can transfer to other research contexts? Students were given different questions based on their self-identification as elementary, middle, or high school, from the TRAILS protocol for grades 3, 6, and 9 respectively. All students were presented with 5 multiple-choice questions appropriate to their grade level, designed to assess whether they could

  1. Identify a good research question,

  2. Understand the best order of steps in the research process,

  3. Identify primary sources,

  4. Differentiate between fact and opinion, and

  5. Judge whether a source is trustworthy or reliable.

Students were given a score out of 5 for the 5 questions, and the mean score was calculated for elementary, middle, and high school. Score breakdown by grade and by age group is laid out in Table 4.

Table 4.

Summary of Module Three Scores

Summary of Module Three Scores
Summary of Module Three Scores

As illustrated in Table 4, mean scores vary considerably by grade level within each larger category of elementary, middle, and high school. The significance of this variation at the grade level is unclear because of the small sample size of some grades.

We compared percentage scores with benchmarks released by TRAILS from the 2016–2017 school year (the last year benchmark data were published), with an understanding that these benchmark data were collected in a variety of posttest and pretest settings and allow one to compare Brooklyn Connections students against both national averages and New York State averages.41 These comparisons are illustrated in Table 5.

Table 5.

Comparison of TRAILS and Brooklyn Connections Assessment Data

Comparison of TRAILS and Brooklyn Connections Assessment Data
Comparison of TRAILS and Brooklyn Connections Assessment Data

On a more granular level, elementary and middle school students consistently struggled with the first question in Module Three, where they were tasked with identifying the most suitable research question to aid in the study of a topic presented in the question. Twenty-four of 59 elementary school students responded to this question correctly (40%), and 18 of 44 middle school students (40%). High school students struggled most with questions 2 and 4, which focus on understanding the steps of the research process (question 2) and differentiating between fact and opinion (question 4). For question 2, 16 out of 42 students answered correctly (38%). On question 4, 13 out of 42 students answered correctly (30%).

To understand the reliability of comparison between benchmark scores and our data, we ran a t-test to corroborate the apparent differences between the scores for Brooklyn Connections students and the US benchmarks at each of the learning level categories: elementary, middle, and high school. Specifically, we tested whether the average (arithmetic mean) scores observed for Brooklyn Connections students would be “surprising” or unlikely, given the Benchmark scores. In performing a t-test, one compares a “hypothetical” score (i.e., the Benchmark score for each Learning Level) against the estimated range of probable scores based on the mean and standard deviation of the observed scores.42 Our tests corroborated that, on average, both elementary and middle school students in Brooklyn Connections performed better than the US Benchmark. The hypothetical score based on the Grade 3 US Benchmark (M=5*0.532=2.66) fell outside the estimated range for elementary students in Brooklyn Connections (3.01<M<3.70). Cohen's d statistic measures the magnitude of the difference between two average scores (effect size), in this case, the observed average score and the benchmark score. Research conventions specify a value of d≅0.2 as a small effect, d≅0.5 as medium, and d≅0.8 as large. By these conventions, Brooklyn Connections had a medium effect (Cohen's d=0.52) on the scores of elementary students. Likewise, the hypothetical score based on the Grade 6 US Benchmark (M=5*0.471=2.36) fell outside the estimated range for middle school students in Brooklyn Connections (3.25<M<3.84). Brooklyn Connections had a very large effect (Cohen's d=1.22) on the scores of middle school students. For high school students in Brooklyn Connections, we could not rule out chance in the difference between the observed average score and the hypothetical score: the theoretical score based on the Grade 9 US Benchmark (M=5*0.502=2.51) fell within the estimated range for high school students in Brooklyn Connections (2.44<M<3.27). In other words, high school students in Brooklyn Connections likely performed about as well as the average high school student in the United States.

Overall, an important first step in reflection on redevelopment of our program assessment tools is whether the tools themselves are appropriate for our target population. The completion rate we saw—11 of 156 students did not complete the survey, or 7%—feels acceptable to us for a survey of this length. In future years, when we expect to have larger cohorts of students in our partnership program, we will watch to see whether the completion rate changes dramatically. We also feel confident about administration of the survey in an online format (in this case through SurveyMonkey). Future years will require administration of the survey in both paper and online formats; we anticipate increased work resulting from necessary data entry of paper survey responses and translation of sliding scale responses, and we will watch for any other unexpected issues in administration of the paper survey.

The breakdown of survey respondents by grade level (see Figure 1) as compared to a breakdown of participants in the program overall shows similar response rates and participation rates for high school: 27.6% of responses were from high school students, versus 32.9% of program participants who were high school students. Elementary and middle school response rates were more variable; while 37.6% of students in our partnership program were in middle school, only 30.1% of responses came from middle school. And, while only 29.6% of students in our partnership program were in elementary school, 37.9% of survey responses came from this group. These discrepancies are not large enough for us to be concerned, but we find them interesting and, if this pattern continues, they suggest that we should be mindful of how surveys are administered at different grade levels. We suspect that elementary school teachers dedicate more in-class time and support to students doing this survey because they understand that it will be challenging for younger students, while middle school teachers may provide less support and class time based on the assumption that the task will not present a considerable challenge for their students. In the future, we can share more appropriate guidance with teachers on how best to support their students with survey completion, based on grade level.

Further discussion of the findings should be framed by our initial goals for assessment redesign and the questions these led us to. Our redesign was motivated by a desire to derive more useful information from our assessment practice: could assessment have a greater impact on our instruction, and could it give us better frameworks for describing the impact of our work? We developed three guiding research questions which, in turn, framed the three modules of our new assessment tool:

  1. What can we learn about student experiences with the program, as well as student perceptions of the library and of history research?

  2. Is the impact of Brooklyn Connections on identity formation possible to assess, and do students actually feel a closer connection to the place they live after studying local history?

  3. Are students learning research skills, and can they transfer these skills to contexts outside archival research on local history topics?

A discussion of the results of each module, framed by these guiding questions, is provided here, along with suggestions for how redesign of our assessment tool provides new real-world insight for our work.

Module One: What can we learn about student experiences with the program, as well as student perceptions of the library and of history research?

Module One of the survey explored student experiences and perceptions of the program they participated in (see Figures 2 and 3). Analysis of what students remember most from the program shows that working on a research project is most memorable. This confirms the value of incorporating project-based learning into TPS. “Studying a Brooklyn history topic” and “Working with a Brooklyn Connections educator” were both close behind the top-ranked item. This is valuable for us in confirming the impact of teaching through the lens of local history and in highlighting the value to students of building a relationship with an archives educator.

We recognize that the results for most memorable aspects of this program could change in future years. While visiting the archives and working with historic documents ranked lower, in the 2020–2021 school year, students were not able to visit the archives because of COVID-19 closures. And, while students worked extensively with digital reproductions of historic documents and, in a few cases, with print reproductions, they were not able to interact with original documents in the archives. We will be interested to see in future years if these aspects of the program prove more memorable.

A section of Module One asks what places students visited independently and outside of class time while working on their research project (see Figure 4). It is difficult to extrapolate findings from this section because of the impact of COVID-19 closures during the 2020–2021 school year. Sixty students indicated that they used the Brooklyn Public Library website; we are grateful to see that a number of students realized this would be a useful resource, as we know that many of our students rely entirely on search engines for their research. Eighty-two students (53%) indicated that they did not visit any of the places we mentioned. As baseline data, this is useful for us in terms of comparing how these responses change in future years when more options become available after the COVID-19 pandemic.

The next section of Module One invites students to reflect on the library resources they used for their research, as well as library resources they would use in the future if they had to complete another research project (see Figure 5). The most prominent takeaway for us in this section is that 113 students indicated they would use a library resource that they hadn't previously. We understand that many students were not able to use library resources as they would have liked because of library closures due to the COVID-19 pandemic; however, responses show that they are not entrenched in the research behavior they practiced while in our program but are instead eager to try out resources and strategies that they did not have access to. In the midst of a pandemic school year, the second most-used library resource was the Brooklyn Connections educator (29 students), which reinforces for us the importance of humans—whether classroom teachers or visiting educators—as a resource for students who are learning online.

The greatest increase between resources used and resources students would want to use in the future falls under “Look for information in books.” While 23 students were able to use library books for their research, 83 expressed that they would like to on a future project. We also saw an impressive percentage increase in the number of students who used a library computer (11) and the number who would like to for a future project (56). This reinforces a critical need for public access computers in public libraries; we suspect that, although most of our students received some kind of device (often only a Chromebook or tablet) from their school for remote learning, many recognized that they needed access to a more powerful computer or their own dedicated computer (many of our students shared a device with family members) to complete a research project.

At the conclusion of Module One, we asked students what they liked most about the Brooklyn Connections program (see Table 2). Again, we expect these scores to change in future years, as one of our favorite aspects of the program—“Visiting the Center for Brooklyn History archives”—was not possible in the 2020–2021 school year. However, we were gratified to see “Working with your Brooklyn Connections educator” receive the highest number of “really liked” ratings (67) as well as the highest mean score (4.14 out of 5). Combined with an expressed desire to use a Brooklyn Connections educator as a resource on future projects (see Figure 5) and the high score that “Working with a Brooklyn Connections educator” received on the question about how much students remember various aspects of the program (see Figures 2 and 3), we understand this data as affirming the value of designated archives educators who are available to build relationships with students while supporting them in the research process.

Through the redesign of our assessment tool, Module One provided us with clearer information on the affective impact of the program. While our previous assessment tool had given overall positive results and students reported that they enjoyed “getting to explore more about Brooklyn history,” the breakdown of affective impacts in our redesigned assessment tool helped us understand program factors that are especially important: providing educators who can build relationships and act as resources for students and classroom teachers, creating opportunities to research local history topics, and giving access to technology that students can use for research. Similar survey questions, adapted to include relevant program elements, could be designed for other assessment tools to provide similarly granular insight.

Module Two: Is the impact of Brooklyn Connections on identity formation possible to assess, and do students actually feel a closer connection to the place they live after studying local history?

Module Two provides us with insight into the impact of Brooklyn Connections on identity formation: whether students feel a closer connection to the place they live after studying local history. We already learned in Module One that studying a local history topic was memorable (see Figures 2 and 3). In Module Two, students indicated that studying topics from Brooklyn's past helped them recognize similarities in Brooklyn (a mean response of 71.13 on a scale of 100 for recognition), and feelings of affiliation with Brooklyn's past and identification with Brooklyn's present correlate strongly with this initial recognition. Moreover, recognition and affiliation contribute reliably to a sense of aspiration: students are interested in taking actions that will shape Brooklyn's future after learning about and connecting with Brooklyn's past. While our past experience in delivering the Brooklyn Connections program led us to theorize that study of local history leads to a sense of affiliation with place and an aspiration to take actions in the local community,43 Module Two provided us with data to confirm this theory.

Reflecting on the redesign of our survey to incorporate the program's impact on identity formation, we understand that Module Two, as currently designed, does not provide a clear indication of how much Brooklyn Connections changed students' sense of recognition, affiliation, identification, and aspiration. In future years, we will explore administering a version of Module Two alone as a brief, anonymous assessment at the start of our first session with students in our partnership program; these pretest scores could be compared to the posttest Module Two to present a clearer picture of the degree to which these feelings change after participation in the program. However, we were thrilled to confirm through our first implementation of this tool that it is possible to collect these data, and we are excited about how others might replicate and adapt this portion of our assessment tool to measure identity formation in related work, both in TPS and in other instruction areas.

Module Three: Are students learning research skills, and can they transfer these skills to contexts outside archival research on local history topics?

Module Three provides insight into whether students are learning research skills that they can transfer to contexts outside archival research on local history topics. We understand that many other factors may contribute to the results we see for this module: students may come to the program already possessing these skills or may learn these skills during the partnership program from their classroom teacher alongside the work of the Brooklyn Connections educator. However, we believe that the results in this section are not wholly disconnected from our own work. As noted, Module Three results across elementary and middle school students showed Brooklyn Connections students achieving higher percentage results than national or New York State benchmarks and high school students achieving at least on par with benchmarks (see Table 5). Given the variety of schools we work with—80% Title 1, with classroom types including general education, self-contained, integrated coteaching, and honors classes—we feel that, while other factors may be at play, these results cannot be attributed entirely to type of student, school, or school resource level.

Moreover, the results in this section are tremendously insightful for improving our instruction practice. We noted specific areas of weakness in each age level: elementary and middle school students struggled with question one, which asked them to identify a good research question. We have long been aware of the challenges students face when creating questions for research.44 These new assessment results demand that we do even further work to differentiate our instruction of this skill for elementary and middle school audiences. We can also use these results when planning out our partnership program with elementary and middle school teachers; we might suggest more strongly that they include at least one class session entirely devoted to asking questions for research. At the high school level, we need to reflect on and change our instruction practice for teaching the multipart process of doing research and for teaching differentiation between fact and opinion. While survey results show that our instruction may be appropriate for the skill level expected of elementary and middle school students in these areas, TRAILS questions for the high school learning level posed a greater challenge than our students were prepared to meet. Cumulatively, these results provide us with something we hoped our assessment redesign would accomplish: feedback that could help us adjust and improve our instruction practices.

While it is perhaps disheartening to conduct assessments that show areas of weakness, we see the implementation of Module Three as highly successful in that it provided us with honest results and pointed to areas where our skill-based instruction should be adjusted with some grade levels for stronger impact. Others could replicate our use of TRAILS questions in assessment by selecting questions from the freely available TRAILS question bank that specifically addresses skill areas targeted by their work.45

Overall, our work on redesigning our assessment protocols for a TPS initiative has confirmed that improving our assessment tools can give us better frameworks for describing the impact of our work and can have an impact on our instruction by revealing ways to tweak our instructional practice for better results. We also answered our three research questions. We were able to gather information on student experience to understand the value of archives educators, the impact of project-based learning, and the value students see in different types of library resources, among other things. We were able to confirm that the Brooklyn Connections program, through a focus on local history research, helps students feel a closer connection to the place where they live and fosters a desire to take actions that will shape the future of their community. Finally, we were able to confirm that our students are learning research skills that they can transfer to other research contexts.

Data do not tell us everything about how we teach and how our students learn. We understand that the results of this new assessment tool help us understand nuances of our program's impact that we could not see before and also confirm some ideas we had theorized through close observation. This reaffirms that our observational assessment is still valuable and may, in the future, help us build new assessment tools to measure new impacts that we believe we see.

Going forward, we are eager to administer this survey in future years and compare it against the baseline data of this first year's results. We are also excited to see how others use and build on the assessment tools we've developed, and we hope that this can help us as a TPS community in refining our teaching practices and collectively shaping a clearer picture of overall impacts of teaching with primary sources.

This project succeeded in confirming for us that redesign of our assessment protocols could provide insight for speaking about the outcomes of our work and improving implementation of our program.

We understand that this assessment was redesigned in a specific context—TPS specifically with local history content—and our own assessment findings may not be replicable in every other instance of TPS, let alone other contexts of library instruction. However, we encourage readers to note which of our findings resonate with impacts they have observed in their own work and to borrow our assessment tools as appropriate in the process of redesigning their own program assessments. We encourage others to adapt our assessment tool to show how results differ in other types of situations: when working on a one-off instruction basis instead of a partnership program; when looking at topics aside from local history; when working with a variety of K–12 and higher education audiences.

The primary goal of this article, however, is not only to understand our unique program outcomes but more broadly to explore the potential of redesigning a program assessment tool. Our work confirms that clear reflection on existing tools, development of new goals, and design of new assessment strategies can result in inspiring new data on program impact and can highlight areas for improvement. This project provides us with a road map that we hope others might adapt for thoughtful reflection on their own library and archives program assessment, whether within the landscape of TPS or not:

  • Step One: Reflect on the findings of current assessment tools. Do they provide you with the information you want to know? Is there anything you'd like to know that your current assessment tools do not tell you?

  • Step Two: Set new goals. Think about what you'd like to learn from your assessment tools. Reframe these ideas as research questions that can guide your work.

  • Step Three: Design a new assessment tool. This might incorporate parts of your previous assessment tool, or you may decide to start from scratch. Consider the format of your assessment tool and what will be most appropriate for your audience. In our own context, we scrapped a previously implemented pretest because we realized it could not be implemented in such a way (within our context) to provide the data we would want from it.

  • Step Four: Implement your new assessment tool, and analyze the results.

  • Step Five: Reflect: Does your new assessment tool answer the new research questions you created? Is it appropriate for your audience? Is there anything you'd like to change?

Reflecting on Keith Curry Lance's comment that library assessment tools can become “exhausted” and fail to provide new insight, the redesign of our own assessment tool provided excitement and new energy not only for the Brooklyn Connections program but for the huge potential of TPS in local history contexts and elsewhere.46 We hope that others will find this assessment work equally inspiring for breathing new life into the ways we conceive of, implement, and talk about TPS.

Appendix A: Brooklyn Connections Exit Survey, 2008 to 2019

  1. What grade are you in?

  2. Do you have a library card? (yes /no)

  3. Rate how your opinion of history has changed after participating in Brooklyn Connections program:

    • ○ I like the library LESS

    • ○ I like the library the SAME

    • ○ I like the library MORE

  4. Did you visit any of the below libraries to work on your project outside of school?

    • A branch/neighborhood library

    • The Central Library

    • The Brooklyn Collection archives

  5. If you visited the library after school, what did you find most helpful?

    • Librarian

    • Brooklyn Connections Educator

    • Books

    • Computer

    • Other sources

    • I did not visit the library outside of school

  6. True or false?

  7. Rate how your opinion of the library has changed after participating in the Brooklyn Connections program:

    • ○ I like the library LESS

    • ○ I like the library the SAME

    • ○ I like the library MORE

  8. Rate how much you liked or disliked the following aspects of Brooklyn Connections:

  9. In one or two sentences, tell us what you enjoyed most about Brooklyn Connections.

    _______________________________________________________________

    _______________________________________________________________

    _______________________________________________________________

  10. Describe to someone whose class doesn't participate in Brooklyn Connections why they should be a part of the program.

    _______________________________________________________________

    _______________________________________________________________

    _______________________________________________________________

  11. What, if anything, would you change about Brooklyn Connections?

    _______________________________________________________________

    _______________________________________________________________

    _______________________________________________________________

  12. Would you describe your overall experience with Brooklyn Connections as positive or negative?

    _______________________________________________________________

    _______________________________________________________________

    _______________________________________________________________

Appendix B: New Brooklyn Connections Student Exit Survey

It has been a joy working with you this year! We want to hear how your experience was so we can make sure Brooklyn Connections is even better next year. Please fill out this survey and remember, your answers are anonymous so be as honest as you want. Thank you!

Part One

  • Who is your teacher? [drop-down list of options]

  • Rate how much you remember the following aspects of Brooklyn Connections:

  • Did you visit any of these places to work on your project outside of school (not with your whole class)? Select all that apply:

    • ○ The Central Library in Brooklyn

    • ○ A different library in Brooklyn

    • ○ A library not located in Brooklyn

    • ○ My school library

    • ○ The Center for Brooklyn History archives

    • ○ The Brooklyn Public Library website

    • ○ I did not visit any of these places

  • If you visited a library to work on your project, what did you do there? Select all that apply:

    • ○ Talked to a librarian

    • ○ Talked to a Brooklyn Connections educator

    • ○ Looked for information in books

    • ○ Used the library computer

    • ○ Looked at online resources

    • ○ Other (please explain): __________________________________

  • If you had to do another research project, what would you really want to do at the library? Choose up to 3:

    • ○ Talk to a librarian

    • ○ Talk to a Brooklyn Connections educator

    • ○ Look for information in books

    • ○ Use the library computer

    • ○ Look at online resources

    • ○ Other (please explain): __________________________________

  • Tell us about your favorite school activity. It could be anything: reading, group projects, eating lunch, or anything that you like most about your school day.

  • Think about your favorite school activity as something you REALLY like. Now, rate how much you liked or disliked the following aspects of Brooklyn Connections:

  • Describe to someone whose class doesn't participate in Brooklyn Connections why they should be a part of the program:

    _______________________________________________________________

    _______________________________________________________________

    _______________________________________________________________

Part Two

On this page, move the slider to the answer that reflects how much you agree with the following statements:

Part Three

Note: Questions in this section are adapted from the TRAILS Item Bank for grades 3, 6, and 9, https://trails-archive.org/archive.

What grade are you in? (Responses from this question determine whether respondents receive elementary [grades 4 and 5], middle [grades 6 through 8], or high school [grades 9 through 12] questions for part three.)

Elementary School (correct responses notated as <*> )

  1. Your teacher asks you to research elected officials in your community. Which question would best help you find the facts you need to know to complete your research?

    • ○ Where is my state located?

    • ○ Who was elected in my community's last election? <*>

    • ○ What types of trees grow in my community?

    • ○ Who is the governor of my state?

  2. When writing a report, what is the best order of the steps you should take?

    • ○ Find sources of information, list questions you want to answer, take notes, write your paper, choose a topic

    • ○ Take notes, write your paper, list questions you want to answer, choose a topic, find sources of information

    • ○ Choose a topic, list questions you want to answer, find sources of information, take notes, write your paper <*>

  3. You decide to make a gift for your grandmother's birthday and want to include ideas from sources you found in her home, including a letter, diary, piece of clothing, and photograph. What kind of sources are these?

    • ○ Primary sources <*>

    • ○ Secondary sources

  4. You need to find facts about Mexico for a presentation. Which of the following statements is an opinion and should not be included in your list?

    • ○ Mexico City is the capital of Mexico.

    • ○ Mountains cover much of Mexico.

    • ○ The Pacific Ocean borders the west coast of Mexico.

    • ○ Mexican food tastes good. <*>

  5. Your sister shared some information with you that she found on a website about dogs. What is the best way to find out if what she's sharing is true?

    • ○ Ask one of your classmates what they know.

    • ○ Look at an .edu or .gov website about dogs. <*>

    • ○ Read a fictional book about a dog named Rex.

Middle School (correct responses notated as <*> )

  1. Imagine you're conducting research about westward expansion in the United States. Which question would be most helpful in finding relevant information?

    • ○ Who was the President?

    • ○ What states were included in westward expansion? <*>

    • ○ Why did people move west?

    • ○ When was the west founded?

  2. Your teacher wants you to write a report about a natural disaster. Choose the correct order of the steps from the choices below.

    • A) Make a list of questions about your topic that you would like to learn the answers to.

    • B) Choose a natural disaster topic.

    • C) Find information about your topic.

    • D) List what you already know about your topic.

    • ○ A, B, C, D

    • ○ B, D, A, C <*>

    • ○ D, A, C, B

    • ○ B, A, C, D

  3. Your teacher has asked you to find a primary source that tells you about a soldier's life during war. Which of the following examples is a primary source?

    • ○ Book about the war

    • ○ Biography about the soldier

    • ○ Encyclopedia article

    • ○ Soldier's diary <*>

  4. Read the text below. Which sentence demonstrates the author's opinion about the topic rather than fact?

    Many species have become endangered due to the destruction of the rainforests. People around the world have become more aware of how destructive deforestation is to the environment and wildlife. This is a cause that all people should be willing to give money towards to stop it from happening.

    • ○ People around the world have become more aware of rainforest destruction.

    • ○ Species are becoming endangered due to habitat destruction in the rainforest.

    • ○ People should be willing to give money to help save the rainforests. <*>

    • ○ Deforestation hurts the environment.

  5. You're writing a paper on the safety of tanning beds. Which website would have the most authority on the topic?

High School (correct responses notated as <*> )

  1. You have been asked to write a five-page research paper about hydraulic fracturing or “fracking.” Which of the following research questions will lead to the most relevant and focused information for your paper?

    • ○ What are the possible environmental dangers with fracking? <*>

    • ○ How much water is used in fracking to get the natural gas?

    • ○ What fracking cases have come before the Supreme Court?

    • ○ Who invented fracking?

  2. Your English class group is creating a pamphlet about drug abuse. This pamphlet will be distributed in your school and community. Select the correct order of steps your group needs to take in order to complete this project:

    • A) Select sources, evaluate, and record information.

    • B) Organize information and create a rough draft of your pamphlet.

    • C) Identify information needed and likely sources.

    • D) Get review comments and revise for final version.

    • E). Review the success of your research and final pamphlet.

    • F) Focus the topic for the intended audience.

    • ○ F, C, A, B, D, E <*>

    • ○ C, F, A, B, E, D

    • ○ F, B, A, C, E, D

    • ○ C, F, B, A, D, E

  3. You're writing a research paper on Albert Einstein that has to include reference to at least one primary source. Which of the following is a primary source?

    • ○ A biography of Albert Einstein

    • ○ The chapter about Albert Einstein in the book Great Physicists

    • The World Book Encyclopedia entry on Albert Einstein

    • ○ A speech by Albert Einstein <*>

  4. The excerpt below comes from a travel industry magazine. Does it illustrate fact, opinion, or bias?

    Deputy Director Hill of Horseshoe Cruise Line stated the following about the mysterious illness aboard the 900-person cruise ship Royal Lady, “Including crew and guests, we believe that 766 passengers are presently ill. That is certainly not an epidemic. In fact, there is no reason to believe that this illness has anything to do with the food or facilities.”

    • ○ Fact

    • ○ Opinion

    • ○ Bias <*>

  5. Which of the following provides the best evidence that a website is an authoritative and trusted source?

    • ○ The website is endorsed by a well-known celebrity.

    • ○ The website is near the top of search results found in Google.

    • ○ Other reputable sources refer to the website in their work. <*>

    • ○ There are a number of spelling mistakes on the website.

1

“ORE Staff,” American Library Association, http://www.ala.org/aboutala/offices/ors/staff, captured at https://perma.cc/FE63-BGLF; “Office for Research and Evaluation,” American Library Association—eLearning, http://www.ala.org/educationcareers/elearning/unit/ore, captured at https://perma.cc/LH8A-7385.

2

Project Outcome: Insights for Adopting the Model, prepared by ORS Impact for Project Outcome and the Public Library Association (2018), 5.

3

Megan Oakleaf, “Dangers and Opportunities: A Conceptual Map of Information Literacy Assessment Tools,” portal: Libraries and the Academy 8, no. 3 (2008): 233–34, http://dx.doi.org/10.1353/pla.0.0011; Association of College and Research Libraries, Value of Academic Libraries: A Comprehensive Research Review and Report, researched by Megan Oakleaf (Chicago: Association of College and Research Libraries, 2010).

4

Daniel Callison, “Enough Already?: Blazing New Trails for School Library Research An Interview with Keith Curry Lance, Director, Library Research Service, Colorado State Library & University of Denver,” School Library Media Research 8 (2005): 3, http://www.ala.org/aasl/sites/ala.org.aasl/files/content/aaslpubsandjournals/slr/vol8/SLMR_EnoughAlready_V8.pdf, captured at https://perma.cc/TB4K-6AVM.

5

Susan Searing, “Integrating Assessment into Recurring Information Literacy Instruction: A Case Study from LIS Education,” in The Teaching Library: Approaches to Assessing Information Literacy Instruction, ed. Scott Walter (Binghamton, NY Haworth Press, 2007): 208–10; Susan Searing, “In It for the Long Haul: Lessons from a Decade of Assessment,” author's manuscript of article published in Journal of Library & Information Services in Distance Learning 7 (2013): 3, https://doi.org/10.1080/1533290X.2012.705684.

6

Callison, “Enough Already?,” 7–9.

7

Patricia Owen, “Using TRAILS to Assess Student Learning: A Step-by-Step Guide,” Library Media Connection 28, no. 6 (2010): 38, https://eric.ed.gov/?id=EJ887721; Barbara Schloman and Julie Gedeon, “Creating TRAILS: Tools for Real-Time Assessment of Information Literacy Skills,” Knowledge Quest 35, no. 5 (2007): 45, https://eric.ed.gov/?id=EJ826458.

8

Rebecca Scott, “Assessing the Impact of a Guided Inquiry Unit on Year 5 Pupils' Information Literacy: A Student Case Study,” Journal of Information Literacy 11, no. 1 (2017): 221, https://doi.org/10.11645/11.1.2211.

9

“History of Trails,” TRAILS archives (2019), https://trails-archive.org/history, captured at https://perma.cc/LP4C-8AB2.

10

Oakleaf, “Dangers and Opportunities,” 324.

11

Andrew Walsh, “Information Literacy Assessment: Where Do We Start?,” Journal of Librarianship and Information Science 41, no. 1 (2009): 23, https://doi.org/10.1177/0961000608099896.

12

Oakleaf, “Dangers and Opportunities,” 235; Walsh, “Information Literacy Assessment,” 19.

13

Walsh, “Information Literacy Assessment,” 22.

14

Schloman and Gedeon, “Creating TRAILS,” 45.

15

Barbara Rockenbach, “Archives, Undergraduates, and Inquiry-Based Learning: Case Studies from Yale University Library,” American Archivist 74, no. 1 (2011): 304, 308, https://doi.org/10.17723/aarc.74.1.mml4871x2365j265; Peter Carini, “Information Literacy for Archives and Special Collections: Defining Outcomes,” portal: Libraries and the Academy 16, no. 1 (2016): 192, https://doi.org/10.1353/pla.2016.0006; Sarah M. Horowitz, “Hands-On Learning in Special Collections: A Pilot Assessment Project,” Journal of Archival Organization 12, nos. 3–4 (2015), 216–17, https://doi.org/10.1080/15332748.2015.1118948; Anne Bahde and Heather Smedberg, “Measuring the Magic: Assessment in the Special Collections and Archives Classroom,” RBM: A Journal of Rare Books, Manuscripts, and Cultural Heritage 13, no. 2 (2012): 159–60, https://doi.org/10.5860/rbm.13.2.380; Lori Lynn Dekydtspotter and Cherry Dunham Williams, “Alchemy and Innovation: Cultivating an Appreciation for Primary Sources in Younger Students,” RBM: A Journal of Rare Books, Manuscripts, and Cultural Heritage 14, no. 2 (2013): 76, https://doi.org/10.5860/rbm.14.2.402.

16

Magia G. Krause, “Undergraduates in the Archives: Using an Assessment Rubric to Measure Learning,” American Archivist 73, no. 2 (2010): 512, https://doi.org/10.17723/aarc.73.2.72176h742v20l115; Horowitz, “Hands-On Learning in Special Collections,” 221; and Meggan Press and Meg Meiman, “Comparing the Impact of Physical and Digitized Primary Sources on Student Engagement,” portal: Libraries and the Academy 21, no. 1 (2020): 108, http://doi.org/10.1353/pla.2021.0007.

17

Press and Meiman, “Comparing the Impact of Physical and Digitized Primary Sources on Student Engagement,” 108.

18

Patricia Garcia, Joseph Lueck, and Elizabeth Yakel, “The Pedagogical Promise of Primary Sources: Research Trends, Persistent Gaps, and New Directions,” The Journal of Academic Librarianship 45, no. 2 (2019): 100, https://doi.org/10.1016/j.acalib.2019.01.004; Anne Bahde, Heather Smedberg, and Mattie Taormina, eds., Using Primary Sources: Hands-On Instructional Exercises (Santa Barbara, CA: Libraries Unlimited, 2014), 156–57.

19

Wendy Duff et al., “The Development, Testing, and Evaluation of the Archival Metrics Toolkits,” American Archivist 73, no. 2 (2010): 579, https://doi.org/10.17723/aarc.73.2.00101k28200838k4.

20

Bahde and Smedberg, “Measuring the Magic,” 157–58; Krause, “Undergraduates in the Archives,” 510–11.

21

Peter Carini, “Archivists as Educators: Integrating Primary Sources into the Curriculum,” Journal of Archival Organization 7, nos. 1–2 (2009): 45, https://eric.ed.gov/?id=EJ866946.

22

Elizabeth Yakel and Deborah Torres, “AI: Archival Intelligence and User Expertise,” American Archivist 66, no. 1 (2003): 51–78, https://doi.org/10.17723/aarc.66.1.q022h85pn51n5800.

23

Teaching with Primary Sources Collective, “Guidelines for Primary Source Literacy Rubric—TPS Collective: Resource Exchange (BETA),” https://rbms.info/tpscollective/guidelines-toolkit/guidelines-for-primary-source-literacy-rubric, captured at https://perma.cc/7PEZ-XLTM.

24

Garcia, Lueck, and Yakel, “The Pedagogical Promise of Primary Sources,” 98.

25

Jen Hoyer, “Out of the Archives and into the Streets: Teaching with Primary Sources to Cultivate Civic Engagement,” Journal of Contemporary Archival Studies 7, Article 9 (2020), https://elischolar.library.yale.edu/jcas/vol7/iss1/9; Christine Woyshner, “Inquiry Teaching with Primary Source Documents: An Iterative Approach,” Social Studies Research and Practice 5, no. 3 (2010): 42, https://doi.org/10.1108/ssrp-03-2010-b0003; Katie Atkins, Eleonora Gandolfi, and Matt Phillips, Ithaka S+R UX Project, Supporting Undergraduate Teaching with Primary Sources, University of Southhampton (2020): 9, https://eprints.soton.ac.uk/444023.

26

Searing, “In It for the Long Haul,” 215.

27

Callison, “Enough Already?,” 7–8.

28

According to the United States Department of Education, Title 1 schools are defined as “schools with high numbers or high percentages of children from low-income families.” See U.S. Department of Education, https://www2.ed.gov/programs/titleiparta/index.html, captured at https://perma.cc/5TVGBYZP.

29

A General Education Classroom is one where students are all expected to learn at the same level, without additional supports. A Self-Contained Classroom is a type of special education environment usually limited to twelve students, many of whom are assigned a teaching aid. An Integrated Co-Teaching Classroom brings together learners with specific learning needs and age-appropriate peers who are learning at a general education level. An Honors Classroom provides more challenging instructional material for students who need less support.

30

While a lengthy description of the program's implementation in various settings is outside the scope of this article, further description of this program's implementation with elementary, middle, and high school audiences can be found in Jen Hoyer, Kaitlin Holt, and Julia Pelaez, “Crafting a Research Question: Differentiated Teaching for Instruction with Primary Sources Across Diverse Learning Levels,” Case Studies on Teaching with Primary Sources, Society of American Archivists, Case 4, https://www2.archivists.org/sites/all/files/TWPSCase_4_Crafting_A_Research_Question.pdf, captured at https://perma.cc/6Y3M-2SVQ; and Hoyer, “Out of the Archives and into the Streets.” Additional teaching materials for the program, including descriptive lesson plans, are freely available at Bklyn Public Library, https://www.bklynlibrary.org/cbh/connections/resources.

31

A full copy of this Student Exit Survey is included in  Appendix A.

32

John Voiklis, Elizabeth Attaway, Rebecca Joy Norlander, and Nezam Ardalan, Brooklyn Connections: Analysis of 2012–2013 to 2018–2019 Data Looking Back to Plan Ahead (New York: Knology, 2020), 5, https://www.bklynlibrary.org/sites/default/files/documents/brooklyn-collection/connections/Brooklyn%20Connections%20Looking%20Back%20to%20Plan%20Ahead.pdf, captured at https://perma.cc/V2JS-3GWA.

33

Voiklis, Attaway, Norlander, and Ardalan, Brooklyn Connections, 7. Note: the full details of this analysis are outside the scope of this article but can be read in full in Voiklis et al., Brooklyn Connections.

34

Voiklis, Attaway, Norlander, and Ardalan, Brooklyn Connections, i; Callison, “Enough Already?,” 7–8.

35

For a complete list of research questions developed through this process, see Rebecca Joy Norlander et al., Understanding Program Impact & Preparing for Remote Learning (New York: Knology, 2020), 1–2, https://www.bklynlibrary.org/sites/default/files/documents/brooklyn-collection/connections/Brooklyn%20Connections%20Understanding%20Program%20Impact.pdf, captured at https://perma.cc/E7EG-ZXHU.

36

A full copy of this new survey tool is available in  Appendix B.

37

The full bank of TRAILS questions is available at TRAILS archives, https://trails-archive.org/archive.

38

This is likely because identification is redundant with recognition, as revealed by the correlation analysis.

39

In the context of linear regression, statistical significance depends on whether the covariation between the outcome variable and the predictor would be “surprising” or unlikely within a range of estimated covariation values one might observe in 95% of samples that could be drawn repeatedly from the same population.

40

Cohen (1988) provides the following benchmarks for linear regression effect sizes: η2≅0.02 (i.e., explaining 2% of variance) is a small effect, η2≅ 0.13 (13% of variance) is a medium effect, and η2≅0.26 (26% of variance) is a large effect.

41

TRAILS Benchmark Data 2016–2017. Originally published by Kent State Libraries. Shared via personal email by Kenneth Burhanna, May 25, 2021.

42

In general, this estimates the range of values one might observe in 95% of samples that could be drawn repeatedly from the same population: i.e., 95% Confidence Interval.

43

Hoyer, “Out of the Archives and into the Streets.”

44

Hoyer, Holt, Pelaez, “Crafting a Research Question.”

45

The TRAILS question bank is available in its archives at https://trails-archive.org/archive.

46

Callison, “Enough Already?,” 7–8.