Date of Award


Level of Access Assigned by Author

Campus-Only Thesis

Degree Name

Master of Science (MS)




Sara Lindsay

Second Committee Member

Molly Schauffler

Third Committee Member

Natasha M. Speer


Program-level assessment within undergraduate education provides information on student learning that is not easily accessible from in-class metrics. For example, information about where in a program understanding is obtained and mastered and the degree to which knowledge is retained over time can be measured. This project encompasses the development of an assessment tool for measuring student content knowledge specific to the curriculum of the School of Marine Sciences’ (University of Maine) four-year undergraduate program and evaluation of the assessment results from a pilot deployment.

Creation of the content-knowledge assessment tool involved gathering information about the implemented curriculum via faculty interviews and identification of specific concepts where students might hold misconceptions from a free-response assignment. These initial data-gathering steps informed the development of a set of multiple-choice and short-answer questions aligned with the curriculum. Feedback from faculty on the clarity and scientific accuracy of the drafted questions and opinions from a pilot student group helped further refine the questions. The finalized assessment survey was deployed to the entire undergraduate population in the School of Marine Sciences in the fall of 2010 and spring of 2011. Student response data from the full deployment were used to evaluate the performance of individual questions and reliability of the assessment overall.

Results indicate that students’ overall content knowledge on the assessed topic of marine primary production increased by 45% between the first and last years of the program, consistent with content-knowledge changes from other life-science programs in the literature. Achievement was very consistent across subtopics that had been identified from implemented curriculum faculty interviews, regardless of amount of education provided to students. In addition, student achievement on the assessment increased by year level for questions with lower cognitive demands (remembering and understanding), but not for questions with higher demands (applying, analyzing and evaluating).

Another facet of this study was an examination of how students’ self-reported confidence in their question responses and familiarity with the question content related to each other and to assessment performance. We found that, in general, students were over-confident in their self- assessments but that students who selected correct answers were more confident than students who did not. This suggests a general ability for students to identify whether they selected the correct answer. However, we found that this self-assessment ability decreased for students at the end of their undergraduate careers. Another contrasting result was that the self-reported measure of familiarity with the assessment content was a poor predictor of achievement. Evaluating both aspects together, we found that familiarity and confidence were correlated, suggesting that familiarity breeds confidence with these students.

This program-level assessment is a first step within the School of Marine Sciences to evaluate the achievement of student understanding throughout its curriculum. Future assessments can be modeled after this project and can assist in generating robust, quantitative and qualitative information for continued program improvement as well as reporting for accrediting bodies.