These work better in some disciplines than others, but in those where they are appropriate, they can provide very clear data about student learning. A number of approaches have been taken in relation to when the exams are administered, from doing pre- and post tests in a single course (Geology, Physics), to using a test multiple times throughout the curriculum (Modern Language and Literatures), to designing a test to be administered at the end of the entry level course and at the end of Senior year (Economics). There is also variation in how whether they are national instruments (Physics, Modern Languages and Literatures), designed by a course instructor, or designed by the department (Economics). The use of these exams has, again, demonstrated significant student learning, but also helped departments to identify places where they would like to approve. Such exams can indicate particular concepts that need more emphasis in a course or departmental curriculum.
In addition to these exams that measure student learning in a specific area, we have used the Research Practices Inventory to assess departmental courses (Government, Global Studies, History, as well as FYS). Because we intend to administer this survey (which has both direct and indirect measures) regularly to both pre-FY students and students at the end of the FYS, we encourage departments to consider using it as a tool for assessing information literacy.
Evaluation of Student work using rubrics
A number of departments (Philosophy, Global Studies, Sociology, English, Anthropology) are doing rubric assessment of student work, often, but not exclusively, senior level work. The departments are doing holistic analysis to see if their seniors have developed the overall skills and perspectives defined in departmental learning goals. Philosophy, for example, assesses work from seniors and evaluates a subset of departmental learning goals each year. The English Department has assessed both SYEs and Sr. Seminar work, and is considering assessing work in lower level courses (using narrative methods, rather than a scoring rubric).
This method has also been used by History in sophomore courses that are pivotal for the development of research and writing skills. Government has also developed a tool to do this assessment in GOVT 290. In this instance, it is focused on writing and research skills in the discipline. Religious Studies has developed a rubric that they will use at both the 100 and 400 level.
In addition, Music has developed a series of questions to ask students to use to reflect on listening to pieces of music. Faculty will then look at the ways be which students are able to discuss the music.
Oral presentations have been evaluated in order to assess student learning in relation to departmental content goals (Global Studies), or both (Chemistry, Math, CS, and Stats). This assessment has taken place in relation to Sr. presentations and it has been done by faculty who are present for the presentations. We believe that there is potential for the assessment of oral presentations in classes, and we will solicit departmental representatives to work on developing techniques for course-embedded assessment of oral communication this summer.
During the Spring 2013 semester, faculty in Psychology, Economics, and FYS are piloting course-embedded assessment of student writing. In this instance, the faculty are agreeing on and norming a rubric, the faculty member teaching the class will assess student work using the rubric as part of the grading process, and an additional faculty member will later use the rubric to assess the work. Our goal is to test the hypothesis that we can do meaningful assessment of particular skills as part of grading, and therefore generate assessment results in a timely and cost-effective way.