About Collaborative Assessment

Collaborative Assessment is the center piece of SLO assessment at Cerritos College. Assessment in general involves a five step process. Collaborative Assessment asks that all fulltime faculty and as many part-time faculty as possible work together to accomplish each step of the process. In this way, assessment becomes a collegial endeavor that shares best practices amongst faculty and promotes instructional consistency across sections.

Departments can best implement collaborative assessment by including SLOs as part of their monthly department meetings. In this way, departments involve more faculty members in the discussion and assessment of SLOs, and are better able to document their participation in the process, for these discussions would appear in the agenda and minutes of each meeting.

Following is an explanation of each step of the Collaborative Assessment process.

Step 1: Creating/ Reviewing SLOs

At the beginning of each academic year, department faculty work collaboratively to select the courses for which they wish to assess and review, write, or revise the SLOs. For those courses without SLOs or with only one or two SLOs, the faculty writes a core set of SLOs (approximately 4-6 for a 3 unit course). For those courses with four to six SLOs, the faculty may wish to revise the wording of particular SLOs or rewrite the SLOs altogether, based on the assessment of the previous academic year.

Also, when course outlines are due to be updated, department faculty can collaborate to align course outlines and course syllabi to course SLOs. The alignment process involves faculty in determining whether the course objectives address the SLOs and vice versa. A similar process of alignment can be used with SLOs and course syllabi. By doing so, faculty can determine how well each SLO is addressed in lectures, readings, and other assignments.

Step Two: Developing/Reviewing Assessment Tools

After deciding the SLOs to be assessed, department faculty develops or reviews the assessment tools for each SLO. In general, SLO assessment at Cerritos College takes five forms: essay, short answer, multiple-choice, skills testing, and surveys. The faculty determines which assessment type works best for the SLO they will be assessing, and they either develop an assessment tool or ensure that an existing one measures the knowledge and/or skills articulated in the SLO.

Next, faculty defines the criteria for the three assessment categories commonly used at Cerritos College: Good, Satisfactory, and Emergent. The "Good" and "Satisfactory" categories designate students who have, to differing degrees, demonstrated adequate mastery of the SLO. The "Emergent" category designates students who do not demonstrate adequate mastery of the SLO.

The faculty choosing an essay or short answer assessment tool work together to create a rubric that identifies the criteria for each category. (click for further explanation and examples of rubrics). After developing the rubric, the department faculty either develops a common assignment to be used by all instructors, or they each develop an assignment that corresponds to the commonly-developed rubric.

For example, the Math Department, after developing a common rubric, had instructors include the same word problem on all final exams, whereas the Earth Science Department, after developing their common rubric, had instructors write their own questions for their final exams. Both methods have worked successfully.

While the short-answer and essay assessments rely on a common rubric to maintain consistency across different sections of the same course, the multiple-choice and skills-testing assessments must rely on a single set of questions or performances to maintain the same consistency. For this reason, faculty members spend much less time defining criteria and far more time developing common questions or performances.

For example, the Political Science Department faculty developed a set of multiple choice questions to assess how well students understand the three branches of the federal government. These questions were then distributed to each section of Political Science 101. Similarly, the Auto Body Department selected a set of skills that assessed the students' ability to prepare and paint a section of an automobile. Students in each section of the course performed these skills and the instructor assessed them accordingly.

While determining the questions or skills takes time, establishing criteria is much simpler. For these types of assessments, the criteria are defined by the percentage of questions answered or skills performed correctly. For example, many departments in the Humanities/Social Science Division use multiple-choice tests, and each one uses essentially the same criteria: "Good" is 85 percent or more correct; "Satisfactory" is 70 to 84 percent correct, and "Emergent" is below 70 percent correct. These percentages are not uncommon.

Step Three: Assessing Student Work

Those departments using multiple-choice or skills testing assessments do not assess student work collaboratively. Faculty members collect the results for their respective sections and report them to their department heads. The department head then aggregates the results and reports to the department the totals for the course.

Those departments using short answer or essay questions do assess collaboratively. These departments collect a random sample of student work from each section and, using the rubric, assess the work as a group.

The first step in the process is to determine the number of the sample. The consultants for the ACCJC have recommended between 100 and 125 pieces of work. Next, the department takes the sample size and divides it by the number of sections. This number determines the number of samples the department needs to collect from each section.

The second step in the process is to determine how to select the work randomly. The Director of Research for Cerritos College (Chris Meyers) recommended selecting students by a fixed interval. For instance, if each section needs to provide four examples of student work, then the instructor would provide work from the fourth, eighth, twelfth, and sixteenth students on the roster. If one of those students did not turn in the work, then the faculty member selects work from the next student on the roster.

The third step in the process is to prepare the student work for assessment. This requires that the student's and the instructor's name be blacked out and that each work be assigned a number, and that work be placed into set of 10, 15, or 20, depending on the number of raters.

The fourth and final step in the process is for faculty to assess the student work. Faculty should set aside a couple of hours to assess the work. The assessment should begin with a norming session. Then, each faculty member should be given a set of student work, a copy of the rubric, and a rating sheet. The faculty member reads and rates the work in his or her set and reports the results to the department head. The department head totals the results and reports those numbers to the department.

Step Four: Analyzing Results/Creating Improvement Plan

Once the assessment is complete and the results aggregated and distributed, faculty members meet to discuss the student work and the results. In reflecting on the assessment and reviewing the results, faculty determine the knowledge or skills that students learned and those that students need to improve upon. Once faculty members determine the areas needing improvement, they discuss the possible causes for students' misunderstanding of the material.

Finally, faculty members discuss possible plans for helping students to improve their learning. In these discussions faculty consider the five areas that directly impact student success as determined by the Cerritos College Student Success Plan: Teaching Practices, Student Engagement, Academic Infrastructure, Program Improvement, and Academic Resources.

The Photography Department submitted a very good example of the analysis of results and development of an improvement plan. Based on the results of their assessment, the faculty members determined that students did not understand how to find "equivalent exposures using different shutter speeds and lens apertures."

Moreover, they decided that students did not understand this concept because they lacked an "understanding of basic mathematics." To correct this problem, the department decided to "seek out assistance from a mathematics instructor to help [photography faculty] expand our techniques for explaining the fundamentals of equivalent exposures." By the way, this improvement strategy would fall under the area of "Teaching Practices" in the Cerritos College Success Plan.

By following this four-step collaborative assessment process, faculty will focus their energies on improving student success in their courses. No doubt, these conversations occur across campus in an informal manner, but by these conversations occurring under the auspices of an assessment process, they can be part of a consistent and coordinated effort to improve student learning.