Date: 10/20/1999

To: Discipline Coordinators and Unit Directors

From: Vicky Demos, Chair of the Assessment of Student Learning Committee

Subject: Unit Assessment Plans

Cc: Unit faculty/staff

As you know, on October 27, 1998, Dr. Cecilia Lopez, Associate Director of the North Central Association of Colleges and Schools (NCA), approved our plan for the "Assessment of Student Learning".

With NCA approval accomplished, we are at the point of putting the assessment process "to work" for both our programs and students. We need to implement the assessment plans and to take action to enhance our students' learning.

Enclosed is your unit assessment plan, a set of guidelines, and an alternative assessment model. The Assessment of Student Learning Committee (ASLC) requests that each unit review its assessment plan and consider the following questions:

• Does the plan still seem appropriate?

• Do parts of the plan seem obscure or otherwise questionable?

• Do you need to obtain additional information to properly assess your students?

• How can your assessment plan be strengthened?

• What kind of assistance or information do you need to implement your plan?

• What actions have you already taken to implement the plan?

• What actions do you plan to take in the immediate future to implement the plan?

The ASLC can assist you in your work. The Unit Assessment Subcommittee of ASLC can meet with you and respond to questions you may have about the assessment process. It can, also, provide you with a clarification of self-evaluation criteria, and handouts on various aspects of the process. In addition, if you decide the model of assessment your unit developed is inappropriate, the committee can discuss with you the alternative model

included in this mailing.

The unit assessment process should begin by February 11, 2000. The NCA accreditation team will be very interested in your work, and will be asking you about it. We look forward to hearing from you. Let us know how we can help you with the assessment process in order to improve student learning.

 

 

MODEL 1 (EXISTING MODEL)

DEFINITIONS OF LEARNING OBJECTIVES & EXPECTED OUTCOMES IN THE PLAN

A. Learning Objectives

Learning objectives will flow from the unit’s mission and goals and will be detailed enough to cover the different functions of the unit. Based upon the unit's goals, an individual instructor for a course, or the discipline faculty in the case of a major, will identify the specific learning objectives. They may be as specific as those for a particular course (for example, understanding a cost/benefit analysis) or as general as those for the major (for example, provide students with a basic understanding of the nature and functioning of the economic system).

B. Expected Outcomes

Units must next specify, based upon their learning objectives, a variety of expected outcomes, measurable in qualitative or quantitative terms. Depending upon the unit's goals, the expected outcomes may be stated as cognitive, behavioral, or attitudinal characteristics. The outcomes can be as specific as being able to solve differential equations, being able to integrate trigonometric functions, or being able to interpret the results of a factor analysis, or as broad as being able to explain how the development of mathematics has been part of the evolution of civilizations and is intimately interwoven with their cultural and scientific development. At this stage in the assessment cycle the expected outcomes represent predictions of how student learning will be demonstrated.

[Measurable, specific, reflected on student]

Steps in Establishing Assessment of Student Learning in

Specific Discipline/Activity

1. List the learning Objectives of the Major -- be specific and clear.

A. Cognitive (e.g., Knowledge acquisition)

Example:

 

B. Behavioral (e.g., Skill acquisition)

Example:

 

C. Attitudinal or affective (e.g., Changes in values, beliefs, or sensitivity/tolerance)

Example:

 

2. Restate the objectives in terms of measurable student learning outcomes.

Examples:

 

3. Design appropriate measures

A. These must be based directly on the foregoing learning outcomes.

B. Resist the temptation to insert items that are not relevant to the outcomes. Use a different instrument if you want to evaluate discipline/activity itself.

4. Collect data--be sure everyone involved understands how to use the measure.

 

5. Analyze and interpret the data--be sure to focus on what the data indicates about the outcomes themselves.

 

6. Revise courses, resources, or activities as needed to improve subsequent student performance on the outcomes.

 

7. Revise objectives, outcomes, or measures as needed to reflect changes in the major, or to remove items that turn out to be obscure or otherwise not useful.

MODEL 2 (ALTERNATIVE MODEL):

INQUIRY/HYPOTHESIS BASED ASSESSMENT MODEL

 

Since UMM is a small college, faculty have an opportunity to observe the students learning process more closely. By using their experience, prior knowledge, units can pin point the key areas and set up hypothesis that needs to be tested related with the students learning. This model is more flexible and allows units to concentrate on the areas that they believe important and timely. Hypotheses may be constructed as a result of a dynamic process which can be related with the areas of student learning that the unit thinks students are deficient, learning is problematic, students learning is strong or

no information is available(positive/negative) in this area.

 

Steps in Establishing Assessment of Student Learning in

Specific Discipline/Activity by Using the Alternative Model

1. Set a hypothesis related with the student learning that you think it is crucial and timely -- be specific and clear. The hypotheses could be related with and one of the following areas:

A. Cognitive (e.g., Knowledge acquisition)

Example:

 

B. Behavioral (e.g., Skill acquisition)

Example:

 

  C. Attitudinal or affective (e.g., Changes in values, beliefs, or sensitivity/tolerance)

Example:

 

2. Select measurable student learning outcomes.

A. These must be based directly on the foregoing learning outcomes.

B. Resist the temptation to insert items that are not relevant to the outcomes. Use a different instrument if you want to evaluate discipline/activity itself.

Examples:

 

3. Design the assessment instrument(e.g. test, survey, interview etc.) and data collection process(e.g. sampling, census etc.)

 

4. Collect data and interpret the results.

  

5. Determine the actions that needs to be taken, if any, to improve the students’ learning and take these actions (e.g. revise courses, resources, or activities as needed to improve subsequent student performance on the outcomes.)

  

6. Formulate new hypotheses for the next cycle.

 

INQUIRY/HYPOTHESIS BASED ASSESSMENT FORM

Unit:

Date Endorsed by the Unit:

Planning

Learning Hypothesis

 

 

 

 

Measurable Student Learning Outcomes

 

 

 

 

Assessment Instrument & Data Collection Process

 

 

 

 

Implementation

Data and Interpretation

 

 

 

 

Actions that Needs to be Taken to Improve Student Learning

 

 

 

 

New Learning Hypothesis for the Next Cycle

 

 

 

 

 

 

Guidelines for Unit Assessment Plans

These guidelines may be used by the units to self-evaluate their plans.

Unit Mission/Goals(s):

oYES oNO Plan includes statement of unit mission/goal

oYES oNO Unit mission/goal relates to institutional mission

Student Learning Objectives/Expected Outcomes:

oYES oNO Learning objectives/outcomes are stated in terms of important student achievements

(e.g., knowledge, skills, behaviors, competencies, and attitudes)

oYES oNO Outcomes identified are relevant to mission and goals

oYES oNO A reasonable number of outcomes (3-4) is selected

oYES oNO Outcomes include at least one cognitive-(knowledge) or performance-based

Assessment Methods & Tools:

oYES oNO Provides a detailed description of assessment methods that will be used to measure

expected outcome

oYES oNO Defines the measure(s) and instruments that will be used for each expected outcome

oYES oNO Involves content across courses and/or disciplines

oYES oNO Considers validity of measures and instruments

oYES oNO Considers reliability of measures and instruments

Procedure:

oYES oNO Gives a detailed description of procedure for measuring expected outcome

oYES oNO Specifies an implementation time line

oYES oNO Assigns responsibility for data collection and analysis

oYES oNO Assesses the student learning across the discipline

oYES oNO Assesses the student learning independently from the course assessment/evaluation

oYES oNO Describes who will administer the assessment

oYES oNO Specifies who the assessors are

oYES oNO Specifies how public assessment results are

Possible Use of Observed Outcome and Actions:

oYES oNO Describes how the results of the assessment will be communicated to faculty

oYES oNO Identifies mechanisms and processes for using results to improve the student learning

and programs

oYES oNO Has feedback loops to related university processes (e.g., planning (academic and

nonacademic, curriculum review))

oYES oNO Describes how the results of the assessment could change unit mission/goal(s)

Overall:

oYES oNO Evidence of faculty involvement (planning, implementation, and evaluation stages)

oYES oNO Evidence of student involvement (planning, implementation, and evaluation stages)

oYES oNO Plan will provide information that can be used to improve teaching and learning

processes and curricula

oYES oNO Assessment plan elicit performance with sufficient data to provide for diagnostic,

structured feedback to the student on her strength and weaknesses

oYES oNO Plan considers effectiveness over time

oYES oNO Considers effectiveness of important academic processes (e.g., teaching, learning

and advising)

Notes:

To produce a "better" assessment procedure

• Do not rely on heavily or exclusively on assessment at the level of the individual classroom or course

• Do not rely on course completion

• Inclusion of a direct measure of student learning is necessary, indirect measures may be included

Here are some of the examples of direct and indirect measures of learning:

A. Some Direct Measures of Learning

• Faculty, site supervisor, external evaluator etc. ratings of students' knowledge, skill, or values.

• Student self-ratings on attitudes and values

• Note that students DO NOT rate themselves on knowledge or skill. Thus, surveys are NOT valid as direct measures of knowledge or skill.

B. Some Indirect Measures of Learning

• Surveys and questionnaires

• Job placement or job satisfaction

• Testimonials, anecdotal reports, or other self-reports

• Include an analysis of the results and of changes recommended and implemented as a consequence of the analysis of assessment data.

ASSESSMENT OF STUDENT LEARNING FORM*

Student's Name:

 

Assessment**

Assessor

Assessment Instrument

Comments

 

0

1

2

3

4

5

 

 

 

Learning Objective 1:

 

 

 

Expected Outcome 1

 

 

 

 

 

 

o Discipline Faculty

o External Evaluator

o Peer Evaluator(s)

o A Team of Faculty

Date:

Signature:

Expected Outcome 2

 

 

 

 

o Discipline Faculty

o External Evaluator

o Peer Evaluator(s)

o A Team of Faculty

Date:

Signature:

Expected Outcome 3

 

 

 

 

 

 

o Discipline Faculty

o External Evaluator

o Peer Evaluator(s)

o A Team of Faculty

Date:

Signature:

Date:

Signature: