Assessment

Minnesota Campus Compact is committed to deepening the evaluation and assessment of civic engagement’s outcomes for students, institutions, and communities.  This section provides resources related to our collaborative work around assessing the effectiveness of civic engagement.

FEATURED RESOURCES

Upcoming civic learning rubric assessment tools. Created by the Massachusetts Department of Higher Education, these two assessment rubrics measure civic values and civic knowledge and are under consideration by the AAC&U for inclusion in its suite of assessment tools. The civic knowledge rubric will be available for testing on February 29th, and can be found at the link above.

The Global Learning Assessment Opportunity: the Global Engagement Survey (GES). The GES is looking for institutions to participate in the survey, which assess global learning outcomes by examining institutions and their summer and spring programming. Apply to participate by March 15, 2016.

MNCC Assessment Publications

From Activities to Benefits – This report reflects on the 2009 Civic Engagement Forums that Minnesota Campus Compact convened with the Minnesota Evaluation Association. We aimed to bring out multiple perspectives on what civic engagement outcomes matter, to identify models and resources, and to generate new ideas for implementing and learning from evaluation.

Outcomes – Minnesota Campus Compact’s periodic assessment brief – Issue 1, Fall 2010 |Issue 2, Fall 2011

Guiding Principles for Evaluation

The Joint Committee on Standards for Educational Evaluation recommends four basic attributes (and numerous sub-standards) for any program evaluation:

  1.  Utility, including identification of stakeholders, credibility of evaluators, pertinence of information, and clarity and timeliness of reporting.
  2. Feasibility, including practicality of procedures, political viability, and cost effectiveness.
  3. Propriety, including service to participants, community, and society, respect for the rights of those involved, and provisions for complete and fair assessment.
  4. Accuracy, including program documentation, use of valid and reliable procedures, appropriate analysis, impartial reporting, and justified conclusions.

Palomba, Catherine A., and Trudy W. Banta, Assessment Essentials:  Planning, Implementing, and Improving Assessment in Higher Education (San Francisco:  Jossey-Bass Publishers, 1999).

Eight additional principles that were developed in relation to student success, but are relevant in general:

  1. Context is everything.
  2. The whole is greater than the sum of the parts.
  3. Evidence is essential:  the more, the better.
  4. Test prevailing assumptions.
  5. Cast a wide net.
  6. Use outsiders to ask hard questions.
  7. Focus on what matters to student success.
  8. Stay the course.

George D. Kuh et al., Assessing Conditions to Enhance Educational Effectiveness:  The Inventory for Student Engagement and Success (San Francisco:  Jossey-Bass, 2005), pp. 10-16.

Finally, a common acronym to help people remember what makes a good evaluation objective is SMART:

  • Specific (how does the objective link the behavior and an anticipated numeric or descriptive result?)
  • Measurable (is there a reliable system in place to illustrate achievement of the objective?)
  • Achievable (can the objective be achieved?)
  • Relevant (are the people within the objective in a position to make an impact on the situation?)
  • Time-based (is there a clearly defined start and end for the objective?)

A Glossary of Terms

Basics

Evaluation: the systematic process of determining the merit, value, and worth of someone or something

Assessment:  the process of measuring, quantifying, and/or describing results related to the attributes covered by the evaluation

Formative Evaluation:  intended to help “form” (develop, change, or improve) a program, conducted before or during implementation of the program to allow feedback to be acted upon

Summative Evaluation:  intended to judge a program’s effectiveness and demonstrate accountability, conducted after the program has ended or has operated for a significant period of time so the actual outputs and outcomes can be compared to the original objectives; may also be used to form the program if the program will continue or be offered again

Results

Goal:  intended results in general terms (e.g., an institutional culture that is supportive of civic engagement)

Objective:  intended results in precise terms, identifying specific policies, behaviors, etc. (e.g., the institution’s strategic plan includes civic engagement as an explicit priority)

Measure/Indicator:  a specific standard or criterion by which something can be judged (e.g., students graduate with a stronger commitment to ongoing civic engagement)

Benchmark:  a piece of evidence to which later results/performance can be compared (e.g., the number of service-learning courses offered in a particular year)

Output:  activities done to achieve intended results (e.g., a two-day orientation and ongoing mentoring were offered to first-generation college students)

Outcome: meaningful change, the intended results of activities/programs/partnerships (e.g., retention rates among first-generation college students went up)

Techniques

Tool/Instrument: a device used to collect data, information, or evidence (e.g., tests, questionnaires, interview schedules, checklists, rating scales, observation records)

Audit/Inventory:  documentation of what is already going on and where there might be additional interest (e.g., a survey of campus faculty and staff about their civic engagement activities)

Quantitative Methods:  techniques that generate numerical scores, ratings, or findings (e.g., survey questions answered yes/no or on a scale)

Qualitative Methods:  techniques that yield descriptive information, stories, and narrative analysis (e.g., open-ended questions on surveys, journals, ethnographic field studies, participant observation)

Direct Measures:  techniques where knowledge, skills, or other desired outcomes are demonstrated in response to the instrument or in their own existence (e.g., tests in which students show what they have learned, official tenure and promotion policy guidelines that explicitly recognize engaged scholarship)

Indirect Measures:  techniques where people reflect on outcomes (e.g., surveys asking students to report what they have learned, interviews asking faculty their perceptions of how engaged scholarship is recognized in tenure and promotion)

Levels of Analysis

Micro:  changes in individuals or their circumstances
Macro: changes in broader structures that impact people’s lives
Meta:  evaluation of the evaluation

For more on the language of research see the following useful article at Journalist’s Resource: Statistical terms used in research studies; a primer for media

Select Evaluation Resources

Service-Learning/Civic Engagement Related Tools

Minnesota Resources

Professional Organizations/Associations

Survey Resources

Free Online Survey Tools

  • Google Docs Forms http://www.google.com/google-d-s/forms/ Free, easy to use – view and download results in a spreadsheet
  • Lime Survey http://www.limesurvey.org/ A robust open source survey program for advanced users, which includes survey logic and analysis features; it requires some technology knowledge, but a growing support community is available to assist you.

Some popular sites with limited free basic services and some fee-based advanced features:

Focus Group Resources

Interview Resources

Other Resources and Articles

 

Skip to toolbar