Student Affairs Assessment Rubric Critical Thinking

In CARS, we think of the assessment process as an ongoing cycle, in which assessment of student learning outcomes is used to improve programming. Student learning outcomes are things we want students (or program participants) to know, think, or do upon completing the offered program. For more information on the assessment cycle, see the resources below. The following sections include resources for each step of the cycle.

Assessment Cycle

            Video: An Overview of the Assessment Cycle
            Slides: Workshop on the Assessment Cycle
            Webinar: One Size does not Fit All: Developing Custom Assessment Solutions

Step 1: Writing Learning Outcome Objectives

            Video: Writing Objectives
            Handout: Writing Clear Objectives, the ABCD Method
            Handout: Common Mistakes in Writing Objectives
            Handout: The Do’s and Dont’s of Objective Writing
            Handout: Checklist for Effective Objectives
            Slides: Objective Writing Workshop

 Step 2: Mapping Objectives to Programming

            Video: Program Theory
            Video: Mapping Objectives to Program Components

  Step 3: Selecting and/or Designing Instruments

            Video: Selecting/Designing Instruments
            Handout: Overview of Selecting/Designing Instruments 
            Handout: How to Find Pre-existing Instruments 
            Handout: Comprehensive Guide to Selecting and Designing Instruments 
            Slides: Overview of Writing Instrument Items 
            Slides: Item Writing Workshop 
            Video: Designing and Using Rubrics 

  Step 4: Collecting Data on Learning Outcomes and Implementation Fidelity

            Video: Collecting Data on Learning Outcomes
            Video: Evaluating Implementation Fidelity 
            Video: Introduction to Implementation Fidelity 
            Slides: Implementation Fidelity Workshop with an Applied Example 
            Article: Measuring Implementation Fidelity (Gerstner & Finney, 2013)
            Article: Practical Approach to Assessing Implementation Fidelity 
                         (Swain, Finney, & Gerstner, 2013)
            Article: Importance of Implementation Fidelity (Fisher, Smith, Finney, & Pinder, 2014)   

 Step 5: Analyzing Data

            Video: Analyzing Student Learning Outcomes Data 

Step 6: Using Assessment Results to Improve Programming and Learning Outcomes

            Video: Using Assessment Results          

JMU Rubrics

Please see our Examples of Learning Improvement for examples of how JMU programs have used assessment result to improve their courses, activities, and other programming components.

Academic Degree Programs

*For resources on the assessment process and each step of the assessment cycle click here.

*For an example of learning improvement from one of the Academic Degree Programs at James Madison University click here.

The Assessment Progress Template (APT)

The APT is to be submitted by the program with Department Head approval on or before June 1.

Program Assessment Coordinators are typically the leading authors of the APT. Submit the APT for departmental review. In order to access the APT submission system, a program username and password must be entered. This information is provided in an email sent out on March 1.

Department Heads review and approve program APTs under the department site. This site requires a username and password, which is provided in an email sent out on March 1. Some department heads may also choose to co-author program APTs.

Example APTs

The purpose of the template is to provide the most current assessment-related information for each of JMU's academic programs. A separate template is completed for each academic and certificate program offered at JMU.

            Exemplar APT: College of Business  
            Exemplar APT: College of Science and Math  
            Exemplar APT: College of Health and Behavioral Studies 
            Exemplar APT: College of Visual and Performing Arts 
            Exemplar APT: College of Arts and Letters 


            APT Rubric
            General Information for Contents of Each Section of the APT 
            Including Tables and Graphs in the Report 
            Complete How-To for the APT 
            Update to Rubric Rationale 
            APT System Guide for Dept. Head 
            APT System Guide for Assessment Coordinators 

It is highly recommended the APT be drafted in Microsoft Word before it is then copied and pasted in APT submission system as a final version (i.e., a version that department heads will review). Drafting the template in Word makes it easier to edit and work with others. Additionally, the Word document can serve as a backup copy.

Alternative Option Information

These example APT alternatives can help guide your program in designing an alternative year project for learning improvement, or the assessment process, in your program, as well as guide the writing and submission of the one-page alternative. The APT alternative is to be submitted by the program with Department Head approval on or before June 1.”

            Alternative Option Information Sheet
            80's Pop Culture Alternative Example - Assessment Process
            80's Pop Culture Alternative Example - Learning Improvement
            CBIS Alternative Example - Assessment Process
            CBIS Alternative Example - Learning Improvement

Overview of Meta-Assessment

Meta-assessment is the process of evaluating and providing diagnostic feedback to academic degree program’s assessment plans. The following presentation, given at SACSCOC in December 2016, provides a brief overview of the meta-assessment process at JMU. Please contact Program Assessment Support Services with further questions. A shorter webinar of the presentation can be found here.

            SACSCOC Meta-Assessment JMU Workshop (ppt)

The APT Template

We understand that writing and organizing the APT can be difficult and that it may be unclear as to what exactly to include in the APT. PASS has developed a template for the APT that may be helpful in this regard. The APT Template has been designed to make your reporting as streamlined, efficient, and organized as possible. For every element of the APT rubric, there is a corresponding section on the APT Template. Tables are provided for many of the sections in the APT Template. You may use these specifically, or adapt them how you see fit. The APT template is completely optional, but we encourage you to examine it and determine if it would assist you in the reporting process. If you have any questions about the APT Template, please feel free to contact PASS (

APT Template

Student Affairs Resources

*For resources on the assessment process and each step of the assessment cycle click here.

*For an example of learning improvement in Student Affairs at James Madison University, see the Examples of Learning Improvement click here.

StudentAffairs Assessment Reports

Reporting Template: The LADOR

General Education

*For resources on the assessment process and each step of the assessment cycle, see the General Resources section above.

*For an example of learning improvement in General Education at James Madison University, see the Examples of Learning Improvement tab under General Education Assessment.

Examples of Cluster Reports

Cluster 1, Skills for the 21st Century:

Cluster 4: Social and Cultural Processes: Fall 14-Spring 15 Report
Cluster 5: Individuals in the Human Community: Fall 13-Spring 15 Report

Summer Assessment Fellows

During the month of June, The Center for Assessment & Research Studies, with the support of University Programs, hosts a summer fellowship program where faculty and administrators from across campus can come to learn about Assessment in a focused environment. The program is lead by Assessment Specialists, and Graduate Student Consultants within CARS. The program typically ranges from 2-4 weeks and has evolved in structure over the years to best accommodate the training needs of the University.  For more information please click here.

Assessment Advisory Council

The current Assessment Advisory Council Charge is:

This body is charged with advising the Provost and Senior Vice President for Academic Affairs regarding procedures and practices of assessment across the campus. The Provost has charged every member with reporting back to their constituents on current JMU assessment practice and policy.

The AAC should periodically:

  • Review the procedures and processes pertaining to first year and late sophomore/early junior assessment days; 
  • Review the procedures and processes pertaining to assessment in academic and degree programs and certificates; 
  • Review the procedures and processes pertaining to assessment in library and educational technologies;
  • Review the reporting of assessment data to students;
  • Review external reporting (e.g. SCHEV) of assessment data off-campus;
  • Suggest ways for better use of assessment data in APR, accreditation, C&I processes;
  • Review implementation of competency requirements for on-campus applications such as General Education and for off-campus such as for community college articulation agreements;
  • Suggest ways for better display of JMU program learning objectives, description of assessment instruments, results, and reported uses of results;
  • Offer general recommendations regarding improvements to campus assessment practice.
  • Work closely with the Student Affairs & University Planning Assessment Advisory Council pertaining to assessment in student affairs. 

Assessment Advisory Council 2016 – 2017

Chair: Dr. Herb Amato
Associate Vice Provost , University Programs

Council Members listing can be found here.

The VALUE rubrics were developed by teams of faculty experts representing colleges and universities across the United States through a process that examined many existing campus rubrics and related documents for each learning outcome and incorporated additional feedback from faculty. The rubrics articulate fundamental criteria for each learning outcome, with performance descriptors demonstrating progressively more sophisticated levels of attainment. The rubrics are intended for institutional-level use in evaluating and discussing student learning, not for grading. The core expectations articulated in all 16 of the VALUE rubrics can and should be translated into the language of individual campuses, disciplines, and even courses. The utility of the VALUE rubrics is to position learning at all undergraduate levels within a basic framework of expectations such that evidence of learning can by shared nationally through a common dialog and understanding of student success.

Preview the Critical Thinking VALUE Rubric: click to expand

Download the Critical Thinking VALUE Rubric at no cost via AAC&U's Shopping Cart (links below):


Critical thinking is a habit of mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion.

Framing Language

This rubric is designed to be transdisciplinary, reflecting the recognition that success in all disciplines requires habits of inquiry and analysis that share common attributes. Further, research suggests that successful critical thinkers from all disciplines increasingly need to be able to apply those habits in various and changing situations encountered in all walks of life.

This rubric is designed for use with many different types of assignments and the suggestions here are not an exhaustive list of possibilities. Critical thinking can be demonstrated in assignments that require students to complete analyses of text, data, or issues. Assignments that cut across presentation mode might be especially useful in some fields. If insight into the process components of critical thinking (e.g., how information sources were evaluated regardless of whether they were included in the product) is important, assignments focused on student reflection might be especially illuminating.


The definitions that follow were developed to clarify terms and concepts used in this rubric only.

  • Ambiguity: Information that may be interpreted in more than one way.
  • Assumptions: Ideas, conditions, or beliefs (often implicit or unstated) that are "taken for granted or accepted as true without proof." (quoted from
  • Context: The historical, ethical. political, cultural, environmental, or circumstantial settings or conditions that influence and complicate the consideration of any issues, ideas, artifacts, and events.
  • Literal meaning: Interpretation of information exactly as stated. For example, "she was green with envy" would be interpreted to mean that her skin was green.
  • Metaphor: Information that is (intended to be) interpreted in a non-literal way. For example, "she was green with envy" is intended to convey an intensity of emotion, not a skin color.

Acceptable Use and Reprint Permissions

For information on how to reference and cite the VALUE rubrics, visit: How to Cite the VALUE Rubrics.

Individuals are welcome to reproduce the VALUE rubrics for use in the classroom, on educational web sites, and in campus intra-institutional publications. A permission fee will be assessed for requests to reprint the rubrics in course packets or in other copyrighted print or electronic publications intended for sale. Please see AAC&U's permission policies for more details and information about how to request permission.

VALUE rubrics can also be used in commercial databases, software, or assessment products, but prior permission from AAC&U is required. For all uses of rubrics for commercial purposes, each rubric must be maintained in its entirety and without changes.

One thought on “Student Affairs Assessment Rubric Critical Thinking

Leave a Reply

Your email address will not be published. Required fields are marked *