Assessment Terminology

Assessment Terms

  • Approaches are the procedures used to gather the information needed to assess how well students have met the learning objectives. They are the courses of action through which evidence about courses, programs, majors, and the like will be gathered. To provide quality information, multiple approaches should be used.
  • Assessment refers to a continuous process instituted to understand and improve student learning. While academic units may find alternative pathways to arrive at this goal, this process needs to begin with articulation of educational goals for all programs and courses. These goals should be expressed as measurable objectives followed by the selection of reliable and valid methods and measures. After collecting, interpreting, and sharing findings, the aim is to use these learning outcomes to better understand how and that students learn, how well students are meeting expected objectives, as well as to develop strategies to improve the teaching and learning processes.
  • Benchmark is the actual measurement of group performance against an established standard or performance, often external.
  • Criterion is the standard of performance established as the passing score for a performance or other measures such as a test. The performance is compared to an expected level of mastery in an area rather than to other students’ scores. Cross-Sectional Studies provide information about a group of students at one point in time.
  • Evaluate and Evaluation are terms used to indicate the interpretation of findings and are used as synonyms to the terms assess and assessment. Many make a distinction between evaluation and assessment with the difference that assessment is a process predicated on knowledge of intended goals or objectives while, in contrast, evaluation is a process concerned with outcomes without prior concern or knowledge about goals.
  • Goals are statements about the general academic aims or ideals to which an educational unit aspires. Goal statements allow us to share with others our hopes in regard to the learning achievements of our students. Further, goals at the unit level should align with the mission of the university. Goal statements are not amenable, as stated, to measurement.
  • Longitudinal studies provide information from the same group of students at several different points in time.
  • Measures are the specific instruments or performances used to provide data about learning. They are the tools that are to provide information as to the level of achieved results or outcomes. To avoid systematic bias in findings, multiple measures are required. There are two types of measures. Baseline measure is where a department is currently performing and target measure is where the department wishes to perform.
  • Methods - see approaches.
  • Objectives are the redefinition of learning goals in a way that permits their measurement. Objectives express the intended results or outcomes of student learning and clearly specify the criteria by which student knowledge, performance, or values will be evaluated.
  • Process is a method generally involving steps or operations that are ordered and/or interdependent.
  • Qualitative and Quantitative Research describe two research methods. Both are valuable as a means to assess student learning outcomes. In a practical and somewhat philosophical sense the difference is that quantitative research tries to make use of objective measures to test hypotheses and to allow for controlling and predicting learning. Qualitative research makes use of more subjective observations of learning.
  • Reliability is the extent to which studies or findings can be replicated.
  • Sampling consists of obtaining information from a portion of a larger group or population. When the selection of a sample is randomly chosen there is greater likelihood that the findings from the sample will be representative of the larger group.
  • Validity refers to the degree to which a study accurately reflects or assesses the specific concept that the researcher is attempting to measure. Validity has three components:
    • relevance - the option measures your educational objective as directly as possible
    • accuracy - the option measures your educational objective as precisely as possible
    • utility - the option provides formative and summative results with clear implications for educational program evaluation and improvement

TracDat Terms

  • Assessment Plan: Used at the Department to record Objectives, relate the Objectives to Goals at various organizational levels, document assessment, measure and relate department courses (Activities in a Non-Academic Department) to the Objectives. 
  • Assessment Plan Review Process: Used to document how and how often the assessment plan will be reviewed.
  • Assessment Process: Schedule of assessment tasks and activities.
  • Assessment Method: Used to document how attainment of the department will be measured.
  • Assessment Method Criterion: Standard of achievement for an assessment method.
  • External Unit: Organizations outside of the institution to which assessment data needs to be related (Higher Learning Commission, NCATE, AACSB, and other accrediting bodies).
  • Feedback Loop (How will you use this data?): How the results of the assessment efforts will be used to impact the department.
  • Goal Type: Allows institutions to classify goals into categories (Academic, Strategic, Financial, etc.).
  • Goal: A component of the organizations or departments mission statement.
  • Measurement Link: Used to store any document or URL that related to a specific Assessment Method.
  • Mission Statement: A clear statement of an organizational units intended accomplishments.
  • Objective: An intended student learning outcome stated in measurable terms.
  • Objective Name: Brief name of an Objective.
  • Observation: A conclusion or hypothesis derived from the analysis of assessment data. Observations can be based on formal, informal, quantitative or qualitative data samples.
  • Observation Data Source: The source of the data that was used to make an observation.
  • Observation Name: Brief name given to an Observation.Observation Level: Whether the observation relates to a department or a course.
  • Observation Type: Used to classify an Observation as a problem/limitation or distinction/strength.
  • Remedy: An intervention that is being proposed to remedy the noted problem/limitation.
  • Related Data: Provides a means of retaining and managing data that supports an Observation.
  • Schedule: When and how often each measure will be taken.

Back to top