MCCESC Teaching & Learning

August: Assessment Literacy

Welcome Back!

Some of us are in denial that the summer is nearly over, while others of us are eager to get back into the swing of a new school year. No matter which end of the scale you find yourself, the beginning of the school year is near.


Nearly all of Ohio's school districts have shifted to the Ohio Teacher Evaluation System (OTES) 2.0, and with it comes that ever daunting phrase, "HQSD," or, "High-Quality Student Data..."

We promise, it is not as scary as it sounds. When we stop to consider how many instructional decisions we make throughout a school year, think of this is just using strong (or high-quality) data to make those decisions. The stronger the instrument used to gather the data, the stronger the data; therefore, it is important to focus on creating high-quality assessments.

Because of a Teacher (A Tribute to All of Those Making a Difference)

Validity: Does it measure what it intends to measure?

Before you create an assessment, you must ensure that you thoroughly understand what the content standards expect your students to know and be able to do.


This requires unpacking, unwrapping and deconstructing content standards to establish the skills and content required and at what level of rigor (either through Depth of Knowledge or Cognitive Demands).


Consider viewing this video on Unpacking Content Standards for additional assistance.


The Ohio Department of Education suggests the following as ways to improve validity:

  • Eliminate assessment items that contain unrelated content.
  • Ensure representative distribution of assessment items.
  • Ensure item alignment to both the content and skill levels.

Reliability: Does it provide trustworthy results?

A reliable assessment will provide similar scores for similar students when the same or similar assessment is given at different times.


How does this differ from validity? Think of a scale... If you step on the same scale every day for a week and get the following numbers: 67, 67, 67, 67, 67, 67, 67 - this scale is reliable - it is providing consistent results. However, if you purchased this scale to measure your weight, but it is providing height, the scale is not valid - it is not measuring what is intended to be measured.


The Ohio Department of Education suggests the following as ways to improvide reliability:


  • Allow enough time to complete the assessment.
  • Include enough items to accurately measure the content and skill indicated, including items of various complexity.
  • Avoid ambiguous test questions.
  • Provide clear directions.
  • Develop a systematic administration procedure.
  • Ensure consistent use of rubrics.

Bias: Does it offend or unfairly penalize?

When assessment items are biased, they can skew the results/data, leading to misinterpretation and misinformed instructional decisions.


The most common sources of assessment bias are racial/ethnic bias, gender bias, and socio-economic bias. A more in-depth study of these assessment biases can be read in Assessment Bias: How to Banish It.


What does a biased assessment question look like?


An item from an IQ test: Rifle is to Hunter as Saw is to ________


This item is biased because many students may live in geographical areas where hunting is not common. The correct answer (carpenter) may also be unfamiliar to students in certain areas or those who have never seen someone use tools. Additionally, students may have negative feelings toward some of these words.


In an effort to eliminate bias, consider the following questions provided by Kansas State University when creating/reviewing assessments: Are there any test items that:

  • Contain language that is not commonly used or has different connotations in different parts of the state or country, or in different cultural or gender groups?
  • Portray anyone in a stereotypical manner?
  • Contain any demeaning or offensive materials?
  • Have any religious references?
  • Have references that mean different things to different cultures?
  • Assume that all students come from the same socioeconomic or family background?
  • Contain information or ideas that are unique to the culture of one group AND this information or idea is not part of the content standards?
  • Measure membership in a group more than measure a content objective?
  • Put up barriers preventing any group of students from demonstrating their knowledge and abilities?
  • Portray a group unfavorably or in a stereotypical manner?
  • Contain language or symbolism that can be interpreted in an offensive or emotionally charged way to a person or group?

Means of Assessment

Assessments can be much more than just a paper-pencil test, even when it comes to being considered for High-Quality Student Data (HQSD).


Selected Response Assessments include:

  • Multiple Choice
  • Matching
  • True/False


Constructed Response Assessments include:

  • Short Answer
  • Extended Response
  • Essay


Performance Assessments include:

  • Product
  • Visual
  • Verbal
  • Physical


The Ohio Department of Education explains that performance assessments:

  • Have multiple criteria that are being assessed
  • Are reserved these types of assessments for high-level cognitive skills
  • Allow for multiple approaches
  • Are multi-stepped
  • Allow for reflection and revision
  • Are accompanied by a high-quality rubric

Data Use in the OTES 2.0 Rubric

Within the new OTES 2.0 Rubric, there are two components that specifically list the use of High Quality Student Data (HQSD):

(1) Organizational Area: Instructional Planning

Domain: Focus for Learning

Component: Use of High-Quality Student Data

Skilled description: The teacher thoroughly and correctly analyzes patterns in at least two sources of high-quality student data to develop measurable and developmentally appropriate student growth goal(s) and monitors student progress toward goal(s).


(2) Organizational Area: Instruction & Assessment

Domain: Assessment of Student Learning

Component: Evidence of Student Learning*

Skilled description: The teacher uses at least two sources of high-quality student data to demonstrate growth and/or achievement over time, showing clear evidence of expected growth and/or achievement for most students.


HOWEVER, teaches are constantly utilizing data to make instructional decisions throughout their planning. This data use is addressed in the following components:

  • Connections to Prior & Future Learning
  • Planning Instruction for the Whole Child
  • Communication with Students
  • Monitoring Student Understanding
  • Student-Centered Learning
  • Use of Assessments

*This component is not allowed to be evaluated this school year.

WE ARE HERE TO HELP

If you have interest in learning more, please reach out as we can schedule opportunities within districts, online, or in-person at our agency.


Reach out - we are here to help. tandlsupport@mccesc.org

Madison-Champaign ESC

We Work to Serve!

Department of Teaching & Learning

Check out our Instagram: @tandlmccesc


Feel free to use our hashtags:

#MCCESCTeachingandLearning #M_C_ESC

ESC Connection

If you have a chance, and have not yet read our quarterly ESC Newsletter, the ESC CONNECTION, please feel free to peruse at your leisure. Lots of great things happening at the Madison-Champaign ESC.
Big picture