Skip to main content

Impact Measures

CAEP Standard 4 is about program impact. We are asked to measure the following areas for our program completers:

Impact on P-12 Student Learning

  1. Our case study was implemented in the academic year 2019 - 2020 to measure impact on P-12 student learning as well as observing 4 of our program completers in the classroom (three childhood completers and one adolescence completer). Attached are the results.

All case study participants completed the Impact on Student Learning assignment in Spring 2020 (March – May). All participants received a score of 2 (Acceptable) or 3 (Adept) for all criteria on the Impact on Student Learning rubric. The average rubric score for all participants is 2.67 with individual scores ranging from 2.50 – 3.00. Scores increase for individuals the longer they have been working in the field.

Assessment Strategies for Impact on P-12 Learning:

  • All four teachers used multiple assessment strategies, both informal and formal.
  • All four teachers used at least some self-created assessments, although some were more reliant on pre-made assessments than were others.
  • Most teachers consistently used questioning as an assessment strategy, but only once (during the first observation) did all four teachers specifically cite their use of questioning as an assessment strategy.
  • Summative assessments tended to only be used at the adolescent level. The elementary teachers consistently used a wider range of informal, formative assessment strategies.
  • Most of the teachers consistently stated that the data they gathered from their informal assessments would impact their future instruction.
  • Entrance/exit tickets were consistently used as an assessment strategy by multiple teachers.

Successes for Impact on P-12 Learning:

  • All four teachers consistently felt that their lessons were successful overall across the observations.
  • All four teachers consistently claimed that their students had learned what was intended, although each teacher used different evidence to support his/her claim depending on the grade level and/or content being taught.
  • The high school teacher consistently asserted that success in their lessons was due to students’ use of prior knowledge and learning.
  • The most experienced elementary teacher consistently used multiple forms of assessment as well as ecological considerations in determining whether their lesson had been successful; success, for them, came from a variety of factors.

Artifacts of Student Learning:

  • All participants used pre-assessments and data of their students as well as understanding of the curriculum being taught to justify the learning goals of their lesson. All participants made modifications based on students’ prior learning and understanding in their instructional planning to ensure differentiation.
  • The three elementary teachers consistently cited observation of students as an informal assessment.
  • The use of questioning as an assessment method was reported as inconsistent across the three observations. In the first observation, all teachers did so; in the second observation, two teachers did so; in the third observation, no teachers directly reported using questioning as an assessment method.
  • The most experienced elementary teacher consistently cited the most formal/informal assessment methods as artifacts of their students’ learning.
  • Two teachers consistently used student writing samples as a form of assessment.

Indicators of Teaching Effectiveness

  1. Employer survey from 2015 and 2018.
  2. We are implementing a case study (see above) for an observation approach of our program completers teaching in P-12 classrooms.

Observations of Completers:

The total average score on the student teaching evaluation has increased from the first observation in September to the second observation in December, from 2.33 to 2.49 (0.16-points) but has slightly decreased for the third observation to 2.44 (this is not significantly different from the second observation).

Rubric Item 6 (Plans for meaningful instruction for all students by drawing on curriculum knowledge of their discipline and related content areas, as well as on knowledge of students and the community) has the greatest increase in scores over time as being one of the lowest scoring items in the 1st Observation (1.58) to one of the highest scoring items in the following observations (2.75 and 2.50 for 2nd and 3rd observations respectively). So this item increased in score by about 1.0 points. All individuals also increased in their scores for this item.

The highest scoring Rubric Item overall is 17 (Demonstrates strong moral character and professionalism) with a score of 3.00 over all 3 observations for all case study participants.

Satisfaction of employers and employment milestones:

  1. Employer survey from 2015 and 2018. The 2018 survey includes questions about promotion and retention.
  2. Summer 2018 employer focus group that measures employer satisfaction of our program completers. 

Satisfaction of completers

  1. Alumni survey from 2015. The 2018 survey also includes a question about starting salary (CAEP outcome measure 8).
  2. Summer 2018 alumni focus group that measures completer satisfaction. 

For the 2015 alumni survey, the results are attached in the following downloadable link: 2015 Alumni Survey Results

For the 2018 alumni survey, 83% of respondents (both undergraduate and graduate students) were currently in a position related to their field of study at the time they took this survey (which would be 1 - 3 years from graduation). This aligns with CAEP outcome measure 7: ability of completers to be hired in education positions for which they have been prepared. Alumni were also asked about their current salary if employed in their field of study. The responses were split fairly evenly between $5,000 increments starting with less than $30,000 up to more than $60,000. Those who make $35,000 or less is 25% of respondents and those who make more then $50,000 is 26% of respondents. The majority of respondents (49%) makes between $35,0001 - $50,000. This aligns with CAEP outcome measure 8: other consumer information about starting salary of program completers. Finally, alumni rated the program satisfaction questions with regards to the School of Education preparation being relevant and effective in their current work. On average, they rated these questions as either completely or mostly prepared 85% of the time. The results are attached in the following downloadable link: 2018 Alumni Survey Results

As a follow up response to the 2018 alumni survey, respondents were invited to participate in one of two one-hour focus group meetings held on campus. Eleven alumni (undergraduate, graduate, or completers of both programs) took part in the focus group discussion. Focus group interview sessions were recorded and transcribed for analysis. Focus group questions were developed based on survey responses to shed more insight on the completers' perceptions of weaker areas of preparation and trends observed. The analysis of the alumni focus group is attached in the following downloadable link: 2018 Alumni Focus Group Analysis

For the 2015 employer survey, results are attached in the following downloadable link: 2015 Employer Survey Results

For the 2018 employer survey, employers rated their teachers that had completed a School of Education program within the last three years with regards to satisfaction of the program preparation being relevant and effective (these are the same questions as rated by the alumni). On average, they rated these questions as either completely or mostly prepared 95% of the time. No question was rated as not at all prepared. When asked if they believe their teachers are on an employment trajectory that would advance them to a position of leadership, 75% of employers responded yes. Finally, when asked how long they foresee their teachers to remain employed in their district, 50% of employers responded for more than 10 years. The remaining 50% are fairly evenly split between less than 1 year through 6 - 10 years. The results are attached in the following downloadable link: 2018 Employer Survey Results

As a follow-up to the 2018 employer survey, administrators were invited to participate in a one-hour focus group meeting held on campus. Two building administrators (one superintendent and one principal) took part in the focus group. The focus group interview session was recorded and transcribed for analysis. Focus group questions were developed based on survey responses to provide more insight into the administrators' perceptions of completers' weaker areas and trends observed. The analysis of the employer focus group is attached in the following downloadable link: 2018 Employer Focus Group Analysis