Apply

Accreditation

Department of Education

CAEP Annual 2024

The College of Education, Nursing, and Health Professions represents the Educator Preparation Provider (EPP) at the University of Hartford. In the Spring of 2023, the Council for the Accreditation of Educator Preparation (CAEP) granted the Department of Education at the University of Hartford Accreditation at the initial-licensure level and the advanced level for a seven-year period, ending in 2030. The CAEP Accreditation Letter and the CAEP Accreditation Action Report are both available for review online.

The EPP consists of six undergraduate, three graduate initial level programs, and one advanced level program. These programs are offered by departments in the College of Education, Nursing, and Health Professions, College of Arts and Sciences, and the Hartt School 

Program Accreditation Status

Program Name

Degree level

Licensure Level

Method of Delivery

Current Accreditation

Early Childhood

Baccalaureate

Initial

In Person

CAEP Accreditation

effective Spring 2023 - Spring 2030.

Elementary

Baccalaureate

Initial

In Person

CAEP Accreditation

effective Spring 2023 - Spring 2030.

Integrated Elem/Special Education

Baccalaureate

Initial

In Person

CAEP Accreditation

effective Spring 2023 - Spring 2030.

Secondary Education English

Baccalaureate

Initial

In Person

CAEP Accreditation

effective Spring 2023 - Spring 2030.

NCTE (Since 2022)

Secondary Education

Math

Baccalaureate

Initial

In Person

CAEP Accreditation

effective Spring 2023 - Spring 2030.

NCTM (Since 2022)

Music Education

Baccalaureate

Initial

In Person

CAEP Accreditation

effective Spring 2023 - Spring 2030.

NASM (Since 2017)

Early Childhood

Masters

Initial

 

Distance Learning

 

CAEP Accreditation

effective Spring 2023 - Spring 2030.

Elementary

Masters

Initial

 

Distance Learning

 

CAEP Accreditation

effective Spring 2023 - Spring 2030.

Special Education

Masters

Initial

Distance Learning

CAEP Accreditation

effective Spring 2023 - Spring 2030.

School Psychology

Masters

Advanced

 

Distance Learning

 

CAEP Accreditation

effective Spring 2023 - Spring 2030.

NASP (Since 2020)

Accredited Programs

 The following graduate advanced programs have earned their accreditation within their specialty areas, outside CAEP:

  • S. in Music Education – Accredited by National Association of School of Music (NASM) in 2017

Programs Nationally Recognized by SPAs

The following programs achieved National Recognition by their Specialized Professional Associations:

  • S. Secondary English (7-12) – Full recognition by National Council of Teachers of English (NCTE) in 2022
  • S. Secondary Mathematics (7-12) – Full recognition by National Council for Teachers of Mathematics (NCTM) in 2022
  • Masters in School Psychology – Full recognition by National Association of School Psychologists (NASP) in 2020

CAEP Annual Reporting Measures

The EPP uses various tools to measure completer effectiveness including candidate impact on P-12 learning and employer satisfaction. The EPP currently utilizes a combination data collection method to ascertain information about completer effectiveness, including surveys, focus groups, and data advisory board meetings data to inform our understanding of completer effectiveness. EPP completers and their employers were surveyed as described below. Completers and employers will participate in focus groups to clarify their ideas. 

  1. Completer Effectiveness Survey

 The EPP collaborates with the University's Office of Development and Alumni Affairs to ascertain contact information to send out an EPP designed survey to candidates who have graduated from our programs and have been employed for at least a year. The survey collects data on completer perceptions of their knowledge, skills, and dispositions obtained as candidates and how those skills directly impact their ability to instruct P-12 students. In Spring of 2024, the survey was emailed directly to 103 EPP completers who had graduated in 2020 or more recently. Some of the email addresses were not valid and therefore 86 emails were effectively delivered via the online survey program Survey Monkey. Additionally, direct invitations were offered via a QR code provided on social media.  There were 7 completers who responded to the survey. The table below provides data for key questions from this survey that offer insights on candidates’ perceptions of effectiveness.

Completer Effectiveness Self-Report

N = 7

Item

Strongly Disagree

Disagree

Undecided

Agree

Strongly Agree

I feel prepared to design and implement developmentally appropriate and challenging learning experiences.

12.5%

0%

12.5%

50%

25%

 I feel prepared to create inclusive learning environments.

0%

0%

0%

50%

50%

I feel prepared to create learning experiences that assure mastery of the content.

0%

0%

25%

50%

25%

I am able to connect concepts and use differing perspectives to engage learners.

0%

0%

25%

37.5%

37.5%

I feel prepared to use multiple methods of assessment to monitor and evaluate progress.

0%

25%

0%

50%

25%

I consistently evaluate and reflect upon my practice and adapt my practice to improve instruction and meet the needs of each learner.

0%

0%

0%

62.5%

37.5%

I feel prepared to collaborate effectively with colleagues, and school professionals to ensure learner growth and advance the profession.

0%

0%

0%

75%

25%

I feel prepared to communicate with colleagues, and school professionals to ensure learner growth and advance the profession.

12.5%

0%

0%

75%

12.5%

I engage in ongoing professional development opportunities to improve student learning.

12.5%

0%

0%

62.5%

25%

I feel prepared to utilize technology, as needed, to improve instruction and advance student learning.

0%

14.29%

28.57%

28.57%

28.57%

Looking across these data we see that completers felt overwhelmingly prepared to: (1) create inclusive learning environments, (2) evaluate upon their own practice to improve their practice and to meet the needs of each learner, (3) communicate with colleagues and school professionals to ensure learner growth and advance the profession. At the same time, these data indicate that EPP completers could use additional support to utilize technology to improve instruction and advance student learning. Data from EPP surveys is shared with faculty and used to make continue program improvements.

To further validate completer impact on P-12 student learning and development, the EPP will invite completers finishing their first or second year of teaching to join a focus group discussion during June 2024.

 

  1. School University Partnership Advisory Board (SUPAB) meetings

The EPP hosts regular meetings with its partners to discuss and receive direct feedback on completers’ effectiveness on P-12 students from the administrators. This Advisory group also addresses current early career professional development needs that might be addressed in program or in partnership with P-12 districts. The leadership team documents the suggestions and recommendations to improve the program and strength partnerships. The consensus is that our programs offer strong, integrated preparation experiences and completers are career ready because of the content and experiences provided in the teacher preparation program. As we begin a new accredited cycle, we are redefining the work of this advisory, including the name, by building on the success of this SUPAB structure while making changes that reflect the post 2020 educational realities and needs that include unifying professions that serve P-12 students, families and communities.

 

  1. Connecticut State Department of Education (CSDE) Dashboard

The data dashboard is a State sponsored database maintained by the CSDE to provide EPPs with data about their completers. Data include information about certification and employment status of completers. One benefit of the Data Dashboard is we have information on the completers’ demographics, current place of work, diversity, etc.

For example, one way in which we use benefit of the Data Dashboard is to collect information on the completers’ demographics, current place of work, diversity, etc. In combination with survey data and other measures, These data support us to look for trends across groups of completers.

 

  1. Work in Progress

Our program will be updating our surveys to reflect more delineated data and educational landscape shifts since COVID in 2020.

We are seeking to partner with districts to share their CT Smarter Balanced Assessment (grades 3 through 8), so we can better assess the impact of our candidates on their students’ learning.

We are implementing focus groups of completers finishing their first or second year of teaching, this new measure will begin in June 2024.

We will request that candidates in their second and third years of teaching share with us the TEAM professional growth work. This is particularly pertinent as effect July 1, 2024 the new state of CT Educator Evaluation will take effect, https://portal.ct.gov/-/media/sde/evaluation-and-support/ctleadereducatorevalsupportplan2024.pdf

 

Our stakeholders are integral to our data-driven decision-making process, providing invaluable feedback that drives continuous improvements to our programs. Through this collaborative approach, we ensure that our teacher candidates are prepared to become successful educators, ultimately benefiting P-12 learners. To measure the effectiveness of our completers, our Educator Preparation Program (EPP) utilizes the following two measures to actively engage our partners:

  • Employer Satisfaction Survey
  • Individual Employer Outreach Discussions

The EPP assesses the effectiveness of completers in enhancing diverse P-12 student learning growth through the Employer Satisfaction Survey and Individual Employer Outreach. The survey employs a Likert scale spanning from 1 (strongly agree) to 6 (not yet observed), while the Individual Employer Outreach facilitates the gathering of qualitative feedback from employers.

1. Employer Satisfaction Survey

The EPP sent out a 44-question employer satisfaction survey to principals/administrators for schools that employed our EPP completers. The purpose of the employer satisfaction survey is to find out the impact our completers have on P-12 student learning and to gauge overall employer satisfaction. In Spring 2024, surveys were sent to 103 administrators of those 14 administrators responded. The table below provides data for key questions on the survey. 

Employer Satisfaction Survey

N = 14

Item

Strongly Disagree

Disagree

Undecided

Agree

Strongly Agree

Not Yet Observed

The teacher implements developmentally appropriate instruction that accounts for learners’ strengths, interests and needs

0%

0%

14.29%

35.71%

50%

0%

The teacher demonstrates thorough knowledge that learners are individuals with differences in their backgrounds as well as their approaches to learning and performance.

0%

0%

7.14%

42.86%

50%

0%

The teacher exhibits respect and high expectations for each learner; communicates with diverse learners in a fair and respectful manner; and consistently provides equitable opportunities to meet the diverse needs of learners.

0%

0%

15.38%

23.08%

61.54%

0%

The teacher collaborates with learners to facilitate self-reflection and ownership for ongoing improvement of the classroom community.

0%

7.14$

14.29%

21.43%

57.14%

0%

The teacher develops highly engaging learning environment, taking into account student differences and learning needs

0%

0%

14.29%

21.43%

64.29%

0%

The teacher clearly communicates expectations for appropriate student behavior

0%

0%

7.14%

35.71%

57.14%

0%

The teacher monitors student behavior and responds appropriately on a consistent basis.

0%

0%

0%

50%

50%

0%

The teacher guides learners in using technologies in appropriate, safe, and effective ways.

0%

0%

7.14%

50%

35.71%

7.14%

The teacher displays mastery of content knowledge and learning progressions that allow flexible adjustments to address needs of all learners.

0%

0%

7.14%

42.86%

50%

0%

The teacher creates an interactive environment where learners take the initiative to master content and engage in meaningful learning experiences to master the content.

0%

14.29%

14.29%

28.57%

42.86%

0%

The teacher designs and facilitate challenging learning experiences related to the students’ real-life experiences and relevant core content.

0%

0%

14.29%

28.57%

57.14%

0%

The teacher designs activities for learners to engage with subject matter from a variety of perspectives and develops interdisciplinary connections.

0%

7.14%

7.14%

50%

35.71%

0%

The teacher uses relevant content to engage learners in innovative thinking & collaborative problem solving.

0%

7.14%

7.14%

42.86%

42.86%

0%

The teacher designs and modifies multiple formative and summative assessments that align with learning targets.

0%

0%

7.14%

57.14%

28.57%

7.14%

The teacher provides effective feedback to learners that aids in the improvement of the quality of their work

0%

0%

15.38%

38.46%

46.15%

0%

The teacher uses appropriate data sources to identify student learning needs.

0%

0%

7.14%

50%

42.86%

0%

The teacher engages students in self-assessment strategies.

0%

7.14%

14.29%

35.71%

35.71%

7.14%

The teacher connects lesson goals with school curriculum and state standards.

0%

0%

7.14%

64.29%

28.57%

0%

The teacher uses assessment data to inform planning for instruction.

0%

0%

7.14%

57.14%

35.71%

0%

The teacher proactively address student learning needs through ongoing collaboration with other teachers, and/or specialists.

0%

0%

7.14%

50%

42.86%

0%

The teacher integrates a variety of instructional approaches to engage students.

0%

0%

14.29%

35.71%

50%

0%

The teacher uses technology appropriately to engage learners and enhance instruction.

0%

0%

7.14%

57.14%

35.71%

0%

The teacher differentiates instruction in the areas of content, process, product, or learning environment to meet the needs of all students.

0%

0%

14.29%

41.86%

42.86%

0%

The teacher articulates thoughts and ideas effectively using oral, written and nonverbal communication skills in a variety of forms and contexts to inform, instruct, and motivate during instruction.

0%

0%

14.29%

 

 

 

 

The response rate for the survey was 13.6% and the results indicate that the employers agreed unanimously that our completers can differentiate instruction in the areas of content, process, product, or learning environments to meet the needs of all students. Completers can transfer theory to practice by assessing and expanding students’ prior knowledge. Additionally, our completers are tech savvy and can create technology-based interactive lessons.  We find this point useful for our triangulation of data because completer self-reports indicated less confidence around using technology. One possible interpretation is that completers had higher standards related to integrating technology than expected by administrators. More importantly, completers can design learning experiences that integrate culturally relevant content to build on learners’ cultural backgrounds and experiences. Completers were also seen as being able to exhibit respect and high expectations for each student and communicate with diverse learners in a fair and respectful manner. In addition, they noted that our completers consistently provide equitable opportunities to meet the diverse needs of the P-12 students by designing learning experiences that integrate culturally relevant content to build on learners’ cultural backgrounds and experiences. Based on these data, the EPP believes the completers are successful at contributing to diverse P-12 student learning growth. Moving forward, this Spring 2024 we will be deploying a revised pilot survey so we may make revisions for use in the 2024 – 2025 academic year. The survey will be sent to all employers listed on the CT State Department of Education Data Dashboard at the end of every academic year in the Spring.

  1. Individual Employer Outreach Discussions

Following the employer satisfaction survey (reported in Measure 2), representatives from the EPP engaged in discussions with two principals to gain insight into the experiences and perspectives regarding four completers. The principals were specifically asked about the completers' contributions to student learning and their capacity to handle responsibilities independently, without necessitating further training. Overall, the feedback from the principals was positive, indicating satisfaction with the competencies and abilities demonstrated by the completers in their respective classrooms. However, one notable concern highlighted by the principals was their desire for the completers to receive better training in the ongoing monitoring of student progress. This feedback underscores the importance of continuous improvement within our program, and we will take proactive steps to address the identified areas for enhancement, particularly in providing comprehensive training on monitoring student progress.

  1. School University Partnership Advisory Board (SUPAB) meetings

The EPP hosts regular meetings with its partners to discuss and receive direct feedback on programmatic effectiveness for preparation to work in the P-12 students. Administrators, coaches, specialists, state agency leads, partner instructors, classroom teachers and alumni bring key perspectives on the current early career preparation continuum and professional development needs which inform our continuous program improvement through content, assignment, assessment and curriculum adjustments.

Instruments and Responses
The EPP uses both proprietary (Certification Tests, edTPA, Title II) and EPP designed assessments (lesson plan assessment, student teaching evaluation, portfolio, and the Candidate Effect on Student Learning/Teacher inquiry project (CESL/TIP) to assess learner and learning. The data provides information to identify if candidates can apply their knowledge of learner and learning at various progression levels. To keep abreast with national and state standards, workforce changes, and feedback from key stakeholders (candidate, clinical partners, faculty), the EPP closely reviewed its existing curricula across programs. 

The EPP ensures that instruments/methods are designed to elicit responses specific to the criteria in Standard 1 (Learner and learning, content, instructional practice, professional responsibility, and technology) through regular data review meetings and ongoing adjustments data review meetings. The purpose of the Department and data review meetings that include partners is to bring together key stakeholders to review data and identify program strengths and weaknesses to better prepare our teacher candidates Data review meetings are hosted with faculty, clinical partners and other stakeholders. We also set aside time during monthly department meetings to discuss items that need additional or ongoing attention.

In addition, School University Partnership Advisory Board (SUPAB) meetings are held regularly that allow us to regularly check in with our clinical partners and address issues as they come up.  At SUPAB meetings we discuss both the data and how we can improve the curriculum. Further, we also engage in discussion about partner needs and how we as an EPP can support.  For example, during the pandemic we organized remote tutoring sessions hosted by our candidates for a local middle school because their students needed additional support with schoolwork.  In this last academic year, we worked on bridging the gap between pre-service preparation and in-service practice through the full employment of the Common Core of Teaching (CCT) Rubric for Effective Teaching in our student teaching experiences.

Examining data about completers', the EPP finds that by in large extent EPP is currently meeting teacher preparation program and state requirements for licensure. However, we recognize there is room for improvement and that we need to make the necessary improvement across the programs and even in our collaborative effort with the partners. Moving forward we will be using assessments with established validity to measure candidate performance. For example, the edTPA or equivalent as a key assessment along with the Connecticut Core of Teaching (CCT) Rubric to evaluate candidates teaching practice during student teaching. These changes allow us to use assessments with established validity to measure candidate performance. It also better prepares our candidates for all the first-year expectations because their employers are using the same CCT Rubric to observe their teaching.

Evidence of Stakeholder Involvement in Program Design, Evaluation and Data Driven Decision Making for Continuous Improvement

Our stakeholders are an integral part of our EPP and participate regularly in collaborating with us on several levels starting from designing of program/coursework, assignments and rubrics, data collection, evaluation and making data-based decisions for continuous program improvement.  The following outlines institutional participation of stakeholders in evaluating program and completer quality and effectiveness.

Program Level – The EPP is in continuous communication with school partners and what our teacher preparation programs can do to support their P-12 needs. For example, key assessments and rubrics are also co-constructed during our regular SUPAB meetings to ensure our assessments are meaningful and relevant to produce successful teachers. Recently collaborative efforts have included piloting a dual enrollment program with local school district to encourage high school students to enroll in teacher preparation programs based on requests made by stakeholders; creating pathways that would allow their non-certified staff members to seek certification; and early intervention Part C pathways for early intervention employment opportunities.   The use of next generation clinical option and residency experiences are furthering our partnership work.

Course/Assignment Level - At the course level, we collaborate with our partners to design meaningful and relevant assignments and involve them with professional development events. For example, field-based course assignments have been aligned for candidate, instructor, partner and cooperating practitioner clarity of expectations.  Feedback received at a SUPAB meeting from our partners who hired our candidates resulted in a change to include data collection and progress monitoring skills in our curriculum. Our adjuncts and clinical educators comprise teachers and/or administrators who provide direct, immediate, and consistent feedback that enables us to make continuous program improvement. 

Evaluation and Data Review Meeting – Stakeholders are involved throughout the program reviewing candidates’ performance data in fieldwork and student teaching placements. SUPAB and faculty data review meetings (e.g., discuss, analyze, and disseminate results) are held at mid and end of semester to make data-informed changes to our curriculum. Data for key assessments are collected on our assessment platform Watermark Student Learning and Licensure. The process is overseen by the Programmatic Leads in collaboration with the Manager of Assessment, Accreditation, and Certification, responsible for ensuring all stakeholders have access to the data.  Instructors teaching courses with key assessments also play an important role in data review. At the end of each semester, these instructors share summaries and data interpretation of their key assessments with the department. At the data review meetings, faculty and clinical partners review the Student Learning and Licensure data and the one-page instructor data summaries. Subsequently, sub-sets of faculty and staff teams create professional growth plans for candidates who need additional support.


Clinical Data – Stakeholders are involved in evaluating clinical experiences (e.g., fieldwork – First year students, Sophomore, and Junior Year) and practicum/student teaching (Senior Year and Last Semester for Graduate Students). Clinical Faculty (University Supervisors and Cooperating Teachers) and the Teacher Candidate (Self-Evaluation) review formal data for final clinical experiences at the data review meetings as described above. Stakeholders collect data during the mid- and final semester student teaching evaluation to enable us to triangulate data on half-day and full day culminating clinical experiences at the undergraduate and graduate levels. We use discussions from our data review meetings to continuously improve the programs. Moving forward, we would like to collect data from fieldwork teachers. This change will allow us to collect feedback from clinical partners and allow us to strengthen our programs and engage in continuous program improvement.  

Stakeholder Involvement in Completer Data Decision Making 
Our stakeholders are actively involved in our data-based decision-making process and provide meaningful feedback that allows us to engage in continuous program improvement and better prepare our teacher candidates to be successful teachers and have positive impact on P-12 learners. Confidence in our completer effectiveness includes collecting meaningful and relevant feedback from all stakeholders. These measures are as follows: 

  • Completer Effectiveness Survey
  • Employer Satisfaction Survey
  • Individual Employer Outreach Discussions
  • School University Partnership Advisory Board (SUPAB) regular meeting, at least three times per year

Advanced Program

Satisfaction of Employers
The School Psychology Program utilizes a survey to measure the satisfaction of employers of graduates. The rubric used within this measure ranged from 1 (strongly agree) to 5 (strongly disagree). The average ratings obtained indicated overall strong agreement from employers regarding the performance of graduates from the School Psychology program. The average rating on items ranged from 1.00 to 1.22.

School Psychology Program Employer Satisfaction Survey Results

2022-23

1= Strongly Agree ….. 5=Strongly Disagree

 

 

2022-23

 

N=9

 

 

Graduates Excel in… (Average)

 

Applying school psychology-specific knowledge and skill effectively in their work as a school psychologist.

1.00

Presenting and applying research to assist in meeting school/district needs.

1.00

Participating in and leading collaborative activities involving other school/district personnel.

1.22

Communicating with other professionals and those they serve

1.00

Practicing according to professional school psychology standards and codes of ethics

1.00

Understanding and respecting diversity in their work as a school psychologist

1.00

 

 

Overall Satisfaction (Average)

 

How well the University of Hartford school psychology graduate program prepared school psychologists for employment in your agency.

1.00

The results indicated overall positive responses with no significant weaknesses identified. The School Psychology program continuously reviews survey results and outcome data in order to ensure that graduates are receiving exemplary training in school psychology.

Satisfaction of Completers
The School Psychology Program utilizes a survey to measure the satisfaction of program completers. The rubric used within this measure ranged from 1 (strongly agree) to 5 (strongly disagree). The average ratings obtained indicated overall strong agreement from program completers regarding their satisfaction with the skills acquired throughout their training as well as their sense of being supported throughout their time in the program. The average rating on items ranged was 1.0.

School Psychology Program Completer Satisfaction Survey Results

2022-23

1= Strongly Agree ….. 5=Strongly Disagree

 

2022-23

 

N=9

 

 

Graduates Excel in… (Average)

 

Apply school psychology-specific knowledge and skill effectively in my work as a school psychologist

1.00

Apply research to my work as a school psychologist

1.00

Participate in and/or lead collaborative activities involving other school/district personnel

1.00

Provide school psychology services that meet professional standards and codes of ethics

1.00

Understand and respect diversity in my work as a school psychologist

1.00

 

 

Overall Satisfaction (Average)

 

How well the University of Hartford program prepared me for my current employment

1.00

How well the curriculum covered knowledge and skill required for my work as a school psychologist.

1.00

How well the faculty provided support and resources during all aspects of my graduate training.

1.00

The information from the Completer Satisfaction Survey is reviewed by the program regularly.  It was noted that all of the respondents indicated Strongly Agree on all items. We will continue monitoring student outcomes to ensure that our candidates are receiving exemplary training.

Title II Data Summary

Data used in the tables below were collected from the public records of the Connecticut State Department of Education Data Dashboard. Employed included information candidates employed in Connecticut Public Schools, including approved private special education programs.

Table 1: Candidates Employed in CT within One Year

Graduating Year

N

Employment in CT

Employment Rate

AY 2014 – 2015

53

24

45.3%

AY 2015 – 2016

51

24

47.1%

AY 2016 – 2017

49

17

34.7%

AY 2017 – 2018

57

19

33.3%

AY 2018 – 2019

63

28

44.4%

AY 2019 – 2020

59

29

49.2%

AY 2020 – 2021

70

38

54.3%

AY 2021 – 2022

75

53

70.7%

 

Table 2: Candidate Persistence Levels for 2nd and 4th Year of Employment

Graduating Year

2nd Year N

2nd Year Percentage

4th Year N

4th Year Percentage

AY 2014 – 2015

22

91.7%

20

83.3%

AY 2015 – 2016

21

87.5%

17

70.8%

AY 2016 – 2017

14

82.4%

9

52.9%

AY 2017 – 2018

16

84.2%

13

76.5

AY 2018 – 2019

26

92.9%

22

78.6%

AY 2019 – 2020

27

93.1%

N/A

N/A

AY 2020 – 2021

35

92.1%

N/A

N/A

AY 2021 - 2022

N/A

N/A

N/A

N/A

Data in Table 2 measures persistence levels of Year 2 and Year 4. For example, if a candidate completed their program in 2015-16 and was employed within one year, 2nd year would reflect employment in 2017-18 and 4th year would reflect employment in 2019-20. N/A is displayed when the 2nd or 4th year data has not yet occurred for a given completion year.