Apply

Accreditation

Department of Education

CAEP Annual 2022

The College of Education, Nursing, and Health Professions represents the Educator Preparation Provider (EPP) at the University of Hartford. The EPP is at present accredited by NCATE and will be evaluated by The Council for the Accreditation of Educator Preparation (CAEP) in Fall 2022.

The EPP consists of six undergraduate, three graduate initial level programs, and one advanced level program. These programs are offered by departments in the College of Education, Nursing, and Health Professions, College of Arts and Sciences, and the Hartt School. 

Programs and Accreditation Status

Program Name 

Degree Level 

Licensure Level

Method of Delivery 

Current Accreditation

CAEP visit in Fall 22

Early Childhood  

Baccalaureate  

 

Initial 

Face-to-Face  

NCATE (Since 2016)

Elementary  

Baccalaureate  

 

Initial 

Face-to-Face  

NCATE (Since 2016)

Integrated Elem/Special Education 

Baccalaureate  

 

Initial 

Face-to-Face  

NCATE (Since 2016)

Secondary Education English 

Baccalaureate  

 

Initial 

Face-to-Face  

NCATE (Since 2016)

NCTE (Since 2022)

Secondary Education

Mathematics

Baccalaureate  

 

Initial 

Face-to-Face  

NCATE (Since 2016)

NCTM (Since 2022)

Music Education 

Baccalaureate  

 

Initial 

Face-to-Face  

NCATE (Since 2016)

NASM (Since 2017)

Early Childhood  

Masters 

Initial 

Face-to-Face  

NCATE (Since 2016)

Elementary  

Masters 

Initial 

Face-to-Face  

NCATE (Since 2016)

Special Education 

Masters 

Initial 

Online 

New Program

School Psychology 

Masters 

Advanced 

Face-to-Face  

NCATE (Since 2016)

NASP (Since 2020)

 

Accredited Programs

 The following graduate advanced programs have earned their accreditation within their specialty areas, outside CAEP:

  • B.S. in Music Education – Accredited by National Association of School of Music (NASM) in 2017

Programs Nationally Recognized by SPAs 

The following programs achieved National Recognition by their Specialized Professional Associations:

  • B.S. Secondary English (7-12) – Full recognition by National Council of Teachers of English (NCTE) in 2022
  • B.S. Secondary Mathematics (7-12) – Full recognition by National Council for Teachers of Mathematics (NCTM) in 2022
  • Masters in School Psychology – Full recognition by National Association of School Psychologists (NASP) in 2020

CAEP Annual Reporting Measures

The EPP uses a variety of tools to measure completer effectiveness including candidate and completer impact on P-12 learning and employer satisfaction. The EPP has adopted a Ten Measure Strategy to demonstrate the impact of candidates and completers on P-12 students. Data on all measures included as part of the ten-measure strategy are collected regularly and reviewed to make continuous program improvement decisions. Following are details about the measures included in the Ten Measure Strategy.  

1. Candidate/Alumni/Completer Survey

The EPP collaborates with the university's Office of Development and Alumni Affairs to send out an EPP designed survey to candidates who have graduated from our programs within the last five years. The purpose of the survey is to collect data on completer satisfaction with knowledge, skills, dispositions obtained as candidates and how those skills directly impact their ability to instruct P-12 students. The alumni survey was piloted in Fall 2020 and implemented again in Spring 2021. In Fall 2020, the survey was sent to 294 alumni, with a response rate of 10% (N=30) while in Spring 2021, the survey was sent to 290 completers, with a response rate of 3.7% (N = 11). The survey was sent out again in Spring 2022 and we are awaiting results.  

The qualitative feedback received from the alumni was mixed depending on whether the alumni were in the online or the in-person program. Candidates in the in-person program expressed high levels of satisfaction, while candidates in the online programs felt they needed more support to help them better prepare for the State certification assessments (e.g., Praxis II). They also expressed the need for more experience with tools and strategies to assess P-12 students for special education eligibility. Feedback from candidates from in-person programs suggest that they were noticeably confident and ready to implement the skills and assessment tools they had learned about while in the program. Most alumni felt they were well-versed in designing and implementing lesson plans, modifying lessons as required and differentiating instruction to meet the needs of all students. More importantly, the alumni attributed their success to the preparation that they received as part of the programs offered by the EPP. They felt they had been prepared to successfully meet the needs of their first teaching position. Some of their descriptions about the EPP focused on the quality of instruction, real-life experiences related to teaching, fieldwork experiences from the very first semester to student teaching experiences that prepared them well to be first-time teachers. Candidates who did not feel adequately prepared to work independently said that they learned new skills from their colleagues and mentors that allowed them to be successful in their positions as first year teachers. Data from the survey are shared with program faculty and partners and are used to make continuous program improvement decisions. 

2. Completer Focus Group Sessions

To further validate completer impact on P-12 student learning and development, the EPP invited 35 completers to participate in focus group discussions. The last focus group was attended by 18 completers (51%) attended a focus group session. Results from this session demonstrated that our completers are confident that the EPP had prepared them to be effective classroom teachers. Completers indicated they felt prepared and received positive feedback from their P-12 mentors. Focus Group data indicated that most of our candidates were satisfied with our programs. For example, 94% of the completers agreed that they obtained knowledge and skills on how to successfully connect to concepts being introduced and to differentiate instruction to meet the needs of all learners. Approximately, 83% of the completers felt prepared to use multiple methods of assessment to monitor and evaluate student progress, and 95% of the completers indicated that the program prepared them to create inclusive learning environments.

3. Action Research Project (Completer impact on P-12 Students) 

Completers from all programs were invited to participate in an Action Research Project that was designed to measure the direct impact of our completers on their P-12 students. Completers were asked to select a content area, conduct a pre-assessment, design and implement a series of lessons, and conduct post-assessment to evaluate the impact of their instruction on student performance. Five completers (N=5) representing the elementary and integrated elementary and special education participated in this project. Data for the action research project indicated that our completers were having a positive impact on their students.

4. P-12 Student Survey

Three completers shared a survey with their P-12 students ranging from kindergarten to second grade to collect feedback on their teaching effectiveness. Completers were allowed to adapt the survey into a grade and age-appropriate format that would work for their students (e.g., circle emojis for kindergarten students). A total of 22 school students responded to the survey.   

Survey results indicate that all P-12 students agreed that their teacher had a positive impact on their learning. They were motivated to attend classes and learn new information from their teachers. They also indicated feeling safe in the classroom because their teacher created a caring environment for them and made time to help them understand their tasks.

5. Teacher Education and Mentoring (TEAM) Program

The TEAM program is a professional growth model that incorporates the Connecticut Standards for Professional Learning to assess teacher effectiveness. It provides beginning teachers with multiple opportunities to reflect on their practice, analyze student data and outcomes, and identify areas for growth and improvement for their individual professional learning. Two completers (one elementary education teacher and one special education teacher), with permission from school administrators, shared their TEAM with us. These data were critical because they are a direct indication of completers’ teaching effectiveness on student learning. Although we have limited data, our completers have had a positive impact on P-12 learners and their development. Moving forward we will continue to work with partners and completers across programs to collect this information.  

6. Employer Satisfaction Survey

The EPP sent out a 44-question employer satisfaction survey to the principals/administrators of 10 completers who they had hired. The purpose of the employer satisfaction survey is to find out the impact our completers have on P-12 student learning and to gauge overall employer satisfaction. The response rate for the survey was 61% and the results indicate that the employers agreed unanimously that our completers can differentiate instruction in the areas of content, process, product, or learning environments to meet the needs of all students. Completers can transfer theory to practice by assessing and expanding students’ prior knowledge. Additionally, our completers are tech savvy and can create technology-based interactive lessons. More importantly, completers can design learning experiences that integrate culturally relevant content to build on learners’ cultural backgrounds and experiences. Completers were also seen as being able to exhibit respect and high expectations for each student and communicate with diverse learners in a fair and respectful manner. In addition, they noted that our completers consistently provide equitable opportunities to meet the diverse needs of the P-12 students by designing learning experiences that integrate culturally relevant content to build on learners’ cultural backgrounds and experiences. Based on these data, the EPP believes the completers are successful at contributing to diverse P-12 student learning growth. Moving forward the survey will be sent to all employers listed on the CT State Department of Education Data Dashboard at the end of every academic year in the Spring. 

7. Employer Focus Group Meetings

Following the employer satisfaction survey as described above, the EPP met with a small group of administrators who would share their experiences and views about our completers. Three principals were interviewed, and the questions asked at these one-on-one meetings were about completers’ impact on student learning and their ability to take on the responsibilities independently without requiring additional training. The principal’s report was favorable. However, the principals indicated that our completers required additional support in implementing the IEP process. We have subsequently made changes to our program and have our candidates complete multiple IEPs based on case studies and complete a mock IEP with their cooperating teacher for a student who they are working with in their student teaching placement (EDH 420, EDH 421, and EDH 601). 

8. School University Partnership Advisory Board (SUPAB) meetings 

The EPP hosts quarterly meetings with its partners to discuss and receive direct feedback on completers’ effectiveness on P-12 students from the administrators. The leadership team takes note on the suggestions and recommendations to improve the program. The consensus is that our completers are highly trained and are career ready because of the content and experiences provided in the teacher preparation program. 

9. Connecticut State Department of Education (CSDE) Dashboard

The data dashboard is a State sponsored database maintained by the CSDE to provide EPPs with data about their completers. Data include information about certification and employment status of completers. One benefit of the Data Dashboard is we have information on the completers’ demographics, current place of work, diversity, etc. 

10. Completer Satisfaction Survey and Focus Group Meetings

Data on completer satisfaction are collected using the completer satisfaction survey followed by focus group sessions. Data indicate that while teacher candidates strongly like the teacher preparation programs offered by the EPP, there are areas of improvement. One area of improvement is assessment and progress monitoring that mirrors school district requirements. Based on the feedback, we made changes to the curriculum and now candidates learn to create and maintain a gradebook on Excel to collect and monitor student progress. We have also included progress monitoring as a topic in our Screening and Diagnosis class.

Overall survey and focus group data indicate that 100% of the completers are prepared to create inclusive learning environments to meet the needs of diverse learners. Approximately 89% of the completers are confident that they can use multiple methods of assessment to monitor and evaluate progress. Additionally, 83% of the completers claim that they are comfortable adapting to real time changes to instruction. They are prepared to design and implement developmentally appropriate and challenging learning experiences including the use of technology to improve instruction and advance student learning. About 78% of the completers claim that they can create learning experiences that assure learners’ mastery of the content.  

Instruments and Responses

The EPP uses both proprietary (Certification Tests, edTPA, Title II) and EPP designed assessments (lesson plan assessment, student teaching evaluation, portfolio, and the Candidate Effect on Student Learning/Teacher inquiry project (CESL/TIP) to assess learner and learning. The data provide information to identify if candidates can apply their knowledge of learner and learning at various progression levels. To keep abreast with national and state standards, workforce changes, and feedback from key stakeholders (candidate, clinical partners, faculty), the EPP closely reviewed its existing curricula across programs.  

The EPP ensures that instruments/methods are designed to elicit responses specific to the criteria in Standard 1 (Learner and learning, content, instructional practice, professional responsibility, and technology) through constant data review meetings. The purpose of the Department and data review meetings that include partners is to bring together key stakeholders to review data and identify program strengths and weaknesses to better prepare our teacher candidates Data review meetings are hosted with faculty, clinical partners and other stakeholders. We also set aside time during monthly department meeting to discuss items that need additional or ongoing attention.

In addition, SUPAB meetings are held quarterly that allow us to regularly check in with our clinical partners and address issues as they come up. At SUPAB meetings we not only discuss the data and how we can improve the curriculum, but we also engage in discussion about partner needs and how we as an EPP can help. For example, during the pandemic we were able to organize remote tutoring sessions hosted by our candidates for a local middle school because their students needed additional support with schoolwork.  

Examining the completers’ data we collected, the EPP finds that the current instruments/ methods to elicit responses indicate that to a large extent the EPP meets the teacher preparation program and state requirement for licensure. However, we recognize there is room for improvement and that we need to make the necessary improvement across the programs and even in our collaborative effort with the partners. For example, moving forward to better prepare our candidates we will be using edTPA as a key assessment, and we will also be adopting the CCT Rubric to evaluate candidates in their student teaching placement. These changes will allow us to use assessments with preestablished validity to measure candidate performance. It will also allow our candidates to be better prepared as first year teachers because their employers will be using the same CCT Rubric to observe their teaching. 

Evidence of Stakeholder Involvement in Program Design, Evaluation and Data Driven Decision Making for Continuous Improvement 

Our stakeholders are an integral part of our EPP and participate regularly in collaborating with us on several levels starting from designing of program/coursework, assignments and rubrics, data collection, evaluation and making data-based decisions for continuous program improvement.  

The following outlines institutional participation of stakeholders in evaluating program and completer quality and effectiveness. 

Program Level – The EPP is in continuous communication with school partners and what our teacher preparation programs can do to support their P-12 needs. For example, feedback from CSDE partners prompted us to offer an online master's in special education program to support a shortage area. Key assessments and rubrics are also co-constructed during our quarterly SUPAB meetings to ensure our assessments are meaningful and relevant to produce successful teachers. Recently, we piloted a dual enrollment program with 10 students from a local school district to encourage high school students to enroll in teacher preparation programs based on requests made by stakeholders. In addition, we are also working with another district to create pathways that would allow their non-certified staff members to get certified.   

Course/Assignment Level - At the course level, we collaborate with our partners to design meaningful and relevant assignments and involve them with professional development events. For example, our partners provided training to our candidates on the new IEP process that will be implemented in Fall 2022. Feedback received from our partners who hired our candidates resulted in a change to include data collection and progress monitoring skills in our curriculum. Our adjuncts and clinical educators comprise teachers and/or administrators who provide direct, immediate, and consistent feedback that enables us to make continuous program improvement.  

Evaluation and Data Review Meeting – Stakeholders are involved throughout the program reviewing candidates’ performance data in fieldwork and student teaching placements. Data review meetings (e.g., discuss, analyze, and disseminate results) are held at mid and end of semester to make data-based changes to our curriculum. Data for key assessments are collected on Student Learning and Licensure, and the process is overseen by the Manager of Assessment, Accreditation, and Certification who is responsible for ensuring that all stakeholders have access to the data. As for instructors teaching courses with key assessments, at the end of each semester, they submit a one-page summary with data and interpretation. At the data review meetings, faculty and clinical partners review the Student Learning and Licensure data and the one-page summary. At these meetings, the team puts together professional growth plans for candidates who need additional support.

Clinical Data – Stakeholders are involved in evaluating clinical experiences (e.g., fieldwork –First year students, Sophomore, and Junior Year) and practicum/student teaching (Senior Year and Last Semester for Graduate Students). Clinical Faculty (University Supervisors and Cooperating Teachers) and the Teacher Candidate (Self-Evaluation) review formal data for senior year clinical experiences at the data review meetings as described above. Stakeholders collect data during the mid- and final semester student teaching evaluation to enable us to triangulate data on half-day and full day culminating clinical experiences at the undergraduate and graduate levels. We use discussions from our data review meetings to continuously improve the programs. Moving forward, we would like to collect data from fieldwork teachers. This change will allow us to collect feedback from clinical partners and allow us to strengthen our programs and engage in continuous program improvement.  

Stakeholder Involvement in Completer Data Decision Making  

Our stakeholders are actively involved in our data-based decision-making process and provide meaningful feedback that allows us to engage in continuous program improvement and better prepare our teacher candidates to be successful teachers and have positive impact on P-12 learners. We selected five measures from the 10-measure strategy as described above that require partner involvement to evaluate Completer Effectiveness. that includes collecting meaningful and relevant feedback from all stakeholders. These measures are as follows:  

a.       P-12 Student Survey of Completer Effectiveness  
b.       Teacher Education and Mentoring (TEAM) Program   
c.       Employer Satisfaction Survey
d.       Employer Focus Group Meetings  
e.       School University Partnership Advisory Board (SUPAB) meetings  

Advanced Program

Satisfaction of Employers
The School Psychology Program developed and began implementing a survey to measure the satisfaction of employers of graduates. The rubric used within this measure ranged from 1 (strongly agree) to 5 (strongly disagree). The average ratings obtained indicated overall strong agreement from employers regarding the performance of graduates from the School Psychology program. The average rating on items ranged from 1.00 to 1.25.

School Psychology Program Employer Satisfaction Survey Results
Spring 2019-2021

1 = Strongly Agree…. 5 = Strongly Disagree 

 

 

Spring 2019 

Spring 2020 

Spring 2021 

 

N=8 

N=6 

N=8 

 

 

 

 

Graduates Excel in… (Average) 

 

 

 

Applying school psychology-specific knowledge and skill effectively in their work as a school psychologist. 

1.25 

1.00 

1.25 

Presenting and applying research to assist in meeting school/district needs. 

1.25 

1.25 

1.25 

Participating in and leading collaborative activities involving other school/district personnel. 

1.25 

1.00 

1.00 

Communicating with other professionals and those they serve 

1.25 

1.00 

1.00 

Practicing according to professional school psychology standards and codes of ethics 

1.10 

1.00 

1.00 

Understanding and respecting diversity in their work as a school psychologist 

1.00 

1.00 

1.00 

 

 

 

 

Overall Satisfaction (Average) 

 

 

 

How well the University of Hartford school psychology graduate program prepared school psychologists for employment in your agency. 

1.18 

1.04 

1.08 

The results indicated overall positive responses with no significant weaknesses identified. The School Psychology program continuously reviews survey results and outcome data in order to ensure that graduates are receiving exemplary training in school psychology. 

 

Satisfaction of Completers
The School Psychology Program developed and began implementing a survey to measure the satisfaction of program completers. The rubric used within this measure ranged from 1 (strongly agree) to 5 (strongly disagree). The average ratings obtained indicated overall strong agreement from program completers regarding their satisfaction with the skills acquired throughout their training as well as their sense of being supported throughout their time in the program. The average rating on items ranged from 1.00 to 1.50.

School Psychology Program Completer Satisfaction Survey Results
Fall 2019-2021

1 = Strongly Agree ….. 5 = Strongly Disagree 

 

Fall 2019 

Fall 2020 

Fall 2021 

 

N=10 

N=8 

N=8 

 

 

 

 

Graduates Excel in… (Average) 

 

 

 

Apply school psychology-specific knowledge and skill effectively in my work as a school psychologist 

1.00 

1.00 

1.00 

Apply research to my work as a school psychologist 

1.00 

1.50 

1.25 

Participate in and/or lead collaborative activities involving other school/district personnel 

1.00 

1.00 

1.00 

Provide school psychology services that meet professional standards and codes of ethics 

1.00 

1.00 

1.00 

Understand and respect diversity in my work as a school psychologist 

1.00 

1.00 

1.00 

 

 

 

 

Overall Satisfaction (Average) 

 

 

 

How well the University of Hartford program prepared me for my current employment 

1.00 

1.00 

1.00 

How well the curriculum covered knowledge and skill required for my work as a school psychologist. 

1.00 

1.00 

1.00 

How well the faculty provided support and resources during all aspects of my graduate training. 

1.00 

1.25 

1.25 

The information from the Completer Satisfaction Survey is reviewed by the program regularly.  It was noted that the majority of the respondents indicated Strongly Agree on all items. We will continue monitoring student outcomes to ensure that our candidates are receiving exemplary training.

Candidate Readiness to Move into the Profession 

The EPP licensure programs aim to prepare effective educators with a deep understanding of critical concepts, content knowledge, principles, skills, and dispositions. These skills are essential to advance the learning of all students toward attainment of college- and career- ready standards. To that end, candidates are provided with exposure to diversity and opportunities to develop proficiencies associated with the design and implementation of college- and career ready standards. All program coursework and clinical experiences enrich candidates’ exposure to specific content and pedagogical knowledge in the licensure areas. For example, candidates in the elementary and special education programs are prompted through assignments to develop an Individualized Education Plan (IEP) (EDH 420, EDH 601, EDH 611) and candidates in the early childhood program learn about the (Individualized Family Service Plan (IFSP) (EDY 334) to reflect on the role of early childhood special education. Instructional strategies acquired from the courses in turn support P-12 students' critical thinking, collaboration, and communication skills. The EPP Candidate Disposition Inventory and InTASC alignment tables further detail course and clinical assignments. Readings and experiences are aligned to the EPP's commitment to ensure that candidates are prepared to support access for all P-12 students for college and career readiness. 

Our programs use the adapted version of Connecticut’s System for Educator Evaluation and Development (SEED) CCT rubric to evaluate candidates’ performance and practice during their student teaching observations. The rationale for using the adapted CCT rubric is to ensure that candidates show mastery in essential and critical aspects of a teacher’s practice as required by the state. However, moving forward, we will be using the CCT rubric to ensure our candidates are better prepared to be successful first year teachers.  

Multiple Sources of Evidence to Triangulate Preparation for Certification  

The EPP prepares and monitors candidate progress for certification using multiple sources of evidence. These sources include both proprietary and EPP designed assessments. Systematic progress monitoring allows us to triangulate data to ensure that our candidates are prepared to apply for certification. When applying for certification, we ensure that candidates have met all program requirements, passed all certification tests, and completed edTPA requirements. For example, if candidates do not have a 3.0 GPA or have not received a B or better on required coursework they cannot be recommended for certification.  

Academic Areas: We measure candidate content knowledge across multiple courses and assignments. In addition to program assessments that measure content knowledge across programs we have program-based assessments that help us ensure that our candidates are prepared to meet certification requirements. The data for academic content is triangulated between 1) Proprietary Assessment 2) EPP Assessments; and Program-Based Key Assessments.  

For clinical experiences data on student teaching evaluations are triangulated first by the number of times the assessment is completed (midterms and final) and second by different evaluations (cooperating teacher, university supervisor, and candidates).

Another example of triangulation would be across assessments. This would include data from candidate effect on student learning/inquiry project, student teaching evaluations, and edTPA. We also triangulate data including the following assessments – student teaching evaluations, edTPA, and certification tests.  

Non-Academic Areas: we can triangulate data for dispositions across for data collected across coursework – preprofessional program, professional program, and end of program. We also triangulate data for disposition across individuals – university supervisor, cooperating teacher, and teacher candidate. In addition to triangulating academic and non-academic data, we also triangulate data at each of the six progress monitoring points. This allows us to ensure that our candidates are making progress towards getting certified to become successful educators.

Candidate Impact on diverse P-12 Student Learning and Development 

During culminating half-day and/or full day clinical experiences (candidates are formally evaluated on how they impact diverse P-12 learners. These evaluations are completed by cooperating teachers, university supervisors, as well as the candidates to determine the impact that their day-to-day teaching has on their P-12 students. Data for Spring and Fall 2021 show that all candidates have met the end of program requirements at benchmark and mastery levels. This shows that both cooperating teachers and university supervisors believe that our candidates are having a positive impact in P-12 Student Learning and Development.

Another way to measure candidate impact on P-12 candidates is edTPA, a performance-based assessment, that directly measures the impact the candidates have on their students. The edTPA assesses candidates on their ability to plan, assess, and instruct P-12 students effectively. Overall, the edTPA data show a slight improvement in scores from 2018–2019 to 2019–2020 at the institutional level. Although we did not go above and beyond the state and the national mean, the data show growth in that direction.  

Candidate impact is also measured at the program level by EPP designed key assessments. Some assessments that directly measure impact on student learning are the candidate effect on student learning (CESL for candidates in Elementary, Special Education, and Secondary Programs) and Teacher Inquiry Project (for candidates in the early childhood program). The CESL data indicate that both the undergraduate and graduate candidates are at benchmark and mastery. All female candidates were at mastery in all the criteria they were evaluated on while 75% (N=3) male candidates were at mastery in analyzing pre- and post- assessments and evaluating instruction. The overall data indicate that candidates are proficient in carrying out action research projects. 

Overall data for assessments discussed indicate that our candidates are proficient in effective teaching and have a positive impact on diverse P-12 student learning and development.  

Candidates’ Critical Dispositions Commitment to Growth in Cultural Awareness and Reflection on Bias and Equitable Practices 

Important to the success of the candidates are positive dispositions in the following areas: ethical behavior, professional behavior, commitment to collaboration, appreciation of diversity, and commitment to professional growth. Although we have not explicitly collected data on candidates’ commitment to cultural awareness and their reflection on bias and equitable practices, discussion on these elements have always been part of classroom conversation in the courses candidates take. Candidates are encouraged to share their cultural experiences during class discussions to benefit everyone in the classroom. Moreover, faculty members are also from diverse linguistic and cultural backgrounds, and they bring in their experiences into the classroom. Evaluating these elements is a recent idea, and we have in the pipeline to use the CCT and the edTPA rubrics to evaluate the candidates. Future candidates will be required to participate in implicit bias training and obtain a certification as proof of participation.  

Admission and Completion 

The EPP disaggregates the completion data for the Title II report each year as required by federal law. EPP data is analyzed across demographic groups and is available on CSDE Data Dashboard. The EPP has taken the charge of diversifying our candidate pool and included it in our marketing and recruitment plan, and we have increased our student teaching placement diversification by placing students in different DRG groups (e.g., urban, suburban) different grade levels (elementary, middle, high), and different classrooms (included classrooms, self-contained classrooms, resource rooms, transition academies) to ensure candidates are exposed to different demographic groups and settings.    

The EPP collaborates with the Office of Development and Alumni Relations (IA), to send out a survey to program completers from the past five years. Candidates who responded to the survey were invited to participate in the focus group. At this focus group, the department leadership asked completers a series of related questions as found in the survey: the completer’s background, the completer’s professional aspirations, the completers views and opinions on the professional program, the completer’s options on what could be improved or changed in the professional program, and lastly the completer’s opinions on their ability to be effective teachers based on going through our professional program. The data from this survey were helpful in providing feedback on program improvement.   

Connecticut State Department of Education has begun to collect and share completer data with EPPs through a software interface called the Data dashboard. Within this dashboard are two data sources, completers employed in their first year of teaching and employed in their second year of teaching. In future, the EPP hopes to capture more demographic information and refocus the department survey questions. 

Lessons Learned and Data Based Decisions  

Recruitment 

The EPP adopts a variety of strategies to recruit and retain candidates. We work closely with the admissions offices both at the undergraduate and graduate level. Declining enrollment has been a nationwide trend in teacher preparation programs and our data indicates the same. However, in the last few years we have made numerous changes to program offerings (online master's in special education; dual enrollment program, 4+1 programs) as well strategies to market our programs (Teach CT, Ed Rising, Oakhill). As a university, our focus has also been on retention and as indicated above we have seen some increase in retention rates including retention rates of diverse students.  

Monitoring and Supporting Candidates 

The EPP has a strong and well scaffolded system in place to monitor and support candidates. Monitoring candidate progress begins by the candidates themselves as we teach them how to be reflective practitioners. The next level of support is provided by instructors, advisors, other program faculty, clinical educators, and the department leadership. Outside of the department, our candidates are supported by the dean's office and other university offices (access ability services, counselling, and psychology services, reading and writing center, center for student success, etc.) and committees (academic standing committee). Our candidates are made aware of these supports from the very beginning and encouraged to reach out as and when required.  

Competency and Completion 

The EPP prepares candidates for certification using multiple sources of evidence. First, candidates need to earn a minimum B in all professional courses and have a 3.0 GPA. Second, candidates need to complete practicum and student teaching and meet the professional dispositions requirement. To be eligible for a teaching certificate, candidates for all teacher preparation programs must complete and pass the edTPA, a subject-specific teacher performance assessment completed during student teaching. Additionally, candidates must pass all applicable subject area tests (Praxis II, Early Childhood Education Test, and the CT Foundations of Reading). Academic performance, student teaching, edTPA, professional exams (Praxis 2 and Foundations of Reading) are the ways in which the EPP triangulates candidates’ preparation for program completion and certification. Data from these assessments also guides us on what we should continue doing and what needs to be revised to best prepare our candidates.   

A.3.4 Selection at Completion

The School Psychology program's graduation criteria include GPA, passing of the Comprehensive Examination, satisfactory completion of all internship requirements, and passing of Praxis II exam at national level. The program director is responsible for monitoring candidate progress through the program with support from practicum and internship supervisors.

 

Graduation Completers for Academic Year 2020-2021

Program

Summer 2020

Fall 2020

Spring 2021

Total

UG Early Childhood

-

-

4

4

UG Elementary

-

-

-

0

UG Integrated Elementary and Special Education

-

-

24

24

UG Secondary Mathematics

-

-

2

2

UG Secondary English

-

-

2

2

Graduate Early Childhood

2

2

4

8

Graduate Elementary

0

3

0

3

Graduate Special Education

0

3

13

13

Graduate School Psychology

0

0

9

9

 

150% Graduation Rates by Degree, Program

Table 1. Bachelor Degree Programs (Fall 2015 Entering Cohort, 6 Year Graduation Rate)

Program

150%

Total in Cohort

285 – Early Childhood Education

75% (6)

8

340 – Elementary Education

71% (12)

17

438 – General Education/Undecided

80% (4)

5

7400 – Music Education/Vocal

83% (5)

6

7402 – Music Education/Instrumental

100% (6)

6

775 – Special Ed/Elementary

73% (16)

22

Grand Total

77% (49)

64

*Rates include new, first-time first year students entering the program in Fall 2015 who completed before 8/31/2021

 

Table 2. Master Degree Programs (Fall 2018 Entering Cohort, 3 Year Graduation Rate)

Program

150%

Total in Cohort

286 – Early Childhood Education

43% (3)

7

341 – Elementary Education

75% (3)

4

766C – Special Education OL

80% (8)

10

Grand Total

66% (29)

44

*Rates include new graduate students entering the program in Fall 2018 who completed before 8/31/2021

 

Title II ETS Reporting Services
HEOA – Title II
2020 – 2021 Academic Year

University of Hartford

Statewide

 

 

 

 

 

 

 

Group

Number Taking Assessment1

Number Passing Assessment2

Institutional Pass Rate

Number Taking Assessment1

Number Passing Assessment2

Statewide Pass Rate

All Program Completers, 2020-21

57

48

84%

1111

948

85%

All Program Completers, 2019-20

49

41

84%

1155

1007

87%

All Program Completers, 2018-19

58

56

97%

1234

1142

93%

 

 

 

 

 

 

 

Note: In cases where that are less than ten students taking the assessment of licensure/certificate, the number passing and pass rate are not reported.

1Number of completers taking one or more assessments within their area of specialization.

2Summary level “Number Taking Assessment” may differ from assessment level “Number Taking Assessment” because each student is counted once at the summary level but may be counted in multiple assessments at the assessment level.

Title II Data Summary

Completers Employment Data 

The data used in the tables below were collected from the public records of the Connecticut State Department of Education. The employment data does not include the employment in private schools in CT.

Employed in CT within One Year

Graduating Year

N

Employment in CT

Employment Rate

AY 2014 – 2015

53

24

45.3%

AY 2015 – 2016

51

24

47.1%

AY 2016 – 2017

49

17

34.7%

AY 2017 – 2018

57

19

33.3%

AY 2018 – 2019

63

28

44.4%

AY 2019 – 2020

Data not available on State Data Dashboard

AY 2020 – 2021

¹Employed in CONNECTICUT public schools, including approved private special education programs ONLY.

Employed in CT within 2nd and 4th Year

Graduating Year

2nd Year N

2nd Year Percentage

4th Year N

4th Year Percentage

AY 2014 – 2015

22

91.7%

20

83.3%

AY 2015 – 2016

21

87.5%

17

70.8%

AY 2016 – 2017

14

82.4%

9

52.9%

AY 2017 – 2018

16

84.2%

NA

NA

AY 2018 – 2019

25

89.3

NA

NA

AY 2019 – 2020

Data not available on State Data Dashboard

AY 2020 – 2021

¹Employed in CONNECTICUT public schools, including approved private special education programs ONLY. ²For candidates employed within one year of program completion, Indicator 7 displays whether these candidates were also employed in the 2nd and 4th years after program completion. For example, if a candidate completed their program in 2015-16 and was employed within one year, 2nd year would reflect employment in 2017-18 and 4th year would reflect employment in 2019-20.N/A is displayed when the 2nd or 4th year has not yet occurred for a given completion year.