If you are having any difficulty using this website, please contact the Help Desk at Help@nullHofstra.edu or 516-463-7777 or Student Access Services at SAS@nullhofstra.edu or 516-463-7075. Please identify the webpage address or URL and the specific problems you have encountered and we will address the issue.

Skip to Main Content

RESEARCH DESIGN AND EVALUATION

The Center for Advanced Study in Education (CASE) of the City University of New York will undertake process and outcome evaluation of ATEP. CASE was integrally involved in the development of the ATEP proposal, helping to make linkages between Project activities and expected outcomes. ATEP will benefit from a strong working relationship between Hofstra and CASE on numerous NSF projects. The evaluation and research team will work closely with Project management to ensure that all assessments and multimedia learning tools are relevant and appropriate, and that the evaluation findings are shared in timely and meaningful ways. Research and external evaluation (formative and summative) will be conducted by Dr. Deborah Hecht at CASE. ATEP evaluation and research will examine questions about the content and format of the modules, how the modules are used, dissemination activities, and student impact.

Formative evaluation will assess and optimize progress on key activities focusing on development and use of ATEP. Examples of formative evaluation questions related to module development are: How well does ATEP connect high school (HS) curriculum materials with community college (CC) ATE programs in the three technological domains? How well is the ATEP hybrid text and Web-based cyber-learning system aligned with national standards? What are the key and essential features of ATEP and how relevant, aligned, essential and comprehensive are the textual content, animations, simulations, and practical applications?

Examples of formative evaluation questions related to ATEP module use are:What are the optimal structure, format, and content of ATEP professional development for teachers? How do teachers and students use ATEP materials in classes? Do students and teachers believe that the content and key elements of ATEP are engaging, doable, and easily institutionalized? Is there evidence that the ATEP model is transferable to other content areas? How effective is ATEP at disseminating the curriculum and materials? 

The procedures used to develop and refine the hybrid model; to identify, select, and enhance the text, online browser-based simulations, animations, videos, and practical activities; connect text and multimedia materials; and create module-specific assessments will be documented and reviewed using expert reviews. Periodic surveys and interviews with the development team, along with reviews of tasks completed and materials developed, will be conducted to determine whether team progress and outcomes are in alignment with the Project work plan, timeline, and goals. 

As components of the hybrid curriculum are identified or created (e.g., simulations, videos), they will be reviewed by relevant educational and industry experts and recommendations made for any changes. The evaluation team will assist developers in creating embedded assessments. As the modules are created (modules A & B – summer and fall of Year I; modules C & D – summer and fall of Year II), they will be reviewed by Project staff (including members of other module teams), teachers, students, content specialists, and industrial partners to help ensure the materials are clear, coherent, relevant, factual, and essential and that all components are aligned. 

The modules will then be pilot tested with students from the classes of the teacher-developers to ensure they are doable and understandable; to examine how the curriculum is used; and to further guide development and revisions of the teacher instructional tools.

All dissemination activities will be documented using an online data management system and posted on the Project Moodle. Feedback from dissemination activities (e.g., feedback surveys from workshops) along with reviews by experts will help assess whether the model is transferable. 

Summative evaluation and research activities will focus on the study of ATEP classrooms (students and teachers). Summative evaluation and research questions are: 1) To what degree does the ATEP hybrid model, and what components of it, lead to deeper student understanding of content, positive attitudes, enhanced self-efficacy, and interest in and awareness of technical STEM education and careers? 2) How have students’ interests in careers in the three ATEP domains been stimulated through Project involvement? 3) In what way does the curricular treatment of manufacturing, biotechnology, and ICT affect student engagement and promote interest in further study in those domains? 4) Is there a differential use and impact of ATEP in terms of school, type of student (e.g., gender), teacher experience, and type multimedia introduced? 5) What are the characteristics of schools where ATEP is most easily implemented?

Expert reviews and pilot testing will provide initial data about ATEP’s impact. These will also provide opportunities to refine assessment tools to evaluate the accomplishment of Project goals. Using the assessment tools developed and refined during the pilot testing, pre-post surveys given before and after participating in ATEP field testing (Year III) will be collected from students and teachers to assess each of the key Project goals. Student surveys will examine understanding of ATEP content, interest in and awareness of technical careers, interest in STEM careers, and relevant self-efficacy. Teacher surveys will examine interest and self-efficacy in using ATEP materials and attitudes toward the program.

Demographic and background data will be collected to characterize the participating schools, types of students, gender, and teacher experience. Using statistical techniques such as multivariate analysis of covariance and discriminant analysis, the differential impact related to these and other curriculum variables (e.g., curriculum topic, number of simulations) will be examined. The relationship between teacher ratings of curriculum effectiveness and various demographic variables will be explored. Results will inform the development of text- and Web-based materials and contribute new knowledge related to the use of the Project’s hybridmodel as a transformative educational methodology. In particular, differences across the three technology domains will be examined within the context of the structure and format of the curriculum.