Beginning in 2013, Ohio instructors participated in workshops at CETE, part of The Ohio State University, to create test item banks for Career-Technical Education End-of-Course (EOC) Tests for Ohio’s CTE Testing System. Required tests are post-tests, given after instruction and funded by the Ohio Department of Education, that provide scores that can be used in the calculation of the state’s technical skill attainment indicator of performance (2S1) specified by the Carl D. Perkins Career and Technical Education Improvement Act of 2006. Pretests are an optional, but valuable, way to measure gain.
When designing test forms for Technical Testing Project, the primary goals are content coverage, reliability and validity. Panels of subject matter experts (SME) – Ohio secondary and postsecondary instructors, and business and industry practitioners – write test questions linked to outcomes-competencies in Career Field Technical Content Standards. Quality assurance of test questions is performed using a two-part review, first by checking the content and accuracy of the questions, and then by rating each test question. Test items are then field tested with students in Ohio CTE programs across the state. Results are analyzed by the CETE psychometric staff. Poorly performing questions are dropped, and final forms are chosen to optimize test reliability, content coverage, and instructor ratings of essentially and quality.
Before each school year, CETE staff choose two 40-item test forms for finalized EOC Tests – one a pretest and the other a posttest. The two tests are not identical, but a percentage of questions are common between the pretest and posttest.
In general, each EOC Test is a blend of questions at two levels of challenge: C1 and C2. Approximately 30-40 percent of items will be at the C2 level (higher in Bloom’s taxonomy – mainly application, analysis, or evaluation). The remaining items are at the C1 level (knowledge and comprehension). In addition, approximately 30-40 percent of items for each test are scenario-based. A scenario describes a short entry-level workplace situation that provides information needed to answer associated items. Graphics (tables, charts, figures, maps) may also be included to provide information needed to answer the test question.
Each EOC Test used for pretesting corresponds to a course as defined by the Ohio Department of Education. Specific information regarding each individual pathway test— EOC Tests to administer, EOC Tests included in each Pathway Test, and number of questions in each EOC Test—follows in the next section. To locate the more information about the Ohio Career Field Technical Content Standards covered by each course outline and associated EOC Test, go to the WebXam Information Resources page at https://www.webxam.org/legacy/AboutTheTests.
As stated above, not every pathway and course has an associated pretest at the time of system launch. Available course-based pretests represent a subset of courses released by ODE across Career Fields in transition to courses (Construction, Engineering, Health Sciences, Law and Public Safety, Manufacturing, Information Technology, Cosmetology, Ground Transportation, and Agricultural-Environmental Systems). This staged release is based on several factors: 1) not all local districts have adopted courses as they have a two-year window, 2) earlier high school classes (juniors, sophomores) are at first the only available cohort for field testing during each school year and thus some courses do not reach an appropriate number of test-takers for item analysis (which is expected to change each successive school year as seniors receive instruction and take field tests in their second year), and 3) the staggered development of Career Field Technical Content Standards (CFTCS) and course outlines to create item banks over three years. Item writing follows content standards revision, and the ODE-CTE website includes the proposed renewal schedule.
Other than reliability and validity for assessing student knowledge and skill, CETE as test developer as yet provides no warranty for the pretests. Staff are looking to partner in collecting data that will help to support score interpretations for purposes of teacher effectiveness; this means that gain is related to other measures of teacher effectiveness. The pretest user is defined as the school district, which assumes its own risk in using this tool to measure student gain. Finally, we want to remind local districts that, according to Ohio Revised Code interpretation contained in the ODE Administrative Rule, districts are not required to use Category B vendors for OTES if another acceptable option is available (for example, SLO).