Test | Programs |
AP subject test | 38 |
CAPE (BYU) | 21 |
Wisconsin Test | 8 |
ACTFL OPI | 7 |
CLEP | 5 |
SAT II subject test | 2 |
Michigan Test | 1 |
SOPI | 1 |
Japanese Proficiency Test | 1 |
MLPA | 1 |
AATG Test* | 1 |
University of Oregon NFLRC STAMP* | 1 |
College Board* | 1 |
sras* | 1 |
Reason | Respondents |
Widely used in other programs | 21 |
Gives consistent results | 18 |
Have always used it | 8 |
Familiar to teachers | 7 |
No other option | 4 |
Ease of administration* | 1 |
Cost* | 1 |
Convenience* | 1 |
Faculty not available in summer* | 1 |
Dean's decision* | 1 |
As with the internally developed tests, content review and piloting on current students were the most common ways of validating externally produced tests (Table 21). One program cited a high correlation obtained between a previously used test. Perhaps because of the inherent mismatch between a standardized test and an individual program's curriculum, matching the test content to course objectives was not used to the extent it was for locally produced tests.
Validation method | Responses |
Review by instructors/department head | 37 |
Piloting on current students | 31 |
Matching contents to course objectives | 7 |
Item analysis | 2 |
Compared favorably to previous test* | 1 |