next up previous
Next: Externally developed tests Up: Placement test content Previous: Placement test content

Internally developed tests

Tables 13 through 18 refer to those tests developed locally by individual programs. Better control over test content (including maximizing the match between student ability and test difficulty) and the unsuitability of commercially produced tests were the most often cited reasons for developing a test locally (Table 13). Three programs noted that internally developed tests incurred fewer costs than commercial tests.


Table 13: Reasons for locally developing placement tests
Reason Responses
To better control test contents 38
To better reflect ability range  
of students in program 31
Commercial tests unsuitable 31
No commercial test exists for language 20
Availability of funding/resources  
for development 19
As temporary measure only 8
To supplement external test 3
No cost/less expensive* 3
External assessments not well-known* 1
Create state-wide instrument for articulation* 1
Complement oral interview* 1
*Write-in response

As can be seen in Table 14, teachers bear most of the burden in developing tests at the program level, with test content coming from textbooks, course objectives, authentic and original material (Table 15).Two programs reported having the tests developed through the testing division or foreign language office. Perhaps because of their role in the test creation process, teachers' involvement in locally produced tests extends through all phases of the placement process, as seen in Tables 8 through 10.


Table 14: Responsibility for test creation
Test developer Responses
Current/past teachers 77
Current/past administrator 5
Special committee 5
Language resource center 5
Individual test developer 4
Foreign language office* 1
Testing division* 1
*Write-in response


Table 15: Primary source of test content
Source Responses
Course objectives 40
Original materials 31
Course textbooks 28
Authentic materials 25
State/local content standards 11
MLA* 1
Proficiency guidelines* 1
Combination including national exams* 1
Past exams* 1
*Write-in response

Because teachers have other duties in addition to test creation, test revision seems to be a function of course content rather than natural test development per se (see Table 16 and 17). One program reported revising the cut scores each year, but not necessarily the test contents.


Table 16: Test revision frequency
Frequency of test revision Responses
When necessary 37
Once every several years 16
When conditions (personnel, financial) permit 11
Once or twice year 10
Never revised 10
With each new intake* 1
Only cut scores are revised* 1
*Write-in response


Table 17: Reason for revising test
Reason Responses
To better reflect course contents 42
To address a deficiency in current test 28
To assess a skill not previously assessed 15
To make the test contents more timely 13
To lengthen or shorten the test 8
To prevent cheating 7
Demography of students* 1
More contextualization* 1
Improve validity, delivery, student-friendliness* 1
Upgrade/develop* 1
Prepare test for web delivery* 1
*Write-in response  

The validation process for internally developed tests (Table 18) included having current students take the test, matching the contents to course objectives, or review by language specialists. Some programs reported performing an item analysis, but those were in the minority. Some programs also reported that the review by language specialists was performed only when the test was first developed. One program reported that the test was not validated in the traditional sense, but that it seemed to be working fine.


Table 18: Validation method for internally developed tests
Validation method Responses
Piloting on current students 57
Matching contents to course objectives 43
Review by language specialists 33
Item analysis 18
Not validated but seems to work* 1
*Write-in response


next up previous
Next: Externally developed tests Up: Placement test content Previous: Placement test content
Martyn Clark 2004-12-21