Some of the material in is restricted to members of the community. By logging in, you may be able to gain additional access to certain collections or items. If you have questions about access or logging in, please use the form on the Contact Page.
With the latest developments in computer based testing, implementing equating techniques that incorporate automated essay scoring systems such as e-rater are encouraging potential new directions for equating mixed-format tests of writing that include multiple-choice (MC) items and an essay. The inclusion of generic e-rater essay scores into the anchor set was used to adjust for essay prompts' difficulty. A pseudo-test form study was completed to investigate the impact of using generic e-rater scores to equate mixed-format tests that consist of MC items and a single essay. Equating outcomes for the proposed equating methodology (MC+e-rater scores) as an alternative to the current MC-only anchor-item set approach for the mixed-format tests revealed some promising results. The kappa and observed agreements for pass/fail status determined from the composite scores were very large and similar for all six equating methods comparing the alternate method to the current approach. The findings indicate that MC+e-rater equating outcomes are as strong as the MC-only equating results, and even better for some conditions.