Single-Correct Answer (SCA) and Multiple-Correct Answer (MCA) in Multiple-Choice Computer Assisted Language Testing (CALT) Program

<Click here to download full paper>

Single-Correct Answer (SCA) and Multiple-Correct Answer (MCA) in Multiple-Choice Computer Assisted Language Testing (CALT) Program

Herri MULYONOa*, Gunawan SURYOPUTROa & Tri Wintolo APOKOa

aUniversity of Muhammadiyah Prof. DR. HAMKA (UHAMKA), Indonesia

*hmulyono@uhamka.ac.id

Computer has been widely used to assess language proficiency (Coniam, 2006; Dunkel, 1991; Lee, 2004; Y. Sawaki, Stricker, & Oranje, 2009). In promoting the benefits of computer in carrying the language testing task, some literature (e.g. Choi, Kim, & Boo, 2003; Coniam, 2006; Lee, 2004; Sawaki, 2001) evaluate the use of computer in testing language within comparison to conventional paper-based test. Coniam(2006) evaluate the use computer-based and paper-based English listening test. His evaluation of 115 students grade 11 and 12 from two schools attending the two mode of tests: computer and paper based tests has shown that the students performed better in computer-based test than the conventional one. In Choi et al.’s (2003) comparative study of a paper-based language test and a computer based test at five universities at Korea, they found that the section of reading within the computer based test received the weakest support. Choi et al (2003) believes that eye fatigue may be factors harming students’ concentration while reading the passages on the computer. This corresponds suggestion offered by Bridgman and Rock (1993) saying that computer based test need to pay attention to the length of instruction given.

Some authors view that open-ended questions (e.g. essay) offers wider room for measuring cognitive process as well as behavior (e.g. Bennett et al., 1990; Birenbaum & Tatsuoka, 1987; Brown, 2004). Although multiple choice test format is likely viewed to be inferior in exploring students’ problem solving ability as well as constrain teachers from retrieving much information from the students or test takers (Birenbaum & Tatsuoka, 1987), the test format may be seen as alternative of test that offer less stress for students and practical. Within multiple choice test format, students may be facilitated by available answers to the question items (Cheng, 2004) and retrieved direct feedback as the test offers benefit in suggesting practicality of scoring (see Birenbaum & Tatsuoka, 1987; Bridgeman & Rock, 1993).

In a study conducted by Coniam(1999), the use of multiple choice test in computer based program is shown to be students’ preference compared to the paper-based test. Coniam(1999) argues that students’ preference to the computer-based program is derived by simplicity of the program such as clicking alternative answer in multiple choice test application. In addition, a study conducted by Cheng (2004) has shown that students preferred the multiple-choice test compared to the multiple choice cloze and the open-ended tests. Cheng (2004) discusses that students’ preference is mainly derived by stimuli available in multiple choice test format. Furthermore, it is likely obvious that alternative answers available in the multiple choice tests promote students for guessing. However, it is still not clear if the scoring method applied in the three test formats may also attribute to such students’ preference.

In Indonesian context, wide use of computer to facilitate language testing has been applied to evaluate school teachers’ competence. For such use, the Indonesian government has developed web-based application to facilitate this teacher competency online testing as available to access at http://www.ukg.web.id. Although teachers have been shown to get much exposures as well as experiences in completing the online testing, they are likely to pay little interest of using computer to evaluate their students’ language proficiency. There are three indications to this reluctance of using computer for testing purpose in Indonesian secondary school context: 1) teachers’ inadequate knowledge of testing principles applied in computer-based testing, and 2) teachers’ inability of designing computer based test for classroom use. This paper describes principles applied in designing SCA and MCA CALT programs in assessing secondary school students’ grammar proficiency. It also discusses students’ preference to SCA and MCA CALT programs they experienced.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s