Open-ended questions

(By: Hilde ter Horst (Zoëzi) and Riet Martens)

There are various types of open-ended questions: completion items, short-answer questions, long-answer questions, essay-type questions. Examples can be found in publications such as the CITO Manual.

We will now examine the following aspects in more detail:

·

Advantages of open-ended questions

·

Disadvantages of open-ended questions

·

Composing tests as a whole

·

Composing open-ended questions

·

Checking test questions

·

The scoring system and mandatory data retention

Advantages of open-ended questions

-

for some learning objectives, tests that contain open-ended questions are more valid than tests containing only closed-ended questions; this is particularly true when assessing complex skills;

-

open-ended questions allow students greater freedom in formulating their answers, thus allowing them to exercise creativity;

-

students have to make productive use of technical terms (this is not the case with closed-ended questions).

Disadvantages of open-ended questions

-

it is difficult to formulate open-ended questions in such a way that students clearly understand what type of answer is expected of them;

-

marking can vary considerably from one lecturer to another. An accurate scoring system is needed;

-

marking can be very time consuming.

Composing tests as a whole

A test is more than just a set of questions. The test as a whole must exhibit content validity. To this end, it is vital that there is a balanced emphasis on the relevant learning objectives throughout the test. The best way to produce a representative test is to create a specification table (also known as a test blueprint). This is used to distribute the test questions across the main topics and across the requisite level (knowledge, understanding, skills). This specification table serves both as a guideline and as a source of inspiration when developing the questions. A test blueprint can also be used to compose multiple tests, with comparable content, on the same topic.

Composing open-ended questions

When composing an open-ended question, it is best to use the following procedure:

1.

First formulate a model answer. If you start by formulating the answer, it is easier to compose a focused question.

2.

Next formulate the question. The emphasis should preferably be on higher cognitive skills rather than on recalling knowledge. Accordingly, ask about the application, analysis, synthesis and evaluation of knowledge. Use verbs that are in keeping with this approach.

3.

Incorporate restrictions on responses as necessary to prevent students from over-elaborating on the answer. Use restrictions to help them formulate the essence of the answer.

4.

Each test question (or sub-question) should be accompanied by an indication of the maximum number of points that can be scored.

Consider the following aspects when formulating a question:

-

If possible, divide the question into an information section and a question section (containing clear instructions on how to answer).

-

Ideally, the question should be formulated in positive terms, if not then highlight the NEGATIVE parts of the question.

-

Be as specific as possible when formulating questions and assignments, ensure that they cannot be incorrectly interpreted.

-

Avoid phrases that do not evoke the required answer.

-

Formulate the question in a way that is linguistically comprehensible to the student. Words that they do not know, and long, complex sentence structures should be avoided. Rather, keep sentences short and to the point.

Sample questions, together with examples of do’s and dont’s, and of common mistakes, are listed in the CITO Manual.

Checking test questions

Check the test against the checklist for composing, assessing and editing open-ended questions. If possible, ask a colleague to take the test that you have prepared for your students. However careful you have been in composing the questions, experience shows that questions can often be interpreted in more than one way. Compare your colleague’s (or colleagues’) answers to the model answer that you yourself formulated. Modify the questions if necessary.

The scoring system and mandatory data retention

In open-ended questions, our aim is to achieve the most objective assessment possible. Where there are multiple examiners (but even if you are the sole examiner) it is particularly important that the students’ answers are assessed as accurately and consistently as possible. This will increase the reliability of the test. While assessments will never be totally consistent, there are ways to negate, at least to some extent, adverse assessor effects. Scoring systems are useful tools in this regard. These are lists of correct (sometimes also partially correct) and incorrect answers, which are provided as guidelines for examiners. On the basis of elements of the provided answers, this indicates which answers merit the maximum score and which only a part score. It also shows the value of the score in question. It is best to create a scoring system at the same time are formulating questions as an aid to marking (example of a scoring system - in Dutch).

Once you have marked a couple of tests, you will be able to further refine your scoring system. If helpful, you can use the answers given by students to further refine your scoring system. It is also advisable to get a second examiner to check any assessments that are close to the pass/fail border.

NOTE: Students have the right to inspect their own past examination papers (from the previous two years) in preparation for a resit and/or “to ascertain which standards were applied in assessing said examination” (OER Art. 12.5). Accordingly, you are required to retain both the students’ answer sheets and the scoring system for a period of two years. If so desired, you can post the scoring system on Blackboard such that it is inaccessible to students.

Sources:

- CITO group: Tests with open-ended questions. http://195.169.48.3/html/literatuur/openvraag.pdf

- Erkens, T. Toetsen met open vragen (Tests with open-ended questions). In: ‘Berkel, H. van & Bax, A. (2006) Toetsen in het hoger onderwijs’. (Testing in higher education.) Bohn Stafleu Van Loghum, 2006 (ISBN 9031336394). http://195.169.48.3/html/literatuur/openvragen_toetsenho.pdf

## go to summary of job aids