Designing a test

Designing a test

The first step of the test cycle is designing your test. Here you formulate your learning objectives, your purposes of testing, and you make your test plan to check if your assessment program is in line with your learning objectives and your teaching activities.

  • Prerequisite: Formulating good learning objectives

    Objectives refer to learning outcomes: statements of what a learner is expected to know, understand and/or be able to do and demonstrate after completion of a process of learning. Outcomes should be observable, measurable.

    Some tips:  

    1. Start with: At the end of …  the students are able to…   / … will show that they’re able to…
    2. Use action verbs (e.g. describe, analyse), terms that demonstrate that students have learned and achieved skills and competences at a specific proficiency level.
    3. Be as specific, clear as possible.

    Please watch the following movie or read the following hand-outs to learn more about formulating good learning objectives:

    • Hand-out Writing learning outcomes: A very helpful, practical article about formulating learning objectives.
      In this hand-out you can also find information about Benjamin Blooms' taxonomy. Bloom identified several levels, each with a list of suitable verbs, for describing levels in objectives. The levels are arranged from the least complex levels of thinking to the most complex levels of thinking.  NB. We are not sure about the copyright. If you know more about it, we would like to give credits.  
    • Writing good learning objectives: This is an article that explains what a learning objectives is, why they are important and it provides practical tips for formulating good learning objectives.
    • A very useful site about learning objectives (and much more), is the site from Eberly Center from the Carnegie Mellon Universisty. 
  • Decide on the purpose of testing

    Before we think about how we should assess, what methods we will use or what we have to do, it is important to consider why we test in the first place. What is our purpose when we test?

    There are many purposes for testing as stated on the homepage of this website, but here we will make a distinction between testing as mean for process evaluations and testing as mean for grading/judgement as an end-evaluation.  

    The difference between formative testing and summative testing can be explained like this:
    Formative assessments are in-process evaluations of student learning that are typically administered multiple times during a course/module. The general purpose of formative assessment is to give lecturers real-time feedback about what students are learning or not learning so that instructional approaches, teaching materials, and academic support can be modified accordingly. Formative assessments are usually not graded, and they may take a variety of forms, from more formal quizzes and assignments to informal questioning techniques and discussions with students.

    Summative assessments are used to evaluate student learning at the conclusion of a specific instructional period—typically at the end of a course/module. Summative assessments are graded tests, assignments, or projects that are used to determine whether students have learned what they were expected to learn during the defined instructional period.


    Interactive assessment tools 

    You can use interactive assessment tools to get insight into their understanding of the material. For example, during your lecture you can check on what level of understanding the students master the material, or see if there are any misconceptions. 

    Tools

  • Choosing the right testing method

      In order to ensure that your learning design is sound, your learning outcomes or objectives should be in line with the assessment that you are using to test for the achievement of learning outcomes. In addition, both learning outcomes and assessment should be aligned with the teaching method. Biggs refers to this as the "constructive alignment" (Biggs, 1999). We can imagine the relationship between these three concepts forms a triangle; consequently it is often referred to as the “instructional triangle of learning designs”.


    The appropriate methods for testing …

    • provide students adequate opportunity to demonstrate they have achieved the learning goals
    • can provide the evidence students have achieved the goals
    • assess and grade students in the right (reliable) way

    What to take into consideration when chosing an assessment method?

    • suitability/your experience
    • purpose (summative vs formative; motivate; progress)
    • practical and efficient (workload, is it for instance doable when looking at the costs, or the time available for teachers and rooms that you have available? Is it too risky to do it real life)
    • program - vision on education; policies on testing; Examination Rules (OER)

    Advise:

    • Use different types of assessment methods (to motivate students; taking learning styles into account; for students to compensate shortcomings)
  • Different testing methods

    There are many different testing methods which you can use, as long as they fit with your learning objectives and your teaching methods.In the following articles you can find more information about different testing methods.

  • Test plan

    A test plan is also called an assessment plan, assessment scheme or test scheme; different words for the same thing.

    A test plan helps with making sure your test is valid. It helps you making sure that you test what you want to test. It provides an overview of all tests involved in your course/module in relation to the learning outcomes/objectives of the course/module.

    A test plan provides you a blueprint. A way to…

    • ensure that proper emphasis is given according to the importance of each of the objectives.
    • show the match between what should be learned and what is tested.
    • ensure that proper emphasis is given according to the importance of each of the objectives.
    • ensure the test is representative for all that should be learned.
    • ensure that we test at the intended level (Bloom).
    • help make sure that tests for a specific unit are similar (each year, re-sit).
    • help you to construct suitable items. 

    There are many formats which you can use, this is an example for a test plan for a course with the most basic information in it: learning outcomes, all the tests with test methods, the weight per test and (if there are any) special conditions.

    This is another example, but this is a format that can be used for a module:

    Module part

    Learning goals part

    Relation to overall module goals

    Testing method(s)

    1) Project

    1.1) …

    1.2) …

    (1) (4) (6) *

    A) Individual reflection
    B) Group product
    C) Individual presentation

    2) History and theories of …

    2.1)…

    2.2) …

    (1) (5)

    Written test, open questions

     

     

     

     

     

     

     

    * The overall module goals are stated  somewhere.  NB. Module goals can be tested in more than one part of the module. 

     

  • Test specification table

    A test specification table is also called a test matrix.

    A test specification table helps to ensure that there is a match between what should be learned (objectives), what is taught and what is tested. It also ensures transparency (for yourself but also for colleagues) and repeatability.
    A test specification table zooms in on an individual (written) test. For assignments like a report or presentation a test specification table isn;t necessary.

    Lecturers cannot measure every topic or objective and cannot ask every question they might wish to ask. A test specification table allows you to construct a test which focuses on the key areas and weights those different areas based on their importance. A test specification table provides you with evidence that a test has content and construct validity.

    There are many formats possible for a test specification table. You need to provide the relation between the learning objectives per each individual question, to mention the question format (open, closed, essay, etc), the score/amount of points per question, the weight of each learning objective and/or per question. You can add the book/lesson material belonging to the question. That way you immediately know if you need to adjust your written exam if you change anything in your lesson material. However, this is not obligatory.

    Below you'll find an example of a test specification table:

     

    You can also use this format for making your own test specification table.

  • Website

    score.hva.nl

    On the website of score.hva.nl you can find very practical (Dutch) information about almost every aspect of assessment presented by the University of Applied Sciences of Amsterdam.

  • Student involvement in assessment

    Students learn a lot when they are involved in their own learning process (Vermunt & Sluijsmans, 2015). Subsequently, they can also learn a lot when you involve them in the design process of your education including assessments (Bron & Veugelers, 2014).

    7 ways to involve students in summative assessment

    The University of Applied Science of Amsterdam listed 7 ways to involve students in summative assessment: 

    1. Involve students in formulating the learning goals
    2. Involve students in formulating the assessment criteria
    3. Involve students in the choice of assessment methods
    4. Self-assessment by students
    5. Peer-assessment by students
    6. Evaluating the tests by giving students feedback.

    Here you can find more detailed information about every aspect (Dutch).