5 tips for scaling up your EPA Knowledge Tests

End Point Assessment EPA Knowledge Tests

Whether you are introducing new EPAs or considering how to scale up your existing portfolio, chances are you will be perfecting your process for developing paper or online tests.

These may only make up a small proportion of your overall assessments, but they are a likely to contribute significantly to your EPA resource allocation.

By choosing the right tools and project management methods you can make best use of your budget so that your tests remain valid, reliable, and comparable as you grow.

Here are five recommendations for developing or scaling up your EPAO’s examined components. They are drawn from our experience in helping awarding organisations in the UK and around the world improve efficiency in developing high-stakes tests.

Choose a flexible delivery format

If you are moving your tests online, you may be tempted to use your delivery system’s authoring tool to develop your content. Our advice is to store your test content in a separate, specialist system.

Authoring capabilities aside, storing content in a test delivery system is restrictive because it binds you to using one provider over the long term. Keeping your content in a separate and secure system with a flexible output option means you can choose the best-value system as your testing approach changes or as better value vendors enter the market.

It’s also highly likely you’ll need to offer printed paper options for apprentices with access requirements or in less well-connected locations. Your authoring system should therefore allow for dual output – e-test and print – so you don’t waste time re-keying the same content into different systems.

In fact, a system that allows you to automate some or all of your typesetting tasks for print would be a distinct advantage and could significantly reduce your design costs.

Adopt a system that prepares you for EQA inspection

Now that your assessments are subject to External Quality Assurance by Ofqual or OfS, there’s an increased need to consider how you track quality assurance at each stage of the test development process.

We recommend you choose an authoring system that allows you to keep an audit trail of all reviewer and approver activity, as this will significantly reduce your workload.

Your system should support:

  • Version control – so test developers know they are using the most up-to-date items;
  • QA – so approvers know that items have been thoroughly checked against set criteria before they are published – and that your tests effectively assess candidate knowledge across the entire standard;
  • Security control – so you can show evidence that authors and reviewers are not able to leak draft tests to candidates – and take swift action if you identify a breach;
  • Usage tracking – so project managers can check who has accessed specific content if a conflict of interest is declared.

Bank your items for longer term efficiency and cost reduction

In these early days of EPAs it’s likely you’ll be creating new tests as they are needed.  But over time you’ll appreciate having an authoring system that offers you the flexibility to:

  • Add new items to a bank from multiple contributors;
  • Create new versions of old items as workplace regulations change;
  • Add performance data and reject items that are not differentiating adequately;
  • Re-use older items – either in live tests ensuring re-sit candidates have not seen them before, or as practice tests.

Adopting an item banking model will allow you to do all of the above. It can also support more cost-effective and streamlined trialling of questions – something that you will likely need to do in future if you’re not doing already.

With the right system you can create separate workflows for items destined for trials, upload facility and discrimination values back into the system following trials and include pre-tested items as anchors in future tests.

Take the strain out of project management

Knowledge Tests are scalable EPA methods but developing an error-free, valid test can suck up time and resource. After all, developing any high-stakes test is one of the hardest jobs in publishing.

Tools that facilitate project management or automate parts of the process can therefore be extremely valuable.

Specialist software enables you to automate how content moves through the review cycle and track contributions along the way. We have seen that using this type of functionality can significantly reduce the time it takes to create an exam paper.

This is one of the reasons why specialist software trumps standard office packages when used to develop exams. I offer a few more in this LinkedIn post.

Get ready to enhance your test format

Right now, your examined components may just consist of standard MCQ questions, but over time there may be opportunities to design more sophisticated tests. For example, you might want to:

  • Include images, diagrams, videos, or audio in your tests;
  • Include additional material alongside your questions that candidates look up such as technical information;
  • Add stimulus for a set of questions such as a case study;
  • Include more open-ended questions, introducing the need for mark schemes and grade descriptors.

If you foresee doing any of these in future, you’ll need to consider how to adapt your digital infrastructure.

For example, how will you:

  • Keep track of copyright clearance and asset usage?
  • Manage ‘composite’ items (items comprising a stimulus and associated questions) while they are in development?
  • Manage audio transcripts and brief recording artists?
  • Hold together the different components of an item (question, mark scheme, and additional materials) as it travels through a review cycle?

Good specialist authoring software offers a solution to each of these requirements, as well as many other development issues that may arise as your EPA gets established.

A final thought

A specialist exams authoring system could add efficiency to your overall EPA assessment portfolio, not just your Knowledge Tests. It can act as a secure way to hold together and collaborate on the development of your entire set of grade descriptors.

By housing your KSB themes, observation guidelines and assessor interview questions in the same system as your question bank, you can maintain coherence in your EPA portfolio and create opportunities for linking different assessment methods later on.

If you would like to learn more about how a specialist assessment development service could benefit your EPAO, contact me for a no-obligation online demonstration.

Or, if you are attending the FAB EPA conference in Warwick on 19th May, visit the GradeMaker stand and we’ll show you how our technology can deliver some of the benefits summarised in this post.

Shaun Crowley

Shaun is Head of Sales and Marketing at GradeMaker, helping assessment providers to improve the quality, efficiency, and security of their exam operation.

Prior to joining GradeMaker Shaun worked for Oxford University Press in the Education and English Language Teaching divisions, introducing curriculum services and resources to schools around the world. He was a founding member of the leadership team responsible for launching international school qualifications through OUP’s joint venture with AQA in 2015, OxfordAQA.

Meet the Team: Shaun Crowley - Head of Sales and Marketing