Case Study 11 – Assessing the Knowledge of the People Managing and Working in the Water Industry – Learning & Development Associates

Background

The Office of National Statistics indicate that the number of jobs in water supply, sewerage, waste and remediation sector has risen steadily since 2008, with a particularly steep rise of 19,000 between 2022 and 2023. It is predicted to rise further in PR24 / AMP8 with a step change in both capital investment and consequent staffing numbers, said to rise by 44,000. This, coupled with the skills shortage which seems to be particularly affecting the water industry, means that the training requirements in the sector are increasing dramatically. The inclusion of additional subject areas in qualifications such as Adaptation to Climate Change and regulator-led imperatives like Innovation and Resilience puts further pressure on training and subsequent assessment of staff.

Many apprenticeship qualifications are specific in the assessment methodologies which should be used, covering direct observation and portfolios of evidence for practical skills and direct questioning or multiple-choice questions for the knowledge-based parts of the qualification.

There are other training needs, such as staff requiring non-vocational qualifications and those undertaking refresher training, perhaps as part of a licence to operate. Training providers can use the outputs from assessments as part of competency checks, when multiple choice questioning is seen as a cost-effective way to check a candidate’s knowledge, usually in the form of an end test.

Innovation

L&DA, in common with other providers, made a rapid transition to online delivery of courses during lockdown, and our colleagues had to adapt to our own online delivery platform as well as using companies’ own or preferred systems, some of which presented unique challenges in their operability. The confidence we gained from this and the rapid increase in expertise gave L&DA the confidence to embark on the development of an IT-based solution to address the limitations of current testing methods. This was undertaken in tandem with the provision of water treatment training for the subsidiary of a major UK water company.

A system was designed in-house, using a windows-based platform and had the ability to provide bespoke question papers using a mixture of multiple-choice answer, ordering, or table based questions including short written answers. The tests were undertaken by the candidate on their laptop or mobile phone with onsite invigilation by L&DA staff.

On completion of the test, most questions were automatically marked by the system, with a manual intervention for assessment of written or tabular questions. This allowed candidates to be advised of their scores before leaving the test.

A major benefit of this system was its ability to provide data, not only on the performance of individual candidates, but also the candidates’ performance in different areas of the syllabus
Other benefits of this system were:

  • The client had an input into the questions to make them more relevant to their needs
  • Management reports from the training events could be provided quickly
  • The client had an input into the contents and style of the management report

The client was delighted with the functionality of the system and the information it provided. Testing the market for this product showed that there was significant interest from a number of clients. However, from the L&DA perspective the system was labour intensive in the provision of the data and the system lacked some functionality.

The new KAT™ system

KAT LogoArmed with a new set of design requirements L&DA staff set about finding a different platform to host a new bespoke system. The Knowledge Assessment Tool (KAT)™ was designed to fulfil all of the requirements of the original system but with greater flexibility and some new features which would make the deployment of the tool more straightforward. There were significant cost savings predicted for both the company and the client. This system was in development whilst the pilot system was being used. An opportunity to use the KAT™ commercially presented itself when the company was contracted to a UK water company to provide a level 3 knowledge qualification for Waste Water Treatment. It was agreed with the client that it would be valuable to run the KAT™ both before and after the training. This would give an indication of the current level of knowledge, with the post course assessment showing both the enhanced level of knowledge and the value added by the training. Undertaking the pre course KAT™ also would allow the course content to be modified in line with the areas where scores were weakest.

The portal is able to select random questions to formulate a test paper or allow the administrator to select bespoke questions and to present these questions in a random order to each candidate to limit the opportunities for copying.

Case Study 11 - KAT example question

Deployment of the KAT™

Case Study 11 - figure 2For this deployment on the level three waste water qualification, the system selected a random set of questions to form the test paper. In this case the client was invited to comment on the questions and decide whether the same set of questions would be used in the post training KAT™. The client decided to use the same questions in order to more accurately check the improvement in knowledge.

The candidates were invited by MS Teams to join a meeting hosted by the invigilator from which they followed a link to the assessment portal. The invigilator was also logged into the system and could monitor individuals’ progress throughout the test and see the current marks allotted to each candidate.

Given they were using laptops or mobile phones the camera facility was enabled so that the invigilator could see them from his/her location. After a brief introduction the candidates individually started their own assessment with the candidates moving at their own pace through the questions guided by a countdown time of provided to them on the screen. The question paper auto submits at the expiry of the timer. There was some concern that candidates may attempt to use artificial intelligence in order to answer the questions. This can be controlled to some degree by careful selection of the test duration however, a feature can also be set to warn the invigilator that a candidate has changed screen away from the test. It also warns the candidate that their behaviour has been noted as per warning shown.
Case Study 11 - figure 3

On completion of the test the invigilator could view pre-set management information reports or export data to Excel to produce bespoke reports. The client had requested free text questions be included which the system is unable to mark. A subject matter expert (SME) also accessed the portal to view the written answers and manually assign a mark. This was incorporated into the overall score to be fed back to the client.

At the follow up meeting held the next day, the client was impressed with the flexibility of the system to provide the data so quickly, in the form that they wanted. The main area of interest was the individual candidate performance, but the spread of knowledge and individual areas of weakness were also of interest. The tutor also attended the meeting to get a view of the weakest subject areas in order to make slight adjustments to the delivery plan to cover those areas.

The training course was delivered face to face for 2 cohorts in the following two weeks. Previous discussions with the client had indicated their wish for a period of time to elapse before the final assessment was undertaken.

The follow up KAT™ was arranged, undertaken in the same way as before, using a mixture of laptops tablets, mobile phones and PCs. Some issues experienced in the initial KAT™ such as video loss, extraneous noise and data connection issues were able to be mitigated before this, possibly more important, final assessment took place. The data analysis took place immediately and various tables and graphs were auto generated. A small sample of which are shown below:

The Results

Case Study 11 - figure 4

The graph shows the average score of all the candidates in the 30 questions that were set.

It also provides an assessment of whether the questions are then classed as Easy, Medium or Difficult.

The table below shows the results from one candidate’s pre course assessment. This is shown as an overall percentage and also as a breakdown of score per category or subject of question.
Case Study 11 - figure 5

This table shows the results from the same candidate’s end test assessment. It shows a dramatic rise in the score and the categories or subjects that were most improved.

Case Study 11 - figure 6

The final graph shows an example of a number of graphs available to demonstrate group performance. showing the percentage of the group that answered the individual questions correctly. Comparing the original scores in green with the final KAT™ scores in blue giving an view of the value of the training performance and the areas where candidates’ current level of knowledge may need addressing further in the next round of interventions.

Case Study 11 - figure 7

Benefits of KAT™

  • Invigilator and candidates can be remote from each other.
  • Candidates can undertake the test without travelling to a central location.
  • The ability to use laptops and mobile phones mean that the test can be undertaken on site, at home or even in a vehicle. The only issue is a reliable data connection, although 4G data can be used.
  • Candidate travelling time and costs are not incurred.
  • Invigilation costs are reduced from both the time and travelling aspect.
  • Certificates can be produced automatically, branded as required either L&DA, or the client.
  • Results are available immediately after the KAT™.

Quality Assurance

The key to the success of knowledge assessment is the questions. Their type, validity, difficulty, wording and number all have a bearing on the accuracy of the assessment.

Standardisation of question writing has been undertaken by following industry guidelines to prepare procedures outlining the requirements for the various types of questions:

  • Single Best Answers (SBA)
  • Multiple Best Answers (MBA)
  • Very Short Answer Questions (VSAQ)
  • Short Answer Questions (SAQ)

All of these can be used with the tool.

The KAT™ currently has seven pathways at level 3 and 5:

  • Water Production
  • Water Network
  • Wastewater Network
  • Wastewater Treatment
  • Asset Failure, Incident and Event Management
  • Developer Services Management
  • Adaptation to Climate Change

Summary

The flow chart below illustrates the incorporation of KAT™ into training delivery:

Case Study 11 - figure 8

The future

There is significant interest in the tool, not just from the water industry. It can be used in any situation where remotely managed testing takes place, providing there are sufficient questions. Working with another client we have included mid-course knowledge assessments to check learning throughout the course. The current focus for L&DA is their level three and five qualifications in Water Production, and Wastewater Treatment, with Water Networks and Wastewater Networks in development. The use of in-course assessments has started and we look to increase this by the use of brief “smartphone tests” to check on progress during longer courses and gauge candidate satisfaction.

L&DA are also working with other training providers to carry out online assessment on their behalf. The tests can be branded to the clients’ requirements for both test and results production, including certificates.

If questions can be written on the subject, KAT™ can assess it!

Click here to download the case study as a PDF document.