The role of AI in English assessment

Jennifer Manning
A woman holding a tablet stood in a server room

Digital assessment is becoming more and more widespread in recent years. But what¡¯s the role of digital assessment in teaching today? We¡¯d like to give you some insight into digital assessment and automated scoring.

Just a few years ago, there may have been doubts about the role of AI in English assessment and the ability of a computer to score language tests accurately. But today, thousands of teachers worldwide use automated language tests to assess their students¡¯ language proficiency.

For example, ɫèAV¡¯s suite of Versant tests have been delivering automated language assessments for nearly 25 years. And since its launch in 1996, over 350 million tests have been scored. The same technology is used in ɫèAV¡¯s Benchmark and Level tests.

So what makes automated scoring systems so reliable?

Huge data sets of exam answers and results are used to train artificial intelligence machine learning technology to score English tests the same way that human markers do. This way, we¡¯re not replacing human judgment; we¡¯re just teaching computers to replicate it.

Of course, computers are much more efficient than humans. They don¡¯t mind monotonous work and don¡¯t make mistakes (the standard marking error of an AI-scored test is lower than that of a human-scored test). So we can get unbiased, accurate, and consistent scores.

The top benefits of automated scoring are speed, reliability, flexibility, and free from bias.

Speed

The main advantage computers have over humans is that they can quickly process complex information. Digital assessments can often provide an instant score turnaround. We can get accurate, reliable results within minutes. And that¡¯s not just for multiple-choice answers but complex responses, too.

The benefit for teachers and institutions is that they can have hundreds, thousands, or tens of thousands of learners taking a test simultaneously and instantly receive a score.

The sooner you have scores, the sooner you can make decisions about placement and students¡¯ language level or benchmark a learner¡¯s strengths and weaknesses and make adjustments to learning that drive improvement and progress.

Flexibility

The next biggest benefit of digital assessment is flexible delivery models. This has become increasingly more important since online learning has become more prominent.

Accessibility became key: how can your institution provide access to assessment for your learners, if you can¡¯t deliver tests on school premises?

The answer is digital assessment.

For example, Versant, our web-based test can be delivered online or offline, on-site or off-site. All test-takers need is a computer and a headset with a microphone. They can take the test anywhere, any time of day, any day of the week, making it very flexible to fit into someone's schedule or situation.?

Free from bias

Impartiality is another important benefit of AI-based scoring. The AI engine used to score digital proficiency tests is completely free from bias. It doesn¡¯t get tired, and it doesn¡¯t have good and bad days like human markers do. And it doesn¡¯t have a personality.

While some human markers are more generous and others are more strict, AI is always equally fair. Thanks to this, automated scoring provides consistent, standardized scores, no matter who¡¯s taking the test.

If you¡¯re testing students from around the world, with different backgrounds, they will be scored solely on their level of English, in a perfectly objective way.

Additional benefits of automated scoring are security and cost.

Security

Digital assessments are more difficult to monitor than in-person tests, so security is a valid concern. One way to deal with this is remote monitoring.

Remote proctoring adds an extra layer of security, so test administrators can be confident that learners taking the test from home don¡¯t cheat.

For example, our software captures a video of test takers, and the AI detection system automatically flags suspicious test-taker behavior. Test administrators can access the video anytime for audits and reviews, and easily find suspicious segments highlighted by our AI.

Here are a few examples of suspicious behavior that our system might flag:

Image monitoring:

  • A different face or multiple faces appearing in the frame
  • Camera blocked

Browser monitoring:

  • Navigating away from the test window or changing tabs multiple times

Video monitoring:

  • Test taker moving out of camera view
  • More than one person in the camera view
  • Looking away from the camera multiple times

Cost

Last but not least, the cost of automated English certifications are a benefit. Indeed, automated scoring can be a more cost-effective way of monitoring tests, primarily because it saves time and resources.

ɫèAV English proficiency assessments are highly scalable and don¡¯t require extra time from human scorers, no matter how many test-takers you have.

Plus, there¡¯s no need to spend time and money on training markers or purchasing equipment.

AI is helping to lead the way with efficient, accessible, fair and cost-effective English test marking/management. Given time it should develop even further, becoming even more advanced and being of even more help within the world of English language learning and assessments.?

More blogs from ɫèAV

  • A teacher holding a tablet in a classroom with students around her also looking at the tablet smiling

    How to motivate and engage students with authentic video

    By Sue Kay
    Reading time: 4 minutes

    Sue Kay has been an ELT materials writer for over 25 years. She is the co-author of ɫèAV's Focus Second Edition and is one of the co-founders of . In this article, Sue takes us through her experience of using video in the classroom and shows us how to motivate and engage students with authentic video.

    Videos are no longer a novelty

    When I started teaching in the early 80s, video was a novelty in the classroom. We only had one video player for the whole school and had to book it a week in advance. There was very little published material available, but thanks to the rarity factor, the students lapped it up.?

    There was no problem with getting them motivated, even if the lessons accompanying the videos were not particularly exciting and consisted mainly of comprehension questions. Lucky for me, our school had a very dynamic Director of Studies who gave great teacher training sessions and I was very taken with a presentation he did on active viewing tasks.?

    I was, and still am, a big fan of the Communicative Approach and I embraced the more interactive video tasks enthusiastically: freeze frame and predict, watch with the sound down and guess what people are saying, listen with the screen hidden to guess the action, etc.?

    When I¡¯m preparing a video lesson, I still try to include at least one of these activities because the information gap provides an ideal motivation for students to watch the video and check their ideas.?