“They can barely talk about the time of day, let alone go into complex details about their dissertation research or…understand the feedback they’ve got [on] their assignment.”
These are the concerns of an associate professor at one of London’s top universities about some of his international students. “You can hear they’re talking Chinese, then you come closer and then they stop talking. Now, is that because they’re lacking confidence, or is it because they don’t speak very good English?” he asked. And in some cases, he strongly suspects that it is the latter: “How did they manage to get 6.5 in IELTS? When they’re sitting in front of me, they clearly don’t have that qualification.”
Confiding such concerns is not easy, the academic confessed. They don’t apply to all students, and he doesn’t want to come across as racist or Sinophobic. But he is far from the only one quietly asking these questions. As UK universities, facing rising costs and stagnating domestic income, have become more dependent than ever on foreign fee income, there are concerns that, somewhere along the way, quality has faltered.
The growing noise around this is particularly uncomfortable for the language testing organisations that gatekeep international education by assessing students’ English abilities. This industry, which, worldwide, is expected to be worth , has changed significantly over the past five years, with one disruptor in particular forcing incumbents to reassess how they operate – and bringing attention to the question of whether convenience and quality are compatible.?
探花视频
The online language learning behemoth Duolingo’s English test (DET) rose to prominence during the pandemic. When lockdowns began and traditional companies were left floundering after their test centres were forced to close, the language learning app offered students the chance to prove their competencies from their bedrooms, and universities rushed to adopt it under temporary emergency policies even though its use of remote proctoring was relatively untested.
Given how consequential language tests can be to the trajectory of an individual’s life, “the test provider has a responsibility to rigorously validate their tests,” said Talia Isaacs, professor of applied linguistics and TESOL at UCL. “Some of that research is done in-house, but?that should be complemented by?research?undertaken by independent researchers who are external to the testing organisation.”
探花视频
Regarding the DET, she added that “at the time of the pandemic, there?was very limited?validity evidence?to support the?use?and?interpretation of test scores?for higher education admissions purposes. However, universities were in a situation where they needed to adopt some kind of alternative. This was a ready-to-go option and so many did [adopt the DET].”
Moreover, the benefits of the DET soon became clear: the test is quicker and less expensive than most traditional tests and can be taken at home, making it accessible to more students. “Universities like the fact that they can recruit talent from markets that are less served by the test centre infrastructure,” said Michael Lynas, UK country director at Duolingo. “I think that’s good for students; it’s good for universities. It’s probably less good for incumbent testing companies that have business models that are based on this.”
More than five years on from the first lockdowns, DET still may not have usurped the traditional language testing giants – namely, Cambridge University Press & Assessment, IDP and the British Council, which jointly own and operate IELTS; and ETS, which operates TOEFL – but it has certainly established a strong foothold in the market. It is now accepted by more than 6,000 institutions globally, including all US Ivy League colleges and more than 50 UK universities.

At the same time, DET continues to be dogged by accusations of academic inferiority – a claim it firmly denies. For instance, a quoted academics who claimed their foreign students were performing worse in class after being admitted to university using Duolingo tests.
Other test companies started commissioning their own research, too. A 2025 study by researchers from the universities of Cambridge and Dundee, as well as the British Council and Cambridge University Press & Assessment, found that Duolingo’s test was “viewed with scepticism” among academic staff, with concerns raised about “the validity, security, and overall suitability of these newer, more efficient or less established tests”.
Studies that go beyond perceptions and probe DET’s actual efficacy are scarcer, partly because large-scale data about DET-takers’ subsequent academic performance has only become available recently. A found that university students who arrive with higher DET overall scores do go on to achieve higher academic grades in their first year of study than do lower DET performers. However, it also found that students accepted with DET experienced lower academic success than those accepted with IELTS or TOEFL iBT, the study found.
On the other hand, a found that there were no statistically significant differences between the mean grades of students who had entered a US institution through IELTS, TOEFL or DET – findings supported by . Similarly, shows that the performance of graduate students admitted with DET is comparable?to that of those admitted with traditional exams. argues that language proficiency is only one of many factors that affect academic success anyway.
Meanwhile, as Duolingo attempts to bolster the evidence base for DET’s effectiveness, other tests on the market have begun to change in response to DET’s emergence.?
探花视频
“The established test providers?sometimes have felt compelled – because of this market pressure and competition – to either rapidly?develop a new brand test?that mimics the shorter, cheaper, more accessible?fully-automated tests of their competitors, or to replace tasks on their existing test with new tasks that are less resource-intensive to administer and score,” said Isaacs. “They often claim that there are no differences in quality and may even generate validity evidence to support that, but, of course, there are trade-offs.”
While the emergence of new providers may have driven innovation, with tests becoming cheaper and shorter, Karen Ottewell, director of academic development and training for international students at the University of Cambridge, worries that we are seeing a “race to the bottom” among test providers.
“If it’s cheap and quick, it probably isn’t the quality that the universities are looking for,” she said.
One issue is that automated testing is likely to demand less discursive answers from test takers and may therefore be a less effective assessment of overall communicative ability. Sara Cushing, professor of applied linguistics at Georgia State University, noted that the TOEFL test developed in the 1960s was focused on grammar and vocabulary, with questions centred around short sentences and paragraphs – much like DET today. Hence, admitted students were “great at grammar and vocabulary” but some of them “couldn’t write more than a paragraph, if that. They couldn't speak?very well or understand what was being said to them. So?it was?teachers?who?pushed back and said, ‘We need a test that will really let us know whether these students are ready.’”
The introduction of TOEFL iBT in 2005 thus included longer reading and listening passages, along with speaking and writing. “I think what?we may see?in another 10 years is that if all the tests become more like DET, then we’re going to have the same problem,” Cushing said. “Students aren’t going to come to campus prepared to do the things they need to do?to be successful.”?
At the same time, academics stress that universities also have a level of responsibility, not only to demand rigorous tests but to set acceptance scores responsibly – something that does not always appear to be happening. , for instance, found that UK universities tend to set cut-off scores towards the lower end of what is permitted by the Home Office, rather than at the level recommended by the language testing companies.
“Minimum entry for visa and study requirements were introduced to be just that – to be the absolute minimum with which you should be able to cope with the demands, but, increasingly, they’re being interpreted as the target,” said Danijela Trenkic, professor of second language education at the University of York.
For entry to pre-sessional and pathway programmes, language requirements are even lower, on the assumption that students will undergo intense language training as part of those courses. But “universities and providers are usually overly optimistic about how much change can happen”, said Trenkic. “Even students who meet only the minimum standard for normal university admissions tend to do better than the ones who come through pre-sessional programmes because at the end of the pre-sessional programmes, you do not have to…pass a secure test.”
探花视频
She believes that regulators should require universities to communicate to students not just the minimum requirements but the language scores “that would best support them to fulfil their potential”. So even if financial exigencies drive universities to accept individuals who only meet the minimum requirements, the students themselves can make a more informed choice about whether they are ready to start their studies.
But another problem is a lack of language testing expertise among those setting admissions standards, according to Ottewell – particularly given the large amount of money spent on marketing the tests to them. It is unclear even to scholars themselves how far institutions take academic research into account when selecting which tests and scores to accept.
“Universities can make whatever decisions they wish to…but they also have to know what that [decision] means,” Ottewell said. “If students got, say, a six in IELTS, what does that mean? What can they do? What can’t they do? And what will the university need to do in order to support that student when they arrive?”
There are also cultural factors to consider. Ruolin Hu, a lecturer and researcher at UCL’s Institute of Education who herself came to the UK as an international student from mainland China, believes there are “some misunderstandings” within institutions about students’ language proficiency. “Across the programmes I work with, the language bars are set really high,” she said. “I am confident that the students who come actually have a grasp of language that is good enough to at least sustain their academic studies.”
However, that may not always be evident due to some students’ culturally inculcated reticence. In China, for instance, “we were taught to think three times before speaking our minds”. As a consequence, it costs Chinese students “some effort to come out of their shell and communicate in a way that typically would come across as confident and fluent in UK academia”.
Like Ottewell, she thinks universities need to be better prepared to support students once they arrive – but also accept that their communications styles may differ.

Meanwhile, as test companies continue to fight it out for dominance – research papers and conference panel sponsorships are their weapons – there are more changes on the horizon, with generative AI likely to impact language testing just as much as it will affect academia more widely. For instance, as well as question writing, AI is starting to be used to match content to a test taker’s ability as they are being examined, making the exam more efficient and personalised.
“Technology is really, really changing things,” Isaacs said. “All of this represents a really golden opportunity for newer or less established test providers to break into what was before a much more difficult market in which a couple of giants dominated.”
Hu is excited by what AI can bring to the market. “It does give us the opportunity really to [have a] rethink about tests,” she said, including capturing more data not just on what students know, but how they learn and think. It will also obviate the need for university applicants to go to test centres to sit hours of exams, she continued: “not a nice experience and quite traumatising”.
One strong argument for test centres, of course, is that they make it easier to prevent cheating; one major concern with the online proctoring developed during Covid was that it was perceived to be relatively easy for students to fool.?That said, centres are no guarantee of rectitude either; a decade ago, amid concerns that, even then, UK universities were admitting students with substandard English,?the ?uncovered widespread cheating on ETS’ TOEIC (Test of English for International Communication) test at independent centres in the UK, leading to all ETS tests, included the TOEFL, being suspended?as a proof of English ability and the suspension of several universities’ licence?to sponsor international student visas.
That cheating involved having other people sit students’ exams for them. But Lynas said Duolingo hopes to use AI to improve test security. “Sadly, in high-stakes environments, people will try to cheat. Every test at Duolingo is…invigilated one-to-one by an invigilator – a human – but AI can almost turbocharge those humans and do things that maybe humans are less good at,” he said.
Because the company has a centralised digital system, rather than distributed test centres, it also has access to the biometrics of everyone who takes the DET. Using AI, “we can then put that into a system to check against millions of other test sessions,” to crack down on paid cheating. Machine learning, Lynas said, will be much better at working out whether someone has taken the test before than a human could ever be.
ETS, however, insists that human proctoring is still superior – which is why the company is spending “millions of dollars” on it, according to Omar Chihane, TOEFL general manager. “If we could leverage AI to save that money, we would love to, but today the technology simply is not there,” he said. “It does cost us more than competitors…but there is a certain quality behind it.”
And while AI might make test development easier, that’s not necessarily a good thing, he added. “It is very easy to just build the test. It is very hard to build a test that is cemented in research and that is secure beyond doubt.”
Realising that aspiration appears to be what has driven the UK Home Office’s recent decision to commission its own English language test for visa applicants. Historically, the department has opened applications for test suppliers to join its list of Secure English Language Tests (SELT) for visa purposes every few years. However the government last year for one supplier to develop a single bespoke test, known as Home Office English Language Testing, or HOETL.
As trusted sponsors, universities will remain able to select their own language tests for academic visas, but the move is still likely to reshape the industry. “For any of the big test providers, if they were to get the tender for it, then of course it would have quite a big impact on their UK share of the market,” said Ottewell. “It would…be an opportunity to raise standards and close some of the loopholes, because some of the newer tests that are coming out at the moment…haven’t gone through the regulation process from the SELT list.”
However, the Home Office’s move will not necessarily favour the traditional test suppliers over Duolingo. The Home Office has also said it is exploring transitioning to a , with a new round of engagement taking place to “gather market insights on newly available and emerging technology in relation to remote testing, and the viability of incorporating this into the HOELT service”.
Duolingo itself is certainly convinced that its service fits the bill, even as it shies away from denigrating its competitors. “It doesn’t help students, doesn’t help universities, when you have organisations trying to trash-talk other organisations,” says Lynas. “But I think it is important, in the face of that, to just point people to the data and the evidence. It’s all out there.”
探花视频
Whether the promised improvements to language testing rigour improve the mood of academics at the teaching coalface, however, remains to be seen. As Trenkic notes, the UK’s “broken” funding model, which the government shows little willingness to address, leaves universities more dependent than ever on international students’ fees – and a similar picture is emerging in Australia. In such circumstances, “There’s always the temptation to gradually adjust the entry requirements to cast your nets wider, to get more students in,” Trenkic said. “And that, I think, is what’s been happening.”
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰’蝉 university and college rankings analysis
Already registered or a current subscriber?