Skip to main

Impact case study database

The impact case study database allows you to browse and search for impact case studies submitted to the REF 2021. Use the search and filters below to find the impact case studies you are looking for.
Waiting for server

A better international higher education experience through better assessment of Academic English proficiency

1. Summary of the impact

Academic English skills are vital for students to engage effectively at higher education (HE) institutions. Research conducted by the Centre for Research in English Language Learning and Assessment (CRELLA) into the specification and measurement of Academic English proficiency, including language abilities needed by students in the digital age, has led to significant improvements in the design of tests used by universities to select millions of international students globally. In addition, improved test quality underpins the growth of the language testing industry, and enables HE institutions to provide appropriate support, bringing maximum benefit to students.

2. Underpinning research

In response to demand from HE for improvements in the assessment of Academic English proficiency, staff at the University of Bedfordshire’s Centre for Research in English Language Learning and Assessment (CRELLA) have conducted extensive research into the design, validity, and use of tests of English for Academic Purposes (EAP) in the UK and internationally. Recognised as a world-leading research centre by the global language testing and assessment community (see 5.3, 5.5, 5.7, 5.10), we have been at the forefront of EAP assessment research. For example, we have conducted 20 studies (16 completed and 4 in progress) under the IELTS-Joint Funded Research/IELTS Research Partnership grant schemes (see 5.3 for the full list) with funding totalling over GBP400,000 (e.g., 3.1, 3.2, 3.5, 3.6). These all investigated IELTS (International English Language Testing System): the world’s premier Academic English proficiency test for students seeking access to higher education. There were over 3.75 million candidates for IELTS in the 2018/19 academic year alone.

Aiming towards improved measurement of Academic English proficiency, CRELLA’s research has focused on the language demands of English-medium HE in local and global contexts, examining the extent to which test tasks reflect these language demands. For example, focussing on the relationship between test and academic study materials, Weir (Powdrill Chair Professor in Applied Linguistics), Hawkey (Senior Research Fellow), Green (Professor in Language Assessment) with Ünaldi and Devi (former CRELLA doctoral students) ( 3.1) pioneered the use of automated digital text analysis to investigate a range of Academic Reading demands in UK universities, exploring the challenges that international students encounter.

Regarding the cognitive demands placed on HE students and test takers, Weir and Chan (Senior Lecturer specialising in integrated assessment of Reading and Writing) conducted a comprehensive synthesis of literature of Academic Reading to model the intricate relationships between Reading and Writing in academic study ( 3.2). Field (Reader in Cognition in Language Learning) developed a research methodology for investigating cognitive processes employed in real-world Academic Listening events and identified the effects of test methods on listener behaviour ( 3.1). The extent to which interactional EAP Speaking performance requires students’ Listening proficiency was established through Nakatsuhara’s (Reader specialising in Listening and Speaking) mixed-methods research ( 3.1). This research has shown that ‘Listening’ - often neglected in such Speaking tasks - is an important factor that specifically affects lower-level learners and thus needs accounting for in interactive Speaking tasks. Khabbazbashi (Senior Lecturer specialising in Speaking) investigated topic effects - a controversial issue in EAP research, establishing the extent to which students’ background knowledge impacted on their performance in tests of Speaking skills for academic purposes ( 3.3).

Outside the UK, Chan, Wu (a former CRELLA doctoral student), and Weir’s research redefined the academic literacy skills needed by Chinese university students. The research established key features of test tasks involving test-takers reading source materials to produce a piece of Writing (Reading-into-Writing tasks), as they would in real-life academic studies ( 3.4).

Through the body of research exemplified above, CRELLA has provided a solid basis for test providers to align their EAP tests more appropriately with the real-life contextual and cognitive demands of tertiary level study. CRELLA’s research extends to the development and trialling of practical EAP test specifications that operationalise key parameters of real-life academic study, as well as the provision of supporting evidence for the value of such tests. For example, between 2017-18, with funding of GBP75,000 from the IELTS Partners, CRELLA conducted a comprehensive research synthesis of how each of the four IELTS subtests (Reading, Writing, Speaking & Listening) should respond to the changing nature of English-medium tertiary study, taking account of new technology (e.g., 3.2). Between 2013-18, Nakatsuhara and Inoue, in collaboration with Berry and Galaczi (researchers from the IELTS Partners) conducted three phases of research in the UK, China and three Latin American countries, investigating the use of video-conferencing technology for IELTS Speaking test delivery ( 3.5). The research received funding of GBP68,985 under the IELTS Research Partnership grant scheme. In 2015-18, Chan, Bax (Professor in Applied Linguistics) and Weir ( 3.6) researched the computer delivery of the IELTS Writing test (with funding of GBP32,790 under the IELTS-Joint Funded Research programme). These studies investigated the feasibility of digitally mediated test delivery; established the comparability of results from novel and original versions of IELTS; and provided validity evidence supporting use of the computer-delivered mode in relation to test scores, characteristics of test-taker language, processes in which test-takers engage, and examiners’ test administration behaviours.

3. References to the research

3.1 Taylor, L., & Weir, C.J. (Eds.) (2012). IELTS Collected Papers 2: Research in reading and listening assessment. Studies in Language Testing 34, Cambridge: Cambridge University Press (7/10 chapters in this volume report IELTS research by CRELLA staff: Green, Field, Hawkey, Nakatsuhara, Taylor, & Weir, first published in IELTS Research Reports www.ielts.org/researchers/research.aspx).

3.2. Weir, C., & Chan, S. (2019). Research and practice in assessing academic English: the case of IELTS. Studies in Language Testing vol 51. Cambridge: Cambridge University Press.

3.3. Khabbazbashi, N. (2017). Topic and background knowledge effects on performance in speaking assessment. Language Testing, 34(1), 23-48. DOI: 10.1177/0265532215595666

3.4. Chan, S., Wu, R. Y. F., & Weir, C. J. (2014). Examining the context and cognitive validity of the GEPT Advanced Writing Task 1: A comparison with real-life academic writing tasks, LTTC-GEPT Research Report, 3, 1-91. https://www.lttc.ntu.edu.tw/lttc-gept-grants/RReport/RG03.pdf

3.5. Nakatsuhara, F., Inoue, C., Berry, V., & Galaczi, E. (2017). Exploring the use of video-conferencing technology in the assessment of spoken language: A mixed-methods study, Language Assessment Quarterly, 14(1), 1-18. DOI: 10.1080/15434303.2016.1263637

3.6. Chan, S., Bax S., & Weir. C. J. (2018). Researching the comparability of paper-based and computer-based delivery in a high-stakes writing test. Assessing Writing, 36, 32-48. DOI: 10.1016/j.asw.2018.03.008

4. Details of the impact

To engage and benefit fully from academic study, tertiary students studying through the medium of English need to have appropriate and sufficient English language skills. If under-qualified applicants are admitted, the economic, social, and personal costs can be enormous. Academic English tests serve a key role in providing universities with information they need to make admissions decisions and to determine whether any remedial language instruction is required ( 5.1). CRELLA’s research has contributed to a deeper understanding of the nature of Academic English and provided a theoretically sound and practically efficient framework for its measurement, giving test providers the basis for high-quality assessment. CRELLA’s work plays an integral role in the validation and further improvement of existing EAP tests and guides the development of new tests. It generates significant commercial impact for test providers and considerable educational impact for receiving HE institutions, English language teachers and students globally.

Among a range of academic English tests, CRELLA’s research substantially informed the design and validation of all four UK-based tests that have been officially (re)approved as meeting the criteria for secure English language tests (SELT) by UK Visas and Immigration (UKVI): IELTS, Trinity ISE, PTE Academic, and LanguageCert. CRELLA has thus contributed to the implementation of international student recruitment policy and to more stringent standards for selecting appropriately qualified students for HE in the UK.

Examples of Commercial Impact

  • IELTS

CRELLA’s research has informed the agenda for the IELTS Partnership Research and Development group responsible for major revisions to the test. Shared with the IELTS Partners through strategic meetings, presentations and over 20 collaborative publications with their staff, the research has helped to distinguish IELTS from prominent international competitors, assisting them to standardise the content of the test through improved test writing and scoring procedures across 86 versions in concurrent operation. It has been empirically demonstrated that IELTS is a better predictor of academic success than alternative HE entry pathways ( 5.2). As stated by the Head of Research Strategy at Cambridge Assessment English (on behalf of the IELTS Partners), CRELLA’s research has also “contribute[d] to the test’s state-of-the-art standing, international reputation, and commercial success” over the recent years ( 5.3). Global recognition has grown 40% from 7,000 to over 10,000 educational institutions, government agencies and professional organisations. During the current REF period, annual test-taker numbers have risen from 2 million to 3.75 million ( 5.4), making IELTS the industry leader and generating an estimated GBP650 million p.a. (GBP170-195 per test) for the IELTS partners. Furthermore, the new online format of IELTS ( IELTS Indicator), launched across 50 countries in April 2020, features a video-conferencing ‘Speaking’ test and a computer-based ‘Writing’ test. These were introduced as a direct result of CRELLA’s validation research ( 3.5, 3.6), enabling IELTS to remain the preferred option for international student admissions during the global COVID-19 pandemic ( 5.3).

  • Password

CRELLA’s research underpinned the design and development of a new EAP test: Password ( 5.5). In 2008, CRELLA’s original commercial proposal led to a joint venture business (English Language Testing Ltd). The company now employs 8 permanent staff members in its London-based office and 6 part-time item writers and 25 examiners who work from test specifications and scoring rubrics developed by CRELLA (Headcount: 39, FTEs: 15). Following the success of Password Language Knowledge, CRELLA’s research led to the commissioning from CRELLA of a further 6 test modules, targeting different student populations and a broader range of language skills: Password Intro, Password Pupil, Password Reading, Password Writing, Password Speaking, Password Listening. Since 2013, the Password tests have been taken by approximately 1 million learners and used by about 400 universities around the world, as an efficient means of assessing students’ language levels and identifying those needing language support. This represents a tenfold increase in candidature and an over 40% increase of institutional recognition (cf. 120 institutions and 100,000 candidates during the REF2014 period) ( 5.5).

Examples of Educational Impact

  • Trinity Integrated Skills in English (ISE)

In 2013, CRELLA was commissioned to redesign a suite of the ISE tests ( 5.6) to reflect the integrated nature of academic English skills based on CRELLA’s research ( 3.1, 3.2, 3.4; 5.6). Both the Reading & Writing and Listening & Speaking components were fully re-developed and validated across four proficiency levels. As a result of CRELLA’s contribution, the new ISE launched in 2015 was one of the only two UKVI SELT tests approved at that time. The test was awarded the Association of Language Testers in Europe Q-mark, having met all 17 of their stringent quality standards ( 5.6). ISE is recognised by 98% of UK universities and regarded as “a truly state-of-the-art exam” (Director, University-wide Language Programmes, University of Manchester; 5.6). Commenting on the success of the test, the Academic Manager at St Edmund’s College Summer School underlined CRELLA’s accomplishment, noting that:

“the teachers…have been very enthusiastic about the exam and preparing students for it. Feedback about the methodology of the exam has been very positive. I was impressed by how they were able to integrate the exam skills into their normal class teaching — the exams fit well into our style of lessons” ( 5.6).

  • The Test of English for Academic Purposes (TEAP) in Japan

Until the Eiken Foundation (Japan’s leading English language testing organisation) and Sophia University (Japan) commissioned CRELLA to develop a university entrance examination, four-skills language testing had been rarely employed for university entrance in Japan. TEAP, launched in 2014, has over 30,000 candidates annually and is recognised by 184 universities and colleges. In 2020, TEAP was delivered in 26 prefectures nationally. As the Chief Editor (Test Production Division) at the Eiken Foundation states:

“[t]he long-term impact of the TEAP project is not only limited to those universities that recognize the results: the test has played a significant role in improving EFL teaching and learning practices in Japan by overcoming the barrier to the reform imposed by the established university entrance examinations” ( 5.7).

As one journal author put it, The Test of English for Academic Purposes has “led to an increase in the amount of time students spent studying speaking” ( 5.8). This is a further demonstration of how TEAP (based on CRELLA research) has brought major benefits for high school English language education.

  • General English Proficiency Tests (GEPT) in Taiwan (‘Advanced’ level used for university admission purposes)

CRELLA collaborated with the Language Training & Testing Centre (LTTC) in Taiwan on quality assurance for its locally developed tests ( 3.4). Improved quality increased acceptance of GEPT for HE admissions within Taiwan and among prestigious institutions in the UK, US, Europe, and elsewhere, facilitating access to international HE for local students, especially those in remote locations ( 5.9). As the most widely taken English test in Taiwan, GEPT has had positive effects on English language education in the country, promoting academic language use and the teaching of speaking and listening ( 5.10). As well as noting that CRELLA’s research was “instrumental” in ensuring that the test would “reflect the real-life tasks that students are expected to encounter in the academic context and to elicit the same underlying cognitive processes that real-life tasks do” ( 5.10), the R & D Program Director at LTTC also acknowledges that:

“CRELLA has been LTTC’s long-term partner in research, and partly as a result of the quality assurance and quality enhancement measures [that] their research has informed, the GEPT has seen a considerable growth in the past 6 years, marking 2 million candidates since 2014” ( 5.10).

5. Sources to corroborate the impact

5.1 Migration Advisory Committee (2018). Impact of international students in the UK: Call for evidence responses (1 of 3). Provided as PDF.

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/739092/Impact_intl_students_CfE_1of3.pdf, p. 252.

5.2 Thorpe, A., Snell, M., Davey-Evens, S., & Talman, R. (2017). Improving the academic performance of non-native English-speaking students: the contribution of pre-sessional English language programmes. Higher Education Quarterly, 71(1), 5-32. DOI: 10.1111/hequ.12109. Provided as PDF.

5.3 Testimonial from Head of Research Strategy at Cambridge Assessment English, on behalf of the IELTS Partners, comprising of Cambridge Assessment English, British Council and IDP Australia. Provided as PDF.

5.4 University of Cambridge Local Examinations Syndicate (UCLES) (2019). Annual Review 2018-19. https://www.cambridgeassessment.org.uk/news/our-publications/annual-review/, p. 12. Provided as PDF.

5.5 Testimonial from Chief Executive, English Language Testing Ltd. Provided as PDF.

5.6 Trinity College London (2020). ISE Guide for Higher Education and Universities. https://www.trinitycollege.com/about-us/recognition/english-language, p, 3. Provided as PDF.

5.7 Testimonial from Chief Editor, Test Production Division, Eiken Foundation of Japan. Provided as PDF.

5.8 Sato, T. (2018). The Impact of the Test of English for Academic Purposes (TEAP) on Japanese Students’ English Learning, JACET Journal 62, 89–107. DOI: 10.32234/jacetjournal.62.0_89, p.102. Provided as PDF.

5.9 LTTC (2016). About GEPT. https://www.lttc.ntu.edu.tw/e_lttc/e_gept.htm Provided as PDF.

5.10 Testimonial from R & D Program Director, The Language Training & Testing Centre (LTTC), Taiwan. Provided as PDF.

Additional contextual information

Grant funding

Grant number Value of grant
N/A £26,000
N/A £14,943
N/A £37,700
N/A £37,300
N/A £18,000
N/A £8,303
N/A £30,760
N/A £21,750
N/A £16,475
N/A £32,790