Competencia lectora digital en inglés para ADE y Turismo: La importancia de las microdestrezas para los planes de estudios
Competència lectora digital en anglès per a ADE i Turisme: La importància de les microdestreses per als plans d’estudis
Facultad de Empresa, Finanzas y Turismo
Departamento de Filología Inglesa
Universidad de Extremadura
acurado@unex.es
Received: 29/10/2023 | Accepted: 29/02/2024 | Published: 22/07/2024
Abstract
An optimal reading performance is key in university degrees, and yet, the research focus on digital academic reading is much smaller than on writing skills (Boulton & Cobb, 2017; Pérez-Paredes, 2019; Curado-Fuentes, 2023). This paper focuses on analyzing learners’ development of digital reading skills in English at university. The study has been conducted as a mixed-methods analysis in two consecutive academic years (2021 and 2022) with first-year students in the double degree of Business Management and Tourism at the University of Extremadura. The two groups used different digital tools along 4-week reading sessions and were tested on their reading comprehension before and after the sessions. Additionally, students answered various questionnaires about the activities. Learners’ use of digital data-driven learning (DDL) techniques was combined with other types of online tools. Some face-to-face interviews with students were also conducted at the end of one course. The results from the tests indicated significant reading comprehension improvements in both years, whereas no significant difference between the two years was found. In the second academic year, the DDL tool was favored much more, in addition to online dictionaries, and students were more open to reading comprehension work. Overall, it is found that the reading of academic texts can be fostered by empowering learners with meta-cognition via specific digital reading micro-skills.
Keywords
reading skills; DDL; English for Specific Purposes
Resumen
Un rendimiento de lectura óptimo es deseable como objetivo clave en los grados universitarios y sin embargo, el enfoque de las investigaciones es mucho menor en la lectura académica digital que en las destrezas de escritura (Boulton & Cobb, 2017; Pérez-Paredes, 2019; Curado-Fuentes, 2023). Este artículo analiza el desarrollo de habilidades de lectura digital en inglés en la universidad. Esta investigación sigue un método mixto de análisis en dos años académicos (2021 y 2022) con estudiantes de primer curso en el doble grado de Administración y Dirección de Empresas y Turismo de la Universidad de Extremadura. Los dos grupos utilizaron diferentes herramientas digitales a lo largo de sesiones de lectura de cuatro semanas y se evaluó su comprensión lectora antes y después de las sesiones. Además, los estudiantes cumplimentaron cuestionarios después de las actividades. Se comparó el uso de técnicas de aprendizaje digital basado en datos (DDL) por parte de los alumnos con otros tipos de herramientas en línea. También se realizaron algunas entrevistas con los estudiantes al final de un curso. Los resultados de las pruebas indican mejoras significativas en la comprensión lectora en ambos años, mientras que no se encuentra diferencia significativa entre los dos años académicos. En el segundo curso, la herramienta DDL fue mucho más favorable, junto con el uso de diccionarios online, y los estudiantes se mostraron más abiertos al trabajo de comprensión lectora. En general, se observa que la lectura de textos académicos puede fomentarse mediante el empoderamiento de los alumnos con meta-cognición fomentada a través de microdestrezas específicas de lectura digital.
Palabras clave
destrezas de comprensión lectora; DDL; Inglés para Fines Específicos
Resum
Malgrat que en els graus universitaris és un objectiu clau desitjable un rendiment de lectura òptim, la recerca se centra molt menys en la lectura acadèmica digital que en les destreses d’escriptura. (Boulton & Cobb, 2017; Pérez-Paredes, 2019; Curado-Fuentes, 2023). Aquest article analitza el desenvolupament de les habilitats de lectura digital en anglès en la universitat. La recerca segueix un mètode mixt d'anàlisi en dos anys acadèmics (2021 i 2022) amb estudiants de primer curs en el doble grau d'Administració i Direcció d'Empreses i Turisme de la Universitat d'Extremadura. Els dos grups van utilitzar eines digitals diferents durant sessions de lectura realitzades al llarg de quatre setmanes i se'ls va avaluar la comprensió lectora abans i després de les sessions. A més, els estudiants van emplenar qüestionaris després de les activitats. Es va comparar l'ús que van fer de tècniques d'aprenentatge digital basat en dades (DDL) amb altres tipus d'eines en línia. També es van realitzar algunes entrevistes amb els estudiants al final d'un curs. Els resultats de les proves van indicar millores significatives en la comprensió lectora en tots dos anys i que no hi havia diferència significativa entre l’un i l’altre. Al segon curs acadèmic, l'eina DDL va ser molt més favorable, juntament amb l'ús de diccionaris en línia, i els estudiants es van mostrar més oberts al treball de comprensió lectora. En general, s'observa que la lectura de textos acadèmics es pot promoure mitjançant l'empoderament dels alumnes amb metacognició fomentada a través de microdestreses específiques de lectura digital.
Paraules clau
destreses de comprensió lectora; DDL; anglès per a fins específics
Practitioner notes
What is already known about this topic
• Reading comprehension is a key objective and common activity in academic settings, especially in tertiary education.
• Data-driven learning (DDL) provides language learners with many possibilities for academic writing and vocabulary development.
• Digital apps and web-based resources can foster motivation for foreign language learning.
What this paper adds
• Reading skills are investigated less in academic L2 contexts using digital tools, so, more studies like this are needed.
• Reading comprehension can improve with digital methods tailored to suit specific learning needs.
• Learners’ metacognition is fostered as a result of dealing with digital reading comprehension.
Implications for practice and/or policy
• University students should be motivated to use specific digital tools for reading in their fields within and outside the classroom to increase content and language knowledge.
• University curricula should enhance a combined focus on reading skills together with sub- or micro-skills, such as noticing, discovery, inducing, formulation, and critical thinking.
University curricula generally include reading competence as a major goal in students’ first and target foreign languages. For example, Spain’s Verifica Report (2008)1 regarding the Business Management and Tourism degree indicates that academic reading skills are key for various subjects (e.g. sociology, economics, marketing, and so on). Students’ effective reading comprehension of online material in English related to their studies increasingly includes textbook chapters, news reports, and research articles, whilst most content subjects list these requirements in their syllabi. Content lecturers often assume that learners already have the required competence to understand digital academic English texts effectively, an indication of the content / language teaching divide (Kuteeva & Airy, 2014; Brown, 2017). The reality, however, is that many Spanish students entering university are at the B1 (intermediate) level in English (Pérez-Basanta, 2005; Mancho-Barés & Llurda, 2013; Ohata & Fukao, 2014; Curado-Fuentes, 2015). Therefore, it is unsurprising that first-year students tend to encounter difficulty understanding “the main ideas of complex text on both concrete and abstract topics” (from B1 to B2 according to the European framework of reference for languages: Council of Europe, 2018, pp. 24–26).
In this context, this study set out to examine the process of academic English reading comprehension in two first-year university courses (2021 and 2022) in the dual degree of Business Management and Tourism at the University of Extremadura. The fact is that few studies have been conducted on digital academic reading comprehension compared to writing and translation in the L2 classroom (Boulton & Cobb, 2017; Pérez-Paredes, 2019; Curado-Fuentes, 2023). Therefore, the main objective of this study was to compare English learners’ written reception and perceptions on exploiting digital tools and DDL (Data-Driven Learning), defined below, for reading and vocabulary comprehension. The hypothesis was that this digital method could lead to positive reading achievements and / or reactions if suitable micro-skills were challenged. Therefore, a mixed methods analysis was conducted throughout two academic semesters by working with pre- and post-reading tests, student questionnaires, and interviews.
What follows is the review of key ideas related to reading comprehension based on DDL and digital technologies: First, learning approaches to reading, including DDL, are defined and described, then the relevance of vocabulary exploitation is discussed, and, finally, some features of digital tools are mentioned for reading. These are significant concepts and approaches for the methodological design adopted in this study.
In ESP (English for Specific Purposes), students should approach “authentic-as-possible tasks (…) [that] serve as vehicles for developing communicative competencies (…) [and that] equip students with language learning and problem-solving strategies” (Belcher, 2009, p. 9). However, this objective is not easy to achieve because the academic culture in Spain, as in, for example, France (Boulton, 2010), tends to emphasize the role of the instructor as provider of knowledge and answers, not as task mediator or advisor. For ESP, teachers should thus mediate more, and engage students by providing them with ample opportunities and alternatives of “problem-solving tactics”, “collaborative work”, and “support strategies” (e.g., with online dictionaries, glossaries, examples, concordances) that aid readers in comprehending what is being read (Mokhtari & Reichard, 2002). This type of approach can integrate three scales of reading motivation (Han, 2021): Efficacy, engagement (as intrinsic motivation), and academic / utility value (extrinsic motivation).
In this learning context, DDL can be a key strategy, since learners assume a centre role as language “detectives” (Johns, 1994). They can, for instance, induce content meaning from authentic texts, which is a key feature of ESP reading. In this process, the way of reading digital texts changes compared to the traditional only-left-to-right direction. In DDL, learners can also approach texts “from the centre outwards, and vertically up and down, which the majority of learners will not have done before” (Timmis, 2015, p. 138). Utilities such as corpus-driven concordances and online dictionary examples can favour this type of reading dynamics at tertiary levels (Frankenberg-García, 2012) because, among other reasons, university students tend to engage in more cognitively demanding processes (Grabe & Stoller, 2013). Students’ participation and collaboration in these dynamic exchanges can be thus encouraged with digital tactics (Paltridge et al., 2016).
This collaborative approach in class tends to be valued positively by Spanish university students (Mancho-Barés & Llurda, 2013). Collaboration in reading activities should be encouraged, supervised, and scaffolded by instructors so that learners reach their zone of proximal development for linguistic intake (Curado-Fuentes, 2023; Templeton & Timmis, 2023). Also, the scaffolding need not be linear since the teacher is there to mediate in troubleshooting and re-direct activities by asking different questions (Templeton & Timmis, 2023, p. 46). There should be much “peer effect” on learner engagement, which means that students interact using metacognitive skills during the reading process, co-constructing meaning in the text (Jin et al., 2022). In these collaborative situations, the concept of Broad Data-Driven Learning (BDDL) can play a significant role (Pérez-Paredes, 2024). This notion means that learners interact with key linguistic data extracted from texts by not only looking at key words in different co-texts taken from corpora (a major feature of DDL), but also sharing information with other tools, such as Google resources, online dictionaries, translators, and AI services (Crosthwaite & Baisa, 2023).
The relationship between academic vocabulary and reading comprehension has been extensively explored (e.g., Nagy & Townsend, 2012; Nation & Coady, 1988; Xu et al., 2019, among others). Focusing on key academic vocabulary in the texts can assist students’ critical reading skills (Xu et al., 2019, p. 23). DDL can be a useful instrument for this focus on key lexical items during the reading process (Argyroulis, 2022; Lee et al., 2019). The main reason is that the exploration of key words in context via concordances can reveal key meanings in texts (Chambers, 2022). In this process, important strategies include the notation of vocabulary by noticing “immediate collocates (and translation of the whole), [also] the value of example sentences; how to recognize a word’s syntactic role; how to use context to disambiguate, and so on” (Philip, 2010, p. 12). Effective and efficient comprehension tends to occur when a threshold lexical knowledge is attained (Laufer & Ravenhorst-Kalovski, 2010). It is found that knowing many words can help, but word depth knowledge in specialized domains is a key factor for reading comprehension in ESP (Song & Reynolds, 2022). This lexical depth includes knowledge of word families, which often involves different word classes, e.g., the verb find and the frequent academic noun findings (Laufer & Cobb, 2019).
Reading discipline-specific texts entails engagement with “certain preferences for word selections and collocation patterns” (Zhang, 2013, p. 46). This reliance on “discipline-based lexical repertoires” is essential to academic literacies (Hyland & Tse, 2009, p. 127). To deal and cope with specific lexical behaviour effectively, learners in these specialized situations should access and exploit “corpora of closely associated texts” for narrow reading (Ballance, 2021, p. 12). These sources may be selected, compiled, and managed by students themselves by identifying required readings in their disciplines. This process can actually contribute to enhancing a sense of ownership of the texts (Charles, 2012). Students can also perceive the connection between specific lexical associations and text topics / subjects better when they detect prototypical lexical expressions that correlate with examples found in their authentic readings (Flowerdew, 2015). In first year university courses, a common academic genre is the textbook chapter (Ismayilli Karakoç et al., 2022). Learners can thus be provided with opportunities for autonomous chapter selections and navigation, focusing on key vocabulary in these academic texts.
Reading digital texts involves the deployment of higher cognitive skills (Jones & Hafner, 2012, pp. 39-42). Therefore, the apps and digital tools used for reading should adapt to and challenge university learning demands (Xiangming et al, 2020). These tools should provide opportunities for peer collaboration (Tate & Warschauer, 2022) so that students may realize positive learning aspects in technology use (Choubsaz et al, 2023; Lai, 2015). There is also a need for “simple query tools” readily available as open access (Ronan, 2023, p. 34), since increasingly user-friendly interfaces can open up and ease metacognitive opportunities among lower-level undergraduates (Boulton, 2012; Timmis, 2015). Such technologies should be introduced in order to “fill a need not adequately covered by other provision” (Charles & Frankenberg-García, 2021, p. 8). For example, reading can be fostered by using vertical and horizontal approaches to texts, enabled by concordancers, decoding key words in context, and, at the same time, activating students’ schemata (Hirata & Thompson, 2022).
Another key idea is that online resources should ease task development, and not work against it (e.g., overwhelming learners, or getting in the way of learning objectives and procedures—Friginal, 2018). The combination of hands-on corpus tools and bilingual dictionaries can suitably provide a user-friendly digital environment in this direction (Altun, 2021; Zhang et al., 2023). Also, in terms of computer devices, mobile electronics should be supported, as 95 percent of students prefer smart phones for vocabulary learning (Pérez-Paredes et al., 2019; Steel, 2016). Finally, a key idea is that DDL be transferred beyond the classroom, and that students can apply this methodology to other learning scenarios (Ackerley, 2021, p. 93). Yet, few students actually use DDL technologies for linguistic exploration after language courses (Charles, 2022). This limited use is yet another reason for the broadening of DDL tools by integrating other resources for linguistic exploration, such as interactive videos, Google tools, writing assistants, online dictionaries, collocation / grammar checkers, concordancers, and so on.
The English subject in which this study was conducted is “Specific English I”, a one-semester first-year ESP course in the double degree of Business Management and Tourism at the University of Extremadura. The participants were 41 first-year students: 22 in the 2021 Spring semester (average age = 18.2 years; female = 15; male = 7), and 19 participants in 2022 (age = 18.7; female = 10; male = 9). Their General English level was tested at the beginning of the semester over the Cambridge Assessment English platform (cambridgeenglish.org/test-your-english/general-english/). The results showed that 14 students had a B1 level, six A2, and two B2 in 2021, whereas 15 students had B1, two B2, one A2, and one A1 in 2022. This type of mixed-level group is common in our courses because no minimum linguistic level is required prior to enrollment.
Students completed a preliminary questionnaire about their English learning background and course preferences. The answers mostly reflected their priority for improving oral (spoken and listening) skills (86 percent in 2021, and 63 percent in 2022), and that their secondary school classes had mostly dealt with writing and grammar (64 percent in 2021, and 63 percent in 2022). Most learners also appreciated the use of digital tools for language learning (59 percent in 2021, and 63 percent in 2022). These students are categorized as Generation Z, highly open and resilient to digital tools for all types of purposes (Schroer, 2008). These students also tend to come from a mixed-language level and teaching background where both traditional and innovative approaches merge.
The digital reading modules lasted four weeks in each semester (a total of 16 class hours each, excluding pre-class/homework activities). Various digital tools were deployed during the module sessions. The skills targeted with these resources are competences included in the Common European Framework of Reference for Languages (Council of Europe, 2018), analysed in IDEAL (Integrating Digital Education in Adult Language Teaching), a European project where the author of this study participated (IDEAL desktop project, 2020). This project provided detailed analyses, descriptions, and illustrations about key digital skills and tools exploiting competences for language learning. Of the nine skills classified in IDEAL, “Reading reception”, “Mediating a text”, and “Written and online interaction” were entered as parameters for the selection of the tools. Thus, Kahoot (kahoot.it), Playposit (playpos.it), Acadly (acadly.com), Quizlet (quizlet.com), Versatext (versatext.versatile.pub), and Compleat Lexical Tutor (lextutor.ca) were used. Additionally, two online bilingual (English / Spanish) dictionaries (Word Reference: www.wordreference.com/es, and Cambridge: dictionary.cambridge.org/dictionary/english-spanish), and the Reverso online translator (www.reverso.net/text-translation) were integrated in 2022. All the tools used are freely available on the web (prior to registration in most cases).
The reading material consisted of 12 short (300-800 words) texts about innovation, digital technologies, and marketing in business and tourism (selected from textbook chapters in open access: open.umn.edu/opentextbooks). The sessions included the use of digital tools for incursions into the readings as well as for vocabulary exploitation. Table 1 displays the organization of the sessions and tools in both modules. Additionally, reading comprehension and vocabulary pre-tests (before the module) and post-tests (after the module) were answered individually in class by students on a Google form (see Appendix 1). The online platforms where learners accessed and uploaded the material were the university’s Moodle site (campusvirtual.unex.es) in both semesters, and Padlet in 2022 (padlet.com).
Table 1. Sessions and activities during the 4-week reading modules
2021 semester |
2022 semester |
Sessions and changes |
Poll / discussion (Acadly) Introducing DDL Uploading texts and using concordances (Lextutor) |
Poll / discussion (Acadly) Introducing DDL Uploading texts and using concordances (Versatext) |
Days 1 & 2: Fewer explanations and guidance in 2022 More work in pairs / groups in 2022 DDL tools changed |
Vocabulary activities (Acadly / Playposit) Reading questions and listening comprehension (Lextutor) Concordances (Lextutor) Quiz (Kahoot) |
Vocabulary activities (Versatext / Reverso) Reading comprehension (Versatext) Concordances (Versatext) Quiz (Kahoot) Poll (Acadly) |
Days 3 & 4: More independent (not teacher-guided) work (by groups) in 2022 Follow-up questionnaire (2022) DDL tools changed |
Reading and vocabulary (Acadly) Listening / reading (Playposit) Quiz (Quizlet) Concordances in other corpora (Lextutor) |
Text uploads and vocabulary (Acadly and Versatext) Quiz (Quizlet and Kahoot) Reading (online dictionaries) Translation (Reverso) |
Days 5 & 6: More autonomous text selection (2022) Group work-focused (2022) DDL tools changed |
Exploring corpora (Lextutor) Vocabulary (Quizlet) Reading (Acadly) Quiz (Kahoot) |
Reading (Versatext) Vocabulary (online dictionaries) Video comprehension (Playposit) Poll (Acadly) |
Days 7 & 8: More independent work (2022) Follow-up questionnaire in 2022 DDL tools changed |
As can be observed, DDL was mainly conducted in Lextutor (in 2021) and Versatext (in 2022). The 2022 sessions featured more independent group work. The main reason was that Versatext was easier than Lextutor which, in 2021, required more teacher explanations and guidance (Curado Fuentes, 2023). In 2022, students became more active readers, engaging with online texts not only to answer specific questions posed by the instructor but also for more meta-cognitive procedures as they realized what the tool could achieve in terms of linguistic exploitation and learning. In order to evaluate this meta-cognition, two in-between session questionnaires (after days 4 and 8) were used as within-subjects design instruments (Argyroulis, 2022). In these questionnaires, participants could “express their feelings and attitudes to a greater extent” (Argyroulis, 2022, p. 175) regarding their impressions with the corpus and dictionary tools for reading comprehension. In addition, there was one global questionnaire after the modules were completed.
With the corpus tools (Lextutor and Versatext), learners had to answer reading and vocabulary comprehension questions about the textbook chapter excerpts provided by the teacher, which they had to upload and manage with the tools (Days 1, 2, 3, and 4 in Table 1). Specific questions related to key vocabulary and readings were posed (see Appendix 2 for an example). In sum, three main text comprehension goals were set in these activities: 1) Lexical meaning decoding. 2) Realization of meta-textual references, and understanding of concepts and ideas. 3) Sense disambiguation by contrasting linguistic information. All these activities were conducted in the computer lab under the teacher’s supervision. Figure 1 illustrates a keyword being contrasted over concordance lines for meaning and part of speech decoding. This keyword was significantly used in the texts, more than any other in its lexical family. This specific keyword-in-context arrangement enabled linguistic decoding and meaning comprehension (Laufer & Cobb, 2019).
In 2021, on days 6 and 7, Lextutor was used to find out about word meaning in other types of business-related texts integrated in the tool (i.e., not from the ad hoc corpus of textbook chapter texts). Specific vocabulary and phraseology questions were handed out, and students used the Hypertext and Web/Text/Concordance utilities in LexTutor for the exploitation of these activities. This process was highly teacher-directed.
In contrast, in 2022, starting on day 5, students managed online dictionaries and online translators in addition to Versatext. They were also free to choose any academic texts from the online open textbook resource for their corpus analysis. Because, at this point, they were familiar with the procedure, they mostly operated the tool on their own, examining cloud keywords in context, running concordances, and analyzing vocabulary in the Text Profiler function. They also produced a one-page report in the end (in pairs or groups) about their linguistic findings. Overall, these participants obtained more correct answers in the reading activities (around 70 percent versus 50 percent in 2021). They also completed the tasks on time and even before the classes ended, especially in the last two sessions.
Non-corpus tools were also deployed in order to review vocabulary and to test text comprehension. In both courses, Acadly was used to launch pre-class and in-class quizzes before and during the DDL sessions. For example, synonym-matching exercises evaluated learners’ receptive knowledge of key words seen in previous sessions, and comprehension questions about short texts were posed via discussions in the tool. Quizlet and Kahoot were used in competitive modes to test learners’ lexical knowledge (individually and in teams). In general, team competition was preferred. In Playposit, interactive videos were played by the instructor and watched by the whole class at the same time in order to respond to listening comprehension questions related to specific topics on innovation and technologies in Tourism.
Comparing pre- / post-test results, all the 2021 participants, except one, improved in the vocabulary part, whereas in the reading section, one participant did worse, three obtained the same results, and the rest improved. In 2022, all participants did better, except for one student doing worse in vocabulary, and two scoring the same results in the reading. Since participants’ scores did not add up to more than 30 for each test in each semester, the Wilcoxon-signed rank test, a non-parametric measurement, was applied. The scores also varied greatly, not guaranteeing a normal distribution.
Table 2 displays the Wilcoxon-signed rank test values derived from this intra-group comparison for each semester. Both p and z values were significant for vocabulary and reading comprehension. Therefore, each group significantly increased their lexical and reading scores in the post-tests. Additionally, a Mann-Whitney U-test was run with all the post-test scores from both semesters for inter-group analysis (Table 2). The idea was to examine whether one of the groups significantly differed from the other group in terms of overall post-test scores. In this case, the measurement was conducted by taking the first 19 scores in 2021 and comparing them with the 19 students’ scores from 2022 (without test section distinctions). According to the values obtained (U, Z-ratios, and p), there was no significant difference between the two semesters, as all values exceeded the p value at 0.05. Therefore, no group significantly outperformed the other group.
Table 2. Significant intra-group test and non-significant inter-group test differences
Test sections |
Wilcoxon signed-rank test (intra-group measurement) |
Mann-Whitney U-test (inter-group measurement) |
Vocabulary |
2021 (pre / post-tests): .00001 at p≤0.05 4.0145 at z≥1.96 2022 (pre / post-tests): .00003 at p≤0.05 3.6214 at z≥1.96 |
Post-tests (2021 / 2022) U value = 180 (U at p≤0.05 = 123) Z-ratio = 0 / p = 0.5 at p≤0.05 |
Reading |
2021 (pre / post-tests): .00094 at p≤0.05 3.3137 at z≥1.96 2022 (pre / post-tests): .00194 at p≤0.05 3.1025 at z≥1.96 |
The two groups of participants answered the same online questionnaire at the end of the semester. Their numerical answers to questionnaires are shown in Appendix 3 according to a 1-5 Linkert scale. All learners manifested their overall satisfaction with the modules and resources, appraising digital tools as positive in most cases (versus traditional tools, which received lower scores). Speaking and listening were checked as favourite skills, and in 2022, text-based activities received higher scores than in 2021. Both groups commended working in groups, and, in 2021, pre-class preparation was deemed as less important than in 2022.
Regarding the tools, most learners in both years favoured Acadly and Kahoot, followed by Playposit. This general positive reaction was often observed in class by their excitement to play with these tools (contrary to belief that they may get bored with too much use of these tools). In 2022, Versatext was also rated positively (compared to the other corpus tool, Lextutor, in 2021), and was described as “appealing” (“chula”, “interesante”) by three participants. Reverso and online dictionaries also received good scores. Some other comments by students in their answers about the tools referred to their “fun” aspect (Kahoot, Playposit), “interactivity” and “challenging” features (Kahoot, Quizlet), “easy-to-use” display (Acadly), and “usefulness” for learning (Acadly, Playposit, Versatext, WordReference). In 2021, two learners stated their opinions about Lextutor, finding it “difficult”, “messy” and “little motivating”. In 2022, more students appreciated the corpus utilities, and two people voiced a complaint about deadlines.
In 2022, the positive impressions with the module and tools were not surprising since students had already reacted positively to the learning mechanics via questionnaires about each SM (Sub-Module) during the 4-week module. Each questionnaire was answered online to compare students’ impressions immediately after using Versatext (first SM: SM1) and online dictionaries (second SM: SM2). The questionnaires were integrated in Acadly in the form of multiple-choice polls. Table 3 displays the rounded up percentages of students choosing a given answer for each item about SM1 (answered by all 19 participants) and SM2 (answered by 17 students).
Table 3. Questions and percentages of students’ answers in the two SM questionnaires
SM 1 Questions: Versatext |
Choice percentages |
1. How complicated was this activity? a. very complicated b. complicated c. average d. easy e. very easy f. N / A |
Average (53) Easy (37) Complicated (5) No answer (5) |
2. How was the management of the texts (access, upload, work) a. very bad b. bad c. average d. good e. very good f. N / A |
Average (53) Good (27) Very good (15) Bad (5) |
3. How was the Word Cloud utility? a. very difficult b. difficult c. average d. easy e. very easy f. N / A |
Average (47) Easy (43) Very easy (5) Difficult (5) |
4. How was the Concordance utility? a. very difficult b. difficult c. average d. easy e. very easy f. N / A |
Average (75) Difficult (20) Easy (5) |
5. For word meaning decoding, Concordance was: a. very bad b. bad c. OK d. good e. very good f. N / A |
OK (58) Good (27) Very good (15) |
6. To find the comprehension answers in the text, Concordance made it: a. very hard b. hard c. average d. easy e. very easy f. N / A |
Easy (65) Very easy (15) Average (15) Very hard (5) |
7. For sense disambiguation in the expressions, Concordance made it: a. very hard b. hard c. average d. easy e. very easy f. N / A |
Easy (65) Very easy (25) Average (10) |
8. The easiest aspect of Versatext was: a. Management b. text uploading c. Word Cloud d. Concordance e. interface / layout f. N / A |
Word Cloud (75) Concordance (15) Management (10) |
9. The hardest aspect of Versatext was: a. Management b. text uploading c. Word Cloud d. Concordance e. interface / layout f. Nothing g. N / A |
Nothing (47) Concordance (28) Management (25) |
SM 2 Questions: Online dictionaries |
|
A. How complicated was the activity? a. very hard b. hard c. average d. easy e. very easy f. N / A |
Average (65) Easy (30) Very easy (8) |
B. How did you find the online dictionaries for vocabulary? a. of no use b. of little use c. average d. useful e. very useful f. N / A |
Very useful (53) Useful (41) Average (6) |
C. How were the online dictionaries for reading comprehension? a. of no use b. of little use c. average d. useful e. very useful f. N / A |
Useful (41) Very useful (35) Little use (12) Of no use (12) |
D. To check meanings in Spanish, I used the dictionary: a. never b. sometimes c. average d. often e. all the time f. N / A |
Often (53) Sometimes (41) All the time (6) |
E. In the dictionary, I checked: a. meanings only b. examples only c. explanations d. meanings and examples e. nothing f. N / A |
Meanings and examples (53) Meanings only (47) |
F. Of the two online dictionaries, I used more: a. WordReference b. Cambridge c. both d. none e. N / A |
WordReference (76) Both (18) Cambridge (6) |
According to the percentages for the answers given by participants in both SM questionnaires (Table 3), different observations can be made. Most students felt these tools entailed average difficulty in their use, as can be examined in questions 1 and A. Also, the features and procedures in both tools were regarded as mostly positive (with “average”, “easy”, “good”, “useful” and “very useful” choices mostly selected for items 2, 3, 4, B, and C). Additionally, nothing was negative in Versatext according to almost half of the class (see item 8), whereas online dictionaries were often deployed by most students (see D). Most learners also seemed to understand the purposes of concordances for reading and vocabulary comprehension (see items 5 and 6). WordReference was favored (item F), and was found useful not only for vocabulary exploitation, but also for reading comprehension (B and C).
In the case of concordances, this feature was difficult for some students, but an overwhelming majority found it easy for reading comprehension and sense disambiguation (see 6 and 7). It is also noteworthy that learners realized and correlated the usefulness of dictionary examples and concordances for lexical meaning induction, given their positive answers for both affordances in items 6 and E. In this scope, learners did not just rely on dictionary translations and definitions to answer the activities, but mostly explored examples and concordances for meaning decoding (as indicated in their answers and also observed in class).
Short interviews were also conducted with five students after the module in 2022. The main objective was to annotate students’ direct personal opinions, stated in Spanish so that they could express themselves in more depth. Their answers were highly informative regarding the use of digital technologies for reading and vocabulary developments. In this scope, some meta-cognitive feedback was inferred from interviewees’ answers as they reflected on their learning process. These five students were selected because they attended all the class sessions and had B1 levels in English. For the analysis of their answers, their opinions were grouped according to four thematic categories, based on coding techniques explained by Baralt (2012): 1) Situation or context (CAT1); 2) Task dynamics (CAT2); 3) Learning process (CAT3); and 4) Technology (CAT4).
Question 1: Did you like the digital tools and do you think they helped you to improve your English? Why or why not?
CAT1: Two students referred to a favorable context of “fun / entertaining activities” with Kahoot and Quizlet. Another participant preferred the “lab practice”, and two students said the classes were “enjoyable”. Another learner said the tasks were “appropriate for my level”.
CAT2: One student said that the tasks in “the computer lab motivated me”, two people preferred the “group competition challenge” with Kahoot, and two students said that Versatext presented “challenging” and “interesting” activities.
CAT3: In terms of learning gain, three students recognized their “increased vocabulary” thanks to Acadly activities. Two others pointed to the vocabulary in Versatext as productive because, in one student’s words, “they could annotate word meanings and functions”.
CAT4: Regarding technology, all the students preferred “interactive apps” (Kahoot, Quizlet, and Playposit), and one of them said that “interactive videos challenge my listening comprehension and I can test my knowledge better”.
Question 2: Do you think the reading and vocabulary activities helped you to gain linguistic knowledge / competence? Why or why not?
CAT1: The activities in Versatext were “interesting” and “motivating” according to three students, with one learner pointing out that this tool motivated her to “discover hidden meanings of words”.
CAT2: The competitive mode in Quizlet and Kahoot was highlighted by two students as motivating and “good for vocabulary knowledge”. Two other interviewees recalled that “key phrases” could be easily identified in Versatext.
CAT3: Two students recognized the value of learning “the keywords and patterns” shown in Versatext to understand texts, and one mentioned that “my vocabulary increased” with this corpus tool. Another student said that “I could discover meanings of new words” with online dictionaries and Versatext, and another participant added that “the vocabulary could be learned better” with these two tools.
Question 3: Did you prefer the online dictionaries or Versatext for the vocabulary and reading activities? Why?
Two people mentioned the dictionary, two others preferred Versatext, and one said both.
CAT 1: In WordReference, the “level” was “easy”, according to one student. In Versatext, “topic-specific texts” were “better understood”, and another student alluded to the “easy aspect” of Versatext for “lexical searches”.
CAT2: In the activities, WordReference was “more clear for word searches”. In Versatext, the usefulness of “keyword lists” was mentioned by two students, and one participant found that “doing the task together” was “better”, and they “could manage” their “own vocabulary”, and “discover important phrases and ideas”.
CAT 3: For Versatext, one student indicated that “I could learn new words with this tool” and “I organized myself better with the examples shown on screen”.
Question 4: Did you prefer to work with your own selection of texts? Why or why not?
CAT 2: In Versatext, one student agreed that using one’s own texts was important, as “I could choose the topics I liked”, and “manage my own material”. In contrast, two students preferred the text selection made by the teacher, and one mentioned that she preferred to be given “task directions”. Another participant said that “I don’t mind either but I prefer the teacher’s texts”.
CAT 3: One student explained that she could “discover new words” in her own texts, and two learners said that “the teacher’s texts were interesting”, that they “learned from the terminology extracted from these texts”, and that “these texts were easy”.
The findings above lead to four main observations concerning digital reading comprehension in this university ESP context. Firstly, first-year university students seem to have approached digital texts as important instruments for reading, which agrees with their overall satisfaction with digital resources for language teaching and learning (stated in questionnaires). Both 2021 and 2022 modules, despite featuring different approaches and tools, reflected, via students’ impressions and attitudes, that the overall experience with the reading module had been positive. Test results also attested to significant text comprehension and vocabulary improvement. Another evidence is that no significant difference was detected between 2021 and 2022 in terms of linguistic performance.
Secondly, it is noteworthy that more positive reactions resulted from DDL with Versatext, online dictionaries, and more independent group work in 2022. The tools and task dynamics favored each other, as the learners eagerly worked in groups with specific texts, exploring keywords and expressions. Reading assignments were mostly completed effectively with these tools. DDL thus seemed to meet learning needs in this context. Students often used different types of devices (mobile phones, laptops, and PCs) interchangeably, enhancing digital versatility and class dynamics. These approaches were mostly appraised as beneficial in the SM questionnaires and interviews.
Thirdly, the ad hoc topic-based corpus of authentic texts generally met students’ learning expectations. In both academic years, the interactive tools (e.g., Kahoot and Quizlet) were highly valued for reviewing vocabulary from these texts. The 2022 students appreciated these specific texts with Versatext more, represented by keywords more visually, and leading to more accessible search techniques. In 2022, another qualitative difference was that participants understood and completed the activities more easily and faster, especially in the last two or three classes. DDL thus tended to engage learners in the learning process, since applying DDL strategies to reading and lexical developments fostered enjoyable interaction via incentivized collaboration and problem-solving, as also observed by Hadley and Charles (2017).
Fourthly, it is noteworthy that, even if reading and DDL tools are not students’ favourite foci and resources in language learning (according to their opinions), these instruments and methods have enabled them to probe textual and lexical meaning in innovative ways for written reception, and this novelty has appealed to them, as also noted by Timmis (2015). Challenging learners with different angles and ways of approaching digital texts enables them to acknowledge the efficacy of DDL utilities for problem-solving activities. Participants can thus realize the importance of developing micro-skills such as noticing / discovering linguistic patterns, inducing lexical / textual meaning, and formulating ideas related to specific concordances and text chunks. These skills can be thus directly located within the framework of multiliteracies pedagogies in language learning (Paesani & Menke, 2023).
In conclusion, DDL can be effectively applied to digital reading comprehension in the ESP context. In terms of pedagogy, the lessons / sessions should be designed, devised, and scaffolded according to learning situations (e.g., focusing on specific vocabulary and different texts within a given topic). Learners should be provided with means and opportunities for working autonomously and collaboratively. The exploitation of critical thinking should enter the picture whenever possible so that students can reflect on their cognitive processing during the sessions.
Therefore, the university curriculum should emphasize the exploitation of linguistic and meta-cognitive micro-skills in digital reading. Digital tools can vary, with one specific resource working in one learning situation but not in another. Therefore, the focus should not be the use of digital tools per se (the so-called general digital literacies), but rather the specification of digital strategies that can make students efficient learners (in this case, active / involved readers). It would thus be interesting to explore other tools and tactics for reading in different registers, contexts, and / or for miscellaneous aims and levels (e.g., using larger corpora, Generative AI, different genres, outside-of-class / informal learning resources, and so on).
CHOOSE only ONE option that corresponds in each question:
1. A synonym for ADVERTISING is
a. publicity b. marketing c. management d. relations
2. A synonym for PROMOTIONAL SKILLS is
a. Abilities to promote a product b. Actions for promotion
c. developments in production d. tasks for promotion
3. What is the meaning of RETAIL?
a. Large company b. small company c. partnership d. multinational
4. What is the meaning of “to target an audience”?
a. Work with an audience b. talk to an audience c. aim at an audience
d. listen to an audience
5. What is the meaning of “product development”?
a. Design of a product b. destruction of a product c. thinking about a product d. selling of a product
6. Which word goes in the blank: “to spend money + ____ something”?
a. In b. on c. by d. to
7. Which word goes in the blank: “to invest money _____ something”?
a. In b. on c. by d. to
8. Which word goes in the blank: “Marketing _________”?
a. Work b. techniques c. sales d. purchase
9. Which word goes in the blank: “to address a / an _________”?
a. Outline b. issue c. post d. school
10. Which word goes in the blank: “To grab __________ from”?
a. Productivity b. campaign c. attention d. endorsement
Answer the questions below about this text:
Tourism and environmental sustainability are fast becoming natural partners, their agendas increasingly intertwined. No other industry has to walk the narrow line of environmentally responsible growth as carefully as the tourism industry; no other industry has as much to gain or as much to lose.
More and more, for example, environmentally-concerned tourists are seeking out green tourist destinations — those that make a proactive effort to address critical issues such as carbon emissions, biodiversity conservation, waste management, and water supply. A 2005 survey by the United Kingdom’s Devon County Council found that 54 percent of respondents consider environmental issues when booking a trip and 82 percent are willing to pay more for green services and products. As a bonus, some 72 percent of respondents think a green business is more likely to be quality conscious.
Feeling the push from tourists, leading tour operators such as TUI and Thomas Cook Group are giving marketing and booking preference to environmentally sustainable destinations and demanding higher green standards from hotels and resorts. In addition, major global travel societies such as National Geographic now use environmental sustainability as a key criterion in their destination rankings. In short, if tourist destinations are to stay competitive, they will need to adopt sustainable policies or risk alienating an important and growing customer base.
1. What are Tourism and environmental sustainability becoming and why?
2. Explain one type of tourist’s interest and concern when travelling.
3. What kind of instrument was used to find information about green tourists’ interests and demands?
4. What are leading tour operators doing? Why?
5. What will happen if tourist destinations do not change or adopt new sustainable measures?
1. What does the word “above” refer to in the text (three things)?
2. What does “management” mean in this text? Give an example of its use in an expression.
3. What does “expenses” mean in this text? Give an example of its use in an expression.
4. What does the word “media” mean in this text? Give an example of its use in an expression.
5. Which preposition follows “provide”? Give an example of its use within an expression.
1. What is very important for projects to be successful?
2. Why can social media be important for dissemination?
3. What are customer-based platforms? Give two examples of digital customer-based platforms?
4. Why would main expenses be incurred at the beginning of a multimedia project?
5. What is the highest cost for in the estimated budget for a project?
6. Why is the aspect of fun important in a project?
2021 2022
a. The module topic was interesting and fitted well in the course
2.4 3
b. The activities were interesting and motivated me to learn English
2.5 3
c. I preferred the vocabulary parts in the activities
2.5 2.7
d. I preferred the writing and reading parts in the activities
2 2.8
e. I preferred the listening and speaking parts in the activities
3.3 3.3
f. I preferred working in groups (in class)
3.1 3.5
g. I preferred working individually (in class)
1.2 1.5
h. I liked doing pre-class preparation
1.5 2.8
i. I would have liked having more oral activities
3.1 3.3
j. I would have liked having more reading and writing activities
1.5 2.5
k. I liked using digital tools for the activities
3.7 3.9
l. I would have liked to have more traditional ways of doing the activities (e.g., pdf material and blackboard)
1.2 1.1
m. I liked the on-line university platform
4 3.9
II. TOOLS: Evaluate the tools used in class from 1 (very bad) to 5 (very good):
2021 |
2022 |
|
KAHOOT |
3.8 |
4.1 |
ACADLY |
3.6 |
3.8 |
QUIZLET |
2.4 |
3 |
PLAYPOSIT |
3 |
3.2 |
LEXTUTOR |
2.1 |
Not used |
VERSATEXT |
Not used |
3.8 |
DICTIONARIES |
Not used |
3 |
REVERSO |
Not used |
3.2 |
1. Name some tools you enjoyed and explain why in a short sentence.
2. Name some tools you liked less and explain why in a short sentence.
Ackerley, K. (2021). Exploiting a genre-specific corpus for academic writing: Students’ preferences and strategies. In M. Charles, & A. Frankenberg-García (Eds.). Corpora in ESP / EAP writing instruction. Preparation, exploitation, analysis (pp. 78–99). Routledge.
Altun, H. (2021). The learning effect of corpora on strong and weak collocations: Implications for corpus-based assessment of collocation competence. International Journal of Assessment Tools in Education, 8(3), 509–526. https://doi.org/10.21449/ijate.845051
Argyroulis, V. (2022). Investigating student motivation in the use of corpus concordancing in ESP learning at university level. ESP Today, 10(1), 71–98. https://doi.org/10.18485/esptoday.2022.10.1.4
Ballance, O. (2021). Narrow reading, vocabulary load and collocations in context: Exploring lexical repetition in concordances from a pedagogical perspective. ReCALL, 33(1), 4–17. https://doi.org/10.1017/S0958344020000117
Baralt, M. (2012). Coding qualitative data. In A. Mackey, & S. M. Gass (Eds.). Research methods in second language acquisition: A practical guide (pp. 222–244). Wiley-Blackwell.
Belcher, D. (2009). What ESP is and can be: An introduction. In D. Belcher (Ed.), English for specific purposes in theory and practice (pp. 1–19). Michigan ELT.
Boulton, A. (2010). Data-driven learning: Taking the computer out of the equation. Language Learning, 60(3), 534–572. https://doi.org/10.1111/j.1467-9922.2010.00566.x
Boulton, A. (2012). Hands-on/hands-off: Alternative approaches to data-driven learning. In J. Thomas, & A. Boulton (Eds.), Input, process and product: Developments in teaching and language corpora (pp. 152–168). Masaryk University Press.
Boulton, A., & Cobb, T. (2017). Corpus use in language learning: A meta-analysis. Language Learning, 67(2), 1–46. https://doi.org/10.1111/lang.12224
Brown, H. (2017). Cooperation and collaboration in undergraduate EMI: Adapting EAP to the emergence of blended academic norms and practices in a Japanese university. In J. Valcke, & R. Wilkinson (Eds.). Integrating content and language in higher education: Perspectives and professional practice (pp. 151–166). Peter Lang.
Chambers, A. (2022). What is data-driven learning? In A. O`Keeffe, & M. McCarthy (Eds.). The Routledge handbook of corpus linguistics (pp. 416–429). Routledge.
Charles, M. (2012). ‘Proper vocabulary and juicy collocations’: EAP students evaluate do-it-yourself corpus-building. English for Specific Purposes, 31, 93–102. https://doi.org/10.1016/j.esp.2011.12.003
Charles, M. (2022). The gap between intentions and reality: Reasons for EAP writers’ non-use of corpora. Applied Corpus Linguistics, 2(1), 100032. https://doi.org/10.1016/j.acorp.2022.100032
Charles, M., & Frankenberg-García, A. (2021). Introduction. In M. Charles, & A. Frankenberg-García (Eds.). Corpora in ESP / EAP writing instruction. Preparation, exploitation, analysis (pp. 1–9). Routledge.
Choubsaz, Y., Jalilifar, A., & Boulton, A. (2023). A longitudinal study of highly cited papers in four CALL journals. ReCALL, 1–18. https://doi.org/10.1017/S0958344023000137
Council of Europe. (2018). Common European framework of reference for languages. Language policy unit. Retrieved from https://rm.coe.int/16802fc1bf
Crosthwaite, P. & Baisa, V. (2023). Generative AI and the end of corpus-assisted data-driven learning? Not so fast! Applied Corpus Linguistics, 3(3). https://doi.org/10.1016/j.acorp.2023.100066
Curado Fuentes, A. (2015). Exploiting keywords in a DDL approach to the comprehension of news texts by lower-level students. In A. Lenko-Szymanska, & A. Boulton (Eds.). Multiple affordances of language corpora for data-driven learning (pp. 177–198). John Benjamins.
Curado Fuentes, A. (2023). Corpus affordances in foreign language reading comprehension. In K. Harrington, & P. Ronan (Eds.). Demystifying corpus linguistics for English language teaching (pp. 99–118). Palgrave MacMillan.
Flowerdew, L. (2015). Corpus-based research and pedagogy in EAP: From lexis to genre. Language Teaching, 48(1), 99–116. https://doi.org/10.1017/S0261444813000037
Frankenberg-García, A. (2012). Learners’ use of corpus examples. International Journal of Lexicography, 25(3), 273–296. https://doi.org/10.1093/ijl/ecs011
Friginal, E. (2018). Corpus linguistics for English teachers: New tools, online resources, and classroom activities. Routledge
Grabe, W., & Stoller, F. (2013). Teaching and researching reading (2nd ed.). Routledge.
Hadley, G., & Charles, M. (2017). Enhancing extensive reading with data-driven learning. Language Learning & Technology, 21(3), 131–152. https://doi.org/10125/44624
Han, F. (2021). The relations between motivation, strategy use, frequency, and proficiency in foreign language reading: an investigation with university English language learners in China. Foreign Language in a Global Nexus, 1–12 https://doi.org/10.1177/21582440211008423
Hirata, Y., & Thompson, P. (2022). Communicative data-driven learning: A two-year pilot study. ELT Journal, 76(3), 365–366. https://doi.org/10.1093/elt/ccab066
Hyland, K., & Tse, P. (2009). Academic lexis and disciplinary practice: Corpus evidence for specificity. International Journal of English Studies, 9(2), 111–129. Retrieved from https://revistas.um.es/ijes/article/view/90781
IDEAL Desktop Project. (2020). Desktop research and needs analysis on the mapping of content for digitally competent language teachers. Erasmus+ Project. Retrieved from http://ideal-project.eu/
Ismayilli Karakoç, A., Ruegg, R., & Gu, P. (2022). Beyond comprehension: Reading requirements in first-year undergraduate courses. Journal of English for Academic Purposes, 55. https://doi.org/10.1016/j.jeap.2021.101071.
Jin, T., Jiang, Y., Mingyue Gu, M., & Chen, J. (2022). “Their encouragement makes me feel more confident”: Exploring peer effects on learner engagement in collaborative reading of academic texts. Journal of English for Academic Purposes, 60, 101–117. https://doi.org/10.1016/j.jeap.2022.101177
Johns, T. (1994). From printout to handout: Grammar and vocabulary teaching in the context of data-driven learning. In T. Odlin (Ed.), Perspectives on pedagogical grammar (pp. 27–45). Cambridge University Press.
Jones, R. H., & Hafner, C. A. (2012). Understanding digital literacies: A practical introduction. Routledge.
Kuteeva, M., & Airey, J. (2014). Disciplinary differences in the use of English in Higher Education: Reflections on recent language policy developments. Higher Education, 67, 533–549. https://doi.org/10.1007/s10734-013-9660-6
Lai, C. (2015). Modeling teachers' influence on learners' self-directed use of technology for language learning outside the classroom. Computers & Education, 82, 74–83. https://doi.org/10.1016/j.compedu.2014.11.005
Laufer, B., & Cobb, T. (2019). How much knowledge of derived words is needed for reading? Applied Linguistics, 41(6), 1–29. https://doi.org/10.1093/applin/amz051
Laufer, B., & Ravenhorst-Kalovski, G. C. (2010). Lexical threshold revisited: Lexical text coverage, learners’ vocabulary size and reading comprehension. Reading in a Foreign Language, 22, 15–30.
Lee, S., Kim, S., & Jung, Y. (2019). Effects of DDL approach on English productive and receptive vocabulary knowledge and reading ability. The Journal of Studies in Language, 35(3): 341–360. https://doi.org/10.18627/jslg.35.3.201911.341
Mancho-Barés, G., & Llurda, E. (2013). Internationalization of business English communication at university: A threefold needs analysis. Ibérica, 26, 151–170. Retrieved from https://revistaiberica.org/index.php/iberica/article/view/277
Mokhtari, K., & Reichard, C. (2002). Assessing students' metacognitive awareness of reading strategies. Journal of Educational Psychology, 2, 249–259. https://doi.org/10.1037/0022-0663.94.2.249
Nagy, W. E., and Townsend, D. (2012). Words as tools: “Learning academic vocabulary” as language acquisition. Reading Research Quarterly, 47(1), 91–108. https://doi.org/10.1002/RRQ.011
Nation, I.S.P., & Coady, J. (1988). Vocabulary and reading. In R. Carter & M. McCarthy (Eds.), Vocabulary and language teaching (pp. 97–110). Routledge.
Ohata, K., & Fukao, A. (2014). L2 learners’ conceptions of academic reading and themselves as academic readers. System, 42, 81–92. https://doi.org/10.1016/j.system.2013.11.003
Paesani, K., & Menke, M. (2023). Literacies in language education: A guide for teachers and teacher educators. Washington D.C.: Georgetown University Press.
Paltridge, B., Starfield, S., & Tardy, C. M. (2016). Ethnographic perspectives on academic writing. Oxford University Press.
Pérez-Basanta, C. (2005). Assessing the receptive vocabulary of Spanish students of English philology: An empirical investigation. In E. Martínez-Dueñas (Ed.). Towards an understanding of the English language, past, present and future: Studies in honour of Fernando Serrano (pp. 545–564). Universidad de Granada.
Pérez-Paredes, P. (2019). A systematic review of the uses and spread of corpora and data-driven learning in CALL research during 2011–2015. Computer Assisted Language Learning. https://doi.org/10.1080/09588221.2019.1667832
Pérez-Paredes, P. (2024). Data-driven learning in informal contexts? Embracing Broad Data-driven learning (BDDL) research. In P. Crosthwaite (Ed.). Corpora for language learning: Bridging the research-practice divide. Routledge.
Pérez-Paredes, P., Ordoñana Guillamón, C., Van de Vyver, J., Meurice, A., Aguado Jiménez, P., Conole, G., Sánchez Hernández, P. (2019). Mobile data-driven language learning: Affordances and learners’ perception. System, 84, 145–159. https://doi.org/10.1016/j.system.2019.06.009
Philip, G. (2010). Classroom concordancing in the 21st century: The new generation. Language Forum, 36(2), 1–20.
Ronan, P. (2023). Learning to teach EFL with corpus linguistics approaches: A survey of teacher training students’ attitudes. In K. Harrington, & P. Ronan (Eds.). Demystifying corpus linguistics for English language teaching (pp. 19–38). Palgrave MacMillan.
Schroer, W. (2008). Defining, managing, and marketing to generations X, Y, and Z. The Portal, 10, 9–15. Retrieved from https://www.yumpu.com/en/document/view/10000289/defining-managing-and-marketing-to-generations-x-y-and-z-iam
Song, T., & Reynolds, B. L. (2022). The effects of lexical coverage and topic familiarity on the comprehension of L2 expository texts. Tesol Quarterly, 56(2), 763–774. https://doi.org/10.1002/tesq.3100
Steel, C. (2016). Students' perspectives on the affordances and constraints of using mobile devices and applications for learning languages. In A. Gimeno, M. Levy, F. Blin, & D. Barr (Eds.), Worldcall: Sustainability and computer-assisted language learning (pp. 230–243). Bloomsbury.
Tate, T., & Warschauer, M. (2022). Digital tools for promoting social reading. In M. Brooke, S. Zhu, A. Ramanujan, T. Smotrova, S. Sim (Eds.), Selected papers from the 6th CELC symposium for English language teachers (pp. 4–17). Centre for English Language Communication.
Templeton, J., & Timmis, I. (2023). A flexible framework for integrating DDL. In K. Harrington, & P. Ronan (Eds.), Demystifying corpus linguistics for English language teaching (pp. 39–58). Palgrave MacMillan.
Timmis, I. (2015). Corpus linguistics for ELT: Research and practice. Routledge.
Verifica Report (2008). Informe Verifica del grado de Administración y Empresa de la Universidad de Extremadura. ANECA. Retrieved from https://www.unex.es/conoce-la-uex/centros/ciencias/sgic/comision-de-calidad-de-las-titulaciones/MFPES/inf_ANECA
Xiangming, L., Liu, M., & Zhang, C. (2020). Technological impact on language anxiety dynamic. Computers & Education, 150, 1–12. https://doi.org/10.1016/j.compedu.2020.103839
Xu, M., Xiao, C., Xiaobin, L., Xiaoyue, L., Qiaoxin, Z. (2019). Using corpus-aided data-driven learning to improve Chinese EFL learners’ analytical reading ability. In S. K.S. Cheung, J. Jiao, L. Lee, X. Zhang, K.C. Li, Z. Zhan (Eds.). Technology in education. Pedagogical innovations (pp. 15–26). Springer.
Zhang, M. (2013). The application of corpus tools in the teaching of discipline-specific academic vocabulary: A case study for information. International Journal of Computer-Assisted Language Learning and Teaching, 3(4), 33–47. https://doi.org/10.4018/ijcallt.2013100104
Zhang, D., Hennessy, S., & Pérez-Paredes, P. (2023). An investigation of Chinese EFL learners’ acceptance of mobile dictionaries in English language learning. Computer Assisted Language Learning. https://doi.org/10.1080/09588221.2023.2189915
_______________________________
1 These reports are based on “Verifica” (Spanish Agency of Academic Evaluation—ANECA), a tool for the evaluation of general, transversal, and specific competences in the different specializations and academic domains.