https://so05.tci-thaijo.org/index.php/reflections/issue/feed rEFLections 2025-12-30T11:37:56+07:00 Thanis Tangkitjaroenkun thanis.bun@kmutt.ac.th Open Journal Systems <p><em><strong>rEFLections</strong></em> is a double-blind refereed English language journal devoted to research in applied linguistics and English language teaching. It is published three times a year and is sponsored by School of Liberal Arts, King Mongkut's University of Technology Thonburi.</p> <p><em><strong>rEFLections</strong></em> currently has only one format: electronic (ISSN 2651-1479), which first started in 2018. The original print format (ISSN 1513–5934), first published in 2001, has been discontinued since 2024. In its place, now at the end of each year, only a special printed issue will be released containing 10 - 12 of the year's most notable academic works.</p> https://so05.tci-thaijo.org/index.php/reflections/article/view/286060 100 Great Activities in Language Teaching by P. Ur & S. Thornbury 2025-12-30T11:22:22+07:00 Trung Kien Pham trungkienbmhg@gmail.com 2025-12-30T00:00:00+07:00 Copyright (c) 2025 School of Liberal Arts, King Mongkut’s University of Technolgy Thonburi https://so05.tci-thaijo.org/index.php/reflections/article/view/286058 Evaluation of Technical Description Writing: An Assessment for ESP Learners in Engineering Programs 2025-12-30T11:37:56+07:00 Samia Naqvi snaqvi@mec.edu.om <p>This paper reports an empirical evaluation of a CBT (Closed Book Test) designed to assess technical description writing skills among first-year engineering students enrolled in an English for Specific Purposes (ESP) module. Grounded in Bachman and Palmer’s (1996) test usefulness framework, the study examines the assessment in terms of its validity, reliability, practicality, authenticity, interactiveness, and impact. The CBT required students to produce a written description of an electronic object, using appropriate terminology, critically evaluating the product, and suggesting improvements. Test development involved content expert validation, internal and external moderation, and alignment with ESP module outcomes. Data were collected through test scripts from the entire student cohort (N = 34), expert CVI ratings, post-test survey responses (Likert-scale and open-ended items), and moderators’ comments. Analysis included blind marking of all test scripts by two examiners using standardised analytic rubric, paired samples t-test for inter-rater reliability (<em>p</em> = 0.163), and exploratory factor analysis for construct validity. The mixed-methods approach combined quantitative analysis (survey ratings, statistical tests) with qualitative analysis of open-ended survey responses and moderator feedback. The post-test student survey across all six usefulness dimensions yielded consistently high mean scores (4.1–4.5). The evaluation confirmed the CBT's overall test usefulness across all six dimensions through multiple validation methods, with 85% of students affirming its effectiveness in improving their technical writing skills. Limitations include the small sample size, single-institution context, and potential response bias. Future research should focus on scaling the CBT model across institutions and disciplines, implementing hybrid automated scoring systems, refining rubric analytics, and conducting longitudinal studies to examine skill transfer to professional contexts.</p> 2025-12-30T00:00:00+07:00 Copyright (c) 2025 School of Liberal Arts, King Mongkut’s University of Technolgy Thonburi