I am in the first year of a four-year PhD program in Educational Studies at the University of Victoria (UVic). My dissertation will further my Master’s in Educational Technology research by exploring Emergent Process-Based Strategies for Writing Assessment in the Age of Large Language Models (LLMs). I bring expertise in process-based educational approaches, which emphasizes the learning journey – the iterative cycles of inquiry, problem-solving, assessment, and reflection in the development of knowledge. This is grounded in my Bachelor of Applied Science in engineering, a discipline inherently concerned with process, and reinforced by over 10 years of professional experience in design, making, and process-based teaching.
Emergent Process-Based Strategies for Writing Assessment in the Age of Large Language Models
The rise of LLMs has drastically shifted the media landscape, exacerbating a performance economy and immediacy of ‘truths’ (de Castell et al., 2025). LLMs, an ‘artificial intelligence’ (AI), are simply statistically correlative prediction machines, yet capable of instantly generating text that is indistinguishable from a human (Nikoli et al., 2024; Suchman, 2023). Sociotechnical systems theory proposes that social, technical, and environmental factors are inherently interdependent, existing in continuous interplay and redesign (Pasmore et al., 2019); the proliferation of LLMs marks an especially intense period of redesign that highlights the need to evaluate emergent educational practices.
Human-LLM dynamics alter learning, cognition, and relationships to knowledge (Hwang et al., 2020), with a profound impact on the assessment of student writing (Corbin et al., 2025, Khlaif et al., 2025). An outdated perspective views the learner as a sole cognitive actor where tool usage may be negatively viewed as a substitute for cognition (Fawns & Schuwith, 2024). The separation of learners from contemporary tools stifles opportunities to connect learning with its application (Brown et al., 1989) and contrasts with cognitive load theory, which posits that minimizing extraneous processes (e.g., favourable cognitive offloading) enhances focus on meaningful tasks and promotes effective learning (Mayer & Fiorella, 2021). The sophisticated perspective of distributed cognition views technology as an integral component of a system in which competency is the ability to engage with and manage a distributed system of individuals, environments, and tools (Fawns & Schuwith, 2024; Pea, 1993). This lens, coupled with constructivism, the view that people actively create their own meaning (Powell & Kalina, 2009), is concerned with the interplay of the human-LLM interaction (e.g., collaboration in writing) in context, and how learners engage with distributed cognitive systems as part of their pursuit of intelligence.
Unfortunately, profound inequities are emerging as only some students have the access and skill to effectively leverage LLMs, and among them, some unethically present AI-generated text as their own (Chan & Hu, 2023; Nikolic et al., 2024). Educational institutions are responding reactively, through several unsophisticated approaches:
- use of unreliable AI detection tools that have serious consequences with regard to academic integrity (Chaka, 2024; Elkhatat et al., 2023),
- replacing take-home assignments with live or oral assessments, raising extensive accessibility issues (e.g., anxiety, neurodivergence) (Ng et al., 2025; Nikolic et al., 2024), and/or
- implementing AI policies that depend “entirely on student awareness, understanding, and voluntary compliance” (Corbin et al., 2025, p. 5).
These responses are inadequate, as they fail to support learners in ways that acknowledge the broader impact of LLMs on education (Ajjawi et al., 2022; Lund et al., 2026). Scholars are calling for a radical change to assessment, recommending process-based strategies to reliably support and measure the pursuit of human intelligence in the age of AI (Corbin et al., 2025; Siddiqui et al., 2025).
For over 50 years, scholars have highlighted the benefits of process-based evaluation (Sowell, 2020), which has been shown to:
- promote critical thinking (Glaser, 1984),
- foster motivation and persistence (Black & Wiliam, 1998; Bloom, 1968),
- address concerns with equity, diversity, inclusion, and accessibility (EDIA) (e.g., reduce performance gaps, address neurodiversity) (Black & Wiliam, 1998; Guskey, 2005; Wiggins, 2011),
- support academic integrity (Ajjawi et al., 2024, Morris, 2016; Jantos, 2021; Nikolic et al., 2024), and
- assess student writing augmented by an LLM (Ajjawi et al., 2024 Khlaif et al., 2025; Zhao, 2025).
The process of writing – the thinking, drafting, and revising – is where the learning occurs, yet often, only the product is evaluated (Siddiqui et al., 2025). Techniques like pre-writing tasks, staged drafts, and oral checkpoints remain uncommon and impractical due to the time demands on educators (Fleckenstein et al., 2023; Sadler, 2010). Bringing further visibility to the process of writing may allow for a deeper scrutiny and view into the process of learning and pursuit of human intelligence (Ajjawi et al., 2024; Siddiqui et al., 2025).
Purpose
I will develop a three-paper dissertation to develop the theory and understanding around process-based evaluation of writing. This work explores emerging process-based practices in undergraduate writing assignments and examines how the proliferation of LLMs may have shaped process-based assessment of writing.
Methods
The proposed research will be grounded by the theoretical frameworks of constructivism and distributed cognition. Given the recency of LLMs, this study adopts an emerging grounded theory design, in which data acquisition and analysis occur concurrently, allowing patterns and concepts to develop directly from the data (Creswell, 2012). A literature review will begin broadly with process-based assessments of writing, then narrow to process-based assessment approaches that adapt to diverse media ecologies and/or acknowledge learners’ engagement with LLMs.
Current instructors of undergraduate classes that use process-based writing assessment will be recruited across multiple post-secondary institutions in British Columbia. Recruitment will seek to represent diversity in gender, race, ethnicity, age, dis/ability, and practices and policies of student LLM use. Data will be collected through audio recorded, transcribed semi-structured interviews. The number of participants will be determined by data saturation, with an estimated 30 participants (Baker & Edwards, 2012). Qualitative data will be analyzed using the constant comparative method to code and compare interview transcripts and identify emergent themes (Glaser & Strauss, 1967). Qualitative techniques to promote methodological rigor will be employed (e.g., triangulation, debriefing, member checks).
The first phase of interviews is anticipated to yield novel and/or innovative process-based assessment strategies, tools, and/or interventions that will inform subsequent phase(s) of data collection. Retesting emergent themes reinforces their validity and enhances comprehension of the phenomenon (Creswell, 2012). Further data collection and analysis phase(s) (e.g., additional interviews, case studies, classroom interventions) will continue until a core theme is discovered.
Feasibility
The success of this research is supported by substantial literature, relevant expertise, and a strong academic network. I served as the lead author of a refereed chapter, delivered two conference presentations, conducted an active study including diverse participant recruitment and thematic transcript analysis, and managed several technically complex projects requiring advanced project management skills. Through my affiliation with the CFI-funded Technology Integration and Evaluation Lab, I will deepen my experience in research design, instrumentation, analysis, and advanced learning technologies while contributing to SSHRC-funded initiatives. The PhD program combines essential training with the advantage of dual supervision and alongside eight doctoral students in educational technology. My supervisors offer international expertise in educational technology; Dr. Michael Paskevicius, has multinational experience in open education, technology integrated learning design, and the development of digital literacies; Dr. Valerie Irvine has earned over $1.7 million in funding and affects policy change with research and leadership in the EDIA of technology integrated education.
Knowledge Mobilization
This research advances process-based approaches to writing assessment in the age of LLMs, addressing a timely and highly relevant problem. This topic aligns with SSHRC’s priority for EDIA and applies to the Future Challenge Area of Humanity+, via augmented cognition. Findings will be disseminated at peer-reviewed conferences (e.g., EDEN, CSSE, OTESSA at Congress) and in scholarly journals (e.g., Assessment & Evaluation in Higher Education, Assessing Writing).
Formatted for two pages: