Quality Enhancement In Your Context

Enhancing Playtesting and Feedback Literacy Through Structured Iteration and Stakeholder-Informed Redesign

PHE707 Using Evidence to Support Student Success - Coursework 1

I. Introduction

Playtesting is the moment a game stops belonging to its developer and starts belonging to the player. In professional game development, playtesting is a key aspect of iteration, the process through which ideas are tested, challenged, and refined. As an educator working in game development, making that process work for students is a pedagogic goal and an important area of professional practice.

CRE344: 3D Game Creation is a 40-point, Level 5 practice-based module in which students work collaboratively to design and develop a 3D game. The module emphasises practical development, teamwork, communication, and the creation of a portfolio-ready product. Its pedagogic approach is rooted in studio-style learning where students work through iterative cycles of design, build, testing, and refinement, mirroring the workflows of a small development team. A coursework component specifically requires students to produce a test plan, organise a playtest session, and document the results. The associated rubric expects clear goals, structured methods, analysis of findings, and actionable proposals for improvement.

Playtesting therefore already occupies a meaningful place in the module’s design. However, this enhancement inquiry argues that while students recognise playtesting as valuable, it is currently under-structured as a feedback-led learning activity. The core issue is not whether playtesting belongs in the module, but whether students are sufficiently supported to generate, interpret, and act on useful feedback. This discourse draws on scholarship, student evidence, colleague consultation, and industry-informed perspectives to propose a redesign that makes playtesting more dialogic, feedback-literate, and explicitly iterative. This evidences A1 (design of learning activities) and reflects V4 and K2 by situating enhancement within a wider disciplinary and professional context.

II. Contextualised Research and Analysis

Feedback as Dialogue

A productive starting point for analysing playtesting is considering it as a form of feedback. Nicol (2010) argues that dissatisfaction with feedback often stems from a model in which it is treated as one-way transmission rather than dialogue. Ajjawi and Boud (2017) reinforce this by framing feedback as interaction rather than product, shifting attention from the comment itself to the social process through which meaning is negotiated. These perspectives are directly relevant, highlighting that playtesting is most valuable when it becomes a process of discussion and response, not a list of isolated comments. The implication for CRE344 is that playtesting sessions need to be designed to promote exchange, not simply to collect reactions (V3, K3).

Feedback Literacy and Evaluative Judgement

Carless and Boud (2018) argue that students need to develop feedback literacy defined as the capacity to appreciate feedback, make judgements, manage affect, and take action. This is essential to understanding why playtesting can underperform pedagogically even when students value it. They may recognise its importance while still lacking the skills to give specific critique, interpret peer responses, or translate comments into design decisions. Ajjawi (2018) extend this through the concept of evaluative judgement, the capacity to discern quality in one’s own work and the work of others. In a playtesting context, this means students need support not just to say whether something “works” or is “good or bad”, but to identify what quality looks like in gameplay, usability, or “game feel”. The implication is that the problem is not poor comments alone, but an underdeveloped framework for helping students learn how to judge and use feedback (K1, A4).

Critique in Creative Disciplines

Game design and development sits adjacent to studio and critique cultures in art and design. Blair (2007) demonstrates that poorly structured critique can inhibit participation, heighten emotional risk, and reduce the educational value of feedback. Orr and Bloxham (2013) highlight the ambiguity and subjectivity often present in creative assessment, where quality is experienced tacitly rather than through explicit criteria. These insights suggest that in a studio-style environment such as CRE344, critique can easily become unfocused without clear scaffolding, and feedback may be experienced as ambiguous or unclear. This has direct implications for inclusive practice. Without structure, participation becomes uneven, and students who are less confident in peer settings may disengage (V1, V2, K2).

Authentic and Experiential Learning

Despite these challenges, playtesting remains one of the strongest authentic learning activities within the module in the wider culture of game development. Kolb’s (1984) experiential learning model can be mapped onto the playtesting cycle where students build, test, reflect, and revise. Villarroel et al. (2018) describe authentic assessment as work that mirrors real-world practice and requires knowledge to be applied in context, which playtesting does. Kultima (2015) brings an important professional games industry perspective that is important to note because they identify iteration as essential and natural to game development. Hamilton et al. (2023) add that practice-based tasks can deepen learning but require deliberate pedagogic design, while Liang and Kang (2024) show that situated, collaborative activities also strengthen peer connection and belonging. The enhancement case is therefore not about replacing playtesting, but about making it function more effectively as the authentic, iterative activity it is already intended to be (V4, A1).

The literature suggests that playtesting is most effective when feedback is treated as a conversation rather than a collection of comments, when students are supported to develop the skills to give and use critique, and when sessions are deliberately structured to ensure inclusive participation. Whether these conditions are currently being met in CRE344 is the focus of the following section, which draws on engagement with three stakeholder groups: students, colleagues, and industry-informed perspectives (V3, V4, K2, K3).

III. Stakeholder Engagement

Three stakeholder groups were engaged to triangulate the contextual analysis: students as the primary participants in playtesting, colleagues as internal stakeholders with insight into delivery and assessment design, and industry-informed perspectives as an external frame for professional practice. This multi-stakeholder approach evidences V5 (collaborative working) and V4 (engagement with wider context).

Student Stakeholders

Students were engaged through survey data exploring their experiences of playtesting and feedback. Responses confirm strong recognition of playtesting as a valuable development activity. Students associate it with bug identification, understanding player experience, refining mechanics, and testing whether design ideas work in practice (Figure 1).

At the same time, students consistently identify limitations. Recurring concerns include vague comments, insufficient detail, subjectivity, and a lack of time to discuss findings (Figure 1). Sessions are described as under-structured, and there is clear appetite for better guidance and organisation. These findings, though limited by sample size (n=9), connect directly to the literature. In Carless and Boud’s (2018) terms, students value feedback but lack the literacy to generate and use it effectively. From Nicol’s (Nicol, 2010) perspective, the issue is one of insufficient dialogue. Blair (2007) and Orr and Bloxham’s (2013) work on creative critique helps explain why feedback feels ambiguous when expectations are not made explicit. Student stakeholders point clearly towards enhancement in three areas: feedback clarity, session structure, and support for using feedback in iteration (V1, V2, A4, K1).

At the same time, students consistently identify limitations. Recurring problems include vague comments, insufficient detail, subjectivity, and not enough time to discuss findings (Figure 1). Sessions are described as under-structured, and there is a clear enthusiasm for better guidance and organisation. These findings, though limited to sample size n=9, reinforce the literature. in Carless and Boud’s (2018) terms, students value feedback but lack the literacy to generate and use it effectively. From Nicol’s (2010) perspective, the issue is one of insufficient dialogue. Both Blair’s and Orr’s (2007; 2013) work helps explain why critique feels ambiguous when expectations are not explicit. Student stakeholders point clearly towards enhancement in three areas: feedback clarity

Colleague Stakeholders

Colleagues were consulted through conversations and informal meetings with both the course director and fellow lecturers involved in delivering studio-based modules. These ongoing discussions within the programme team reinforce the importance of playtesting as a legitimate and critical part of game development teaching. Colleagues share the view that iteration is central to studio practice and that students benefit from exposing their work to peer evaluation. However, colleagues also recognise that feedback quality can be inconsistent and that students often need more support in how to give critique, what to focus on, and how to convert comments into concrete design change. There is a shared recognition that authentic practice does not automatically produce effective learning without pedagogic scaffolding. This aligns with Orr and Bloxham’s (2013) discussion of ambiguity in creative judgement and with Hamilton et al.’s (2023) argument that practice-based activities require deliberate design. Colleague perspectives therefore reinforce the case for refinement that preserves studio authenticity while improving consistency and guidance (V5, A1, K5).

Industry-Informed Perspective

An industry-informed perspective is important because CRE344 is explicitly oriented towards professional game development industry practice. In professional settings, playtesting is not optional or incidental. It is part of the core iterative workflow through which games are refined, usability issues identified, and player experience evaluated (Kultima, 2015). Critically, industry practice demands that feedback be actionable and useful only to the extent that it informs design decisions, prioritisation, and revision.

This perspective reframes the current weaknesses in playtesting as more than classroom inefficiencies. They represent a gap between the intended authenticity of the activity and the professional quality of the evaluative practice it should model. Industry-informed perspectives strengthen the argument that playtesting should function as a structured iterative practice, not an ad hoc peer exercise (V4, K2). Across all three stakeholder groups, the message converges: playtesting is valued, but its current implementation lacks the structure, feedback literacy support, and explicit connection to iteration that would make it function as the educative activity it is intended to be. This convergence informs the revisions proposed in the following section.

IV. Intended Revisions and Innovations

In response to the evidence from contextualised analysis and stakeholder engagement, four linked revisions to playtesting in module CRE344 are proposed. The aim is not to replace the existing task, but to strengthen its learning design so that it functions more effectively as an authentic, inclusive, and educative activity.

Revision 1: A structured playtesting framework. Each session would include a defined test goal, a clear indication of what is being tested, assigned roles for players and observers, and a short debrief. This responds directly to student requests for better organisation and aligns with the expectations already built into the coursework rubric. Structuring sessions this way also supports more inclusive participation by giving all students a clear role, reducing the risk of disengagement identified in the literature (Kultima, 2015; Villarroel et al., 2018). This evidences A1 and V2.

Revision 2: Scaffolded feedback prompts. A short, structured feedback form would ask students to identify what worked, where confusion occurred, one key issue, and one suggested improvement. This attempts to address the recurring concern that feedback is too vague, and supports more consistent peer critique across teams (Blair, 2007; Nicol, 2010; Carless and Boud, 2018). This reflects A4 and V1.

Revision 3: Explicit teaching of feedback literacy. Short guidance on constructive critique, examples of useful and unusable feedback, and modelling of how feedback informs revision would be built into the module and potentially other modules focused on design and analysis. The evidence shows that students don’t just need more feedback, but more support in how to engage with it productively (Ajjawi et al., 2018; Carless and Boud, 2018). This evidences K1 and V3.

Revision 4: Linking playtesting explicitly to iteration. Students would record what feedback they received, what changed, and why, through a short iteration in the form of a developer log or reflective commentary. This makes the learning process visible and ensures feedback is used, not merely collected. It aligns with experiential learning principles (Kolb, 1984) and the iterative practice of game development (Kultima, 2015). This reflects V4 and K2.

Collectively, these revisions retain the authenticity of playtesting while making it more structured, dialogic, and inclusive. They respond to student and colleague concerns, align with professional expectations, and create a clearer bridge between feedback and iteration that potentially extends beyond the module into the wider game design and development degree programme. The proposed revisions move towards more intentional design (A1), stronger learner support (A4), evidence-informed enhancement (V3), and closer alignment between pedagogy and disciplinary context (V4, K2).

IV. References

Ajjawi, R. et al. (2018) ‘Conceptualising evaluative judgement for sustainable assessment in higher education’. Available at: https://doi.org/10.4324/9781315109251-2.

Ajjawi, R. and Boud, D. (2017) ‘Researching feedback dialogue: an interactional analysis approach’, Assessment & Evaluation in Higher Education, 42(2), pp. 252–265. Available at: https://doi.org/10.1080/02602938.2015.1102863.

Blair, B. (2007) ‘At the end of a huge crit in the summer, it was crap I’d worked really hard but all she said was fine and I was gutted.’, Art, Design & Communication in Higher Education, 5(2), pp. 83–95. Available at: https://doi.org/10.1386/adch.5.2.83_1.

Carless, D. and Boud, D. (2018) ‘The development of student feedback literacy: enabling uptake of feedback’, Assessment & Evaluation in Higher Education, 43(8), pp. 1315–1325. Available at: https://doi.org/10.1080/02602938.2018.1463354.

Hamilton, E., Margot, K., and Grand Valley State University (2023) ‘Using Practice-Based Learning to Extend Undergraduate Teaching and Learning’, International Journal for the Scholarship of Teaching and Learning, 17(1), pp. 1–11. Available at: https://doi.org/10.20429/ijsotl.2023.17123.

Kolb, D.A. (1984) Experiential learning : experience as the source of learning and development. Englewood Cliffs, N.J. : Prentice-Hall, [1984] ©1984. Available at: https://search.library.wisc.edu/catalog/999550475402121.

Kultima, A. (2015) ‘Developers’ perspectives on iteration in game development’, Proceedings of the 19th International Academic Mindtrek Conference. AcademicMindTrek’15: Academic Mindtrek Conference 2015, Tampere Finland: ACM, pp. 26–32. Available at: https://doi.org/10.1145/2818187.2818298.

Liang, L.R. and Kang, R. (2024) ‘Increasing Commuter Students’ Sense of Belonging with Situated Learning in a First-year Computer Programming Course’, International Journal for the Scholarship of Teaching and Learning, 18(2). Available at: https://doi.org/10.20429/ijsotl.2024.180203.

Nicol, D. (2010) ‘From monologue to dialogue: improving written feedback processes in mass higher education’, Assessment & Evaluation in Higher Education, 35(5), pp. 501–517. Available at: https://doi.org/10.1080/02602931003786559.

Orr, S. and Bloxham, S. (2013) ‘Making judgements about students making work: Lecturers’ assessment practices in art and design’, Arts and Humanities in Higher Education, 12(2–3), pp. 234–253. Available at: https://doi.org/10.1177/1474022212467605.

Villarroel, V. et al. (2018) ‘Authentic assessment: creating a blueprint for course design’, Assessment & Evaluation in Higher Education, 43(5), pp. 840–854. Available at: https://doi.org/10.1080/02602938.2017.1412396.

Ulster University (2025) CRE344 3D Game Creation: Module Handbook 2025/26. Belfast: Ulster University.