Case Study 3: Assessing Learning and Exchanging Feedback

Connecting Feedback, Peer Review, and Digital Tools for Learning

Contextual Background

As a Specialist Technician at London College of Fashion, I deliver technical workshops across various courses within the School of Media and Communication. These workshops have defined learning outcomes but do not directly impact on students’ grades. This can affect engagement, with some students fully committed, while others see less value in participating. In addition, the limited insight into students’ prior work and course requirements makes it difficult to provide meaningful and targeted feedback. A key concern is to ensure that formative feedback is recognised as an essential part of student learning while bridging the gap between technical and academic development.

I currently provide real-time, hands-on guidance during workshops to support skill refinement and experimentation. This aligns with Nicol and Macfarlane-Dick’s (2006) principles of good feedback practice, particularly in fostering self-regulation. However, this feedback lacks structure and reflection, making it difficult for students to link their technical progress to broader academic goals. Furthermore, the absence of technical staff in formal assessment creates a disconnect between practical learning and its academic recognition. While students value the informal workshop environment, structured feedback mechanisms could help them to articulate their technical development more effectively (Addison, 2014).

Moving Forward

Strengthening structured feedback frameworks – Providing structured feedback rubrics within workshops could ensure that feedback is clear, consistent with unit learning outcomes and promotes self-reflection (Nicol & Macfarlane-Dick, 2006). By explicitly linking technical learning to academic progression, students can better understand how their skills contribute to their overall development.

Increasing collaboration with academic staff – Closer collaboration with course tutors could improve the integration of technical and theoretical learning. Nicol and Macfarlane-Dick (2006) emphasise the importance of clarifying learning expectations, which could be achieved by aligning workshop feedback with academic assessment criteria. Addison (2014) also criticises rigid learning outcomes in creative education and suggests that a more negotiated approach to assessment could improve student learning.

Promoting peer review and collaborative learning – Encouraging students to engage in structured peer feedback could help them to critically reflect on their progress, gain diverse perspectives and develop self-assessment skills. The ‘Make the Grade’ strategy (Finnigan, n.d.) suggests that increasing student engagement with assessment criteria could reduce repetition and improve performance. Incorporating peer feedback into workshops may also help students to better understand the assessment criteria in an interactive way.

Maximising blended learning and digital feedback – Blended learning and asynchronous resources are already available, but their role in supporting formative feedback could be enhanced. Regular updates, structured prompts for reflection and interactive elements such as self-assessment checklists can improve student engagement (Nicol & Macfarlane-Dick, 2006). Digital feedback should encourage dialogue rather than serve as one-way communication to ensure that students actively engage with feedback rather than passively receive it.

Creating a reflective culture – Encouraging students to document their technical progress in journals, visual blogs, or process logs such as Miro Boards and Padlet can help them connect experimentation to conceptual development. Reflection is a key principle in formative assessment (Nicol & Macfarlane-Dick, 2006) and could provide a more flexible approach that acknowledges the iterative nature of creative practice (Addison, 2014).

Advocating for inclusion in assessment conversations – Technical learning plays a crucial role in creative disciplines, yet technical staff are excluded from assessment discussions. Working with academic teams to include technical learning in assessment processes could create a more holistic approach to student assessment. Addison (2014) argues for a move beyond performative learning outcomes towards a model that values emergent and situated knowledge. Furthermore, Nicol and Macfarlane-Dick (2006) emphasise the role of feedback in clarifying performance expectations — an area that technical staff could contribute to by providing insights into students’ engagement with practical learning.

References

Addison, N. (2014) Doubting Learning Outcomes in Higher Education Contexts: From Performativity Towards Emergence and Negotiation. International Journal of Art & Design Education, 33(3), pp. 313–325.

Finnigan, T. (n.d.) Make the Grade. University of Derby PReSS Pack.

Nicol, D.J. and Macfarlane-Dick, D. (2006) Formative Assessment and Self-Regulated Learning: A Model and Seven Principles of Good Feedback Practice. Studies in Higher Education, 31(2), pp. 199–218.

Reflection: on Assessment Dimensions and Art Criticism

Reflections on Assessment, Art Criticism, and Student Attainment

Assessment plays a critical role in shaping students’ learning influencing how they engage in creative practice. The reading of Dimensions of Assessment (Anon, n.d.) highlights the need for different methods of evaluation that balance formative and summative approaches, while bell hooks’ Talking Art as the Spirit Moves Us (1995) critiques the power structures that shape artistic validation. Reflecting on these readings within my role as a technician, I recognise the challenge of ensuring students value formative feedback while advocating for assessment practices that acknowledge diverse artistic expressions.

The role of formative feedback in technical learning

In my workshops, I primarily provide formative feedback and offer students real-time guidance for their technical and creative decisions. However, as Dimensions of Assessment suggests, formative work is often perceived as less critical when it does not contribute to the final grade (Anon, n.d.). This is consistent with my observations — students sometimes overlook the importance of these sessions to their academic progress. To remedy this, structured reflection and peer feedback mechanisms are needed to help students recognise the formative process as essential to their learning.

Beyond the product: Assessing process and artistic intent

Traditional assessments in creative education often emphasise the end product, such as a fashion campaign or a 3D rendering, over the actual creative process. hooks (1995) refers to Sylvia Ardyn Boone’s discussion of the Mende aesthetic, in which true artistic perception requires a deep intellectual and cultural initiation. Similarly, assessment should move beyond superficial judgement and consider artistic intent and process. By encouraging students to document their experimentation, decision-making, and influences, a more holistic approach to assessment can be developed (Anon, n.d.).

Addressing power structures in assessment and art criticism

hooks (1995) critiques how mainstream art institutions often validate artists of colour if their work conforms to prevailing narratives. This raises critical questions about assessment in arts education: Who sets the criteria for success? Whose artistic values are given priority? Standardised grading systems run the risk of reinforcing dominant perspectives and excluding diverse, situated knowledge (Anon, n.d.). As educators, we must advocate for assessment frameworks that recognise multiple artistic languages and perspectives.

Using ‘Make the Grade’ to reduce referrals and resubmissions

The Dimensions of Assessment (Anon, n.d.), suggests that students often lose marks because they misunderstand the assessment criteria or overlook key elements (Anon, n.d.). One possible solution is to implement structured interventions, such as the Make the Grade approach. Finnigan (n.d.) explains that Make the Grade aims to help students manage assessment expectations by unpacking assignments, building checklists, and conducting structured workshops. By integrating this approach into technical workshops, students can gain a clearer understanding of what is expected, reducing the amount of revision and improving performance. In addition, using self-assessment checklists prior to submission can help students identify gaps in their work and make necessary adjustments (Finnigan, n.d.).

Conclusion: Rethinking assessment as a space for dialogue

hooks (1995) calls for a more engaged and dialectical approach to art criticism — one that encourages meaningful discourse rather than prescribing a rigid framework. Similarly, assessment should not just be a tool for judgment, but a space for dialogue, reflection, and growth. By integrating structured feedback, process-based evaluation, and inclusive assessment practices, we can better support students in bridging technical skills with conceptual depth, ultimately fostering a more critically engaged learning environment.

References

Anon. (n.d.) Dimensions of Assessment. Unpublished document.

Finnigan, T. (n.d.) Make the Grade. University of Derby.

hooks, b. (1995) Art on My Mind: Visual Politics. New York: The New Press.